EIPM Research: Tech & Collaboration Empowering communication globally Thu, 27 Nov 2025 11:11:35 +0000 en-GB hourly 1 https://wordpress.org/?v=6.9.1 Surviving Among Giants: How HMD Thrives Through Partnerships https://www.europeanbusinessreview.com/surviving-among-giants-how-hmd-thrives-through-partnerships/ https://www.europeanbusinessreview.com/surviving-among-giants-how-hmd-thrives-through-partnerships/#respond Thu, 12 Jun 2025 13:53:08 +0000 https://www.europeanbusinessreview.com/?p=230800 By Hervé Legenvre In less than a decade, HMD has created a niche in the mobile phone market despite fierce competitors. Here is a look at its survival strategies other […]

The post Surviving Among Giants: How HMD Thrives Through Partnerships appeared first on The European Business Review.

]]>

By Hervé Legenvre

In less than a decade, HMD has created a niche in the mobile phone market despite fierce competitors. Here is a look at its survival strategies other small businesses could borrow.

In an industry dominated by behemoths such as Apple, Samsung, and Huawei, entering and sustaining a position in the mobile device market is a daunting task for any small company. Yet, HMD, the Finnish company responsible for the resurgence of Nokia-branded smartphones, provides an intriguing story on how to thrive in this challenging environment.

Established in 2016, HMD has leveraged strategic partnerships and differentiation strategies to compete in a saturated market. This article examines the company’s collaborative origins, its focus on reparability and mental health as differentiators, and the broader implications for small businesses aiming to survive against corporate giants.

The Genesis of HMD: A Collaborative Foundation

A Bold Vision for Nokia’s Revival

HMD was formed with a singular goal: to bring the iconic Nokia brand back into the smartphone market. This vision was bold, given the rapid advancements in mobile technology and the market’s domination by entrenched players. Yet, rather than attempting to build every capability in-house, HMD Global embraced a partnership-driven model that allowed it to focus on core competencies while leveraging the strengths of industry leaders.

Partnerships at the Core

Four pivotal partnerships shaped HMD’s trajectory in its early years:

  • Nokia: The partnership with Nokia allowed HMD to license one of the most recognized names in mobile technology. Nokia’s legacy brought immediate brand recognition, credibility and goodwill to HMD’s offerings, an asset in a crowded market.
  • Foxconn: Through a manufacturing partnership with Foxconn, HMD avoided the capital-intensive process of establishing its own production facilities. Foxconn’s global manufacturing expertise ensured that HMD’s devices met high-quality standards while maintaining cost efficiency.
  • Google: By integrating Google’s pure Android operating system into its devices, HMD provided a user experience free from the bloatware often associated with competing brands. This decision aligned with consumer preferences for clean, reliable, and regularly updated software.
  • Qualcomm: Partnering with Qualcomm enabled HMD to leverage industry-leading chipset technology, ensuring that its devices delivered competitive performance, energy efficiency, and connectivity.

HMD’s growth is also supported by global partnerships that facilitate access to new audiences and industry segments. Collaborations with companies such as Heineken, Mattel, and FC Barcelona enable the integration of mobile technology into specific consumer experiences. By engaging with diverse audience groups, including sports enthusiasts and families, HMD enhances the visibility and relevance of its products.

In 2017, HMD launched its first line of Nokia-branded smartphones alongside feature phones that harkened back to Nokia’s heyday. These devices quickly gained traction, praised for their minimalist design and affordability. By the end of its first year, HMD had sold over 70 million devices, a remarkable achievement for a nascent company operating in a highly competitive industry. The collaborative model had laid a solid foundation for HMD’s future.

Jean-François Baril, HMD Founder and CEO advocates a four-ingredient recipe for partnerships:

Ambition: Collaborations need a bold, shared vision that transcends the status quo. This vision might involve reshaping existing markets or creating entirely new ones, but it must be transformative.

Viability: Partnerships require clear expectations and transparency. Both parties should understand their roles and rewards to minimize conflicts and establish fairness.

Benevolence: Greed is a poison to collaboration. Success comes from focusing on mutual value creation and addressing risks together. This builds confidence and strengthens the partnership.

Passion and Trust: Genuine enthusiasm and trust are non-negotiable. Shared passion for the mission enables partners to navigate obstacles and remain united in pursuit of their goals.

Differentiating through Reparability and Mental Health

HMD - phone

As HMD matured, the company recognized the need to differentiate itself in a crowded market. While competitors focused on cutting-edge camera technologies or foldable screens, HMD adopted a unique approach: prioritizing reparability and mental health. This strategy not only set the company apart but also aligned with growing consumer demand for responsible products.

Modern smartphones are often criticized for their “planned obsolescence”—a design philosophy that prioritizes frequent upgrades over durability and reparability. HMD sought to challenge this norm by designing devices that consumers could easily repair themselves. Key elements of this strategy included:

  • User-Centric Design: Models such as the Nokia G22 and G42 5G were designed with DIY repairs in Users could replace common wear-and-tear components like screens, batteries, and back covers using basic tools. This approach made repairs accessible and cost-effective, reducing the financial and environmental burden of device replacement.
  • Collaboration with iFixit: Recognizing that reparability requires more than just hardware, HMD partnered with iFixit, a leading provider of repair guides and replacement parts. This partnership ensured that consumers had the resources they needed to maintain their devices, reinforcing HMD’s commitment to usability and sustainability.

HMD Global embraced a partnership-driven model that allowed it to focus on core competencies while leveraging the strengths of industry leaders.

HMD’s emphasis on reparability aligns seamlessly with broader societal trends toward sustainability. The growing “right to repair” movement, coupled with increasing awareness of electronic waste, has fostered an environment where repairable devices are viewed not merely as novelties but as necessities. By extending the lifespan of its smartphones and incorporating recycled materials into its designs, HMD has successfully appealed to environmentally conscious consumers, particularly younger demographics. This commitment to sustainability has positioned HMD among the top 1% of companies globally, earning it a prestigious EcoVadis Platinum rating for three consecutive years.

HMD also plays a purposeful role in the digital detox movement by offering consumers tangible solutions to manage screen time and mental well-being. The company has spearheaded the resurgence of feature phones, like the Nokia 2660 Flip, as an alternative for those seeking a break from digital overload. With initiatives like The Better Phone Project, HMD is going even further, working with parents, experts, and campaigners to co-create devices that offer balance and control over smartphone use—particularly for younger generations. As concerns about screen addiction and mental health rise, HMD is not just highlighting the problem but providing real, practical solutions that empower consumers to reconnect with themselves and their surroundings.

A Partnership-Centric Growth Path

HMD’s future growth is anchored in the power of partnerships with three key growth paths:

Device Financing

HMD’s device financing initiatives are powered by a partnership with M-KOPA, a leading micro-finance institution. This innovative model integrates HMD’s proprietary Softlock technology with its partner’s financing capabilities. By eliminating the need for large upfront payments, HMD enables customers—particularly in emerging markets—to adopt a pay-as-you-go model, allowing them to own high-quality smartphones through affordable and flexible payment plans. This strategy addresses the digital divide, it could enable millions of people in underserved markets to participate in the digital economy. This partnership facilitates access to premium devices, enhances productivity, connects individuals with opportunities, and improves quality of life. HMD’s partnership-driven financing model is positioned as a catalyst for social and economic inclusion.

Secure Devices

In the secure device market, HMD recognizes that partnerships are the driving force behind innovation and growth. Through joint R&D investments with strategic partners, the company is expanding its presence in critical sectors such as the military and healthcare, where security and reliability are paramount.

By extending the lifespan of its smartphones and incorporating recycled materials into its designs, HMD has successfully appealed to environmentally conscious consumers, particularly younger demographics.

One such partnership is the HMD’s OffGrid initiative, in collaboration with Bullitt and FocusPoint, which represents a strategic move into the rugged and emergency communication segment. By partnering with Bullitt, a leader in satellite and rugged mobile technology, HMD ensures that its OffGrid solutions meet the demands of users in extreme environments, from outdoor adventurers to remote workers. FocusPoint’s expertise in crisis response and global assistance further enhances the offering, providing users with reliable safety and emergency connectivity services. This partnership positions HMD at the intersection of durability, security, and advanced mobile connectivity, expanding its portfolio beyond traditional consumer devices.

Family-Oriented Services

HMD’s family-focused solutions are built on its collaboration with Xplora, a leader in wearable technology. Xplora combines safety, communication, and activity-tracking features to offer families practical tools for staying connected. In this partnership, HMD’s device portfolio serves as the foundation for Xplora’s innovative services. Xplora’s technology allows parents to monitor their children’s location in real-time, set safe zones, and communicate with pre-approved contacts via voice calls or text messages—eliminating the risks associated with traditional smartphones. By combining HMD’s robust hardware capabilities with Xplora’s expertise in family-oriented software and services, the partnership delivers comprehensive solutions tailored to the unique needs of families.

Across all these initiatives, partnerships are not merely a minor complementary aspect of HMD’s growth strategy—they are its foundation. By fostering collaboration, co-innovation, and shared vision with key partners, HMD is unlocking new opportunities, expanding its reach, and delivering solutions that drive mutual success in an increasingly competitive landscape.

Reflections and Lessons for Small Businesses

HMD’s journey offers valuable insights for other small companies navigating industries dominated by large incumbents. Key lessons include:

  1. Leverage Strategic Partnerships: By collaborating with established players like Nokia, Foxconn, Google and Qualcomm, as well as global players like Heineken, Mattel and FC Barcelona, HMD was able to punch above its weight, accessing resources and expertise that would have been unattainable independently.
  2. Identify and Own a Niche: HMD’s focus on reparability and mental health allowed it to stand out in a crowded market. For small businesses, finding and owning a specific niche can be a powerful strategy for differentiation.
  3. Align with Societal Trends: HMD’s emphasis on sustainability and tangible solutions to mental health reflects an astute understanding of emerging consumer values. Companies that align their offerings with emergent trends are better positioned to achieve long-term relevance.
  4. Be Adaptive: HMD’s ability to pivot from nostalgic feature phones to sustainability and detox-focused smartphones demonstrates the importance of adaptability. Small businesses must remain flexible to respond to changing market dynamics.

Overcoming Challenges in A Competitive Landscape

HMD flip phone

HMD’s story is one of resilience, innovation, and collaboration that led to the creation of the largest European mobile phone company, with 520 employees across 30+ countries and shipping to approximately 100 nations.

By embracing a partnership-driven model and focusing on reparability, the company has carved out a unique position in the mobile device market. Yet, in its quest for long-term survival, HMD faces formidable challenges. The mobile device market is characterized by intense competition, rapid innovation cycles, and significant marketing expenditures—factors that favour established giants.

While reparability and mental health are compelling differentiators, they are not yet the primary purchasing criterion for most consumers. HMD must balance the need to innovate in other areas, such as camera technology and processing power while maintaining its commitment to affordability. Also, as a smaller company, HMD is more susceptible to disruptions in its supply chain, whether due to COVID-19, geopolitical tensions, market tensions, or natural disasters.  Finally, competing against companies with massive advertising budgets presents a significant hurdle. HMD must continue to rely on creative and cost-effective marketing strategies as well as global partners for broader reach to build brand awareness and communicate its value proposition.

As HMD looks to the future, it must double down on its collaborative ethos, forging new partnerships and strengthening existing ones. The company’s ability to survive and thrive will depend not only on its innovations but also on its capacity to build a robust ecosystem of allies. In a world where no company is an island, the spirit of partnership is not just a strategy—it is the only way forward.

Collaboration is a mindset that demands ambition, clarity, and trust. Organizations must approach partnerships with a commitment to shared growth, openness, and the willingness to tackle challenges together. When this mindset is in place, partnerships can unlock new market opportunities, enhance operational excellence, and drive meaningful innovation.

About the Author

herveHervé Legenvre is Professor and Research Director at EIPM. He manages education programmes for global clients. He conducts research and teaches on digitalisation, innovation, and supply chain. He has been one of TEBR’s esteemed columnists since 2023, contributing thought-provoking insights to the publication since 2019. Lately, Hervé has conducted extensive research on how open-source software and open hardware are transforming industry foundations (www.eipm.org).

The post Surviving Among Giants: How HMD Thrives Through Partnerships appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/surviving-among-giants-how-hmd-thrives-through-partnerships/feed/ 0
The Future of AI Language Models (LMs): Three Scenarios that could Reshape Business and Society https://www.europeanbusinessreview.com/the-future-of-ai-language-models-lms-three-scenarios-that-could-reshape-business-and-society/ https://www.europeanbusinessreview.com/the-future-of-ai-language-models-lms-three-scenarios-that-could-reshape-business-and-society/#respond Wed, 26 Feb 2025 16:43:37 +0000 https://www.europeanbusinessreview.com/?p=223520 By Hervé Legenvre, Erkko Autio and Xule Lin This article is part six of an ongoing series – AI Power Plays – that explores the fiercely competitive AI landscape, where […]

The post The Future of AI Language Models (LMs): Three Scenarios that could Reshape Business and Society appeared first on The European Business Review.

]]>
By Hervé Legenvre, Erkko Autio and Xule Lin

This article is part six of an ongoing series – AI Power Plays – that explores the fiercely competitive AI landscape, where tech giants and startups battle for dominance while navigating the delicate balance of competition and collaboration. In this final article, we focus on the ongoing ‘dominant design’ battle among Language Models (LMs) such as ChatGPT, Gemini, and DeepSeek and consider three scenarios how AI LMs might evolve.

The AI LM Inflection Point

AI LMs are approaching an inflection point. Rapid advances in model architecture (DeepSeek’s V3 and R1), computing power (NVIDIA’s Project DIGITS democratising access to AI infrastructure), and autonomous agency (OpenAI’s Deep Research enabling AI to autonomously explore and analyse web information) have propelled AI to the forefront of business agendas, national policies, and everyday life. But the path forward is anything but settled.

The recent breakthroughs in AI LMs reflect fundamental design choices about how AI LMs are built and deployed. We predict that over time, different ‘dominant designs’ of AI LMs will emerge, based on fundamental design choices and the use cases where LMs are applied. Understanding these parameters is crucial for grasping future scenarios for AI LMs.

Design Choices Shaping AI LM Futures

Large vs Small LMs. The relationship between model size and capability is undergoing a fundamental transformation. What began as a simple correlation – “bigger is better” – has evolved into a more nuanced interplay of architecture, efficiency, and specialised expertise. This dichotomy is reflected in many of the design parameters we highlight below.

Scale and Training Cost. Related to size, the general trend for LMs has been towards large models that contain tens of billions of parameters and are trained with vast datasets. This suggests a massive upfront investment in model training and hardware infrastructure and implies a future dominated by tech giants and government-backed corporations. However, technological breakthroughs such as those heralded by DeepSeek may upend this trend and open the door for resourceful startups, academic institutions, open-source communities to enter the market with smaller and specialised LMs. Access to high-quality data for model training may become a key differentiator for LMs.

Operating Costs are shaped by the efficiency of AI hardware and the complexity of user queries. Use cases dominated by simple queries do not require heavy processing, whereas agentic LMs with deep reasoning abilities will be more costly to operate. This has implications for the cost of adding new users and applicable revenue models. We see different use cases emerging, addressed by differently designed LMs that are operated by different players.

Proprietary vs Open LMs. Currently, the LM landscape features a mix of proprietary (e.g., OpenAI’s GPT and o series, Google’s Gemini, Anthropic’s Claude) and more open LMs (e.g., Meta’s LLaMA, NVIDIA’s NVLM, Mistral’s Pixtral, DeepSeek’s R1) made available through Hugging Face. The two approaches represent radically different approaches to the LM business. Whereas proprietary LMs tend to be large and premium ones, models with less restrictive licenses have been adopted by many players and applied to a wide range of use cases. Often the same players offer both proprietary and more open LMs (e.g., Google’s Gemini and BART LMs).

Through model distillation or quantisation techniques, the capabilities of Models can be preserved while reducing resource requirements.

Model architecture and training process: brute force vs efficiency: The Mixture-of-Experts (MoE) architecture exemplifies the evolution from brute force to efficiency. Consider DeepSeek R1: while housing 671 Billion parameters (B), it activates only about 37B for any specific task, while matching the performance of larger models. The future lies not in sheer size, but in the intelligent orchestration of specialised experts. The trend toward efficiency extends beyond architecture to the training process. Low-Rank Adaptation (LoRA allows models to adapt to new domains without significant computational overhead). Through model distillation or quantisation techniques, the capabilities of Models can be preserved while reducing resource requirements.

Regulation. The regulation of AI LMs ranges from tightly controlled regimes that prioritise national security and sovereignty to more open, market-driven regimes that foster rapid innovation and global collaboration. The regulatory dimension shapes strategic dependencies, military applications, government investment, technological and data sovereignty, cultural norms, and trade policies.

The extent of AI regulation directly influences whether a single dominant design will emerge or whether multiple regional standards will coexist. Tightly regulated AI LM environments promote fragmentation, sovereignty-driven divergence, and government-imposed standards, reducing the likelihood of a globally unified design. Flexible regulations foster convergence, enabling a few powerful firms or open-source communities to establish dominant designs through competitive selection.

In the long term, AI LM regulation will determine whether AI LMs follow the trajectory of global technological convergence (as seen in internet protocols and semiconductors) or regional divergence (as seen in telecom standards and cybersecurity models). Beyond technical aspects, the future of AI LM dominant designs will also be shaped by the balance between regulatory intervention, geopolitical constraints, and market forces.

Deployment Architectures shape how AI systems operate and interact with users. This includes both physical architecture (where processing happens) and logical architecture (how systems are controlled and moderated).

For physical architecture, the choice largely depends on computing requirements. Large models typically demand cloud deployment in centralised data centres, creating provider dependencies but simplifying deployment. Edge computing runs smaller models on local devices, offering autonomy but facing computational limits. Apple Intelligence demonstrates an emerging hybrid approach: specialised models handle simple tasks locally while routing complex operations to the cloud, suggesting future systems may emphasise intelligent resource distribution over centralisation.

The logical architecture determines how model capabilities are accessed and controlled. Model providers (e.g., OpenAI, DeepSeek) enforce strict moderation through safety classifiers. Cloud platforms (e.g., Microsoft Azure, Amazon Bedrock, Nebius) offer flexible controls over model behaviour within platform guidelines. Self-managed deployments through rented compute or local installations provide complete control over model boundaries – crucial for enterprises handling sensitive data or requiring specialised behaviours. These deployment choices shape market dynamics and innovation patterns. While integrated cloud solutions attract enterprises seeking reliability, self-managed deployments appeal to those prioritising autonomy. Regional deployments using hybrid infrastructures serve specific market and regulatory needs.

From Design Choices to scenarios

These parameters interact and influence each other, creating the conditions for different possible futures. Based on how these deployment patterns interact with fundamental design parameters—scale of investment, proprietary versus open approaches, and regulatory frameworks—we see three scenarios emerging: (1) Corporate-Led Standardisation, (2) Decentralised Innovation, and (3) Geopolitical Fragmentation. Each scenario arises from specific dynamics—such as the amount of up-front investment required, control dynamics, technology maturity, implementation patterns and operational costs—and provides insight into how AI might evolve over the next decade. By understanding these scenarios, business leaders, policymakers, and technologists can better prepare for whichever future becomes dominant.

Scenario 1: Corporate-Led Standardisation

Dominant design choices at play:

  • Upfront Investment: Large
  • Deployment Architecture: Cloud
  • Proprietary vs open: mainly proprietary
  • Operational Costs: High

In this scenario, well-funded technology giants who control or partner with cloud platforms—think Google, Microsoft, OpenAI, and a few others—take the lead in building and maintaining AI technologies. Because the up-front costs are immense, only these behemoths can afford to invest.

These dominant firms would offer AI capabilities via cloud platforms (subscription and API access). Businesses, governments, and individuals gain access to state-of-the-art (SOTA) models but remain heavily dependent on corporate providers, such as Microsoft’s Azure, Amazon’s Bedrock, and various providers available on Hugging Face and OpenRouter. These providers enforce strict validation and safety controls through their official deployments, maintaining tight governance over how their models are used.

As running inference of large AI models eats up large amounts of computing power, a few players who can manage these costs will set the price for accessing AI technologies. Smaller organisations may be priced out, limiting competition and reinforcing a cycle where AI remains an elite tool controlled by a few firms.

An emblematic use case: Industry and Enterprise-specific AI Copilots where large corporations in finance, healthcare, and legal industries rely on AI copilots for tasks like financial analysis or healthcare diagnostics. These systems would be standardised, secure, and integrated with existing enterprise software.

In essence, Scenario 1 paints a world where scale and control of computing power win the day. While it delivers highly efficient, well-tested AI solutions, it risks locking businesses into proprietary ecosystems that limit choice, hamper competition, and concentrate profits and power at the top. The true moat in this scenario isn’t just money – it is the integration of specialised hardware, vast data centres, and proprietary training methods. Like oil refineries of the digital age, these AI factories require both enormous capital and deep technical expertise to operate efficiently.

Scenario 2: Decentralised Innovation

Dominant design choices at play:

  • Upfront Investment: Small
  • Deployment Architecture: Edge
  • Proprietary vs open: mainly open
  • Operational Costs: Low

Using clever training approaches rather than brute force computing power, smaller teams proved they could match tech giants’ capabilities.

In this scenario, AI development is driven by vibrant open-source communities and a diverse range of stakeholders rather than a handful of dominant tech companies. Consider how recent breakthroughs by DeepSeek challenged conventional wisdom: using clever training approaches rather than brute force computing power, smaller teams proved they could match tech giants’ capabilities. This suggests a future where innovation comes from unexpected places, as tools and knowledge become more widely accessible by large audiences instead of a few large players.

In this scenario, research collectives, universities, non-profits, startups, open-source linchpins such as Hugging Face, and even some large corporations—such as Meta and Alibaba, which integrate AI into their existing platforms without commercialising AI technologies—collaborate actively. They share new model architectures, training datasets, and software tools through public repositories, fostering transparency and generative innovation by large, distributed developer communities.

With affordable specialised hardware and efficient model training techniques, even small teams can develop and refine AI models (e.g., Mistral and DeepSeek). This accessibility fosters a culture of rapid experimentation, democratising technological and use case innovation.

Open-source projects constantly iterate, pivoting quickly with each new breakthrough. For instance, a new training algorithm discovered by a small research lab can be rapidly adopted worldwide. AI-native platforms like Aimlapi accelerate this pace by providing seamless access to a broad range of foundation models through lightweight, developer-friendly APIs. This enables rapid prototyping and experimentation, fostering a decentralized model of progress—faster, but potentially more chaotic.

Instead of sending data to cloud platforms, most processing takes place on local data centres and installations. This approach reduces reliance on cloud services and can lead to substantial cost savings. Edge-based AI also enhances privacy by keeping sensitive data local, making it particularly valuable in scenarios where confidentiality is key. Users have full control over model behaviour and validation parameters, enabling more flexible and customised deployments, while taking on greater responsibility for safety and governance.

An emblematic use case: Local AI Assistants where individuals run personal AI assistants on their phones and computers (e.g., Apple’s M-series laptops and NVIDIA’s Project DIGITS), without relying on centralised servers. These local AI tools learn from personal data privately, respecting user privacy and control.

On the flip side, Scenario 2 may come with challenges around standardisation. Without a powerful central authority, ensuring consistent security, data governance, and reliability becomes more difficult. Still, this vision highlights an exciting possibility: AI technology that is truly of the people, by the people, and for the people—grassroots, inventive, and broadly accessible.

Scenario 3: Geopolitical Fragmentation

Dominant design choices at play:

  • Upfront Investment: Small
  • Deployment Infrastructure: Mix of Cloud and Edge
  • Proprietary vs open: combination
  • Operational Costs: Medium

Unlike the first two scenarios, which emphasise either corporate dominance or grassroots innovation, Scenario 3 places governments at the centre of AI development where nations develop, adopt and customise models, creating relatively Balkanised regional AI ecosystems to safeguard national interests and technological sovereignty.

Medium-sized and large countries in particular tend to want to avoid overreliance on foreign corporations, especially where it comes to strategic technologies. To preserve a degree of technological sovereignty, countries may promote open-source standards not only in LMs but also in related technologies (e.g., RISC-V for microprocessor architectures, Open Computing Project for data centre hardware). They may promote investment in local cloud infrastructures and fine-tune open models to align with regional priorities. This way, different regions may promote distinct “flavours” of AI that reflect their unique characters. DeepSeek’s R1 model, for instance, demonstrates deep understanding of both classical traditions (Tang Dynasty poetry) and contemporary cultural dynamics (Baidu Tieba and RedNote social networks), while Claude and Grok models excel at parsing complex social dynamics on platforms like Reddit and 4chan (from meme culture to community-specific discourse patterns). This could herald a future where regional AI ecosystems diverge, supporting different languages, ethical frameworks, and security protocols.

An emblematic use: France public authorities have introduced its sovereign LM: Albert and it is progressively expanding within the country’s public administration. The system aims to reduce reliance on foreign technologies and reinforce national control over sensitive data. Today, it assists administrative advisors in responding to citizen inquiries with reliable information. Albert is also embedded within the government’s secure messaging system. Albert serves as an API-based infrastructure, providing computational resources and machine learning algorithms for public institutions developing AI-powered solutions. However, the tax authorities prefer to develop their own LM and to avoid using Albert for sensitive data.

For nations pursuing technological sovereignty, Scenario 3 could provide strategic autonomy and localised innovation. But it also risks deepening divisions between regions, making global cooperation on AI ethics, safety, and research more difficult.

Conclusion: Preparing for the AI Worlds Ahead

We expect that multiple dominant designs will co-exist, each optimised for different use cases and constraints.

The three scenarios are not mutually exclusive. We expect that multiple dominant designs will co-exist, each optimised for different use cases and constraints. The different dominant design parameters are also not mutually exclusive and often interact. It is virtually guaranteed that technological breakthroughs will continue to emerge and upend different scenarios and their technological and use case drivers. The evolution will be iterative, and dominant designs will shift over time. We further expect that open-source communities and commercial providers will co-exist in a dynamic equilibrium: corporations continue to adopt open-source breakthroughs, and open-source projects benefit from corporate-funded infrastructure (e.g., LLaMA and DeepSeek models running on Groq servers in Saudi Arabia or on Nebius servers in Finland).

What do these scenarios mean for business leaders, policymakers, and innovators charting their paths today?

Anticipate Power Shifts. In a corporate-led world, forging strong alliances with tech giants and maintaining sufficient capital reserves for AI solutions will be essential. In a decentralised innovation logic, adaptability, open-source collaborations, and edge-based solutions become key. Meanwhile, in a fragmented globe, the ability to understand and comply with diverse national regulations will become a prerequisite to success.

Balance Innovation with Governance. Whichever direction AI takes, companies must keep one eye on short-term performance gains and the other on long-term ethical and regulatory obligations. Stakeholders need to champion responsible data use, equity, and security, or risk public backlash and legal scrutiny.

Balance AI Investments. Given the unpredictability of breakthroughs and the fluid nature of regulations, spreading resources across multiple strategies—corporate partnerships, open-source initiatives, and strategic national collaborations—helps hedge against sudden disruptions. Nonprofit organisations should also prioritise training and governance — adopting an LMS for non profits can scale ethics, compliance, and AI-literacy programs across distributed teams.

No matter which paths AI LMs take, and they will be several, AI’s influence on business, society, and global politics is set to intensify. The key question isn’t just who will own the dominant AI designs—it is how we can guide AI’s development to serve the broadest possible set of human interests. By understanding potential AI LM futures, stakeholders can better position themselves while working toward an AI ecosystem that benefits all society.

About the Authors

Herve LegenvreHervé Legenvre is Professor and Research Director at EIPM. He manages education programmes for global clients. He conducts research and teaches on digitalisation, innovation, and supply chain. Lately, Hervé has conducted extensive research on how open-source software and open hardware are transforming industry foundations (www.eipm.org).

Erkko AutioErkko Autio FBA FFASL is a Professor in Technology Venturing at Imperial College Business School, London. His research focuses on digitalisation, open technology ecosystems, entrepreneurial ecosystems, innovation ecosystems, and business model innovation. He co-founded the Global Entrepreneurship Monitor (www.gemconsortium.org), the Global Entrepreneurship Index (thegedi.org), and Wicked Acceleration Labs (www.wickedacceleration.org).

Xule LinXule Lin is a PhD Candidate in Management and Entrepreneurship at Imperial College Business School, studying how human and machine intelligences shape the future of organizing. His work received the 2024 Strategic Management Society PhD Paper Prize and research grants from OpenAI, Google Cloud, and Cohere for AI. He co-organizes the “Human & Artificial Intelligence in Organizations” symposium at Imperial (www.haiosymposium.com).

The post The Future of AI Language Models (LMs): Three Scenarios that could Reshape Business and Society appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/the-future-of-ai-language-models-lms-three-scenarios-that-could-reshape-business-and-society/feed/ 0
Hugging Face: Why Do Most Tech Companies in AI Collaborate with Hugging Face? https://www.europeanbusinessreview.com/hugging-face-why-do-most-tech-companies-in-ai-collaborate-with-hugging-face/ https://www.europeanbusinessreview.com/hugging-face-why-do-most-tech-companies-in-ai-collaborate-with-hugging-face/#respond Thu, 23 Jan 2025 07:12:21 +0000 https://www.europeanbusinessreview.com/?p=221692 By Hervé Legenvre and Erkko Autio This article is part five of an ongoing series – The AI Power Plays – that explores the fiercely competitive AI landscape, where tech […]

The post Hugging Face: Why Do Most Tech Companies in AI Collaborate with Hugging Face? appeared first on The European Business Review.

]]>
By Hervé Legenvre and Erkko Autio

This article is part five of an ongoing series – The AI Power Plays – that explores the fiercely competitive AI landscape, where tech giants and startups battle for dominance while navigating the delicate balance of competition and collaboration to stay at the cutting edge of AI innovation.

This article examines how Hugging Face has emerged as a linchpin in the AI landscape through its innovative community-centric business model and seamless integration of AI models, datasets, frameworks, hardware, and cloud platforms. By fostering open-source collaboration and building partnerships with major technology providers, Hugging Face enables developers, researchers, and enterprises to co-create, share, and scale AI solutions efficiently. 

The history of Hugging Face

Hugging Face was established in New York City, in 2016 by French entrepreneurs Clément Delangue, Julien Chaumond, and Thomas Wolf. As they wanted to explore state-of-the-art natural language processing (NLP) technology, they created an AI-based chatbot for teenagers. From this, they expanded their focus to developing tools and resources to advance AI, particularly in NLP. By 2018, Hugging Face expanded its focus to creating open-source tools that democratize access to AI models and released the Transformers library, a toolkit that made it easier to use and fine-tune transformer-based language models (like BERT, GPT, and others) for various NLP tasks. This sparked the emergence of a community of developers that used this library and the tools added by Hugging Face, whose mission had by now consolidated around democratizing artificial intelligence through open-source contributions. Today, Hugging Face has received investments from Google, Amazon, Nvidia, AMD, Intel, IBM, Qualcomm, and others. The company maintains a repository of over 1,2 million pre-trained models for a variety of AI tasks. These models can be used with little configuration, which reduces the complexity of building AI solutions from scratch with extensive resources.

Hugging Face: a linchpin in the AI landscape

Today, Hugging Face has established itself as a linchpin in the AI landscape. The company facilitates the seamless integration of technological capabilities, it fosters community collaboration, and it acts as a catalyst for innovation. It empowers organizations, developers, and researchers to co-create and use pre-trained models and a variety of tools they use to create AI applications.

The platform creates seamless connection and integration between AI models, datasets, cloud platforms, and hardware, thus accelerating the adoption and deployment of AI solutions.

Hugging Face: A linchpin for the AI landscape

Figure 1: Hugging Face – a linchpin for the AI landscape

On the Hugging Face platform, the Model Hub is the central repository for hosting, sharing, selecting, and accessing pre-trained models across different domains (e.g., NLP, computer vision, audio processing, and multimodal tasks). Hugging Face facilitates the connection between AI models and thousands of datasets, enabling these datasets to be pre-processed with just a few lines of code before being used to train and fine-tune AI models for specific use cases.

Hugging Face integrates AI models with popular deep learning frameworks such as PyTorch and TensorFlow. This integration simplifies the use of machine learning models and offers developers and researchers the flexibility to work within their preferred framework. The platform thus eliminates the complexity associated with running AI models across different frameworks. These integrations are the result of Hugging Face’s collaboration with PyTorch (Meta) and TensorFlow (Google) teams and related contributions from the open-source community.

Hugging Face also facilitates the integration of models with diverse cloud platforms to enable scalable training, deployment, and ‘running the model’. Major cloud providers are keen to simplify user’s access to their infrastructure, and they therefore partner with Hugging Face to ensure that AI workflows can be executed efficiently on their platform.

Finally, the same collaboration and integration logic also applies to hardware providers. These collaborations and integrations ensure that AI models run optimally on specific hardware configurations, making them faster and more scalable. Hardware providers are therefore keen to support Hugging Face so AI workflows can be executed efficiently on their hardware.

Hugging Face serves as a linchpin in the AI ecosystem because it sits at the intersection of AI models, datasets, frameworks, hardware, and cloud platforms, creating a cohesive environment where these technologies can work together seamlessly. Key technology providers—such as NVIDIA, AMD, Google, AWS, Meta and others—have a vested interest in ensuring that their tools, hardware, and platforms integrate effortlessly with Hugging Face’s offerings. This integration not only enhances the usability of their technology but also provides them with access to Hugging Face’s vast and engaged community of developers, researchers, and enterprises. By fostering interoperability and collaboration, Hugging Face helps technology providers amplify their reach while empowering users to build, deploy, and scale AI solutions faster and more effectively. This unique position makes Hugging Face an indispensable linchpin in the AI community and facilitate the evolution of the entire AI landscape.

How does Hugging Face capture value?

Its strong open-source ethos is the reason why Hugging Face has been able to establish itself as a linchpin in the AI community – yet, this ethos also means that Hugging Face needs a business model that balances community collaboration with monetization. Accordingly, its primary revenue streams derive from subscription-based enterprise offerings, advanced tools, and partnerships.

Access to the Hugging Face Hub remains free for individual developers. Yet, it also offers paid Pro and Enterprise Plans for businesses. These plans provide private model hosting, advanced collaboration tools, and integration with cloud platforms to enable business customers to build AI projects while protecting their proprietary assets. Hugging Face has also created advanced proprietary tools, offered at a prize, that can be used to train and access pre-trained AI models.

Hugging Face also generates revenue from partnerships with major cloud providers, including AWS, Microsoft Azure, and Google Cloud, which further expand Hugging Face’s reach. These partnerships facilitate seamless integrations and related revenue-sharing opportunities, enabling enterprises to more effectively deploy Hugging Face’s tools on their platforms.

This combination of paid and open-source offerings ensures that Hugging Face captures value while continuing to democratize AI through open-source innovation and OSS community-driven progress.

Hugging Face’s community-centric business model

Hugging Face’s community-centric business model casts various AI community stakeholders—developers, researchers, contributors, technology providers and others—as active co-creators of shared open-source resources rather than passive consumers of Hugging Face’s offerings. By fostering shared value creation and collective engagement, community-centric open-source business models empower their communities to drive innovation while sustaining long-term growth within the developer community. This business model builds its success on five key success factors:

  • Shared Purpose: Hugging Face unites the AI developer community around a common purpose: advancing the frontiers of AI through collaboration and open-source ethos. By offering access to tools like the Transformers library, Hugging Face aligns academic researchers, businesses, and developers with this share purpose. This helps foster a sense of ownership and motivates community members to contribute and engage meaningfully.
  • Delivering Tangible Value: Hugging Face provides a platform for accessing, sharing, and deploying machine learning models and datasets. It also ensures seamless connectivity and compatibility multiple AI resources. By lowering barriers to entry, the Hugging Face platform empowers developers to focus on building innovative solutions and use cases instead of duplicating foundational work. This ability to build on the contributions of others helps deliver immediate, tangible value and encourage continued engagement and adoption.
  • Recognition: Hugging Face ensures that contributors receive meaningful recognition for their participation. Users who upload pre-trained models, datasets, or who make other contributions gain visibility and recognition within the Hugging Face community. This community recognition helps maintain a positive dynamic where contributors’ efforts are acknowledged and amplified while enriching the overall ecosystem and its shared resources.
  • Collective and decentralized innovation: Hugging Face harnesses the collective intelligence of its community while encouraging decentralized innovation. By empowering developers to actively shape opens-source tools and technologies, the platform remains responsive to emerging trends. This adaptability allows Hugging Face to remain the community of choice for developers and researchers in the rapidly evolving AI landscape.
  • Trust Through Transparency: Transparency is a cornerstone of Hugging Face’s success. Open governance, clear and transparent decision-making and related rules, and transparent resource allocation foster trust and accountability. By aligning stakeholders with the platform’s mission, Hugging Face builds trust and encourages long-term participation and collaboration.

Hugging Face’s ability to integrate these principles demonstrates the power of community-centric open-source business models. By uniting its community around shared purpose, delivering tools that create tangible value, and fostering trust, Hugging Face has built a self-sustaining developer ecosystem to help drive progress in AI.

Previously in the AI Power Play series

Coming next in the AI Power Play series:

  • The forces shaping AI open-source dynamics

About the Authors

Herve LegenvreHervé Legenvre is Professor and Research Director at EIPM. He manages education programmes for global clients. He conducts research and teaches on digitalisation, innovation, and supply chain. Lately, Hervé has conducted extensive research on how open-source software and open hardware are transforming industry foundations (www.eipm.org).

Erkko AutioErkko Autio FBA FFASL is a Professor in Technology Venturing at Imperial College Business School, London. His research focuses on digitalisation, open technology ecosystems, entrepreneurial ecosystems, innovation ecosystems, and business model innovation. He co-founded the Global Entrepreneurship Monitor (www.gemconsortium.org), the Global Entrepreneurship Index (thegedi.org), and Wicked Acceleration Labs (www.wickedacceleration.org).

The post Hugging Face: Why Do Most Tech Companies in AI Collaborate with Hugging Face? appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/hugging-face-why-do-most-tech-companies-in-ai-collaborate-with-hugging-face/feed/ 0
NVIDIA: Harnessing Open Innovation to Promote User Lock-in https://www.europeanbusinessreview.com/nvidia-harnessing-open-innovation-to-promote-user-lock-in/ https://www.europeanbusinessreview.com/nvidia-harnessing-open-innovation-to-promote-user-lock-in/#respond Tue, 26 Nov 2024 11:14:54 +0000 https://www.europeanbusinessreview.com/?p=218848 By Hervé Legenvre and Erkko Autio This article is part four of an ongoing series – The AI Power Plays – that explores the fiercely competitive AI landscape, where tech giants and startups […]

The post NVIDIA: Harnessing Open Innovation to Promote User Lock-in appeared first on The European Business Review.

]]>
By Hervé Legenvre and Erkko Autio

This article is part four of an ongoing series – The AI Power Plays – that explores the fiercely competitive AI landscape, where tech giants and startups battle for dominance while navigating the delicate balance of competition and collaboration to stay at the cutting edge of AI innovation.

NVIDIA is now the world’s most valuable semiconductor company. Its high-end GPUs (Graphics Processing Units) power approximately 80% of the market for generative AI. This dominance stems from its early investment in programmable GPUs and the development of CUDA, its software development platform that enabled developers to leverage the parallel processing power of GPUs for diverse use cases.

We explore how NVIDIA built a complete digital technology stack and combined proprietary and open innovation strategies to promote customer lock-in. We start with a brief account of the NVIDIA story.

History of NVIDIA

NVIDIA was founded in 1993 with a mission to design Graphics Processing Units and introduced 3D graphics to computer gaming and multimedia markets. As its GPUs rapidly gained traction in the marketplace, NVIDIA launched CUDA, its Compute Unified Device Architecture platform in 1996 to help expand GPU use case. This move was inspired by the adoption of NVIDIA’s GeForce GPUs by French researchers for scientific computing, which revealed a powerful new use case for NVIDIA hardware. Although proprietary to NVIDIA, CUDA was openly accessible for external developers to develop different CUDA-based applications, eventually helping build an ecosystem around CUDA.

NVIDIA dedicated considerable resources to CUDA’s development and promotion. The company built a dedicated compiler team, developed SDKs and libraries, and actively engaged with the developer community, promoting the advantages of GPU computing. NVIDIA also supported developers in marketing their CUDA-based applications, thereby facilitating the growth of an ecosystem that would ultimately become a cornerstone behind NVIDIA’s success.

Another pivotal event in NVIDIA’s history occurred in 2012, which fundamentally altered the company’s trajectory. Researchers at the University of Toronto used NVIDIA GPUs to train a deep learning model known as “AlexNet”. This model achieved unprecedented performance in image recognition. This breakthrough demonstrated the potential of GPUs for AI applications, and it helped spark widespread interest in deep learning.

NVIDIA’s years-long cultivation of the CUDA ecosystem meant that NVIDIA was now ideally positioned to seize this new opportunity. This “AlexNet moment” marked the beginning of NVIDIA’s rapid expansion in the AI and data center markets, and it helped establish NVIDIA’s GPUs as essential foundations for deep learning and AI research.

In 2019, NVIDIA further strengthened its position by acquiring Mellanox Technologies, a leader in high-performance networking solutions. Mellanox’s technology complemented NVIDIA’s GPUs, as they alleviated a key bottleneck in data centers: the need for super-fast, massive-volume data transfer between GPUs within and across data centers. This acquisition helped further solidify NVIDIA’s standing in the data center market and allowed it to offer a comprehensive hardware and software platform for AI and high-performance computing.

The 2020s have witnessed a further consolidation of NVIDIA’s role as a keystone provider of AI computing solutions, with NVIDIA’s data center revenue soaring in response to the growing adoption of deep learning across industries. NVIDIA continues to innovate and release successive generations of powerful GPUs, and it continues to expand its software platform to support an ever-widening array of AI applications, thereby helping cement its influence across technology layers.

While AI has been a significant growth driver for NVIDIA, the company has broadened its reach by developing solutions tailored to specific use cases and markets.

In the automotive sector, NVIDIA introduced the DRIVE platform—a comprehensive hardware and software stack designed to integrate AI and autonomous capabilities into self-driving vehicles. Within the DRIVE platform, CUDA enables the execution of complex algorithms essential for autonomous driving, including real-time image processing, sensor fusion, and deep learning inference. NVIDIA’s automotive strategy focuses on partnering with established automakers and equipping them with the necessary tools and technology to develop and deploy self-driving vehicles.

NVIDIA has also made significant strides with Omniverse, a platform for creating and managing digital twins, which are virtual representations of real-world objects and environments. Leveraging NVIDIA’s expertise in graphics processing, AI, simulation, and robotics, Omniverse allows companies to simulate and test modifications to their real-world assets in a virtual environment before implementation. By 2022, over 700 companies had adopted Omniverse, which demonstrates its value in enabling realistic and interactive simulations that help drive operational efficiencies and innovation across industries.

NVIDIA Technology Stack

This account of NVIDIA’s history illustrates how various layers and interconnections have been progressively added to its digital stack over the years.

Figure 1: NVIDIA’s technology stack

End-user application

Figure 1 illustrates NVIDIA’s technology stack in a layered format and showcases how its components build on each other to power advanced applications. At the bottom is the GPU Layer, which provides the hardware for processing data and supporting applications. Above that is the CUDA platform, which includes the software tools developers use to make full use of NVIDIA’s hardware.

The next level is domain-specific platforms such as NVIDIA Drive (for autonomous driving), Clara (for healthcare), and Omniverse (for digital simulations). These platforms are tailored to meet the needs of specific industries and tasks.

On either side, we see complementary resources: AI Framework Integration on the left, which connects NVIDIA’s stack with popular AI tools and frameworks, and AI Models, Data Centers, and Cloud Solutions on the right, which represent the large-scale infrastructure needed to handle and store the vast amounts of data processed by these systems.

At the very top are end-user applications that consumers and businesses use, all built on top of this powerful technology stack.

Three Open Innovation Models

When building and consolidating its dominant position in the AI and Machine Learning technology stack, NVIDIA has skilfully combined proprietary and open-source strategies. We identified three models of open innovation used by NVIDIA and related digital affordances they harnessed. The first was collaborative innovation, by which it has facilitated joint development of specific use cases with specific partners. This approach allowed NVIDIA to expand and enhance platform functionalities. The affordance here is controlled co-creation, which has enabled innovation and customization while helping preserve proprietary control over the core CUDA platform. As an example, NVIDIA leveraged collaborative innovation with companies like Google and Meta to enhance the performance and usability of Google’s and Meta’s widely-used deep learning frameworks – TensorFlow and PyTorch – on NVIDIA’s hardware.

The second open innovation model used by NVIDIA consisted of providing openly accessible resources for access by developers. These included APIs, software libraries, and tools that support the CUDA platform’s technical ecosystem. The digital affordance here was usability without modification: users could leverage these resources for application development but could alter or redistribute those resources further. This approach broadened the developer community’s access to essential resources and enabled wider developer participation while allowing NVIDIA to related resources proprietary. A wide range of tools, libraries, and resources are accessible to developers on the CUDA platform.

The third open innovation strategy harnessed by NVIDIA, open-source resources, offered a critical affordance: customizability. By providing full access to the source code, open-source resources enable users to customize, extend, and redistribute the resource itself. Open-source software supports collaborative development and community-driven innovation. For instance, all 500 of the world’s most powerful supercomputers, which support specialized scientific applications, operate on some variant of Linux to meet their unique demands. NVIDIA’s decision to open-source its Linux GPU kernel modules empower developers to customize and optimize GPU drivers, addressing the specific performance and scalability requirements of high-performance computing (HPC) environments.

Each of the three open innovation strategies allowed unique affordances for NVIDIA, allowing it to support diverse types of user and developer engagement and foster innovation that was consistent with the platform’s strategic goals and collaboration needs.

How NVIDIA’s Open Innovation Strategies Fostered User Lock-In

The three models of open innovation allowed NVIDIA to expand is footprint in the AI technology stack and foster user lock-in. The collaborative innovation with developers of TensorFlow and PyTorch helped optimize these frameworks for NVIDIA GPUs and the CUDA platform and thus encourage widespread adoption and user lock-in without compromising NVIDIA’s proprietary architecture.

Similarly, when NVIDIA engaged in collaborative innovation with vehicle manufacturers to advance autonomous driving and customize the DRIVE platform for the client, the client’s systems became optimized for NVIDIA hardware. This optimization increased switching costs for the client, thereby promoting user lock-in.

NVIDIA’s strategy of providing broad access to CUDA resources, such as the free CUDA Toolkit, documentation, and educational materials, effectively reduced entry barriers for new users. This accessibility drove widespread adoption and also user lock-in, as developers become invested in the CUDA ecosystem and increasingly dependent on NVIDIA’s tools, hardware, and expertise for application development. Once developers had invested time and effort in mastering CUDA, they became less inclined to learn alternative platforms that would require new skillsets and offer less community support.

Finally, NVIDIA’s open-source strategy, under which only specific components (such as Linux kernel modules and certain libraries) were made open source, helped build trust among developers and expand CUDA’s appeal without relinquishing control of the core platform. This approach allowed users to customize and integrate CUDA into their workflows without fully enabling independence from NVIDIA’s ecosystem.

By selectively open-sourcing its Linux GPU kernel modules, NVIDIA strategically positioned itself within the supercomputing market, where developers favor open-source solutions for customizing their infrastructure. This approach incentivized institutions to invest in CUDA-optimized applications and NVIDIA hardware, thereby fostering ecosystem lock-in.

In 2024 NVIDIA is going one step further, as it decided to release leading-edge Large Language Models under open-source licence. NVIDIA’s NVLM 1.0, a multimodal LLM performs well on both vision, language, and text-only tasks, and its performance rivals that of proprietary LLMs from OpenAI and Google. NVIDIA also released an open-source AI model named Nemotron that builds on Meta’s Llama-3 framework and outperforms OpenAI’s latest models in various benchmark tests.

The release of NVLM 1.0 and Nemotron sent a clear message to NVIDIA clients that they do not need proprietary models such as ChatGPT or Gemini for applications built on top of NVIDIA tools, as the NVIDIA technology stack is now able to support image- and video-heavy ML models.

Conclusion

In conclusion, NVIDIA’s journey from a graphics card manufacturer to a platform company at the heart of the AI revolution showcases a powerful ability to leverage a mix of open innovation models to drive user lock-in without losing control of proprietary foundations.

Table 1: Open Innovation Strategies Used by NVIDIA

Table 1

NVIDIA’s strategy demonstrates that a platform can be both generous in sharing resources and strategic in creating a robust, loyal user base. Through a blend of collaborative innovation, openly accessible resources, and selective open-source offerings, NVIDIA has crafted an ecosystem that attracts users while creating lasting dependence on its platform. As illustrated in the table 1, this nuanced approach to open innovation exemplifies how selective openness drives platform dominance and user loyalty. As a consequence, NVIDIA is now the dominant force in the GPU market, holding an impressive 84% market share, and with over 4 million CUDA developers.

Previously in the AI Power Play series

Coming next in the AI Power Play series

  • Why most Tech companies want to collaborate with Hugging face?
  • AI open-source dynamics: a complementarity perspective

About the Authors

Herve LegenvreHervé Legenvre is Professor and Research Director at EIPM. He manages education programmes for global clients. He conducts research and teaches on digitalisation, innovation, and supply chain. Lately Hervé has conducted extensive research on how open-source software and open hardware are transforming industry foundations (www.eipm.org).

Erkko Autio

Erkko Autio FBA FFASL is Professor in Technology Venturing at Imperial College Business School, London. His research focuses on digitalisation, open technology ecosystems, entrepreneurial ecosystems, innovation ecosystems, and business model innovation. He co-founded the Global Entrepreneurship Monitor (www.gemconsortium.org), the Global Entrepreneurship Index (thegedi.org), and Wicked Acceleration Labs (www.wickedacceleration.org). 

The post NVIDIA: Harnessing Open Innovation to Promote User Lock-in appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/nvidia-harnessing-open-innovation-to-promote-user-lock-in/feed/ 0
Google and AI: The Tech Leader That Had a Perfect AI Plan Until November 2022 https://www.europeanbusinessreview.com/google-and-ai-the-tech-leader-that-had-a-perfect-ai-plan-until-november-2022/ https://www.europeanbusinessreview.com/google-and-ai-the-tech-leader-that-had-a-perfect-ai-plan-until-november-2022/#respond Wed, 06 Nov 2024 11:13:38 +0000 https://www.europeanbusinessreview.com/?p=217354 By Hervé Legenvre and Erkko Autio This article is part three of an ongoing series – The AI Power Plays – that explores the fiercely competitive AI landscape, where tech […]

The post Google and AI: The Tech Leader That Had a Perfect AI Plan Until November 2022 appeared first on The European Business Review.

]]>
By Hervé Legenvre and Erkko Autio

This article is part three of an ongoing series – The AI Power Plays – that explores the fiercely competitive AI landscape, where tech giants and startups battle for dominance while navigating the delicate balance of competition and collaboration to stay at the cutting edge of AI innovation. 

For several years, Google was widely regarded as the gravity well in the AI ecosystem, owing to its ability to steer developments through the judicious release of AI tools and technologies as open-source. However, the meteoric rise of ChatGPT, the AI-driven chatbot developed by OpenAI, has exposed the limits in Google’s capacity to translate the resulting ecosystem momentum into proprietary competitive advantage. In this article, we examine these dynamics. 

Open Technologies and Ecosystem Momentum: The TensorFlow story 

In 2015, Google open-sourced TensorFlow, a framework library that became a cornerstone of the AI and Machine Learning technology stack. By virtue of being openly available to researchers and developers, TensorFlow helped consolidate the hitherto fragmented AI and ML technology landscape and it rapidly became the dominant platform for the design and deployment of AI models. This move had a dramatic effect on the AI ecosystem momentum, as it boosted the development and sharing of AI models, complementary technologies and related tools. Large cohorts of developers were trained in online courses or using books published by high-profile AI researchers. The growing TensorFlow community quickly started to build upon and contribute to the TensorFlow platform. This momentum attracted a new generation of computer scientists and developers to AI. Importantly, this development accelerated the discovery and validation of new AI use cases, as AI start-ups started to attract large amounts of capital and clients across different sectors, thereby driving market creation for AI applications. 

By 2020, TensorFlow had reached 153,000 GitHub stars and 115 million downloads—a testament to its widespread adoption. By open-sourcing TensorFlow, Google had not simply forfeited intellectual property; rather, it was establishing and consolidating its AI ecosystem leadership. 

Capturing Profit from AI: A Nearly Perfect Three-Horizon Strategy 

While Google boosted open innovation through its selective open-source strategy, it also implemented a ‘Three-Horizon Strategy’ to capture value from AI. Horizon 1 aimed at harvesting AI advances within Google’s core products. Horizon 2 aimed to catch up with Amazon Web Services and Microsoft Azure in the cloud service business by integrating AI advances into Google’s cloud offerings. Horizon 3 aimed at harvesting AI advances within ‘moonshot’ projects. 

Horizon 1: Integrating AI into Core Products 

The first horizon in Google’s AI strategy focused on embedding AI capabilities across its existing product portfolio—including Google Search, Google Ads, Google Docs, and YouTube. For instance, in 2017, following the discovery of the transformer architecture, Google integrated its BERT model into Google Search to provide more nuanced and contextually appropriate results. Many such integrations, while enhancing the functionalities of Google’s core products, were often almost imperceptible to Google’s search users. 

Horizon 2: Cloud AI Tools and Solutions 

The second horizon of Google’s AI strategy sought to differentiate its cloud services with enhanced AI capabilities. In the mid-2010s, Google was trailing behind industry leaders Amazon Web Services (AWS) and Microsoft Azure. To catch up, Google launched several AI tools, including AutoML in 2018, Google AI Platform in 2019, and Vertex AI in 2021. These tools provided advanced interfaces for building and deploying AI and machine learning applications within Google Cloud. However, AWS and Microsoft quickly imitated many of Google’s innovations, as many models and tools essential for AI were accessible under permissive open-source software (OSS) licenses. 

Horizon 3: Moonshot Projects Leveraging AI 

The third horizon of Google’s strategy involved moonshot projects that pushed the boundaries of AI to create powerful new applications. Among these, Waymo stood out as a very ambitious initiative—aiming to bring self-driving technology to market and lead the autonomous vehicle industry. Waymo leveraged Google’s expertise in AI to power self-driving technologies. Waymo is an attempt to harness AI advances to establish leadership in an unrelated sector beyond Google’s core markets. However, Horizon 3 projects have yet to generate significant revenue in spite of helping expand Google’s strategic reach. 

The Unexpected Shock: Generative AI and ChatGPT 

Google’s three-horizon strategy was disrupted by the rapid rise of generative AI, particularly by the launch of OpenAI’s ChatGPT in the Fall of 2022. This conversational AI model instantly captured consumer imagination and enterprise interest world-wide, and Microsoft made strategic moves to incorporate ChatGPT into its Bing and other Office products. 

This launch exposed what was considered a weakness in Google’s selectively open AI strategy, as it struggled to quickly launch a conversational AI that matched ChatGPT’s capabilities. 

While Google’s three-horizon approach looked effective in both capturing current value and laying the foundation for future growth, the rapid adoption of ChatGPT demonstrated that in a dynamic and rapidly evolving technology ecosystem, momentum can be seized through bold and well-timed moves.  

Despite Google’s extensive investments into nurturing the AI and machine learning ecosystem, Google’s response to ChatGPT appeared reactive rather than proactive. This delay has prompted observers to question why a company as well-positioned as Google was taken by surprise by a competitor.  

Below, we examine three key hypotheses for why Google found it challenging to respond rapidly to OpenAI’s threat and examine how Google’s response has reshaped its AI strategy. 

Hypothesis 1: Myopia—An Innovation Blind Spot 

Our first hypothesis centres on the concept of myopia—a strategic blind spot that prevented Google from fully appreciating the transformative impact of conversational AI. While Google viewed AI as a powerful complementary technology that boost its core businesses and moonshot projects, the company underestimated the disruptive potential of a freely accessible AI chatbot. In contrast, OpenAI quickly framed ChatGPT not merely as a technology demonstrator, but as a user-centric product in its own right that offered a virtually limitless range of potential use cases. 

Hypothesis 2: Fear of Hallucinations and Ethical Concerns 

The second hypothesis highlights Google’s concerns regarding inherent risks in releasing generative AI models—particularly the ‘hallucination’ issue which prompts AI models to generate content that may be factually inaccurate, misleading, or potentially offensive. This concern was underscored by an incident in 2022, when a Google engineer publicly claimed that the company’s AI chatbot, LaMDA, had achieved sentience. This episode served as a stark reminder of the challenges associated with deploying powerful language models. While the prospect of reputation damage made Google hesitant to release its AI powered chatbot, OpenAI adopted a bolder posture and released ChatGPT early.  

Hypothesis 3: The Wall Between Advanced Research and Business Activities 

Our third hypothesis highlights a disconnect between Google’s advanced research and its business operations, which slowed down the transformation of technological achievements into commercially viable applications. Agile innovation requires close alignment between research and business units. However, the structural separation between these activities at Google meant that breakthroughs in AI were not as efficiently translated into new products as they ideally should be. 

This disconnect was likely further exacerbated by Google’s open-sourcing strategy for AI technologies. By making many of its models and tools open-source, Google was able to generate significant ecosystem momentum. However, this strategy also allowed competitors, including OpenAI, Microsoft, and AWS to capitalize on AI advances developed within the open-source community. 

The combined impact of internal structural separation and the open-sourcing strategy meant that Google may have inadvertently undermined its ability to fully capitalize on its research leadership. This issue highlights the tension between an open-source strategy and proprietary business development—an ongoing challenge for organizations that seek both technological innovation and market leadership. 

Google’s Response: New Products and Reorganization 

Faced with the competitive pressure from OpenAI and Microsoft’s integration of ChatGPT into Bing, Google initiated a multi-faceted response that sought to both catch up technologically and adapt organizationally. The response consisted of two major initiatives. 

Launching Competing Products. Google responded to the rise of ChatGPT by launching Bard, its own conversational AI model, in February 2023. However, the public launch was marred by inaccuracies, which detrimentally affected public perception of Bard. In December 2023, Google announced an upgrade to Bard by powering it with its new LLM, Gemini. Google rebranded Bard as Gemini in February 2024 and started to compete directly with OpenAI’s offerings. Some critics noted that Google’s response to OpenAI was slower than expected, given Google’s extensive resources and expertise in AI research. 

Organizational Reorganization: Bridging Research and Business. The second, and arguably more strategic, response was a reorganization aimed at reducing the gap between Google’s research and business divisions. Reports of internal reorganization efforts suggest that Google has sought to bring its AI research closer to product teams, thereby enabling a more seamless transition from innovation to market. This reorganization also involved a shift away from the previous emphasis on open-sourcing technologies, which, while not stopped, have been de-prioritized.  

Conclusion 

Google’s delayed response to OpenAI’s ChatGPT can be attributed to a combination of strategic myopia, risk aversion, and internal organizational disconnects. These challenges also illustrate that open-technology strategies, while powerful when building momentum around general-purpose technologies, are not always a panacea when it comes to value appropriation. We should keep in mind, however, that the competitive situation in AI remains highly dynamic, and AI and ML technologies continue to evolve at a breakneck pace. We expect to witness numerous shifts in competitive positions as new capabilities are introduced and use cases demonstrated. 

Previously in the AI Power Play series  

Coming next in the AI Power Play series: 

  • Why most Tech companies want to collaborate with Hugging face? 
  • The rise of Nvidia and its recent open-source stride 

About the Authors 

Herve LegenvreHervé Legenvre is Professor and Research Director at EIPM. He manages education programmes for global clients. He conducts research and teaches on digitalisation, innovation, and supply chain. Lately Hervé has conducted extensive research on how open-source software and open hardware are transforming industry foundations (www.eipm.org). 

Erkko Autio

Erkko Autio FBA FFASL is Professor in Technology Venturing at Imperial College Business School, London. His research focuses on digitalisation, open technology ecosystems, entrepreneurial ecosystems, innovation ecosystems, and business model innovation. He co-founded the Global Entrepreneurship Monitor (www.gemconsortium.org), the Global Entrepreneurship Index (thegedi.org), and Wicked Acceleration Labs (www.wickedacceleration.org). 

The post Google and AI: The Tech Leader That Had a Perfect AI Plan Until November 2022 appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/google-and-ai-the-tech-leader-that-had-a-perfect-ai-plan-until-november-2022/feed/ 0
Why is OpenAI Moving Towards a Closed Source Strategy? https://www.europeanbusinessreview.com/why-is-openai-moving-towards-a-closed-source-strategy/ https://www.europeanbusinessreview.com/why-is-openai-moving-towards-a-closed-source-strategy/#respond Thu, 17 Oct 2024 02:16:57 +0000 https://www.europeanbusinessreview.com/?p=215557 By Hervé Legenvre and Erkko Autio This article is part two of the ongoing series—The AI Power Plays—that explores the fiercely competitive AI landscape where tech giants and startups battle […]

The post Why is OpenAI Moving Towards a Closed Source Strategy? appeared first on The European Business Review.

]]>
By Hervé Legenvre and Erkko Autio

This article is part two of the ongoing series—The AI Power Plays—that explores the fiercely competitive AI landscape where tech giants and startups battle for dominance while navigating the delicate balance of competition and collaboration to stay at the cutting edge of AI innovation. 

The Mission of OpenAI: Benefiting Humanity 

OpenAI was founded with a bold vision: to ensure that the development of artificial intelligence would benefit all humanity. Contrary to popular belief, OpenAI was never meant to be a purely open-source research group. Instead, its primary focus was to pre-empt a scenario where AI technologies would be controlled by a few large tech companies at the expense of the humankind. Established as a non-profit organization, OpenAI sought to democratize AI development and ensure that this powerful technology would serve the common good instead of narrow private interests. 

But as the field of AI has evolved, OpenAI has started to migrate towards a more closed-source strategy. What was it that prompted this volte-face in OpenAI strategy?  

The Technical Turning Point: Google’s Discovery of Transformers 

In 2017, Google introduced transformers to the World—a groundbreaking model architecture that changed the course of AI technology. The transformer architecture enabled AI systems to capture patterns and nuances that underpin powerful AI models. However, the catch was that this architecture requires very large datasets to train those models. The transformer breakthrough initiated a shift in the AI landscape, where size, data, and computing power became key drivers of technological advancement. 

Suddenly, AI leadership was driven by a “bigger is better” philosophy. The more data a model could be trained on and the more computing power it could leverage, the better the outcomes became. To compete at the cutting edge of this new AI race, organizations needed access to massive and costly computing capabilities.  

OpenAI’s Microsoft Partnership: Navigating the Need for Scale 

OpenAI soon realized that to continue developing its frontier models and keep pace with the new “bigger is better” trend, it needed access to vast computing resources—resources that only a tech giant like Microsoft could provide. This led to OpenAI forming a series of partnerships with Microsoft from 2019 to 2023, culminating in Azure, Microsoft’s cloud platform becoming the exclusive commercial partner of OpenAI. In 2019, OpenAI already created a for-profit subsidiary. This move naturally raised eyebrows. To mitigate any backlash, OpenAI introduced a capped-profit structure to ensure that its core mission of benefiting humanity would remain. Nevertheless, this shift represented a critical evolution in how OpenAI operated, elevating revenue generation as a strategic objective for OpenAI. 

The Release of ChatGPT: A Global Phenomenon 

In late 2022, OpenAI released ChatGPT, a conversational AI tool that took the World by storm. This tool quickly became a brand on its own, with skyrocketing adoption of the service. The chatbot’s success put OpenAI in the spotlight, but it also placed the company under immense pressure to continue developing and monetizing its technology. 

Given the ultra-rapid pace of AI advances, which make Moore’s Law look like a snail, AI models that are cutting-edge today can be obsolete within just six months. For OpenAI to stay ahead of its competitors and maintain its technology leadership, it had to keep boosting its model development effort—a process that requires enormous financial and computational resources.  

Why Is OpenAI Moving Towards a Closed-Source Strategy? 

One of the most significant shifts in OpenAI’s strategy has been its pivot towards a more proprietary mode that puts emphasis on harnessing ChatGPT for revenue generation. This transition was not without controversy, but it stemmed from a very practical necessity: OpenAI needs to generate revenue to sustain its costly model training and research efforts. 

Unlike tech giants such as Google and Meta that can leverage AI to boost their revenue from their established services such as advertising, OpenAI does not have complementary revenue streams that its AI models could support. For OpenAI, the only way to generate significant revenue is by monetizing its technology directly—either through paid access to its chatbot, ChatGPT, or by providing companies access to its advanced AI models. 

Keeping Up with a Hypercompetitive AI Market 

The pace of innovation in AI is relentless. Without complementary businesses to support its AI advancements, OpenAI’s survival hinges upon its ability to remain at the cutting edge of AI development. The company must not only maintain its technology leadership, as it also needs to effectively monetize its technology. To achieve this, according to press articles, OpenAI generates revenue through ChatGPT subscriptions, projected to reach $2.7 Billion in 2024, and through API access to its models, which should bring in an additional $1 Billion by enabling clients to build their own AI-based applications. 

Today’s Reality: Revenue Is Growing, but Costs Are Growing Even More Rapidly 

Although OpenAI is expected to continue to increase its revenue from AI models, the company’s operational costs still far exceed its earnings. The cost of the computing power needed to train and run advanced AI models is astronomical, even for a startup as hyped as OpenAI. As a result, the company is considering raising the price of ChatGPT subscriptions.  

Despite generating impressive revenue growth, the company is expected to post a staggering $5 Billion loss in 2024. To address this financial strain, OpenAI is considering transitioning to a fully for-profit model, a move that has already led to the departure of several key employees who feel the shift conflicts with the company’s original mission. 

Adding to this uncertainty, reports suggest that Apple is reconsidering its potential investment in OpenAI, raising further questions about the company’s financial future. Nevertheless, this did not prevent OpenAI to raise $6.6 billion from investors including Microsoft and Nvidia. 

A Business Model Unlike Web 2.0 Startups 

OpenAI operates in a vastly different economic landscape compared to web 2.0 startups such as Facebook and Instagram. In the world of AI, scaling does not necessarily mean lower costs and more network effects. In fact, the opposite may be true. As AI models grow in complexity, their training costs may increase disproportionately. 

Where traditional tech startups could rely on a large user base, powerful network effects, and interconnected open-source software built on top of an affordable cloud infrastructure, OpenAI must continuously invest in costly proprietary technology development to stay ahead of competitors. And as OpenAI’s long-term revenue prospects likely lie in the business-to-business (B2B) sector, the company will need to develop strong sales and business development capabilities, which are difficult to build from scratch. 

At OpenAI, the shift towards a proprietary closed-source model reflects the harsh economic realities of running an AI company without complementary products that benefit from the technology and help offset development costs. The economics of AI are fundamentally challenging, offering no easy path to profitability. 

Coming next in the AI Power Play series: 

  • Google and AI: The Tech leader that had a perfect AI plan till fall 2022 
  • Why most Tech companies want to collaborate with Hugging face? 

Also in the AI Power Play series: 

About the Authors

Herve Legenvre

Hervé Legenvre is Professor and Research Director at EIPM. He manages education programmes for global clients. He conducts research and teaches on digitalisation, innovation, and supply chain. Lately Hervé has conducted extensive research on how open-source software and open hardware are transforming industry foundations (www.eipm.org).

Erkko Autio

Erkko Autio FBA FFASL is Professor in Technology Venturing at Imperial College Business School, London. His research focuses on digitalisation, open technology ecosystems, entrepreneurial ecosystems, innovation ecosystems, and business model innovation. He co-founded the Global Entrepreneurship Monitor (www.gemconsortium.org), the Global Entrepreneurship Index (thegedi.org), and Wicked Acceleration Labs (www.wickedacceleration.org). 

The post Why is OpenAI Moving Towards a Closed Source Strategy? appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/why-is-openai-moving-towards-a-closed-source-strategy/feed/ 0
Why Meta is Positioning Itself as the Champion of Open-Source AI https://www.europeanbusinessreview.com/why-meta-is-positioning-itself-as-the-champion-of-open-source-ai/ https://www.europeanbusinessreview.com/why-meta-is-positioning-itself-as-the-champion-of-open-source-ai/#respond Tue, 01 Oct 2024 13:01:22 +0000 https://www.europeanbusinessreview.com/?p=214466 By Hervé Legenvre and Erkko Autio This article is part one of an ongoing series – The AI Power Plays – that explores the fiercely competitive AI landscape, where tech […]

The post Why Meta is Positioning Itself as the Champion of Open-Source AI appeared first on The European Business Review.

]]>
By Hervé Legenvre and Erkko Autio

This article is part one of an ongoing series – The AI Power Plays – that explores the fiercely competitive AI landscape, where tech giants and startups battle for dominance while navigating the delicate balance of competition and collaboration to stay at the cutting edge of AI innovation. 

Why Meta is Positioning Itself as the Open-Source Champion for Frontier AI Models 

The artificial intelligence sector is going through a period of rapid change. At the moment, frontier AI models represent the cutting edge of AI technologies and are pushing the boundaries of what AI can achieve. These models drive advances in natural language processing, computer vision, and other high-impact applications. As Meta is positioning itself as a leader in open-source AI, the question arises: Why is Meta, a tech giant with vast resources, making its most advanced AI models openly available, and what strategic benefits does it gain from this? 

The LLaMA Models: Open, but Not Without Limits 

Meta’s LLaMA (Large Language Model Meta AI) models, introduced in recent years, have become popular among developers and researchers for their powerful capabilities that rival proprietary models from competitors like OpenAI’s o1and Google’s Gemini. However, while Meta promotes LLaMA models as open-source, there are restrictions. For instance, companies with more than 700 million monthly active users must obtain a special license from Meta to use LLaMA, ensuring that large enterprises cannot freely exploit Meta’s technology without permission. 

Despite these limitations, Meta’s decision to release the LLaMA models for the open-source community has been met with enthusiasm. The permissive licensing conditions, along with high performance, positions LLaMA as a great alternative for those seeking advanced AI without the costs associated with proprietary systems.  

The LLaMA Models: an emerging open-source dominant model  

Following the launch of LLaMA 2 in July 2023, downloads of LLaMA on Hugging Face increased by 10x to nearly 350 million. In August 2024 alone, LLaMA models were downloaded over 20 million times, driven by the release of LLaMA 3.1 405B.  

Enterprises such as AT&T, Goldman Sachs, Spotify, Zoom, Infosys and KPMG are now using LLaMA models for their developments. Additionally, many developers access LLaMA through Meta’s partnerships with cloud providers like AWS, Microsoft Azure, and Google Cloud.  

But why is Meta so generously open-sourcing such a valuable asset? 

Meta’s Strategic Generosity 

The answer lies in Meta’s wider strategy: Unlike OpenAI and Microsoft, the company is not primarily looking to monetize access to its LLaMA technology. Instead, Meta leverages AI to enhance its existing products and fuel future innovations. Instead of monetizing LLaMA directly, Meta views its AI technology as a key complement that supports its ability to monetize existing products and services and innovate new ones. 

Enhancing Existing Products and Services

Meta’s core businesses, including social media platforms like Facebook and Instagram, and its advertising ecosystem, already rely heavily on AI. From personalizing user experiences to optimizing ad targeting, AI powers the company’s current value propositions. By making its frontier LLaMA AI models openly accessible, Meta ensures that these models are constantly refined by a global community of developers, an open innovation dynamic that ultimately benefits Meta’s own AI-powered products and services. 

Pioneering New Products in the Metaverse

In addition to souping up its existing products and services, Meta has embarked on an ambitious vision for the metaverse and invested heavily in its development, though tangible results have yet to fully materialize. However, advanced AI will be critical in powering the interactive, responsive, and intelligent systems needed for immersive metaverse environments. By encouraging widespread use of its AI models, Meta can accelerate the development of technologies that will power its next-generation products, helping it maintain a leadership position in both the metaverse and immersive technology space. 

The Advantages of Openness 

Meta’s decision to make its frontier AI models open-source offers several strategic benefits. 

First, it helps Meta attract, motivate, and retain top AI talent. By positioning itself as an open-source leader, Meta strengthens its reputation within the developer and AI research communities, making it a more attractive workplace for AI professionals. Open-source initiatives foster an environment of innovation and collaboration, significantly enhancing Meta’s ability to recruit and retain top-tier employees. It also provides opportunities to collaborate with external talent who are already familiar with the company’s technologies. 

Second, Meta benefits from community-driven innovation. As seen with the success of previous open-source projects of Meta like React or PyTorch, open-source models tend to evolve more rapidly as the global developer community identifies new applications and refines training and fine-tuning techniques. This collaborative approach allows Meta to tap into a vast pool of expertise and creativity, helping the company maintain its leading position in the competitive AI race. 

So, while Google increasingly leans towards a closed-source approach, Meta further solidifies its role as a “white knight,” making life easier for developers and academics who rely on accessible AI tools. For academics, many of whom prefer solutions they can use freely regardless of their institution, Meta’s open-source models are especially appealing—allowing them to continue their work even when moving between universities or research centres. 

By embracing openness, Meta benefits from a global community contributing to the development of AI systems while still retaining control over the most commercially valuable applications. 

Conclusion 

Meta’s positioning as an open-source champion for frontier AI models is not purely altruistic. By making LLaMA models openly accessible, Meta fosters an ecosystem of innovation that directly feeds back into its own development, benefiting both its social media and advertising businesses as well as its metaverse ambitions. 

In an industry where access to talent, innovation, and speed are critical, Meta’s open-source strategy enables it to achieve all three while keeping its AI advancements at the cutting edge. The foundations of digital technologies are reinvented every few years—with the shift to mobile, cloud solutions, and now AI as a core foundation for digital infrastructure. 

By embracing openness, Meta can ride the wave of a growing community of innovators and developers, maintaining its position at the forefront of technological advancement. By leveraging openness, Meta is positioning itself to remain a dominant force in AI, not by charging for access to its models, but by reaping the long-term rewards of a global community that advances the very technologies Meta seeks to capitalize on. 

Coming next in the AI Power Play series: 

  • Why is OpenAI is so closed? 
  • Google and AI: The Tech leader that had a perfect AI plan till November 2022 
  • Why most Tech companies want to collaborate with Hugging face? 

About the Authors

Herve Legenvre

Hervé Legenvre is Professor and Research Director at EIPM. He manages education programmes for global clients. He conducts research and teaches on digitalisation, innovation, and supply chain. Lately Hervé has conducted extensive research on how open-source software and open hardware are transforming industry foundations (www.eipm.org).

Erkko Autio

Erkko Autio FBA FFASL is Professor in Technology Venturing at Imperial College Business School, London. His research focuses on digitalisation, open technology ecosystems, entrepreneurial ecosystems, innovation ecosystems, and business model innovation. He co-founded the Global Entrepreneurship Monitor (www.gemconsortium.org), the Global Entrepreneurship Index (thegedi.org), and Wicked Acceleration Labs (www.wickedacceleration.org). 

The post Why Meta is Positioning Itself as the Champion of Open-Source AI appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/why-meta-is-positioning-itself-as-the-champion-of-open-source-ai/feed/ 0
Decarbonizing the Construction Industry: A Case Study of the Eiffage Platform and Circularity Strategies https://www.europeanbusinessreview.com/decarbonizing-the-construction-industry-a-case-study-of-the-eiffage-platform-and-circularity-strategies/ https://www.europeanbusinessreview.com/decarbonizing-the-construction-industry-a-case-study-of-the-eiffage-platform-and-circularity-strategies/#respond Mon, 30 Sep 2024 13:44:56 +0000 https://www.europeanbusinessreview.com/?p=214430 By Hervé Legenvre and Bertrand Touzet Introduction   This article explores the role of platforms and circularity strategies in mitigating climate change generated by the construction industry. It describes how the […]

The post Decarbonizing the Construction Industry: A Case Study of the Eiffage Platform and Circularity Strategies appeared first on The European Business Review.

]]>

By Hervé Legenvre and Bertrand Touzet

Introduction  

This article explores the role of platforms and circularity strategies in mitigating climate change generated by the construction industry. It describes how the integration of procurement processes, platform technologies, and circular economy principles fosters decarbonization. We first discuss the construction industry’s impact on climate change, then explore specific strategies developed by Eiffage, a construction company. Throughout the article, we illustrate how these initiatives are being implemented to drive widespread adoption of low-carbon solutions across the sector.  

The role of the construction industry in combating climate change 

In recent times, climate change isn’t the only environmental issue within the construction industry. Other significant challenges include managing water use, reducing waste, and securing access to essential raw materials. These are all areas where the industry must make improvements. 

The construction industry contributes significantly to climate change. According to the United Nations Environmental Programme, it accounts for 21 per cent of global greenhouse gas (GHG) emissions. In 2022, buildings were responsible for 34 per cent of global energy consumption and 37 per cent of process-related carbon dioxide emissions. The production of construction materials such as cement, steel, and concrete is notably emission-intensive. According to the European Union, the construction sector is responsible for approximately 50 per cent of all extracted materials and contributes to over 35 per cent of the EU’s total waste generation. Additionally, buildings consume considerable energy for heating, ventilation, air conditioning, and lighting. 

Achieving sustainable construction promises to enhance quality of life by fostering healthier, more durable, and more energy-efficient living environments. As governments increasingly implement sustainable building regulations and standards to promote environmentally friendly practices, the construction sector faces significant challenges. Despite its vital social and economic importance, the industry is sensitive to costs, and efforts to reduce emissions are likely to lead to higher prices. This sector not only significantly bolsters the economy through job creation but also profoundly influences social outcomes. And, beyond climate change, the construction industry confronts other pressing environmental challenges. Critical among these are managing water use, minimizing waste, and ensuring access to essential raw materials. Addressing these issues is crucial for the industry’s continued improvement and sustainability. 

Battling climate change: the Eiffage strategy  

Eiffage is one of Europe’s leading construction companies, renowned for its expertise in civil engineering, construction, and infrastructure projects. The company is also engaged in energy systems and operates various public-private-partnership projects. Eiffage is committed to sustainable development and actively implements strategies to reduce its environmental impact across its diverse range of services and projects. 

Eiffage’s GHG emission-reduction strategy aims to reduce its own emissions while providing clients with more environmentally friendly products and services. In line with the Task Force on Climate-Related Financial Disclosures (TCFD) guidelines, Eiffage has published a climate report that supports the 1.5°C target established by the 2015 Paris Agreement. Eiffage annually tracks and reports its greenhouse gas emissions to the Carbon Disclosure Project (CDP). And the company has set ambitious targets in accordance with the Science Based Targets initiative: by 2030, Eiffage aims to reduce its Scope 1 and 2 emissions by 46 per cent and its upstream and downstream Scope 3 emissions by 30 per cent. Consequently, Eiffage encourages its employees, suppliers, and partners to engage in emission-reduction initiatives. This includes initiating innovation projects, experimenting with innovative technologies, and sharing best practices. 

Key challenges for decarbonizing the construction industry 

Reducing emissions in the construction industry presents significant challenges due to its inherent complexity and long lifecycle, which slow down the pace of change. Since buildings are designed to last for decades, if not centuries, changes in the construction sector have a long-term impact on GHG emissions. The emissions produced during construction can affect the environment for an extended period after completion. Moreover, decisions during the construction phase can influence a building’s energy consumption throughout its lifecycle, impacting emissions during the use phase. 

For effective emission reduction, the industry must improve building designs, adopt new technologies and operational practices, and develop materials that emit less. 

However, innovation in these areas can be slow and costly, despite the availability of more and more low-emission options. Many materials marketed as “recycled” or “low-carbon” offer only modest reductions in GHG emissions. The performance and durability of these alternatives are often questioned, which limits their adoption. The absence of standardized definitions for low-carbon products by governments or regulatory bodies further complicates reliable and comparable life-cycle assessments, making the selection of effective solutions challenging. 

Moreover, the construction sector’s diversity, ranging from minor renovations to large-scale infrastructure projects, means that the implementation of uniform changes is more complex. The uniqueness of each building, for example, complicates material reuse efforts. 

Construction projects also involve a complex network of stakeholders, including architects, project management services, construction companies, and contractors. Unless ambitious decarbonization goals are set by clients from the start, significant emission reductions are difficult to achieve. Early decisions by architects and project management services heavily influence the emissions of a building.  

The proliferation of so-called “low-carbon” solutions, the diversity of projects, and the complexity of stakeholder relationships act as centripetal forces (see table 1). These forces pull in various directions, complicating the coordination of actions or the adoption of unified emission reduction strategies. 

Decarbonizing the Construction Industry: Table 1

To address these challenges, there is a pressing need to realign all industry players around common goals, practices, and standards. This involves reinventing supply chains and fostering collaboration across the entire value chain, including multiple suppliers, to drive innovation. Only through such coordinated efforts can the construction industry overcome its fragmented nature and make substantial strides in reducing its environmental impact. 

Eiffage marketplace strategy 

To align all industry players around common goals, practices, and standards, the digitalization of procurement processes is instrumental. This can facilitate distributed decision-making, allowing each construction project to make informed purchase decisions based on a shared data foundation and robust procurement processes. This involves transforming complex environmental product declarations that contain information on GHG emissions into accessible and actionable data, empowering employees on construction sites to make the right decisions. 

Addressing the decarbonization challenges in the construction sector can be achieved through a data-driven strategy. Data from various sources can be aggregated and synthesized to provide clear accountability and support informed decision-making. Eiffage has led the development of a unified marketplace that integrates various types of data. This marketplace, named BlueOn (https://blueon.io/), while initiated by Eiffage, is accessible to everyone in the industry as a standalone platform and brand. 

BlueOn aims to democratize access to environmental data by providing reliable information that enables buyers to make informed comparisons and automatically generate carbon reports. Serving as a commercial marketplace, BlueOn features products alongside their environmental data, offering comprehensive environmental, technical, and economic insights for any products that suppliers choose to make available in the marketplace. A key requirement for including a product in the marketplace is that its environmental impact, particularly GHG emissions, must be assessed through a life-cycle analysis specific to that product. 

Decarbonizing the Construction Industry

For suppliers, BlueOn presents an opportunity to showcase their environmental performance and investments, enhancing product visibility and potentially creating a new sales channel. While Eiffage does not mandate its suppliers to join BlueOn, it strongly encourages participation in this initiative. The primary objective is transformative, to motivate suppliers to produce reliable environmental data at the product level, which can be publicly shared through digital means. Such initiatives amplify the momentum toward creating publicly accessible product carbon footprints across the industry. 

Although some suppliers may initially hesitate, the commitment to transmitting unaltered data often alleviates their concerns. Additionally, industry mimicry plays a crucial role; when one supplier commits to sharing their product and information in the marketplace, others are likely to follow. This peer influence can significantly accelerate the industry’s transition. While no supplier has refused to participate thus far, engagement levels do vary, and peer pressure frequently serves as a powerful motivator. 

The success of BlueOn hinges on several factors: starting with a blank page of paper to focus on core needs, prioritizing accurate carbon footprint data derived from lifecycle analysis, and ensuring user-friendly design. Simple, intuitive interfaces have attracted internal users without formal mandates, proving that ease of use can drive adoption. 

Although the platform was originally intended for internal use by Eiffage, there is now an intention to open it up to major clients, particularly those involved in internal renovations. Subcontractors and smaller construction firms also stand to benefit from the marketplace as future users. The integration and aggregation of supplier data without alteration is complex but essential for maintaining data integrity. But strong leadership and a focus on essential features have been pivotal in navigating these challenges. 

In conclusion, as BlueOn continues to evolve and expand, its role in transforming the construction industry will become increasingly vital. By fostering collaboration and transparency, digitalization and decarbonization, BlueOn is setting new standards for environmental accountability and enabling a more sustainable future. This forward-thinking approach not only aligns with global sustainability goals but also paves the way for other industries to follow suit in their environmental endeavours. 

BlueOn as a marketplace operates within an ecosystem logic, acting as a hub that connects a broad and evolving network of providers and users supported by multiple data streams. This platform is just one part of Eiffage’s broader strategy to facilitate the decarbonization of the construction industry. Alongside BlueOn, Eiffage is developing new supply chains that bring together complementary capabilities to deliver solutions with a lower carbon footprint. These initiatives are designed to harness the collective strengths of various stakeholders, fostering an environment where sustainable practices are not just encouraged but integral to the industry’s evolution. 

Decarbonizing the Construction Industry

Reinventing the raw material foundations of the construction industry 

The construction sector relies heavily on carbon-intensive materials such as cement and steel. To advance decarbonization, alternative materials like fly ash, slag, calcined clay, and recycled concrete are being utilized, which can partially replace traditional cement and reduce new production demand. 

The adoption of environmentally friendly concrete offers a smaller carbon footprint, since the inclusion of industrial by-products can lessen the environmental impact of construction projects. However, concrete with a lower carbon footprint can take two to four days to set, depending on the weather, which may delay project progress and jeopardize performance. To address this, Eiffage has collaborated with two suppliers to develop a new system that insulates formworks and facilitates the thermo-reactive process of the concrete, enabling it to set much faster. The setting time was successfully reduced to 11-12 hours at a temperature of 40°C, thus preserving the project schedule. This collaborative innovation not only maintained the project timeline but also achieved a 30 per cent reduction in carbon emissions and a 15 per cent improvement in energy efficiency. 

Eiffage is also using wood as a sustainable alternative due to its carbon sequestration benefits, which significantly lower emissions. Wood absorbs and stores carbon during its growth phase, reducing the overall carbon footprint of projects. Although wooden structures have a lighter environmental impact compared to cement, their use is limited in certain types of construction, and their sustainability depends on responsible forestry practices. Eiffage ensures material traceability through audits by Product DNA, which verifies material origins and authenticates them on a blockchain, providing transparent information along the wood-to-construction supply chain. 

Additionally, Eiffage is innovating with the use of prefabricated modules. These modules are manufactured in a factory and then transported to the construction site. For instance, the company has established a business entity called HVA that designs and produces modular bathrooms made of wood. These modules are prefabricated in a controlled factory environment and arrive at the construction site ready to be integrated into the building. The extensive use of wood for some of these modules significantly reduces emissions at construction sites. From a cost perspective, such a solution can be advantageous when considering the increase in quality. Instead of being constructed on-site, a bathroom is pre-assembled as a module in a factory, which reduces the risk of quality issues and the need for rework. 

Towards circularity 

Eiffage is dedicated to advancing a circular economy, emphasizing eco-design and efficient resource use. The company’s strategy includes designing products that require fewer new materials and enhance the use of recycled content, focusing on modular designs that facilitate disassembly and reuse. 

Through its subsidiary Demcy, Eiffage has developed expertise in selective deconstruction, helping clients exceed the law’s 70 per cent recycling target, with 97 per cent of materials from demolition sites being recycled or recovered. This success is achieved through new methods and worker training for systematic selective removal of equipment and materials, supported by resource diagnostics to identify and inventory recoverable items. Some suppliers also repair or remanufacture components such as emergency exit blocks, demonstrating a practical approach to extending material lifecycles. 

Goyer, another Eiffage subsidiary, has introduced FairFaçade, a new façade concept that significantly reduces carbon footprint using a wood / aluminium curtain wall system, lowering emissions by 45-70 per cent compared to traditional aluminium structures. This modular system offers high repairability and thermal performance while utilizing low-carbon glass and recycled aluminium. FairFaçade is a great example of Eiffage’s circular economy initiative’s readiness for mass production 

Looking into the future, a platform strategy is necessary to scale the circular economy in the construction industry. Some digital platforms have emerged to provide a second life for materials and equipment. However, these platforms often struggle to operate at the necessary scale for significant impact, handling relatively small volumes. To truly transform the industry, a systematic approach involving larger-scale remanufactured and reconditioned equipment is crucial. Collaboration with suppliers to integrate circular principles into product design, along with practical, scalable strategies, can enhance reuse and recycling processes. A robust, industry-wide commitment is needed to make these practices fundamental shifts in resource management. 

Effective collaboration across multiple actors and supportive public policies are essential for sustainable building resource management. While detailed information about materials and resources exists during construction or renovation, it often isn’t maintained or updated afterwards. By implementing public policies that mandate property owners to preserve and regularly update this data and encouraging collaboration between architects, builders, and regulators, a more transparent and efficient resource management system can be established. This integrated approach can lead to significant advancements in sustainability within the construction industry. 

Conclusion 

Decarbonizing the construction industry is critical for climate action. The industry must tackle challenges such as material efficiency and waste reduction while integrating sustainable building practices to lower carbon footprints. Eiffage is at the forefront, leveraging procurement, platforms, and circularity to reduce carbon emissions. The company has embraced a procurement strategy that embraces digitalization to enable better decision-making based on robust environmental data, promoting the use of materials with verified low emissions. Platforms like BlueOn developed by Eiffage extend these capabilities industry-wide, offering access to data for environmentally friendly products and fostering an ecosystem of collaboration among suppliers and contractors. This approach not only streamlines the adoption of sustainable practices but also propels the industry towards significant emission reduction by ensuring that all stakeholders align with shared decarbonization goals. Circular strategies further enhance sustainability by prioritizing the reuse and recycling of materials, supported by innovations in product design and supply chain management. Collectively, these strategies embody a holistic approach to environmental responsibility in construction, aiming to transform industry standards and achieve a sustainable future. 

About the Authors

Herve Legenvre

Hervé Legenvre is a Professor and Research Director at EIPM. He manages educational programmes for global clients. He conducts research and teaches on digitalization, innovation, and supply chain. Recently, Hervé has conducted extensive research on how open-source software and open hardware are transforming industry foundations. Hervé is part of the project advisory committee and also of the Independent Committee for the Socioeconomic Studies (www.eipm.org). 

Bertrand Touzet

Bertrand Touzet is in charge of the low-carbon strategy for the Purchasing Department at Eiffage. Over the years, he has been implementing environmental data management tools to measure and reduce the carbon footprint of Eiffage Group’s purchasing activities (Low carbon at the heart of purchasing | Eiffage Group). 

The post Decarbonizing the Construction Industry: A Case Study of the Eiffage Platform and Circularity Strategies appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/decarbonizing-the-construction-industry-a-case-study-of-the-eiffage-platform-and-circularity-strategies/feed/ 0
Innovation, the Art of Abolishing Distance: Building the ATTRACT DeepTech Ecosystem  https://www.europeanbusinessreview.com/innovation-the-art-of-abolishing-distance-building-the-attract-deeptech-ecosystem/ https://www.europeanbusinessreview.com/innovation-the-art-of-abolishing-distance-building-the-attract-deeptech-ecosystem/#respond Thu, 25 Jan 2024 01:48:20 +0000 https://www.europeanbusinessreview.com/?p=199282 By Hervé Legenvre DeepTech technologies have significant effects on the economy and are an avenue to solve many problems humanity faces. However, the vitality of DeepTech projects is linked to […]

The post Innovation, the Art of Abolishing Distance: Building the ATTRACT DeepTech Ecosystem  appeared first on The European Business Review.

]]>

By Hervé Legenvre

DeepTech technologies have significant effects on the economy and are an avenue to solve many problems humanity faces. However, the vitality of DeepTech projects is linked to the innovation ecosystems in which they are embedded. In this article, Hervé Legenvre explores the role of innovation in DeepTech success using the ATTRACT programme.

Innovation is the art of abolishing distances. Innovation thrives by combining distinct, diverse, and previously unconnected capabilities and resources to solve problems. This is particularly true for DeepTech Innovation. ATTRACT is an initiative that created a Pan-European DeepTech ecosystem that spans detection, imaging, and computing technologies. It assembled around 170 DeepTech projects in an ecosystem comprising scientific research institutions, universities, business networks and hundreds of students from across Europe. ATTRACT takes the art of innovation to its next level by steering open and interdisciplinary cooperation across a diverse network of stakeholders.

Innovation as the art of combining the power of distant objects 

Adam Smith’s book, The Wealth of Nations is renowned for the description of the division of labour, where the production process is segmented into a series of small, specialised tasks, each carried out by a distinct worker, enhancing overall efficiency. Smith’s illustration of this through a pin factory is well-known. Yet, the subsequent passages in this book that portray innovation as the art of linking disparate resources and expertise are less cited. He noted.

“Many improvements have been made by […] philosophers or men of speculation, whose trade it is not to do anything, but to observe everything; and who, upon that account, are often capable of combining together the powers of the most distant and dissimilar objects.”

In today’s era of high specialisation and continuing knowledge expansion, philosophers are scientists, and men of speculation are entrepreneurs, but innovation remains the art of combining disparate capabilities, resources, and expertise to solve problems. This art is essential to convert scientific discoveries into economic and social benefits. Bringing DeepTech projects to market requires uniting many elements that are diverse and distinct and initially disconnected.

What are DeepTech technologies? 

DeepTech solutions require long and uncertain development cycles, significant capital investment, and a deep understanding of the underlying scientific principles.

In the rapidly evolving world of technology, the term “DeepTech” describes technologies that are rooted in groundbreaking scientific discoveries and high-tech innovations. These are not the apps or software we use every day at work or in our lives; they are technologies born out of significant scientific advancements and discoveries. DeepTech projects can sometimes be taken to market by new ventures but also by collaborative projects where large companies and scientists work together. The applications of DeepTech innovations are often not confined to a single industry. They are often pervasive and impact a wide array of sectors. They have broad-ranging effects on the economy, improving efficiency and productivity across many different industries. They are also regarded as an avenue to solve some of the most pressing problems humanity faces. Unlike consumer-focused apps or software, DeepTech solutions require long and uncertain development cycles, significant capital investment, and a deep understanding of the underlying scientific principles. DeepTech ventures often struggle with raising money due to their longer gestation periods and higher research and development (R&D) investments. Entering different markets is a major hurdle for DeepTech projects. Different countries and sectors have their own unique culture, language, and business practices.

The role of innovation ecosystems and public policies in DeepTech success 

With such challenges, DeepTech projects need support from powerful innovation ecosystems and public policies. DeepTech projects can gain considerably from supportive public policies. Governments can allocate funds for DeepTech R&D. This can take the form of grants, tax incentives, or direct investment in research initiatives. Such funding helps DeepTech projects overcome the initial capital-intensive phase of development where private investment might be risk-averse due to the long-time horizons and uncertain outcomes associated with these projects. Public policies also support education and training at the intersection of technology and the economy to enlarge the talent pool available to DeepTech companies. DeepTech challenges and opportunities transcend national borders; it is therefore key to have public policies that promote international collaboration so DeepTech projects can connect a more global innovation network. In a nutshell, governments play a crucial role in shaping the ecosystem within which DeepTech projects operate.

The vitality of DeepTech projects is linked to the innovation ecosystems in which they are embedded. These ecosystems bring together complementary resources and expertise to nurture these projects. Research Institutions and universities are the seedbeds of science and DeepTech innovation. They are also the training grounds for the next generation of scientists, engineers, and entrepreneurs. Large companies act as partners for DeepTech projects, providing the scale, resources, and industry expertise necessary to bring DeepTech innovations forward. Startups can also emerge out of DeepTech projects, they are often the vehicles that take nascent technologies out of the lab into the market. Venture capital firms, angel investors, and other financing entities can also play a role when the time is right. These investors also bring their business acumen, mentorship, and networks that can help DeepTech entrepreneurs.

person using ai (1)

The origin of ATTRACT 

ATTRACT is a pioneering initiative bringing together Europe’s fundamental research and industrial communities to lead the next generation of detection and imaging technologies. Funded by the European Union’s Horizon 2020 programme, it aims to help revamp Europe’s economy and improve people’s lives by creating products, services, companies, and jobs. By funding this project, the European Union aims to foster innovation from the earliest stages of technological development to market entry and beyond and to create positive societal impacts.

The ATTRACT initiative represents a strategic and collaborative effort to harness the innovative potential of pan-European research infrastructures and their associated communities. The consortium that orchestrates ATTRACT activities includes Aalto University, the European Organization for Nuclear Research (CERN), the European Industrial Research Management Association (EIRMA), the European Molecular Biology Laboratory (EMBL), ESADE Business School, the European Southern Observatory (ESO), the European Synchrotron Radiation Facility (ESRF), the European X-Ray Free Electron Laser Facility (European XFEL), and the Institut Laue-Langevin (ILL).

The core objective of ATTRACT is to bridge the gap between fundamental research and marketable solutions by engaging entities capable of translating high-level research into societal and commercial gains. ATTRACT has created a rich ecosystem that meshes Pan-European research infrastructure, universities and industry representatives. Research infrastructure initiatives and universities are hotbeds for advanced scientific inquiry and technological development. They are equipped with state-of-the-art facilities and are staffed by researchers working at the frontiers of knowledge across various disciplines. Industry players bring a commercial perspective to the table. They can help create connections between the generation of knowledge and its practical application.

A structured innovation process for DeepTech projects 

ATTRACT offers a structured process to identify and develop breakthrough technologies in the field of detection and imaging. By selecting 170 potential projects, ATTRACT focusses resources on the most promising ideas that have the potential to address societal challenges and fill gaps in the market. In its first phase, ATTRACT helped validate and test these concepts through a gradual and phased approach as illustrated in Figure 1.

In the context of ATTRACT, a demonstrator is an early version of the technology that is used to demonstrate its potential, performance and viability to stakeholders including early investors or partners. A prototype is a more advanced application of the technology, dedicated to a specific market. Prototypes help test and refine the functionality of a product, identify issues, and present the concept to stakeholders including potential users.

The first phase of the ATTRACT Project also serves as an initial financier backed by the European Union Horizon 2020 programme. A €20 million funding from Horizon 2020 supports the selection of 170 technology concepts via an Open Call to consortia comprising Research Infrastructures (RIs), academic institutions, Research and Technology Organisations (RTOs), and industry partners. Each successful proposal received a one-time payment of €100,000 to initiate the first steps of the project.

The second phase of ATTRACT helps bring XX project closer to a market-ready stage. Here, there is a focus on the proven and most promising breakthrough technology concepts from the previous phase showing strong potential for scientific, industrial and societal applications.

A springboard for interdisciplinary education 

ATTRACT focusses resources on the most promising ideas that have the potential to address societal challenges and fill gaps in the market.

The ATTRACT Programme also adopted an “Open Science to Open Innovation” approach by training students in an interdisciplinary manner. This closes a gap in the current education landscape which often limits students to specialised activities without interdisciplinary interaction or entrepreneurial training. The goal is to move beyond the traditional, siloed approach to research and instead imbue students with Design Thinking and similar methodologies. Using these methodologies, groups of students coming from different horizons envision how the DeepTech projects of ATTRACT can be used to solve real-world problems. This fosters an entrepreneurial and co-creation mindset among students, demonstrating how these approaches can mesh scientific research and social innovation. The initiative is designed not only to enrich the educational experience of students but also to contribute to a more innovative and evenly distributed logic of innovation.

Innovation

What is ATTRACT’s secret recipe? 

Innovation is the art of abolishing boundaries and distances. Innovation often emerges from the combination of different and previously unrelated elements such as specialised knowledge, practical skills, advanced equipment, financial backing, and real-world challenges across various industries.

The ATTRACT initiative is about fostering a community as much as it is about advancing technology. It’s not merely a source of funding; it’s a networked environment where sharing ideas and support is standard practice. Central to ATTRACT is the idea that a diverse range of contributors can enable the development of DeepTech innovation. This community includes research institutions and universities that lay the groundwork for new ideas through fundamental research and provide the intellectual and physical resources to develop these ideas further. Students and business networks contribute new perspectives and energy, while public funding, companies and venture capitalists bring the necessary investment and business expertise to help turn promising ideas into market-ready products.

This collaborative approach within ATTRACT is designed to encourage openness and interdisciplinary cooperation, making it easier for people to learn from one another and work together effectively. By facilitating connections among these varied elements, ATTRACT helps to develop nascent ideas into mature projects. ATTRACT speeds up the development and application of new technologies, broadening their potential impact. This approach showcases the value of enabling collaboration across different disciplines and sectors to drive innovation forward.

Example of projects supported by ATTRACT 

ULTRARAM 

ULTRARAM is a new type of computer memory that uses very little energy with the speed of DRAM and the non-volatility of flash memory. It’s designed to save a lot of power in places like data centres and satellites in space. The next step for the project is to test ULTRARAM’s ability to hold data for a long storage time and its endurance level. This could change how all kinds of devices, from tiny sensors to big data centres, store and use data. The ULTRARAM team is led by Manus Hayne from Lancaster University, it is now supported by the award-winning startup Quinas Technology as it won the title of “Most Innovative Flash Memory Startup” in Silicon Valley.

RANDOM POWER 

The Random Power taps into the quantum characteristics of semiconductors to create a secure and continuous supply of random data bits, crucial for encryption. In its first phase, the project team which included research institutions and companies, successfully engineered a compact, credit card-sized device that generates these random bits and passed US National Institute of Standards and Technology tests, confirming its practicality for real-world use. The project led by Massimo Caccia from the Università dell’Insubria has since grown to include a new company formed from the original team and has broadened its expertise by collaborating with other projects. The goal now is to develop a range of True Random Bit Generators that can be used across various industries, from large-scale infrastructure to the IoT and automotive sectors, enhancing security with quantum-level unpredictability.

The author would like to thank the ATTRACT team for their support and for making this article possible. This includes Pablo Garcia Tello, Markus Nordberg and John Wood. 

About the Author

Author

Hervé Legenvre is a Professor and Research Director at EIPM. He manages educational programmes for global clients. He conducts research and teaches on digitalisation, innovation, and supply chain. Lately, Hervé has conducted extensive research on how open-source software and open hardware are transforming industry foundations. Hervé is part of the project advisory committee and part of the Independent Committee for the Socioeconomic Studies call in ATTRACT (www.eipm.org). 

The post Innovation, the Art of Abolishing Distance: Building the ATTRACT DeepTech Ecosystem  appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/innovation-the-art-of-abolishing-distance-building-the-attract-deeptech-ecosystem/feed/ 0
Building the Data Foundations of Supply Chain Decarbonisation https://www.europeanbusinessreview.com/building-the-data-foundations-of-supply-chain-decarbonisation/ https://www.europeanbusinessreview.com/building-the-data-foundations-of-supply-chain-decarbonisation/#respond Tue, 03 Oct 2023 13:41:37 +0000 https://www.europeanbusinessreview.com/?p=193134 By Hervé Legenvre The journey towards the decarbonisation of supply chains requires harmonised and reliable information on emissions as well as industry collaborations by not-for-profit organisations operating safe and trustworthy […]

The post Building the Data Foundations of Supply Chain Decarbonisation appeared first on The European Business Review.

]]>
By Hervé Legenvre

The journey towards the decarbonisation of supply chains requires harmonised and reliable information on emissions as well as industry collaborations by not-for-profit organisations operating safe and trustworthy data-sharing initiatives.

Decarbonisation of supply chains: why data sharing is vital

As the decarbonisation of supply chains is becoming imperative, companies need standards for calculating and reporting the greenhouse gas (GHG) emissions of their product. Over time, suppliers along complete supply chains will share their calculations with their clients and other stakeholders. Fortunately, emissions can be quantified as carbon-emission equivalents, simplifying the measurement process – hence the term carbon footprint. However, today, we are far from having access to harmonised, detailed and reliable information on emissions. While some generic guidelines such as the Greenhouse Gas Protocol exist, consistent industry approaches at product level are lacking. Achieving this will require collaboration along supply chains in the years to come; undertaking this collaborative effort is vital for several reasons. First, consistency and comparability are key to understanding the relative footprint of different raw materials, components, or products. Having estimates at the company level is not enough to decarbonise our supply chains. Second, solid data foundations help understand emission drivers and initiate incremental and radical improvement projects. In a nutshell, to find alternative raw materials and design more circular supply chains, companies need data they can trust. Finally, product carbon footprint transparency is important for stakeholders and policymakers who expect companies to fulfil their climate commitments and targets.

In the future, achieving climate change ambitions will require even more data sharing between companies and digital solutions that span entire supply chains. Beyond the sharing of product-specific greenhouse gas (GHG) emissions, EU regulations are pushing companies to share information that facilitate the adoption and scaling of circular economic models. One EU initiative focuses on batteries. With the rise of vehicle electrification, demand for battery raw materials will increase rapidly. To achieve a sustainable transition, a circular approach is necessary to ensure sustainable material sourcing, efficient battery production, and effective end-of-life processing that maximise the reuse of materials. By doing this, the whole industry will reduce emissions and reduce its dependencies on certain raw material sources.

In this context, EU Regulation will soon require data-digital product passports from companies along the battery supply chains – these passports will contain information about the battery’s composition, environmental impacts, and end-of-life procedures. This data-sharing scheme is designed to provide transparency and accountability throughout the battery’s life cycle, from production to disposal. Companies such as BASF, Umicore and BMW have taken the lead in creating such a passport.

To achieve a sustainable transition, a circular approach is necessary to ensure sustainable material sourcing, efficient battery production, and effective end-of-life processing.

In this context, no company will be able to make information exchange on emissions and other sustainability factors happen on its own. We will need industry collaborations orchestrated by not-for-profit organisations that operate safe and trustworthy data-sharing initiatives. In this regard, Together for Sustainability (TfS) is a role model in the chemical sector that can inspire leaders across industries. In the present article, we review the history of TfS, describe its current product carbon footprint initiative, and discuss how its model could be mainstreamed across different industries.

Together for Sustainability

Together for Sustainability

TfS is a global sustainability initiative for the chemical industry. It was founded in 2011 by six leading chemical companies: BASF, Bayer, Evonik, Henkel, Lanxess, and Solvay. TfS aims to improve sustainability practices in the chemical industry supply chain through data-sharing initiatives. It also provides training and support to suppliers so they can improve their sustainability practices. Today TfS has 47 members whose aggregated revenue exceeds $800 billion; this not-for-profit foundation brings together more than 14,000 suppliers along a common improvement path.

How TfS started

Back in 2011, supply chain due diligence had emerged as a critical issue for businesses. The idea for TfS emerged in 2011 when six Chief Procurement Officers (CPOs) from the chemical industry came together to address a common problem. They realised that approaching suppliers separately with different standards and questions would prevent these suppliers from concentrating their efforts on progress. To solve this problem, they decided to create a standard that would reflect the needs of the industry. Their motto was simple: ‘‘An audit for one is an audit for all.’’

A virtuous change cycle fuelled by trust.

Over the years, TfS has evolved into a professional not-for-profit organisation that orchestrates a virtuous change cycle. As TfS onboards new members, a broader pool of suppliers and data is created. As the pool of data broadens, more members see the benefit of joining TfS. Such a virtuous change cycle was possible because TfS created the conditions for industrywide trust for TfS members and for their suppliers.

On the member side, trust was established through three means: responsibility, exemplarity, and compliance with antitrust laws. First, companies that join TfS must be represented by their CPO.

With the TfS guideline and the associated digital platform, the data used by R&D, sales and procurement across the chemical industry can progressively be unified, making the development of competing methods of calculations for the carbon footprint of chemicals superfluous.

With this decision-making power, each member dedicates the resources needed to support the TfS workstreams. Furthermore, no members can get a free ride, all of them commit every year, to bring their share of suppliers into the pool. TfS has a dashboard that outlines the number of suppliers involved, the aggregated scores of suppliers, and the number of corrective actions closed across the industry. This creates a sense of responsibility and instils emulation. Second, over the years TfS has focused on quality over quantity. To become a member, a company needs to score high against TfS’s own criterion so they can lead by example. By doing so, members also have a clear understanding of the stakes and efforts required to spearhead sustainability improvement. Third, the TfS initiative is antitrust compliant: companies share data without knowing with whom a supplier works. This is an important foundation that requires a professional organisation and solid governance, so everyone is confident.

On the supplier side, suppliers also gain benefits beyond the elimination of paperwork. TfS helps them save time but also gain access to resources that help them improve. While members benefit from experience and best-practice sharing, TfS also offers capability-building initiatives to suppliers who gain help from members. This allows them to develop their capabilities, image, and reputation. As suppliers own their assessment and audit results, they can use them to gain credibility in the market and to support their commercial efforts. Finally, TfS has established partnerships with diverse industry associations and has developed active connections across the world, creating an expanding and supportive ecosystem.

Democratising carbon footprint data

Democratising carbon footprint data

As members of Together for Sustainability started to commit to reducing their greenhouse gas emissions, they realised the importance of calculating the carbon footprint of their products in a consistent way. TfS consequently launched its Product Carbon Footprint (PCF) Guideline in September 2022. The guideline is open source, available in five languages, and accessible to everyone. The guideline provides a standard for calculating the carbon footprint of chemical materials. TfS is now piloting a digital solution that enables corporations and suppliers to share upstream product carbon footprints and manage their emissions of purchased goods and services. In a nutshell, TfS promotes the democratisation of product carbon footprint measure, and it offers a safe digital space to share such information.

With the TfS guideline and the associated digital platform, the data used by R&D, sales and procurement across the chemical industry can progressively be unified, making the development of competing methods of calculations for the carbon footprint of chemicals superfluous. As this occurs, providing information on product carbon footprint to clients will soon be regarded as a basic service and sharing such information on a broader scale needs to rapidly become one of the foundations for more systemic changes that will help fight climate change.

Measurement is a necessity, but it will not reduce emissions by itself, companies will need to turn measurement into priorities and priorities into results. No company should wait for perfection in measurement and reporting to start acting! But sound and rapid progress can only be achieved if we reduce the time it takes to have a common and recognised standard.

To help visualise this trust and measurement challenge, Table 1 can help decision makers assess the maturity of a company’s approach on measuring scope 3 emissions. This table is not part of the TfS guideline, it applies to any situation where a company depends on suppliers to measure its scope 3 emissions. Thanks to the TfS Product Carbon Footprint Guideline, we can rapidly progress from level 0 to level 2, so emissions are calculated and shared by suppliers based on sound industry guidelines and methods. Then, we will need to progress to level 4 where comparable, detailed, and reliable information on emissions exists across industries and along the supply chains. Some large companies already have the capabilities needed to reach level 4 while others are earlier on in their journey. Through TfS they can share their expertise and practices so everyone in the industry can reach level 4.

Table 1
Table 1: Progressing toward a reliable measure of GHG emissions.

Gaining trust in the emission shared across companies cannot be achieved without widely accepted standards and guidelines. No company should spend time re-inventing the wheel on its own. Competing initiatives should not flourish. With sound collective efforts within an industry, we can reach level 4 in five years, with fragmentation and competing standards, it will likely take 15 years to get there.
Other industries should adopt the TfS model!

Other industries should adopt the TfS model!

Over time, it is important to democratise access to data on GHG emissions to ensure everyone understands the impact of their actions on the environment. This data should be used to inform policy decisions, create awareness, and help individuals and organisations reduce all their emissions. Additionally, having access to data helps ensure that all stakeholders are held accountable for their actions and have the necessary information to make informed decisions.

In this context, other industries can build on the efforts undertaken by TfS. Establishing organisations such as TfS is critical. Professional and trusted not-for-profit foundations facilitate safe data exchange and help extract commercial sensitivity out of them. Three important lessons learned from TfS can be outlined to increase the chance of success for such an initiative.

Additionally, having access to data helps ensure that all stakeholders are held accountable for their actions and have the necessary information to make informed decisions.

First, it is of utmost importance to have a dedicated professional organisation. Orchestrating data sharing requires establishing professional not-for-profit foundations. Only not-for-profit foundations can rapidly rise as a trusted body; they can ensure that diverse regulations including antitrust laws are respected, and that potentially sensitive data is protected. Such not-for-profit organisations are best served by a transparent governance process where decisions are documented and widely available. Decisions need to be taken collectively within key projects, endorsed by all members. Good governance also requires an active board that mainly sets the ambitions, defines the functioning of the organisation, and steers the direction of the organisation.

Second, such initiatives need to balance short- and long-term benefits for participants. In the case of TfS, all participants benefit, right from the start, from the elimination of excessive paperwork, the availability of a data-sharing infrastructure, and access to knowledge. However, over time, participants can gain further benefits both at the company level and at the industry level. They can strengthen their credibility in the market and take advantage of a more transparent and sustainable supply chain.

Third, such initiatives need to create a positive ecosystem momentum and mobilise a broad array of members. A not-for-profit foundation that manages a data-sharing initiative plays a pivotal role in attracting new members, but it should also set the bar high for participants. It needs to ensure that members act responsibly, contribute to the collective efforts, and remain exemplary. Members’ contributions and participation should be visible and publicly recognised. At the same time, the not-for-profit foundation that supports data-sharing initiatives should ensure that only exemplary members can join and that all members contribute a fair share of resources to the collective effort. The initiative will be successful if a critical mass of industry players can be rapidly created to instil a virtuous circle of mobilisation and progress.

Conclusion

Conclusion

At a time when we need solid data foundations for the decarbonisation of supply chains, TfS is a prime example of how collaboration can enable the transformation of industries. TfS has created a standard that serves the chemical industry well; the organisation is also ready to support other industries. We need to promote and progress towards the democratisation of product carbon footprint data, which will help create actionable insights and drive progress. Some of the TfS members even believe that they have a responsibility to help other industries fast-track their development. Our collective stake is to create standards that are available for free, avoiding total fragmentation in the way product carbon footprint is measured. Let’s make it happen fast!

About the Author

Herve LegenvreHervé Legenvre is a Professor and Research Director at EIPM. He manages educational programmes for global clients. He conducts research and teaches on digitalisation, innovation, and supply chain. Lately, Hervé has conducted extensive research on how open-source software and open hardware are transforming industry foundations. (www.eipm.org)

The post Building the Data Foundations of Supply Chain Decarbonisation appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/building-the-data-foundations-of-supply-chain-decarbonisation/feed/ 0
Connecting the Unconnected in the Automotive Industry Four Ecosystems that are Reshaping Automotive Industry Collaborations https://www.europeanbusinessreview.com/connecting-the-unconnected-in-the-automotive-industry-four-ecosystems-that-are-reshaping-automotive-industry-collaborations/ https://www.europeanbusinessreview.com/connecting-the-unconnected-in-the-automotive-industry-four-ecosystems-that-are-reshaping-automotive-industry-collaborations/#respond Mon, 23 Jan 2023 00:41:14 +0000 https://www.europeanbusinessreview.com/?p=172637 By Hervé Legenvre and Erkko Autio Connectivity is a defining feature of digitalisation. As companies increase their use of digital technologies in their products and processes, digitalisation will ensure that, […]

The post Connecting the Unconnected in the Automotive Industry Four Ecosystems that are Reshaping Automotive Industry Collaborations appeared first on The European Business Review.

]]>
By Hervé Legenvre and Erkko Autio

Connectivity is a defining feature of digitalisation. As companies increase their use of digital technologies in their products and processes, digitalisation will ensure that, ultimately, all economic activities are interconnected, thereby enabling companies to innovate in more systemic ways. Many industries are not there yet and, in many sectors, data and software remain largely trapped within isolated islands of machinery, products, and companies. To harness the full potential of digitalisation, new forms of collaboration are needed where non-profit foundations orchestrate data and innovation ecosystems to the advantage of industry participants. In this article, we describe four such non-profit foundations and explain why they are expected to play an important role in shaping the future of the automotive industry.

For decades, automotive manufacturers have relied on complex networks of suppliers to solve technical challenges, innovate components, and coordinate supply chains. While it’s good for designing, selling, and servicing cars, this pipeline logic now appears inflexible and wasteful, as innovation is slow and incremental, and the resulting value chain lacks circularity. Today, however, digitalisation is beginning to rewrite the rules of innovation in the automotive industry. New automotive entrants such as Tesla are ‘born digital’ and adopt a much more horizontal and open approach to organising their business activities. Companies like Tesla interact with their clients directly through digital channels, and they experiment with innovations such as subscription business models. They also favour vertical integration to cut though the complexity of the industry. This unorthodox competition is now challenging traditional OEM manufacturers to adopt new, more transparent approaches to collaboration, something they aren’t used to. With advances in digitalisation, innovation in the automotive industry is no longer a solo game or a two-party collaboration where everyone concentrates on their own process and products. Advances in digitalisation facilitate more open and flexible multi-party collaborations and the emergence of industry-level ecosystems. These ecosystems enable the tracking of the carbon footprint of vehicles, they drive circularity through more effective refurbishment and recycling of car parts, they enable digital twins that support product design, transform supply chains and after-market activities, and they provide the software foundations for electrified, autonomous, and connected vehicles. These transformations are made possible by holistic industry-wide initiatives that build new standards, new systems, and new tools that enable the creation of shared data and shared code, and that connect corporations together. We identified four non-profit organisations that, in driving these changes, orchestrate ecosystems of automotive players and play a significant role in delivering the digital future of the automotive industry.

Manufacturing: the Open Manufacturing Platform and the Industrial Digital Twins Association

Instead of multiplying proprietary solutions, the Eclipse Foundation working group aims at creating both open-source software for software-defined vehicles and the associated development tools to implement and expand in-vehicle digital infrastructure. The competitive stakes are high.

The Open Manufacturing Platform (OMP) brought together manufacturing companies and software providers so they can enable smart manufacturing capabilities. This organisation was established as part of the Linux Foundation in 2019 and its key projects were transferred to the Eclipse Foundation early 2023. The Open Manufacturing Platform projects are available not only to members of OMP, but to all industry players. Although a global alliance, it has attracted strong participation from leading German automotive companies such as BMW, ZF, Robert Bosch, and Schaeffler. The equipment and digital tools acquired by automotive manufacturers are typically proprietary solutions, which makes it difficult to integrate data from different pieces of machinery in a common cloud platform. Data ends up trapped locally and can only be accessed with specific software. This data insulation inhibited large-scale analysis and systemic improvements. By making every piece of data accessible in a shared cloud platform, frameworks and tools developed on the OMP allow the dots to be connected across entire factories and beyond. For instance, as a defect is identified by a client, it will increasingly be possible to rapidly track the sequence of events along the supply chain that surrounded the defective part. Digital twins of a production environment can be created to train employees, while other digital twins can help predict when and where unwanted events might occur. The Open Manufacturing Platform drive the development of new industry standards thanks to open source reference architectures and common data models.

The Industrial Digital Twins Association (IDTA) shares some of the goals of OMP. Digital twins are digital replicas of physical industrial equipment and products. They can be used to deliver multiple use cases, to simulate and optimise real-world systems. IDTA is a member-based organisation that aims to help companies integrate data across different systems and assets through the implementation of open standards and the development of open-source software. IDTA brings together industrial and technology companies to advance these developments.
Collaborative developments such as OMP and IDTA reduce obstacles to the implementation of integrated smart manufacturing solutions such as digital twins. They create the necessary foundations to build platforms that are independent of any proprietary provider and eliminate data silos, thus helping move beyond local optimisation to address industry-level challenges such as sustainability and circular economy.

Catena X

Data sharing along automotive supply chains: Catena-X

Catena-X is a data sharing initiative dedicated to the automotive industry and supported by the German government. BMW, Mercedes Benz, Robert Bosch, Siemens, Volkswagen, and ZF are among the 130 companies who have joined Catena-X and who encourage companies of all sizes to participate in data sharing. To improve their operational performance and to strengthen their product development activities, many corporations have harvested large amounts of data from their operations. To enhance resilience, sustainability, and systemic capacity to innovate, data sharing throughout the automotive industry is now perceived as the Holy Grail. The recent component shortages that have disrupted automotive companies have convinced every decision maker that no automotive company can solve such problems alone. However, sharing data with other industry players, including clients, competitors, and suppliers, is not an easy decision to make. Decades of focus on intellectual property have reinforced an industry mindset where no company wants to lose control over its own data by sharing it with others, even if the sharing is for mutual benefit.
Furthermore, creating systems costs money, and companies also fear cybersecurity issues. To help address these obstacles, Catena-X seeks to simplify data exchanges. It does this by creating protocols for data sharing and providing open-source software that facilitates data interconnections. Catena-X also promotes data sharing to automotive companies of all sizes, especially to small firms with limited resources. As every company is concerned with security and potential misuse of their data, Catena-X treats data sovereignty as an important priority. With Catena-X, automotive companies can retain control of their data even when sharing it. No information is uploaded onto a central platform. Instead, data remains on the system of the company that agrees to share it, and exchange occurs using peer-to-peer interconnections. To reinforce data sovereignty, usage policies are enforced by the system. Companies agree that their data can be used solely for specific use cases, and the system ensures that their data can only be used for the purpose they choose.
So far, ten use cases have been agreed by the board of Catena-X. These range from initiatives to improve sustainability, recycling, traceability, and quality, to supply chain orchestration and smart manufacturing. Connecting a multitude of datasets across an entire industry will be a long and challenging journey that requires all players across the industry to be incentivised and mobilised. Nevertheless, data sharing offers opportunities that cannot be ignored within an industry that faces new competitive challenges and that needs to address its environmental impact.

TESLA

Software-defined vehicles: an Eclipse Foundation working group

The Eclipse Foundation is an open-source foundation which announced, in 2022, a new working group on software-defined vehicles or SDVs. This working group brings together companies including Cariad, Volkswagen software company, Toyota, ZF, Robert Bosch, Continental and several software and hardware specialists. More participants are expected to join in the coming years. Cars and vehicles of the future will be software-defined, where all hardware components will be orchestrated as a unified system.

Digitalisation is beginning to rewrite the rules of innovation in the automotive industry. New automotive entrants such as Tesla are ‘born digital’ and adopt a much more horizontal and open approach to organising their business activities.

This is a dramatic advance, as the automotive industry has been used to creating dedicated chips and software separately for each component in a vehicle, and now it needs to reverse its thinking and build an intelligent architecture that will swiftly orchestrate entire sets of interconnected car components. The digital infrastructure of software-defined vehicles will be hosted on a few powerful chips that will control entire zones of the vehicle. This will reduce the density of in-vehicle connections, the amount of hardware, and the overall cost, while processing broader and more complex flows of data. This in-vehicle digital infrastructure will allow users to buy and manage, over the air, subscriptions for differentiating features. It will also allow car manufacturers to aggregate data and sell digital services to other companies. Such in-vehicle digital infrastructures will comprise millions of lines of code and AI models, and most of it will consist of non-differentiated software. Thus, instead of multiplying proprietary solutions, the Eclipse Foundation working group aims at creating both open-source software for software-defined vehicles and the associated development tools to implement and expand in-vehicle digital infrastructure. The competitive stakes are high. Car manufacturers cannot afford to do this alone, and nor can they afford to depend on proprietary solutions for this digital infrastructure. Hence the choice of an open-source platform that provides car manufacturers with both cost benefits and freedom to innovate. As of today, the Eclipse Foundation working group has presented an overall template for the in-vehicle digital infrastructure, and seven software projects have been initiated. Participants in this working group are also learning how to operate with open-source principles of openness, transparency, and vendor neutrality. The working group is in its early stages, and it still needs to reach a critical mass of active participants to become a success. While automotive companies now recognise the imperatives of such large-scale collaboration, achieving it nevertheless requires a major culture change in an industry where all the players are used to operating their own proprietary systems.

regulation ahead

Monetising connected car data: regulation ahead

To provide a complete overview of data sharing and data monetisation stakes in the automotive industry, we now briefly describe how software-defined vehicles will power a large amount of data flow. Beyond selling subscriptions, in-car advertising, and features as a service to vehicle drivers, car manufacturers are also planning to monetise data to build, for instance, aftersales, insurance, fleet management, and road management applications. By 2030, Stellantis plans to generate €20 billion every year from data, software, and subscriptions. If car manufacturers are best placed to use the data generated, they have already committed to share a basic volume of data with diverse industry players. However, these car manufacturers are perceived by some other industry players and by user associations as holding an unfair advantage. So forthcoming regulations could oblige car manufacturers to go beyond their current plans and force them to share more data with other companies in the future. This will be the result of intense lobbying activities. So, while car manufacturers ask other industry players to share data, they are also aiming at keeping control, as much as they can, over the data they can access in cars. This could have an impact on the willingness of everyone across the industry to play the data-sharing game.

Conclusion

Today the automotive industry is working hard to connect the unconnected. We have highlighted how four not-for-profit foundations support this effort by orchestrating data and innovation ecosystems. These industry-wide collaborations create new technologies and a seamless flow of data that will power the performance of all industry players.

The Open Manufacturing Platform can drive the development of new industry standards thanks to open source reference architectures and common data models.

Will these four not-for-profit foundations be successful? To what extent will data sharing occur seamlessly at each stage of the value chain? Will open source software dominate the automotive software stack? This remains to be seen. The four not-for-profit foundations presented here are pointing in the right direction. They are creating a complex web of digital infrastructures that could reshape the automotive industry. Connecting the unconnected will be on the agenda of the automotive industry and other industries for the years to come. And many not-for-profit foundations will continue to set common goals and mobilise complete ecosystems to make this happen.

One certainty is that connecting the unconnected will not be achieved through bilateral relations along value or supply chains. It requires ecosystems that mobilise many industry players around common goals, and calls for industry-wide actions and global ecosystem orchestration. Our knowledge of how this can be done successfully on the scale of industries such as automotive, health, agriculture, and aerospace remains thin and calls for further research.

Why non-profit foundations play a key role in industry digital transformation

If we picture an industry as a complex irrigation network that conveys value towards a final user, we can then witness three types of bottlenecks1 that can prevent value from flowing seamlessly through the network: technical, strategic, and ecosystem bottlenecks. Technical bottlenecks are technical limitations that constrain the performance of products and processes. Strategic bottlenecks are companies who extract a disproportionate share of value out of specific technologies or resources that they control. Ecosystem bottlenecks are technical or commercial misalignments across industry players that slow the adoption and integration of new technologies. A lack of common technical standards, poorly coupled technologies, or the multiplication of competing technical solutions that provide limited differentiation are classic examples of ecosystem bottlenecks.

Not-for-profit foundations help eliminate ecosystem bottlenecks. They bring together heterogeneous groups of industry players to collaborate on shared developments. As they foster collaboration, these organisations help build industry consensus and develop common solutions. In the past, standard organisations were the main vehicle used to eliminate bottlenecks by agreeing on shared specifications and technical architectures. Today, bringing people into meetings to agree on standards is not enough. Not-for-profit foundations support collaborations that develop code, prototypes, and designs to ensure that bottlenecks are properly eliminated, so value can flow towards the final user.

Footnote
1. The three bottlenecks described here were presented in an academic paper authored by Erkko Autio, Herve Legenvre, Ari-Pekka Hameri titled “Driving Infrastructural Advantage: Google’s Strategic Moves in AI Digital Technology Commons”

About the Authors

Hervé LegenvreHervé Legenvre is Professor and Research Director at EIPM. He manages educational programmes for global clients. He conducts research and teaches on digitalisation, innovation, and supply chain.
Lately Hervé has conducted extensive research on
how open-source software and open hardware are transforming industry foundations.
(www.eipm.org)

Erkko AutioErkko Autio FBA is Professor in Technology Venturing at Imperial College Business School, London. His research focuses on digitalisation, open technology ecosystems, entrepreneurial ecosystems, innovation ecosystems, and business model innovation. His research has been cited over 49,000 times in Google Scholar. He co-founded the Wicked Acceleration Labs (www.wickedacceleration.org).

The post Connecting the Unconnected in the Automotive Industry Four Ecosystems that are Reshaping Automotive Industry Collaborations appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/connecting-the-unconnected-in-the-automotive-industry-four-ecosystems-that-are-reshaping-automotive-industry-collaborations/feed/ 0
Triple A.I. Supply Chains https://www.europeanbusinessreview.com/triple-a-i-supply-chains/ https://www.europeanbusinessreview.com/triple-a-i-supply-chains/#respond Wed, 20 Jul 2022 08:50:18 +0000 https://www.europeanbusinessreview.com/?p=149603 By Fabrice Thomas and Hervé Legenvre Hau L. Lee, a Stanford professor described high performing supply chains back in 2004 as Agile, Adaptative and Aligned. Agile refers to the ability […]

The post Triple A.I. Supply Chains appeared first on The European Business Review.

]]>
By Fabrice Thomas and Hervé Legenvre

Hau L. Lee, a Stanford professor described high performing supply chains back in 2004 as Agile, Adaptative and Aligned. Agile refers to the ability of supply chains to face daily uncertainties and variations, adaptability refers to the ability to adjust to more significant changes over time; and aligned refers to the quality of the relationships with partners along the chain that contribute to delivering sustainable performance. While the triple A framework is as relevant as ever for supply chain leaders, in this article we put forward the concept of Triple A.I. supply chains as Agile, Adaptative and Aligned needs to be complemented by Informed, Interconnected and Intelligent supply chains. In a nutshell, this means that information gathered along and beyond the supply chain are combined into short, mid, and long-term insights and scenarios that facilitate human decisions (see figure 1).

Decision-making within the supply chain needs to address different time horizons. First, within a short-term time lens, a supply chain needs to be agile and informed, allowing all teams to orchestrate the supply chain in real time. Acting as an effective control tower impacts on quality, cost and delivery performance while ensuring a positive P&L impact. Second, within a mid-term time lens, a supply chain needs to be aligned and interconnected allowing all teams to strengthen supplier and customer relationships and to optimise end-to-end flows. In terms of performance this contributes to achieving customer centricity and delivering on the corporate strategy goals.  Finally, taking a long-term time lens, supply chains need to be adaptable and intelligent. They mobilise a broad ecosystem of partners that can collectively address any challenge or seize any opportunities based on facts. Being future-ready creates strategic resilience and generates inclusive performance well beyond operational and financial results. This contributes to delivering on the broader goals that matter to all external stakeholders, such as social and environmental impacts.

Figure 1: Triple AI Supply Chain

figure 1

We start by describing what Informed, Interconnected and Intelligent supply chains are. Then, we will look at four key stages of implementation for triple A.I supply chains. The last stage describes how Chief Supply Chain Officers can assemble all the pieces of this strategic jigsaw puzzle together.

Informed Supply Chains

Supply chain teams face complex and dynamic environments that require aggregating reliable information. Some of them are internal sources of information accessible in existing systems, others are external sources of information such as alerts on weather issues, strikes or geopolitical issues. To support decision-making, multiple sources of information are combined. Providing access to all existing information available creates transparency but it can result in information overload and paralysis. Therefore, to perform effectively, operational activity information needs real-time updates and a powerful visualisation. Multiple information sources need to be combined to reveal what is at stake and how this stake can be addressed. Then, operational teams need to easily look at one event from a different perspective and to drill down through different layers of information. If a component is blocked within a harbour, this means accessing information about alternative transportation routes from the supplier location; or visualising the level of inventory for this component in different locations; or accessing information about alternative suppliers of this component.

Interconnected Supply Chains

Supply chains consist of a myriad of assets through which goods, information and money circulate. Gaining real-time visibility on a supply chain can quickly escalate in virtualising a complex network of physical activities. Anyone trying to consider its scope three carbon emissions knows how difficult it is to build such a picture even over a few of steps within a supply chain. While we can start connecting information across a certain scope, over time this can be expanded to broaden the visibility and strengthen the ability to anticipate and accelerate when needed.

The internet of things enables interconnections. Indeed, as data are increasingly produced by objects and can be visualised on many devices, the tracking and monitoring of many activities is facilitated.

How does this work in practice?

It is 6.00 am in a San Francisco warehouse 

“The warehouse manager is looking at a large screen in a Star Trek-like control room. A few touches and the couple of cases that require attention are scrutinised. Decisions are made rapidly with the right data. Interconnecting systems has allowed to bring together customer requirements, real-time location of goods, storage conditions and arrival movements from sales teams and warehouses. Recently, key suppliers have started to share real-time data as well. Contingency planning is in good hands now.”

While we all dream of having such a control room, it is a challenge to bring everything together. In some areas, information might abound while other areas are still dark spots. While IoT allows to access more data, some information-processing activities are still poorly digitalised. Customs-related information is still in most circumstances produced and shared by humans.

The internet of things enables interconnections. Indeed, as data are increasingly produced by objects and can be visualised on many devices, the tracking and monitoring of many activities is facilitated.

This expansion is also facing obstacles because of interoperability problems. Today we still lack data-processing standards for some basic information such as process-to-pay including invoices. Most managers know how challenging it is to interconnect internal and external sources of data. Such interconnection challenges grow as we expand visibility from deep-tier suppliers to clients, as we consider multiple linkages such as transport, customs, regulatory bodies and as we look at different dimensions such as money, real-time localisation, traceability, certificates, carbon impacts and other.

Also building the right digital infrastructure is essential. Companies cannot implement an ERP, a couple of complements and claim victory. They must move forward and think carefully about their digital infrastructure. We should stop patching legacy systems and we need to move towards the cloud. Supply chain executives must think as architects who envision the future use case they need to implement.

While there are challenges to be addressed, accessing information from a myriad of assets, and combining them intelligently from real-time thinking to strategy war rooms can be a source of competitive advantage.

Intelligent Supply Chain

Moving towards intelligent supply chains requires considering each piece of information as a gem. Some gems are shiny but of little value. Many humans share a passion for the weather forecast but rarely consider it in their actions… other pieces of data are uncut diamonds which need to be sharpened and assembled with many other gems to shade light on important issues. Finally, some information can be useless for some time… till it become critical. For instance, many of us have reinvented ourselves as junior epidemiologists during the COVID period.

So while it is challenging, helping information shine and assembling the right sources of information together helps to take human bias out of decision-making processes. It can help optimise the rational side of supply chains. It can help supply chain teams focus on strategic decisions without losing time on reporting and report interpretation. It can transform firefighting into anticipation and facilitate real-time acceleration of decisions.

In 2016, Flex, the manufacturing solution provider unveiled Flex Pulse.

This is the company supply chain analytics solution. It supported $26 billion in revenue, 1.2 million active parts, 20,000 suppliers, 1,000+ customers with their distinct supply chains and 120 manufacturing sites.

Today, Fex Pulse is a cloud based visibility platform dedicated to coordinating supply chain across sourcing, transportation, manufacturing, and inventory. Flex has now 9 centres across the globe that orchestrate Flex supply chain on a real time basi thank to Pulse.

Intelligent information management for supply chain provides a real-time view and a unified view of how our extended supply chain behaves today, tomorrow and under any severe condition we might consider. Intelligent supply chain is not solely about leveraging artificial-intelligence technologies. Often, visualisation that reveals what matters, can already be powerful. Pareto diagrams have already offered powerful visualisations of inventory levels, statistical process controls reveal potential quality risks; value stream mapping – a lean visualisation tool – reveals waste across a few sites and can help a team think differently about the future. Now, artificial intelligence also helps. It can provide insights and foresight on the evolution of demands, on delivery performance, on prices and costs. But the most powerful visualisation should also be complemented by the flexibility to look at an event from different angles.

Now that we have described what Informed, Interconnected and Intelligent supply chains are; we can consider how Triple A.I. supply chains can be implemented in four steps.

Stage 1. Designing a client-centric and sustainable supply chain

Supply chains need to be client-centric, and they need to be properly designed before being powered with information, interconnection, and intelligence. Client-centric supply chains are designed to respond to the complete client expectations. First, this means that client expectations in terms of delivery, accuracy, volumes, order confirmation and transparency are integrated in the design of the supply chain. Second, supply chain design needs to consider complementary products or services of value to the clients. These complements include premium delivery, synchronization of delivery with installation and training activities, as well as ease-of-access to consumables, maintenance, updates, upgrades, and end-of-life services. Well-handled these complementary services and products are a source of client satisfaction and revenue. If they are ignored, even a well-functioning supply chain can generate client dissatisfaction. The covid period has revealed that major gaps exist between the simplicity expected by clients and the reality.

Designing a client-centric supply chain requires moving away from optimising technical connections across operational silos. Supply chains need to be segmented so they can bring together multiple product flows that share common customer value, product attributes, manufacturing and supply capabilities, and performance considerations. The chains of relationships from suppliers to clients need to be considered so alignment, adaptability and agility can be achieved, and distinctive value can be offered to clients.

We have already distinguished strategic planning from operational decisions. Designing supply chain aims at creating distinctive, long-term performance. It comprises some of the most strategic decisions Executives in modern corporations need to take. While being strategic these decisions need to keep some options open at planning and operational level so adaptability and agility can be achieved. Too many constraints make a supply chain crack when unexpected events challenge it.

supply chain

This can require reintegrating activities to regain control of data so new integrated data-led services can be delivered to clients. This can also be achieved through the further outsourcing of activities to high-performing suppliers. This can entail establishing a global hub or localising some activities. Blindly following trends is not always the best recipe for success. Supply chains need to be designed to respond to client expectations and to handle diverse constraints. Supply chains should also answer to social and environmental considerations as they are both imperative and scrutinised by many external stakeholders. While many good practices exist, none of them is universal only the thorough comparisons of alternative options can help find the best solution and its limits of application.

While customer-centric supply chains focus on the horizontal flow of goods, money and information, some vertical perspective needs to be integrated within their design. These vertical perspectives include empowering finance with visibility on revenue and costs so they can manage financial analysts’ expectations. According to research, supply disruption can impact a firm valuation for over a year. Another vertical perspective includes providing the right information to sales so they can inform and support clients effectively. Finally, it also consists in providing added value for tax or legal departments.

Supply chain should be designed to deliver customer centricity, profitability, growth and sustainability. We have a wealth of good practices that can be used to optimise the flow of goods, data, and money. But the thorough comparisons of alternative options can identify what is the best option.

How does this work in practice?

Somewhere in Europe

“In 2020, one medical company had started to rethink its logistic strategy. The 2020 lockdown had led the team to experiment with a couple of new transportation tactics. The data collected was highlighting the benefits of substituting some air transportation with ocean and train transportation. The figures were encouraging.  Over 1300 tons of CO2 could be eliminated while the complete logistic cost base could be reduced even if some inventory levels would go up in a few locations. The team was enjoying the outcomes of a project rapidly executed. Having access to right information along a full supply chain creates power, consensus and speed.”

Stage 2. Getting the right data and the right system

Large corporations who operated supply chains for many years are sitting on a large amount of data. Data might be a gold mine if these datasets have been structured and regularly updated. However, without sound data-management practices, companies end up relying on erroneous, missing or obsolete data and individuals loose trust in data. Today, the focus on digital transformation is forcing most companies across all industries to rethink how they collect, structure, store, aggregate, update and complement their data. But such efforts need to be future proof and allow any future business option to scale.

From a financial standpoint, data-management initiatives are hard to justify. Return on investment, based on operational improvements, is difficult to assess and tends to take a year or two. Therefore, it is necessary to associate these initiatives to new perspectives that outline the complete business potential. Considering new perspectives helps envision and establish the right data foundation that can help scale any future strategic option. This calls for ambition, leadership, and strategic thinking. Building the foundation of a valuable digital transformation requires taking into consideration customers, competition, innovation, and value creation opportunities, not just existing data quality and operational efficiency.

Focusing on the Volume, Veracity and Velocity of existing data can fuel technical excellence. Well-structured and reliable bills of materials, with consistent information on demand, lead times, inventory levels and logistic routes, is a good foundation for operational success.

In terms of Velocity, as competition occurs between supply chains; real-time access to information brings competitive advantages. Veracity is also key. Getting closer to raw data sources brings benefits; it shunts multiple interpretation lenses. Velocity and Veracity combined allow to make the best choices out of the best data in real-time.

The focus on digital transformation is forcing most companies across all industries to rethink how they collect, structure, store, aggregate, update and complement their data.

But the future value data can deliver and the required Variety in terms of data needed to achieve customer centricity should be considered. This starts with a thorough understanding of the end-to-end client experience and most specifically of the concerns, worries or problems they face. From this, future data requirement can be envisioned. For instance, a hospital that orders large equipment should be able to anticipate any issue in the last 100 meters of the logistic chain. And the supplier of equipment needs to consider how it can provide either the right data or the right service to process the client data so issues within these last 100 meters can be avoided. 

While it will never be possible to get everything right first time in terms of data, some goals need to be set. For operational decisions, data need to enable an accurate and rapid reaction. At planning level teams should have an improved capability to perform reliable comparisons of scenario through a good understanding of the total costs of each scenario. And strategic decisions can become more fact-based and allow better supplier selection. Many companies still poorly integrate the cost of non-quality or the cost of late delivery in their supplier selection process for instance.

Then comes the choice of the system that powers the data and empowers the people to deliver customer value. Here, ambition needs to drive decisions, not the immediate ease of implementation. Cloudification and robust implementation of IoT solutions can offer major steps forward while patching and bridging of legacy systems cannot be a viable long-term option. Like a supply chain, the technical infrastructure needs to be aligned with the business ambition, adaptable to future business options and agile enough to respond to day-to-day demands.

Future performance also requires improvements at institutional levels. Digitalisation of customs’ activities is a necessity for leading economies. New standards and modern infrastructures need to be established. The EU project, Alice (Alliance for Logistics Innovation Through Collaboration in Europe), brings for instance a complete ecosystem that aims at a 10 – 30% increase of efficiency in the EU logistics sector resulting in € 100 – 300 billion cost relief. This will require an open global logistics system based on interconnections and intelligence.

Stage 3. From supply chain analytics to digital twins

Supply chain analytics aims at bringing and combining all relevant information in one place in real time, while digital twins aim at performing dependable simulations using real-time data to anticipate supply chain dynamics.

The long-term goal associated with the use of analytics and digital twins is to have machines which take ownership of simple decisions while humans focus on key priorities and options at hand. A container powered by a smart contract could, within a set budget, decide what routes it can take over time to achieve an optimal cost vs speed performance. A warehouse operator will have at his/her fingertip a clear and ordered overview of what needs to be done. Priorities should be clear and informed by reliable information. A supply chain planner would know the ten most critical products he/she should work on and have a few costed options for each of them. Finally, a buyer would have all the information needed to effectively manage the alignment with key suppliers and prepare in time for future supplier-selection activities.

But the reality is less glamorous. We all need to start with the basics first.

Moving towards supply chain analytics starts with assembling a set of measures to ensure “on-time delivery” with a good accuracy. Over time more options can be added, further accuracy can be achieved, and a broader scope can be considered. This can sound quite basic, but this is a necessary level of maturity. In terms of functionalities, a first step with supply chain analytics is to receive warnings and priorities, a second step is to gain access to scenarios for each priority. Going further, supply chain analytics also provides the opportunity to understand weak signals. By combining different sources of data, you might realise that suppliers who are changing items’ delivery date more often than usual, are likely to deliver late a few weeks after. This provides good signals that help anticipate future issues and manage relationships effectively with suppliers.

Also supply chain analytics needs to deal with challenges in terms of data visualisation. Complex maps and graphs might be accurate but poorly actionable. The challenge is always to be able to move from the full picture to the right details easily. For this, the user interface is key, yet challenging to get right. Some of the existing market solutions are very poor in this matter. And the user interface should be easy to share within a small team as more than one expertise can be required to interpret what happens and agree on what should be done.

Digital twins were first used in manufacturing environments back in the 90s. With the advent of the internet of things, the ability to sense and track objects has allowed to economically capture more data in real time. This enables the creating of digital twins for supply chains and for products throughout their full lifecycle. Using real-time data, simulations are available and allow us to take informed decisions on what is likely to happen.

In a nutshell Digital Twins help replace hope and fear with more certainty and confidence in decisions taken. For instance, you can visualise the complete supply chain and can spot a supply item on its way to a sub-contractor. This supply item is located on a boat, you can access all relevant cost and quality information on the item, as well as the current capacity and performance of the sub-contractor who will soon integrate the item into a larger system. As you are informed that this item is on a critical path, you are provided options. To ensure that all clients will be served on time, one option offered to you is to assemble the system in one of your sites, another option suggests an alternative logistic route that can be used to accelerate the delivery. The digital twins can provide scenarios: if a supply is late, it can tell you what the options are and what the existing backups are.

With the right supply chain analytics and digital twin capabilities, the dream is to unify the performance calculation in a real time. In this ideal world, impacts on the Quality-Cost-Delivery triangle are understood, measured, and simulated instantly as needed. This changes the jobs of planners and buyers. Planners become supply chain leaders. they set the goals and the boundary conditions in line with customer demand. Buyers become supply chain designers who provide options in terms of capabilities, price, and contract to planners so they can put the flow in motion.

To achieve the ultimate goals of supply chain analytics and digital twins, the road remains long. It will require new ways to sharing , using standard formats across complete supply chains. We are still far from this in many sectors.

Stage 4. The CSCO new Operating Model

Establishing a triple AI supply chain requires the Chief Supply Chain Officer implement a new operating model. The role of the Chief Supply Chain Officer is not to be an expert on each function, but to assemble all the pieces together over time. This operating model is best explained by distinguishing what needs to be done along different time horizons.

Looking at a short-term horizon the supply chain has a control-tower capability. It aims at orchestrating the supply chains and reacting to changes using real-time information. This contributes to making the supply chain informed and agile. This delivers on quality, cost and delivery performance while ensuring positive P&L impact.

Looking at the mid-term horizon, the supply chain has an end-to-end flow manager capability. Teams can assess and implement new supply chain scenarios based on complete cost / benefit projections. This strengthens supplier and customer relationships. It contributes to making the supply chain aligned and interconnected.  In terms of performance, this contributes to achieving customer centricity and delivering on the corporate strategy goals.

Looking at the long-term horizon, the supply chain has a strategic decision-maker capability and will be future-ready. It anticipates challenges and opportunities. This contributes to making the supply chain adaptable and intelligent. It creates value well beyond operational and financial results. This contributes to delivering on the broader goals that matter to all external stakeholders including social and environmental impacts.

The following table can be used to assess progress, to build and mobilise the right team in order to implement a Triple A.I supply chain.

table

Conclusion: Leadership and vision

We start this conclusion with a thought experiment.

This is 2024 – Singapore – the Supply Chain HQ of a manufacturing company. The bi-yearly strategic scenario building meeting has started.

“The CSCO has invited some team members, some stakeholders and some external partners and experts to a strategic-building event. The aim is to explore how the supply cain could survive a major event.

A professor describes how a conflict between the two countries could evolve and lead different countries to take sides. The head of the local harbour infrastructure describes how this could impact its operation and other major harbours in the regions. The supply chain team pulls out of the data what suppliers and routes are likely to be disrupted… This would lead to an impossible-to-manage situation.

It was decided that the team needed to build a scenario where the company would be able to swiftly operate in two separate parts of the world. Scenarios with solutions, models and data should be ready for the next meeting planned in a week.

This experiment illustrates how triple A.I. supply chains will have the ability to anticipate and handle major shifts related to geopolitics, climatic and environmental risks, pandemics, economic changes such as taxes and trade tariffs. This positions Supply Chains as a guardian of the company performance, as a source of customer satisfaction, as a vector of growth, and as a major driver of financial performance.

Triple A.I. supply chains are Agile & Informed; Aligned & Interconnected, Adaptative & Intelligent. Within triple A.I. supply chains, information gathered along and beyond the supply chain are combined along a triple time horizon. Short, mid and long-term insights and scenarios facilitate human decisions.

Triple A.I. supply chains are more than a combination of outstanding technologies and operational processes; they deliver outstanding performance as a CSCO shares a daring vision and facilitates leadership within the team, among stakeholders and external partners. Triple A.I. supply chains create trust for clients, suppliers, finance teams and other stakeholders. This is all about motivating individuals, being attractive to clients and suppliers, and delivering a daring brand promise to clients. As we conclude this article, we hope to pursue the discussion in the future. Many questions come to mind: How can the CSCO deliver real digital transformations focused on clients and value created? How can we ensure the best management of supply chains against different time horizons? How can we build supply chains that foster trust, innovation, environmental performance, and customer centricity at the same time? How do we create systems that simplify user experience without sacrificing on the richness of options? And how can we bring everyone forward in our teams and perform a smooth implementation of the changes?   

About the Authors

Fabrice Thomas

Fabrice Thomas has been Vice-President of Global Supply Chain in various industries (Transport, Robotics and Healthcare) for the past 15 years, where he design and lead the execution of Digital Transformation Strategies in Supply Chain, leveraging on Adaptative Operating Model and Digital technologies principles. He is based in Singapore, where he lead a Global Supply Chain organisation. He hold a Master from EM Lyon, a MBA from HEC and several certifications from Columbia University.

Hervé Legenvre

Hervé Legenvre is Professor and Research Director at EIPM. He manages educational programmes for global clients. He conducts research and teaches on digitalisation, innovation, and supply chain. Lately Hervé has conducted extensive research on how open-source software and open hardware are transforming industry foundations.

The post Triple A.I. Supply Chains appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/triple-a-i-supply-chains/feed/ 0
How Technology Commons Revolutionise Industry Foundations: The Story of the Open Compute Project (OCP) https://www.europeanbusinessreview.com/how-technology-commons-revolutionise-industry-foundations-the-story-of-the-open-compute-project-ocp/ https://www.europeanbusinessreview.com/how-technology-commons-revolutionise-industry-foundations-the-story-of-the-open-compute-project-ocp/#respond Tue, 25 Jan 2022 08:29:03 +0000 https://www.europeanbusinessreview.com/?p=138261 By Hervé Legenvre and Steve Helvie In this article, we will feature a largely ignored cornerstone of digital transformation – the adoption of technology commons: open-source software, open hardware, open data, […]

The post How Technology Commons Revolutionise Industry Foundations: The Story of the Open Compute Project (OCP) appeared first on The European Business Review.

]]>
By Hervé Legenvre and Steve Helvie

In this article, we will feature a largely ignored cornerstone of digital transformation – the adoption of technology commons: open-source software, open hardware, open data, etc. Using the story of the Open Compute Project, currently celebrating its 10th anniversary, we illustrate how such ecosystems revolutionise industry foundations by removing market bottlenecks and transform technology development into collaborative efforts.

Technology commons: the invisible force that shaped the digital economy

Technology commons

Malcolm P. McLean is a truck driver who revolutionised the transportation industry when he invented the shipping container, the basic unit of transportation that powered the global trade growth during the last part of the 20th century. Awarded a patent for this invention, he was nevertheless conscious that the container needed to become a widely adopted standard, which pushed him to issue a royalty-free license of his invention to the Industrial Organization for Standardization (ISO). As this sparked a growth in the usage of containers, his company, SeaLand Industries, was in the best position to capture benefits from it and it became the largest cargo shipping business. This story illustrates how open designs and technology commons can change the foundation of an entire industry.

Years later, as technology became more widespread, open-source software, another form of technology common, became a dominant force. Today, the largest contributors to open-source software projects are companies such as Google, Facebook and Microsoft. Google and Facebook have benefited from open-source software since their early days. As they evolved, they have turned open-source developments into large-scale innovation engines. It powered their growth and the growth of a myriad of digital native companies. Today start-ups only pay a fraction of the costs needed to create applications thanks to open-source software. It has powered the transition to a mobile world and unleashed the adoption of artificial intelligence. Google and Facebook have created and open-sourced the most-used machine learning frameworks: TensorFlow and PyTorch. These digital giants have also used open source to their advantage; to bring flows of users to their platforms and to challenge their own proprietary solutions. It also makes them an attractive employer as talented developers love open source.

technology

While the development of technology commons has taken place deep in the technology stacks of these companies, many managers who are not familiar with digital infrastructures are unaware of the power of technology commons. Facebook and a group of companies have even brought open-source development to the hardware world. Celebrating its 10th anniversary in 2021, the Open Compute Project (OCP) offers a great illustration of forces at play deep within the technology world. It is an ecosystem that has transformed the foundations of the data centre sector. 

The story of the Open Compute Project (OCP)

In 2009, Facebook was experiencing capacity issues detrimental to its users’ experience and started to design its own data centre. At that time Google and Microsoft had ambitions in the cloud market and they were already working on custom designs for their data centres. However, they were keeping their design for themselves. Facebook did not see data centres as a differentiating factor and decided to open source its designs. This was done in collaboration with Intel, Rackspace, Goldman Sachs and Andy Bechtolsheim, who established the Open Compute Project Foundation in 2012. Today thousands of engineers are working on dozens of projects across the ecosystem. The designs are available under permissive and reciprocal licenses. The Foundation has an inclusive decision-making process and focuses on promoting the adoption of OCP designs.

The designs are available under permissive and reciprocal licenses. The Foundation has an inclusive decision-making process and focuses on promoting the adoption of OCP designs.

At the time, the focus for Facebook was on cost and energy efficiency. The designs started from a blank sheet of paper. This led to a reduction in the number of parts and to an optimised server layout that Facebook described as follows: “Our chassis is beautiful… functionally beautiful. In fact, we like to call it ‘vanity free.’ It was designed with utility in mind. We didn’t use plastic bezels, paint, or even mounting screws, which lowered the cost and weight. Our key customers — our data center technicians — provided a lot of input to the chassis design. The result was an easy to service chassis almost free of screws. A server actually can be assembled by hand in less than eight minutes and thirty seconds!1” While the initial focus was on components such as racks, servers, storage boxes and motherboards, OCP started to disaggregate the network layer of the data centre. This meant separating the software from the hardware to increase modularity and gain flexibility on the supply side.

Companies such as Microsoft and Apple later joined the Open Compute Project and contributed their designs. In 2020 Google joined the board of OCP and announced some upcoming contributions, while Facebook has already submitted over 50 contributions and Microsoft has contributed 35 to-date. As Alibaba, Baidu and Tencent joined, large scale OCP adoption is expected in Asia. Over time the focus of OCP projects has also shifted from what is inside the racks to new challenges and technologies, such as advanced cooling and energy re-use. Also, some new projects such as the Nokia open edge server have taken OCP out of the data centre to the edge, to achieve the best latency cost trade-off in 5G deployments.

Today, OCP has been widely adopted by the “hyperscale “companies; the 20+ internet giants which make extensive use of data centres, but it is also progressing beyond this market. Companies across many sectors are taking advantage of colocation solutions to benefit from the cost and energy efficiencies; others with more technical expertise have developed their own systems. One area where OCP is gaining significant traction is telecommunication with operators such as Telefonica, ATT and Deutsche Telekom. And the emerging disaggregation of the telecommunication value chain is likely to accelerate this.

Over the past 10 years, with the widespread adoption of cloud services, OCP has transformed the design of data centres. Suppliers of OCP components see their markets growing while traditional integrators are losing market share. HPE’s server unit, one of the market incumbents, is quite involved in OCP projects such as the Open Systems Firmware (OSF) project and HPE uses OCP components in its own line of servers. AMD is now seeing OCP as a vehicle to challenge Intel’s dominant position in the data centre chip market segment. These are clear signs that technology commons can shake the hardware world and rewrite the rules of competition. 

The four OCP success factors

The success of OCP can be explained by four factors: OCP brings cost reduction, OCP enables sustainability improvements, OCP offers flexibility and speed and OCP combines the power of standard and innovation.

1. Cost reduction

 OCP success factors

The adoption of a modular and open architecture, coupled with a rigorous development process, eliminates unnecessary costs. This reduces investments and operating costs by optimising installation, maintenance and uninstallation activities. It also focuses on energy efficiencies. The sharing of designs leads to further economies of scale through standardisation and the combined volume of buyers. Yahoo Japan has reported a 35% price advantage in 2019 alone by using OCP servers and a 41% cost savings for racks.

2. Flexibility: speed and multiple sources of supply

As the 5000+ engineers involved in OCP work on the next generation of data centre components, they focus on radically simplifying installation, maintenance and uninstallation activities. This creates a highly flexible environment where installing a new rack takes no more than a few minutes to install. This is a powerful advantage and eases the installation of a new data centre and keeps it at its best level of performance over time. Flexibility is also the result of the disaggregation of the value chain. Instead of depending on a unique supplier to bundle multiple technologies and services, OCP adopters benefit from multiple sources of supply, reducing dependence on single providers.

3. A sustainability enabler

Over the past ten years, OCP has had a strong focus on sustainability performance. The overall carbon emission reduction is sizeable. Facebook estimated in 2018 that it had saved, thanks to OCP innovations, 400,000 metric tons of carbon, the equivalent  95,000 cars over an entire year. Today Advanced Cooling Solutions (ACS) and energy re-use are amongst the priorities for OCP members, and the ACS project are among the most active in the Community.

One of the most recent developments within the OCP Community is the circular economy implementation. Companies like Facebook adopt and acquire technologies ahead of others. These technologies can last a long time, but after a few years they are replaced to allow them to benefit from the latest technical developments. Thanks to the modular and open architecture of OCP they can be wiped and resold to other companies who benefit from recent technologies for a significantly reduced price. This extends the life of the equipment and reduces the CO2 impact of data centres.

4. Standard and innovation united 

These technologies can last a long time, but after a few years they are replaced to allow them to benefit from the latest technical developments.

Finally, as all key industry players work together to innovate around a common architecture, a standard innovative solution is emerging. This eliminates the risk of fragmentation associated with innovation.  James Pearce who was head of open source at Facebook commented “honestly the idea of open sourcing our designs has really helped accelerate the pace of innovation throughout that sort of ecosystem, it’s helped us to iterate quickly, and we know that many other companies in the industry, from Microsoft, now through to Google and many other hardware partners have been involved with the open compute project to, we think, a huge amount of success industry-wide in terms of driving the pace of innovation, driving down the costs of much of this hardware which we think benefits the technology industry as a whole.”2  

Managing the OCP ecosystem 

Managing a technology commons ecosystem is a challenging exercise.  For an ecosystem such as OCP, the diversity of competencies required, the variety of stakeholders involved and the focus on collaborative effort offer very interesting leadership and management challenges. We have outlined some of them here.

Ensuring the architecture is simple, modular and open 

Open development within OCP is structured around simple modular architectures. Components can be easily interconnected thanks to open interfaces. Modularity reduces complexity and openness creates architectural flexibility as components can be easily mixed and matched together. Together modularity and openness allow users to organise collaborative developments around specific technical projects that evolve independently from one another.

Devising supportive IP policies

OCP strikes a fine balance between maximising openness and attracting contributors to the overall project. The Open Compute Project releases specifications. With these open specifications, vendors build products that fit into the architecture and adopters assemble and integrate all the required technology together. Two types of contributions exist.

The first one consists of releasing the complete design files under an open-source license. These design files need to be precise enough so anyone can understand and use them. As technology evolves rapidly, leading vendors within the ecosystem maintain a time advantage even if they share their design.

The second one consists in specifying the interfaces leaving suppliers free to provide specific technologies, including proprietary ones that fit into the architecture. With both options available, the overall ecosystem is attractive for all participants. And even if designs are not fully shared, OCP adopters still have the choice to source components from multiple vendors who offer different technologies.

Creating a governance which supports collaborative development

OCP has created a decentralised and transparent decision-making process and fosters a collaborative approach. Technical directions are defined within OCP by the Incubation Committee. Members, who are elected by OCP members, review the specifications and designs submitted; they set technical directions and encourage open collaboration and contribution. Then each project has a project lead. Anyone can join a project as a participant, with participants receiving all email exchanges on the project, they can join meetings and access documentation. As meetings are recorded, they remain accessible on the OCP website. For companies who contribute, design and manufacture OCP components, being part of such a Community offers them opportunities to influence the project technology roadmaps, visibility and helps them go to market faster with new products.

Mobilising ecosystem participants

Ecosystem participants include companies who either adopt, integrate or contribute to OCP solutions. They need to be mobilised and integrated within the Community. This is achieved through a variety of methodologies. In the early years of OCP, there was a focus on the growth of the members and the joining of well-known companies to OCP. The reputation of the members of the board was also a way to create legitimacy during the early years. The OCP team organises conferences, workshops, and encourages adoption of and contributions to projects. Messaging focused on the benefits of the OCP solutions and were targeted to ecosystem participants and newcomers alike. Today, as OCP is reaching maturity, there is a strong focus on research and case studies which demonstrate the growing adoption and benefits of OCP solutions.

Mobilising ecosystem participants

Changing an industry architecture using technology commons

Leveraging technology commons is a counter-intuitive strategy for many executives who, for many years, have focused on controlling unique valuable resources developed in-house. The prevailing belief being “Own it, and it will help you beat competition.”

However, as the complexity of digital systems increases, differentiation is coming only from a fraction of the overall digital infrastructure companies use. Technology commons help establish standards. They offer shortcuts by avoiding the lengthy market selection process and the myriad of competing options. But for technology commons to succeed, the executive reaction needs to become “This one, we can share it, and competition and many others will help us improve it.”

Open architectures and technology commons should be considered across industries as a strategic opportunity. They have the power to change an industry architecture and to re-shuffle competitive advantages. They are slowly but assuredly conquering the telecom and the automotive sectors. With connectivity penetrating every market it will continue to spread.

figure 1

The impact of technology commons on industry architectures is twofold. It allows the decoupling of technology development from product development and creates frictionless supply markets. Over many years, we have assumed that competitive advantages are created in the early stages of the innovation process. However, open development is rewriting the rules of competition. Technology development can be performed by ecosystems that bring many companies together. They create software and hardware designs that are tested, used and improved by anyone. An effective coordination of ecosystems based on a radically transparent decision-making process allows the most valuable and useful solutions to emerge. This creates, over time, a technology common accessible to everyone. Technology development is becoming a pre-competitive, shared investment that helps scale innovation and establishes standards. This drastically reduces development costs through the re-use of past designs and through collaborative new developments.

This also creates frictionless markets by removing industry bottlenecks and diminishing entry barriers and brings the functioning of the market closer to a state of pure and perfect competition, where technologies can be easily sourced from multiple suppliers.

The Open Compute Project illustrate all these points and shows how the data centre industry has been transformed over the past decade.

About the Authors

Hervé Legenvre

Hervé Legenvre is Professor and Research Director at EIPM. He manages educational programmes for global clients. He conducts research and teaches on digitalisation, innovation, and supply chain. Lately Hervé has conducted extensive research on how open-source software and open hardware are transforming industry foundations.  

Steve Helvie

Steve Helvie, Steve is currently the VP of Channel for the Open Compute Project (OCP).  In this role he helps to educate organisations on the benefits of open hardware designs and the value of “community-driven” engineering for the data center. He’s an advocate of open source business models and promotes the decarbonisation of the data center across the OCP supply chain.

The post How Technology Commons Revolutionise Industry Foundations: The Story of the Open Compute Project (OCP) appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/how-technology-commons-revolutionise-industry-foundations-the-story-of-the-open-compute-project-ocp/feed/ 0
Powering Costing with Artificial Intelligence: The case of Vodafone Procurement https://www.europeanbusinessreview.com/powering-costing-with-artificial-intelligence-the-case-of-vodafone-procurement/ https://www.europeanbusinessreview.com/powering-costing-with-artificial-intelligence-the-case-of-vodafone-procurement/#respond Tue, 08 Jun 2021 17:34:31 +0000 https://www.europeanbusinessreview.com/?p=116963 By Hervé Legenvre, Gavin Hodgson and Govind Khandelwal The present article describes how the Vodafone Procurement Company (VPC) has adopted Artificial Intelligence technology to boost its ability to perform “Design2Cost” and […]

The post Powering Costing with Artificial Intelligence: The case of Vodafone Procurement appeared first on The European Business Review.

]]>
By Hervé Legenvre, Gavin Hodgson and Govind Khandelwal

The present article describes how the Vodafone Procurement Company (VPC) has adopted Artificial Intelligence technology to boost its ability to perform “Design2Cost” and achieved significant cost optimisation as a result. The adoption of digital technologies including Artificial Intelligence allows the automation of activities performed routinely by humans. This positively impacts the efficiency of these individuals and allows them, in the best case, to focus on more value – adding activities. At the same time, such technologies can be used to augment the work of procurement professionals by allowing teams to deliver more value. Augmentation typically supports complex and collaborative activities that were not be systematically performed before.

How it started

The Vodafone Procurement Company was established in 2008 to serve as Vodafone’s centralised procurement hub. By 2017, VPC had reached a plateau in terms of maturity. Price negotiations were facing limits, leading the organisation to ask: What’s the next step? The company needed further opportunities to eliminate costs, but it also needed healthy suppliers who could invest in R&D. Procurement experts at Vodafone knew that they were most valued by their stakeholders when they were bringing not only savings, but also knowledge, facts, and options to the table. Knowledge on the design and on the detailed costs of the products and services provided by suppliers plays a key role here. All this led to the idea of further investing in the integration of cost analysis capabilities within the procurement team. So-called ‘design to cost’ allows the setting of an objective cost goal for a product by breaking down the product into sub- elements and assessing their respective costs and benefits. This allows for the rethinking of products and services, achieving cost reduction and increasing the value delivered.

So-called ‘design to cost’ allows the setting of an objective cost goal for a product by breaking down the product into sub- elements and assessing their respective costs and benefits.

The Vodafone team performed benchmarks with automotive companies and looked across the telecom industry to assess existing practices. This provided a good basis to start a pilot, the results of which were very promising and a business case was developed. The initial aim presented in the business case was to save 300 million Euros over 5 years, but the stretched goal was to achieve cost reductions of 1 billion Euros within this 5 year period. So far, after three and a half years, over 50% of Vodafone’s global spend has been influenced by the team and hundreds of millions of Euros have been saved, putting Vodafone well on track to achieving its stretched ambition. Over 250 pieces of hardware have been analysed to date and the pipeline is full for the coming six months.

Setting up a Design to Cost Lab

The project, branded Design2Cost, was launched at VPC in a temporary lab located in a meeting room. Following the success of the pilot, it was clear that more space was needed to perform their cost teardowns, so Ninian Wilson, CEO of VPC, said: “Take the boardroom and build your lab in it. This is the best way forward. You need more space!”  The Design2Cost initiative started small with a team of just five people, and today boasts 16 trained Design2Cost experts from a range of industry backgrounds including electrical engineers, mechanical engineers and manufacturing. On the ground in the lab the team focuses on the hardware teardowns with the support of a team in India who do detailed analyses of the findings. For services cost teardowns, VPC’s category managers are trained in cost analysis methodology enabling them to perform their own costing activities with the support of the Design2Cost team who give coaching and validate their results.

The process and how it was developed

Vodafone’s Design2Cost process was developed by creating a series of in-house tools. The team developed a systematic seven-step process and invested in access to many sources of information on costs. From the start, the process was developed as a cumulative learning process.  Right from the beginning they built a database that could be updated and re-used over time, meaning that it is continuously updated. For instance, the data on the cost of labour are updated every quarter. So, if you have a supplier in a specific region in Vietnam, the database gives you the labour for the appropriate skill level, energy and factory floor space costs in that region. Many sources of information contribute to the database and different sources are regularly compared so that it is reliable.

From the start, the process was developed as a cumulative learning process.  Right from the beginning they built a database that could be updated and re-used over time, meaning that it is continuously updated. For instance, the data on the cost of labour are updated every quarter.

The analysis process can be described as follows. Whenever a procurement project above a certain value starts, category managers must confirm they have considered a Design2Cost approach. When appropriate, the Design2Cost team starts by defining the scope and the context. This provides a basic understanding of the category and of the business issues. They then tear down representative pieces of hardware and scan all the components using a proprietary, internally developed system. This scanning equipment takes hundreds of images that are combined into a single ultra-high resolution image. Using a neural network, which is currently patent pending, software then recognises and defines the locations of components of interest. Other internally developed software then takes the images already captured, and markings extracted using Optical Character Recognition, to compare and match these components with the internal database built over the life of the lab. Up to 92% of electronic components on an unknown board are now being automatically recognised in this way. Each item within the hardware is automatically allocated a cost thanks to the lab’s database. If the information is not already in the database, the team creates a new cost model or looks for external cost information, always seeking multiple sources of information to ensure reliability. From this, they perform all the analysis using their software. The final cost model is completed by incorporating the value-adding steps as well as overheads, R&D, and any other third party costs. This allows the team to produce the information kit for category managers and support them as needed. The process is outlined in Figure 1.

Figure 1

The information kit provided at the end of the process is comprehensive, offering a complete overview of the teardown. It includes what they call a ‘clean-sheet’ costing, the lab’s own version of the costing. They also decompose the costs by subassembly, by components and offer detailed views on labour impact. The lab can also offer competitive comparisons by doing a tear down on competitors’ products. They suggest some levers with their cost impact.

Typically, the info kit provides category managers with a list of design optimisation opportunities as levers that can be used to reduce costs. These can run to 20 or more opportunities, but the actionable number depends on the maturity of the product.  This provide options. Then it is down to decision makers who specify the needs and to the supplier to see how this can be taken forward.

To create this process and the system that supports it the team has created a ‘garage innovation’ environment. Orlando Grigoriadisthe lab’s AI specialist described the development: “This started in my kitchen, I was putting together all the elements for scanning hardware together at home. Everything was developed in house, and sometimes really in my own house! This is often the case with machine learning, you can start with some algorithms that are available on open source but then, at some point you need to make it work for you.” Before this the team was using magnifying glasses and manual tools, they were looking at every single component on the board and identifying everything by themselves. Now they are teaching the machine to recognise all components so that everything is automated.  After the picture is assembled, the components are identified, and the data is aggregated automatically.

These developments undertaken by the team have had a tremendous productivity impact on their own work and performance. The time it takes to do a complex costing has been reduced from 30 to 10 days thanks to the tools that use AI and machine learning that were developed as part of the setting up of the lab. Today, the neural network technology developed by the team is registered as a patent.

Figure 2

In this approach the wide access to external data is also key. Figure 2 describes the sources of data used by the team. You need all these sources to be able to do a cost teardown quickly and effectively. When the team does not have access to the cost within its existing database, members of the team start to look at a wide range of external information. By looking at multiple sources, the information is triangulated. The lab has used thousands of sources of data to get information on more than 20 000 components.  The level of accuracy can now exceed 90% thanks to the machine learning algorithms. But the team still plays Sherlock Holmes sometimes, they dig deeper in technical papers to understand some of the unknown costs and they exchange and work with suppliers to better understand some of these. However, this is less and less required as information is accumulated and updated on an ongoing basis.

The benefits to the company

Since the lab was established, the team has performed detailed teardowns and cost analyses on hundreds of complex electronic and electromechanical products and services, including x86 servers, remote radio units, customer premise equipment and various types of deployment, marketing, call centres, IT and other services.

The impact of these activities has been significant: using the lab’s results, category managers have achieved a step-change in the quality of discussion with suppliers. They have scrutinised vendor cost structures on the most detailed level, uncovered hidden margins, and opened-up joint cost reduction and value creation opportunities with suppliers.  Sometimes this has helped uncover inefficiencies that were due to year of specifications that had been piling up on each other. Many examples of impact show double-digit cost improvement.

The impact of these activities has been significant: using the lab’s results, category managers have achieved a step-change in the quality of discussion with suppliers.

A category manager interviewed suggested that the benefits are more significant when there is a competitive tender at stake. When representatives from the supplier know they are in a dominant position, it is more challenging to get their team to be open and discuss detailed design. But he also highlighted that benefits on services can be significant. Indeed, the Design2Cost team performs analysis within the field on services. The map processes analyse them and spot improvement opportunities.

To summarise the benefits, the lab and its Design to Cost process provides new insights that can be used in commercial negotiations. Second, this offers a full understanding of designs and of their impact on cost and customer preferences, creating options for the company to choose what is essential and what is not critical from a customer standpoint. This allows for the value of every component to be challenged. For instance, the team in the lab highlight on a recurring basis that black paint on a board that nobody will see is “vanity not value”. Finally, this allows them to have in-depth, fact-based conversations about the opportunities to jointly optimise the end-to-end value chain with Vodafone’s partners. This is key to opening the door to more collaborative and innovative relationships with suppliers.

How was the change implemented?

Implementing such an initiative requires a change in mind-set. On the supplier side, not all were open to discuss and share their costs. The CPO and CTO of Vodafone sent a letter to all key suppliers’ Customer Account Teams and CEOs, explaining to them that the Design2Cost Lab had been established and that it would lead to new ways of working.  Also, from the start of a project with a supplier, an evaluation of their openness is done. This is important to create the right relationships and forms part of tender evaluation criteria. The team has seen some suppliers coming to them with services or products where they lose money and offering to work together to see how this could be improved. New entrants in the market often want to work with VPC’s Design2Cost team and are open to discuss cost. Incumbent suppliers are more reluctant, but in a competitive context or when the process brings real tangible opportunities for both sides, progress has been made. Perseverance and consistency are essential here.

On the internal side, a lot of time has been invested in explaining the benefits and the way it works. This is perceived as an opportunity for procurement teams to be empowered with more information and knowledge, so it really helps them make progress with some vendors.

Business statistics concept.

Looking into the future

As the team progressed, it realised that additional benefits could be unearthed from the data collected and the assets they had created. For instance, some of the data can be used to map sources of components and react to any disruptions or bans. There is also the possibility to offer a Purpose-led Design2Cost capability. This means identifying impacts on the environment and society as part of the exercise so further feedback can be provided to suppliers. This can help identify if refurbishing a product is an option or not, can help eliminate plastic and serve as a good basis to understand CO2 emissions. It can be used to make the necessary total cost calculations for implementing circularity. Finally, one option for the ream is also to monetize its capabilities outside of Vodafone to other telecom operators who cannot invest in such a lab. So, the future is looking great for the Vodafone’s Design2Cost Lab.

All this demonstrates that technology often makes an organization more efficient thanks to automation. But when technology augments the work of the best professionals then the benefits are huge. This requires maturity, and for many organisations, further investigations of their investments.

About the Authors

Hervé Legenvre

Hervé Legenvre is Professor and Research Director at EIPM, an Education and Training Institute for Purchasing and Supply Management. He manages educational programmes for global clients, conducts researches and teaches on innovation and purchasing transformation. Hervé Is the author of the book “Fifth Generation Purchasing”.

Gavin Hodgson

Gavin Hodgson manages Vodafone’s Hardware teardown lab. He has two decades of global experience in procurement and supply chain management, and holds degrees from Cambridge University, and KEDGE. He is currently working on solutions to assess hardware CO2 footprints by using teardown insights.

Govind Khandelwal

Govind Khandelwal is Head of Core, Software & Transmission Technology Procurement and Design2Cost Lab at Vodafone. Govind is a Strategic and business focused senior international telecommunication industry executive with 20+ years of Supply Chain. He has International work experience in India and Europe.

The post Powering Costing with Artificial Intelligence: The case of Vodafone Procurement appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/powering-costing-with-artificial-intelligence-the-case-of-vodafone-procurement/feed/ 0
Business and Governments and Civil Society and a Healthy Planet https://www.europeanbusinessreview.com/business-and-governments-and-civil-society-and-a-healthy-planet/ https://www.europeanbusinessreview.com/business-and-governments-and-civil-society-and-a-healthy-planet/#respond Fri, 25 Sep 2020 00:01:51 +0000 https://www.europeanbusinessreview.com/?p=101728 By Sylvain Guyoton and Hervé Legenvre The following discussion took place In April 2020. We aimed at confronting our points of views on the contradictory forces that needs to be […]

The post Business and Governments and Civil Society and a Healthy Planet appeared first on The European Business Review.

]]>
By Sylvain Guyoton and Hervé Legenvre

The following discussion took place In April 2020. We aimed at confronting our points of views on the contradictory forces that needs to be handled to address the sustainability challenges ahead of us. The key ideas are summarised on figure 1.

Hervé Legenvre (HL) – The equation is becoming clearer every day. The planet can no longer sustain the present rate of economic and demographic growth without significant damage. Our ability to enjoy a safe and prosperous life is being eroded by constraints in resources and environmental conditions. Our world today is interconnected and uncertain, so even if we put all of our efforts and attention into addressing a single challenge, unintended and unexpected consequences can create other challenges and issues. An unknown virus pops up in a wet market in the centre of China and quickly spreads around the planet. Unless we re-invent how we operate both in business and as a society, the limits we are experiencing will constrain and de-scale our economic activities. Whether we call this prevention or adaptation, change is needed. We are facing a tremendous challenge with a multitude of contradictory forces. Social and environmental aspirations offer contradictory imperatives. Personal, corporate, and political agenda conflict. Tensions are growing across geographies and generations. These contradictory forces are also very visible within industries, supply chains and organisations.

Sylvain Guyoton (SG) – Even if the diagnosis is clear and daunting, we need to stress that we may actually have reached a positive inflexion point. In August 2019, 180 of the top US CEOs who are members of the Business Roundtable declared that shareholders interest cannot be the sole purpose of a corporation. They agreed that investing in people, protecting the environment, and being fair with suppliers necessitates commitment. Furthermore, the Financial Times recently asked the question, “Does capitalism need saving from itself?” And the Dean of INSEAD, Ilian Mihov, stated that “it’s time to rethink capitalism” otherwise we will be facing enormous challenges. According to John Elkington’s new book, Green Swans, a profound market shift may be emerging. Even though we are still very much at the stage where principles are proclaimed and there are lots of unknowns, in a few years we may look back at today and say that this was the year something started to move in the right direction.

 

HL: Unfortunately, there are numerous examples of economic actors still operating the old way, and this may not leave enough room for new ways of thinking to scale up, not to mention that environmental degradation and increasing social inequalities may lead to populism and severe social conflicts.

SG: Yes. However, when a system is in transition it carries the old way of thinking while the new way is still being implemented. I agree that this creates massive tensions. I see four types of macro tensions at play here. First, you have the tensions between financial, environmental, and social concerns. This one especially is experienced by both corporate and governments. The second one relates to everyone’s experience. All of us, as individuals, aspire to enjoy the present days, but we fear the future. We hope something will change but we feel paralyzed. Third, technology is seen as both a solution and a problem. Some people accuse technology of being the cause of our troubles while other believe the next wave of technologies will solve many of our problems. Finally, while corporations have focused for years on creating higher added value, they have created complex global value chains. Consequently, they have externalized risks, which sometimes fire back. Now they want to take back ownership of these risks and as a result, a tension between local and global has emerged.

HL: We are witnessing similar tensions within corporations. Let me describe four of them. First, any decision ends up being an arbitrage between financial outcome versus environmental and social impact. It is very difficult for decision-makers to break through this logic. Money always comes first; other concerns are adjusted against the financial gold standard. Second, most decision-makers pay full attention to compliance matters while only a few are pushing for a social and environmental performance improvement agenda. They feel accountable for a compliance breach, so the logic of protection dominates, and the logic of progression comes second. Third, I keep seeing competing forces between risk and innovation within companies. However, innovation and risk go hand in hand. Implementing concepts such as the circular economy can require a significant entrepreneurial mindset, yet large corporations remain fundamentally conservative. Finally, on a human level, personal ethos and company policies and actions can clash. This can create embarrassment, demotivation, misunderstanding and even implementation failures.

SG: We need to keep in mind how the social and environmental agenda has unfolded over time. Looking back at the 80’s, concerns about social and environmental considerations were very limited. This was not on the corporate radar screen except in very specific industries. Then, with globalisation, concerns started to rise and became public; we moved into the era of corporate social responsibility (CSR). However, decisions were still driven primarily by financial parameters. This was not the corporation’s top priority. It was the era of greenwashing. Then CSR became more serious and it aimed primarily at managing risks and mitigating negative impacts on stakeholders. Today, we need to move towards a regenerative economy that creates a fair and inclusive society while restoring and protecting natural ecosystems. We need a CSR that brings positive impact to BOTH corporations and society as a whole. We need to move towards a BOTH/ AND logic rather than an EITHER/OR logic.

HL: Adopting a BOTH/AND logic is demanding. After so many years of focusing on short-term financial parameters, re-inventing the decision-making process is challenging. And let’s not forget all the forces of inertia. For instance, we have within companies an internal division of labour, finance, engineering, purchasing, quality, and corporate social responsibility teams; all with conflicting goals. They tend to operate within a context of a hidden but sometimes harsh power dynamic that prevents common goals from emerging. Also, we see how performance measurements and incentive systems make it even more difficult to adopt demanding and contradictory common goals. Tricking the target system is a corporate sport. Some people are very skilled at it and are happy to live with this. Doing something that does not fully contribute to your own objectives is often seen as an extremely brave behaviour in companies. A lot of things need to change to implement a BOTH/AND logic.

SG: Yes, and you have the current multiplication of transformation initiatives across companies. Mandates to improve or change are coming from every direction. People are getting lost. They don’t see the bigger picture. Creative chaos and agile ways of working were supposed to make companies more efficient, but I am not that convinced today. When people come to me with new ideas, I tell them “yes of course you should try it!”. But trying one idea after another is insufficient. We need a sense of direction. The act of making a decision needs to be powered by a creative, holistic and long-term thinking approach, in other words a purpose. This is the pre-requisite to move into a BOTH/AND decision-making logic.

We need both cost reduction and a better social and environmental performance. This requires us to frame problems as paradoxical directives and to broaden the search for solutions.

HL: I see your point, but implementing an BOTH/AND economy is a grand vision that requires very practical approaches. First, every decision needs to be backed up by all the data that is necessary to understand financial, environmental and social impacts. This is essential and a lot of progress can be made on this. Then, there are three ways to move from an EITHER/OR logic to a BOTH/AND one. Let’s consider contradictory forces A & B. The first way to handle these contradictory forces is to break down priorities over time. First you do A and then you do B. But this creates a risk here that environmental and social aspects will come in second far too often. The second way to handle conflicting forces is to separate the activities. We do A on one side and B on the other side. We invest in a very sustainable activity while we keep running the old-world activities. In both cases, what matters is to make sure that an overall positive trajectory is taking shape. Then the third option is to recognize the tension and seek a BOTH A and B. This requires creative and collective thinking that breaks through existing perspectives. We need both cost reduction and a better social and environmental performance. This requires us to frame problems as paradoxical directives and to broaden the search for solutions. To do this, people from different functions need to take time to understand each other’s concerns and aspirations. But they also need to explore multiple options simultaneously before making a final decision. Doing a bit more of B on top of A is not enough.

SG: But when you talk about a “positive trajectory”, you realize that such a decision-making process needs to be supported by a vision that defines the direction of the trajectory. In an AND economy we need to communicate the need to achieve BOTH A & B continuously. This vision also needs to reconcile the short- and long-term perspectives. One of the best ways to do this is to bring the stakeholders into the decision-making hub so that they can input into the company ambitions and feedback on progress. When necessary, we should recognize that it is difficult to create both A & B but also explain why we need to reconcile the conflicting perspectives. In my organisation, we try to do this by defining four pillars: customers, employees, society and the planet, and shareholders. We map the impact of our main project onto these pillars. They sometimes appear contradictory, but in the end, it’s very easy to see that these pillars can also complement each other because they are unified by a positive “raison d’être”, a kind of rallying cry that will last in the long-term. In other words, what brings them together is stronger than the contradictory forces that may be playing out in some circumstances.

HL: For me, recognizing tension is more important than claiming a purpose. In Europe, in the 80’s, companies decided they needed both economic efficiencies and quality. A lot of progress was achieved at the time thanks to a problem-solving discipline and a focus on a few projects that reconciled these grand contradictory goals. In our context, I also see three macro-solutions that go beyond a single firm perspective. The first one is to always consider what it would take to localise and operate closed-loop activities. There are lots of long-term benefits from this change. The global approach can still be relevant for some activities. However, focusing first on localisation and closed-loop will force us to think differently. Second, we need more open data and open algorithms that aid decision-making. This needs to be built by region and by industry. Openly sharing information and letting anyone at all extract wisdom from it is very powerful. Third, we need more open architecture for products and services. This prevents lock-in situations and creates flexibility. It is a necessary condition for implementing circular economies and to regaining some degree of freedom in our thinking. Proprietary interfaces limit our possibilities.

The global approach can still be relevant for some activities. However, focusing first on localisation and closed-loop will force us to think differently.

SG: From the local-global perspective, I believe that although economies need to rebalance in favor of shorter supply loops, it would be mistake to go 100% local. There are still lots of opportunities provided by global interactions. We can also leverage local experiments and replicate them. Take for example the Eden Project of Sir Tim Smit in England, a project aimed at showcasing and experimenting with sustainable environmental practices that is being replicated in other parts of the world. Local and global will remain two sides of the same coin whose tensions we must learn to better manage. Take for example the open data logic which is essential to finding tomorrow’s solutions; it’s a global logic not a local one. Global interaction will still be needed to foster collaboration. For instance, Scope 3 carbon emissions, which represent the majority of greenhouse gases emissions, will only be tackled through global industry collaboration. Finally, we also need new forms of governance, able to better associate external stakeholders with decision-making. And last but the least, if we want to transform the economy and make it a BOTH/AND economy powered by a BOTH/AND decision-making process, we also need to transform capitalism’s master discipline of Economics as advocated by John Elkington.

Sylvain Guyoton is Senior Vice-President of Research at EcoVadis since its founding in 2007. He brings 20 years of experience in Sustainability and CSR. At EcoVadis, he oversees rating operations and methodology development, and sits on the Executive Committee. Sylvain holds an M.S. in Industrial Management from Cranfield University and an MBA from INSEAD. 

Hervé Legenvre is Professor and Research Director at EIPM, an Education and Training Institute for Purchasing and Supply Management. He manages educational programmes for global clients, conducts researches and teaches on innovation and purchasing transformation, Hervé holds a PhD from Université Paris Sud.

The post Business and Governments and Civil Society and a Healthy Planet appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/business-and-governments-and-civil-society-and-a-healthy-planet/feed/ 0
The Procurement Call for Agile, What does it mean? https://www.europeanbusinessreview.com/the-procurement-call-for-agile-what-does-it-mean/ https://www.europeanbusinessreview.com/the-procurement-call-for-agile-what-does-it-mean/#respond Mon, 25 Nov 2019 14:50:40 +0000 https://www.europeanbusinessreview.com/?p=87590 By Andressa Reis and Hervé Legenvre Today, Procurement teams need to embrace the agile ways of working in order to work in synch with their stakeholders. The present article describes […]

The post The Procurement Call for Agile, What does it mean? appeared first on The European Business Review.

]]>
By Andressa Reis and Hervé Legenvre

Today, Procurement teams need to embrace the agile ways of working in order to work in synch with their stakeholders. The present article describes how an agile vendor selection process has been adopted by a Software procurement team. It describes its advantage, how it is performed and when it is relevant. This methodology can be adapted to other market domains if the mindset and the success factors are maintained.

 

Agile methodology has been rocking the IT World in recent years. Companies use it for developing software; it helps them to be more collaborative and effective when uncertainty prevails. The Agile transformation is now calling for change in the Procurement function as the standard supplier selection process does not fit the agile mindset. The present article will outline what agile means for procurement; how a Software Procurement team has embraced agile to select suppliers for software packages like Microsoft Power Apps; and what the success factors for agility in procurement area.

 

The key for strategic agility is to recognise external changes early enough and to allocate resources to adapt to these changing environments.

What is Agile?

The word agile appears everywhere today. On a day to day basis, being agile is often associated with short daily stand-up meetings where people share feedback on projects and prioritise their next steps. Agile project management is an iterative development process, where feedback is continuously gathered from users and stakeholders to create the right user experience. This is very common for software development. Developers focus on producing manageable chunks of projects; they update their priorities daily, thanks to the feedback provided to them. Different methods can be used to perform an Agile process, these include Scrum, eXtreme Programming, Lean and Kanban. Agile project management metrics help reduce confusion, identify weak points, and measure team’s performance throughout the development cycle. Supply chain agility is the ability of a supply chain to cope with uncertainty and variability on offer and demand. An agile supply chain can increase and reduce its capacity rapidly, so it can adapt to a fast-changing customer demand. Finally, strategic agility is the ability of an organisation to change its course of action as its environment is evolving. The key for strategic agility is to recognise external changes early enough and to allocate resources to adapt to these changing environments.

[ms-protect-content id=”9932″]

Why must Procurement become Agile?

For many market segments such as software, anticipating the needs of our stakeholders and delivering the right value to each of them is an ongoing challenge. Over the years, Procurement teams have moved well beyond placing orders and helping to finalise contracts. They support the business. They deliver value. They help to gain new competitive advantages. Saving is no more the sole focus and the unique yardstick of Procurement teams. They need to bring speed; expertise and to actively manage demand. Moving towards an Agile vendor selection process is one of the steppingstones that can help procurement teams with this. Mirko Kleiner, Agile Enterprise Coach and co-founder of Flowdays, pioneered the introduction of agile to Procurement by implementing Lean in the supplier selection process.  He describes it as transforming months into days, wants into needs and pain into fun. The traditional supplier selection process we use today remains useful in a predictable and easy to understand world.  Procurement teams can focus on exploiting what they know about the needs and the market, to come up with the best and most economical solution. However, such a process is becoming less relevant as companies are changing at a rapid pace. They become more digital and employee centric. Solutions evolve fast and new ideas pop up across global and dynamic business ecosystems. In this context we need to re-invent the supplier selection process to rapidly explore opportunities. By making the supplier selection process agile, we can investigate new ideas while strengthening our relationship with “business owners”. We can create more value while simplifying our ways of working. We can gain efficiency and speed while being more collaborative.  And as we do it well, everyone will gain motivation and engagement will be reinforced. The following diagram explains the core principles behind implementing an agile vendor selection process. It works well when requirements are difficult to formalise due to the diversity of users and the variety of solutions available. To address this unpredictability, we need face to face interactions and teamwork with internal stakeholders and representatives from suppliers. Collaborating is the only way to generate feedback, transparency and to ensure adjustments are rapidly taken on board as people learn from each other. Working like this is an opportunity to simplify how we work; the quality of the discussions across all parties involved helps to focus on the essentials and to spot issues early. This creates speed and efficiency as the team progress at a steady pace with deep engagement. It also generates motivation as people feel safe to speak and enjoy working in an environment that fosters engagement. All this reinforces the spirit of collaboration and creates speed, value and satisfaction.

 

Agile Vendor selection: How does it work?

In an Agile vendor selection process, the same steps are followed, but these steps are performed with a different mindset and approach. A dedicated team with full decision-making power is established from the start of a project. Each step of the process becomes a sprint performed in a collaborative mode. Responsibilities are clear from the start and everyone has visibility on what takes place from the beginning to the end. Face to face team meetings enable creativity, efficiency and rapidity. And as the Team members complete the tender process in about 10 weeks, they can focus on the project during this short time period.

The agile vendor selection process can now be described:

Step 1: Preparation

The team consists of representatives from the business, Procurement, IT technical team, Data Protection Officer, Legal department, etc. During the preparation stage, responsibilities are assigned to them. A project plan is developed to ensure each responsibility is performed with the right mindset. The development of a team charter brings strategy alignment and builds the foundations for effective teamwork.  Requirements are expressed as user needs. They are not technical specifications, but “personas” that describe the users and “user journeys’ that illustrate a desirable user experience. One of the business owners within the Core Project team described the benefits of this: “This process is pragmatic and focused on the user needs. Before we used to describe in detail all the requirements as they came into our heads, now we don’t spend hours writing detailed requirements that were rarely fully considered by the suppliers”.

 

Step 2: The invitation letter

The suppliers receive an invitation letter requesting their participation in the agile vendor selection process. This letter outlines the “5 WH” of the tender (Who? What? When? Where? Why? How?).  The user needs are communicated with the invitation letter. Suppliers are asked to focus on how they can integrate them in their systems.

 

Step 3: Briefing and Q&A call

Following the invitation letters, suppliers are invited to a conference call to further explain the Agile vendor selection process. The Core Team presents the goals of the project, the project team, the agenda of the Agile vendor selection event and the needs. The “User Journey” presents both the touch point where humans meet and the virtual interactions. Everything is summarised in one page. This helps everyone to be concise and to focus on the key points.

 

Step 4: The Agile vendor selection event

A two-day event is organised with all the suppliers, the agile vendor selection team (also called Core Team) and some end-users. This event is an opportunity to stimulate competition amongst suppliers, while favouring collaboration with them. It is also a great opportunity to have a “live” presentation of the solutions and direct exchanges with the suppliers. The agenda of the Agile vendor selection event consists of:

  • A plenary session to present the event and its goals
  • A description of the needs, the “Personas” and the “user journey”
  • Suppliers are given some time to adjust their solutions. They call the Core project team to ask additional questions.
  • Individual sessions with each supplier take place. Members of the Core Project team evaluate the suppliers on technical and business requirements, on commercial, legal, data privacy, security and implementation matters.  The focus is on how to be successful together. Having all the suppliers and the Internal subject matter experts in the same room helps avoid misunderstanding and vague answers to questions
  • Individual final presentation of the offers to the Core Project team by each supplier.
  • During these events you get capture the real creativity from the suppliers. This prevents copy paste solutions. One of the Core Project team members we interviewed for this article is very keen on the agile vendor selection event. “There is no misinterpretation thanks to the prototypes and the exchanges. We can perform on the spot clarifications if needed. As a point is mentioned by one supplier you can easily check with another. This helps solve problems fast. It is very powerful”. The Legal teams also appreciate having breakout sessions with each supplier. This helps to gain speed on the contract side, related blocking points are discussed on the spot. It also helps to understand if the negotiation will be demanding or not.

 

Step 5: Evaluation

Following the presentation of the offers during the agile vendor selection event, the Core Project team develop a consensus on the findings and evaluations. Some calls to reference customers provided by suppliers can be placed to validate key assumptions.

 

Step 6: Contract Negotiation and Final Offer

This step enables the conclusion of contractual negotiations, building on all the discussions that have already taken place during the Agile vendor selection event. This is the opportunity to negotiate the supplier’s final offer.

 

Step 7: Award

Following the negotiations with the suppliers, the project team selects the contract awardee. Feedback is shared with the other suppliers who participated in the vendor
selection process.

 

This Agile vendor selection process is first a mindset change. The Team members lead together the tender thanks to the support of a Procurement facilitator. They are empowered, equally considered and they feel safe to intervene.

The benefits of an Agile vendor selection process

This Agile vendor selection process is first a mindset change. The Team members lead together the tender thanks to the support of a Procurement facilitator. They are empowered, equally considered and they feel safe to intervene. It also creates high degree of engagement and ownership. The Agile vendor selection process helps gain considerable speed. The team focuses on the major commercial aspects that matter and low importance details are put on a back burner. This improves tremendously the efficiency and speed. Agile vendor selections also provide transparency for both Core Project teams and suppliers on the needs, and the decision-making process during the call for tenders. This transparency stimulates competition between suppliers as they are physically face to face to win the project. By adopting an agile approach, internal stakeholders perceive a heightened value delivered by the procurement teams. The following cases illustrate when and how the agile vendor selection process has been used when Selection Software Vendors for a large Company. 

 

Case 1 – Real time Employee Performance Management Solution

First, the Agile vendor selection process was applied to select a Software Vendor for Real Time Performance Management. Four vendors were invited to this process. Their size, scope and solutions were quite different from each other. During the Plenary Session of the agile vendor selection event a Business HR Executive presented the business view of the current problems, existing processes and expectations around the solution. One of the vendors invited was an internal area who developed its own solution; however, they were also considered as a vendor on the selection process and no favourite approach was given to them. The Core Project Team and the suppliers saw this project as innovative, fun, motivating and collaborative. One of them concluded “Without the agile vendor selection event, we would have received a lot of nice slides, but slides don’t tell you what it is like to work the vendors. With such project you need to be sure you can work in a collaborative mode with the supplier.”

 

Case 2 – Asset Management Solution

The second case is the purchase of an Asset Management solution. The agile vendor selection event took place in an Indian office in since key decision stakeholders where based out there. The team members were engaged from the beginning even if Internal workshops were carried out remotely. Due to the complexity of the case and to the number of suppliers invited, the event was performed over three days. Seven vendors were invited and were extensively questioned on the architecture of their solution, on their implementation strategy. They had to perform a demo using the “personas” and “user cases” shared with them in advance. The three days enabled the selection of the right solution that matches the requirements and the creation of a straightforward relationship with the vendors. One team member was enthusiastic “the event was memorable; the vendors were surprised It was great to discuss with them and to see their reactions. Vendors did a great work. It was motivating on both sides. I was impressed by the change procurement had brought to the process”

 

Case 3 – Procurement Source to Contract Solution

In this case, the Agile vendor selection process was used as a second round of a request for proposal that was not providing the expected outputs. The first round of the procurement project had covered all the standard process steps of sending the written requirements to the vendors and receiving their formal responses with lots of documentation to evaluate. The outcome was unsatisfactory, and the Agile approach was adopted to rebuild the interest of the suppliers and reduce the full cost of the software adoption. The three finalists from the first round were invited to the event. They had the opportunity to exchange with Core project team members and to align on multiple fronts including technical, business and commercials aspects. Best and final offers were presented during the second day of the event. The Core Project team could see that vendors had aligned their proposals to the information received and that they decreased their price. A considerable amount of misunderstanding about the needs had been adjusted thanks to the event. The intimacy established with the vendors contributed to better contract negotiation and a more collaborative finalisation of the project. One of the Core Project team members summarised it “We were able to demonstrate that competing vendors could be in the same event. The energy in the room was remarkable. When they shared the results, it was good to see they understood the requirement.”

 

Agile vendor selection: Four success factors Building a great team

Throughout an agile vendor selection, the right people need to be involved from beginning to end otherwise the process can easily derail. Everyone within the team needs to be fully engaged and to fulfil their role. It requires business owners with the right expertise and the power to take decisions. The Procurement Project Manager is there to orchestrate the work of the team and to promote the agile mindset.

 

Diversity and attitude

An agile vendor selection brings together a diverse set of skills capable of tackling the complexity and the uncertainty of the project. Team members need to be proactive, open minded, capable of challenging their initial assumptions and flexible enough to tackle issues as they emerge. They are guided throughout the process, but they need to be at ease with working in an informal process.

 

Company readiness

Company readiness is essential to succeed with an agile vendor selection, you need a supportive business sponsor. An agile vendor selection process is not suited to a risk-averse culture. Everyone should be ready to embrace changing requirements and to transform mistakes into opportunities in order to do better. 

 

Logistics

The logistics of an agile vendor selection event are essential, the facilities need to be right. Performing it in an innovation centre with the support of facilitators is ideal. It is demanding in terms of organisation as it is essential to ensure that everyone attends the full event.

 

When Is an Agile vendor selection the right approach?

Not all RFPs can be performed using an Agile methodology. The following picture can help assess when it is best suited. The agile vendor selection process is well suited when there is a high degree of novelty in what is purchased, but a limited set of risks and stakes associated with the purchase.

The Agile vendor selection process is particularly compatible with projects that are inherently low risk and that do not meet highly technical and demanding requirements. The agile vendor selection team needs a good knowledge of the problem that needs to be solved, and the suppliers need to be able to rapidly demonstrate an effective response to the problems. When higher stakes and risks are at play, if the solution is easy to develop, a classic tender process performed in a systematic way is best suited; if the problem is hard to define, the solution very new, this requires a comprehensive open innovation process with multiple workshops that help stimulate creativity and ensure solid validations

 

The Agile vendor selection process is a great way to meet a small number of suppliers in a short amount of time. This avoids all the emails; the long-winded conversations; the full process can be completed in less than two months.

Conclusion

The Agile vendor selection process is a great way to meet a small number of suppliers in a short amount of time. This avoids all the emails; the long-winded conversations; the full process can be completed in less than two months. It makes the selection fair as you can collect comparable information across vendors on the spot. It provides an objective view of which one is the best and leaves no room for political manoeuvres. Nothing is hidden. You see the limits and the future perspectives of each of the suppliers. The quality of the decision-making is enhanced, and it creates great conditions for a quick start after the tender is completed. This helps to gain time for everyone. The requirement collection phase avoids going into details that don’t add value. The supplier assessment can be performed without re-assessing consistency along disjointed flows of information. It also saves time as you can quickly start a collaborative project with the supplier. But most importantly, it is a fun and enjoyable team experience where everyone ends with a great sense of achievement. The CPO of Capgemini, Emmanuel ERBA, sees the agile vendor selection process as a great asset for his organisation. It is for him a mean to address complex cross-functional topics: “Describing real life user journey, confronting in an open way the service delivered by vendors and walking the participants from several functions through a process is a rapid and valuable selection process. Obviously, it won’t replace hard commitments, but it will bring an organisation more efficiently towards the most viable solution in a given environment.

[/ms-protect-content]

About the Authors

Andressa Reis has worked as an IT Procurement buyer in several industries over the past 7 years. She has a Master in Computer Science for Business Management (Brazil) and works as a Global Category Buyer in Heineken Global Procurement (Netherlands) specialised in Software and IT purchases. She developed the Agile Vendor Selection process in 2018 while concluding her certification of Purchasing Manager in EIPM.

Hervé Legenvre is Professor and Research Director at EIPM, an Education and Training Institute for Purchasing and Supply Management. He manages educational programmes for global clients, conducts researches and teaches on innovation and purchasing transformation, Hervé holds a PhD from Université Paris Sud.

The post The Procurement Call for Agile, What does it mean? appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/the-procurement-call-for-agile-what-does-it-mean/feed/ 0
Mapping and Strategising Across Business Ecosystems https://www.europeanbusinessreview.com/mapping-and-strategising-across-business-ecosystems/ https://www.europeanbusinessreview.com/mapping-and-strategising-across-business-ecosystems/#respond Fri, 10 Mar 2017 10:37:11 +0000 http://www.europeanbusinessreview.com/?p=27268 By Hervé Legenvre and Isabelle Herbet Business ecosystems are the new unit of analysis for strategic thinking; they offer fertile grounds for innovation. This article discusses how managers can map, […]

The post Mapping and Strategising Across Business Ecosystems appeared first on The European Business Review.

]]>
By Hervé Legenvre and Isabelle Herbet

Business ecosystems are the new unit of analysis for strategic thinking; they offer fertile grounds for innovation. This article discusses how managers can map, analyse and take advantage of business ecosystems. It is illustrated by a nice case study inspired by the work done at Groupe Seb, a leader in Small Domestic Appliances and Cookware industry.

 

Mapping and analysing ecosystems is about identifying, testing and selecting options to create and capture value. It is about forming new hypotheses and defining how they can be tested and implemented.

Business strategies should be established based on how a business ecosystem is likely to evolve, not on what we think we can excel at. Mapping and analysing ecosystems is about identifying, testing and selecting options to create and capture value. It is about forming new hypotheses and defining how they can be tested and implemented. To achieve this, a new approach is needed that will help to navigate the complexity and uncertainties of the business ecosystem landscape.

With the ongoing transformation of the business landscape, many industry boundaries have drifted, blurred and changed. New players emerge while other business activities are unbundled. Value chains are continuously sliced and reshaped. Radical innovation and experimentation are led by communities of small disruptive players while cost competitiveness has to be built on the back of existing and emerging champions that leverage scale effect. In between, collaborations with integrators and technology leaders can remain essential to succeed. In this context, thinking in terms of industry and value chains can be misleading; the business landscape is best described as a continuously changing ecosystem where some relationships and collaboration need to be abandoned while others need to be strengthened, initiated or nourished.

Evidence for this is plentiful. The automotive industry has to re-invent itself with the development of self-driving car technologies and the rise of mobility services. For many years the key players tended to work within closed circles. But now they turn to new players to access technology and capabilities that are new to the industry. In other industries such as banking or logistics, startups are challenging established players without disrupting the industry. This leads established and emerging players to simultaneously compete and collaborate. Utility companies need to co-create smart technologies with industrial science leaders in order to offer new business models and services. In the current economic and business environment, companies have to reconfigure their ecosystem of clients, suppliers and partners to transform their value proposition, their business models and secure new competitive advantages.

[ms-protect-content id=”9932″]

Navigating a business ecosystem requires new frameworks, tools and checklist to map where the key players are and to take effective decisions. Executives and professionals have to confront a few fundamental questions. They need to define how they can strive for success in the future. They have to foresee against whom they might compete. And they should explore with whom they will build the needed partnerships and alliances. A great value proposition needs the right business model to deliver its full potential, scale up and capture profit. A great value proposition and a great business model can only flourish if a fertile ecosystem exists. Winning in times of change calls for a certain level of alignment between these three
strategic lenses.

Since the late 1970’s, Traditional Strategy Frameworks have been widely used across business functions. Strategy teams look for entry barriers and attractive positions within an industry. Marketing teams identify where competition comes from and the most profitable market segments.  Procurement teams see where they can best leverage competition across suppliers and where they might have to face risks of dependency. However, to go further and address the full business ecosystem we have identified 4 key requirements that should help us to enrich the traditional analysis:

1. Adopting a broad perspective while diving when necessary: Focusing on the main lines of tension with customers, suppliers and competitors should not prevent us from zooming out and looking at the broader perspective to see where opportunities and threats are forming. When the broad view is secured, it is possible to look at more specific sections of the ecosystem.

2. Thinking collaboration: We need to move beyond looking mainly at who competes with whom and dedicate similar levels of attention to who collaborates with whom across the full ecosystem.

3. Being future-oriented: Analysing the current business game is useful but limited. Looking ahead to anticipate forthcoming transformations and to keep options open is of equal importance.

4. Searching for new sources of competitive advantages: Looking at how existing entry barriers currently canalise profit in certain directions, should be complemented by understanding how new business models, technology, data, patents or other specific business advantages could change the rules of the game in the future.

Mapping and strategising across ecosystems do not always require lengthy analyses. This needs a structured and systematic approach that can be used to sense and frame opportunities. It is best used as an iterative process that can be extended and refined over time. It should not be approached as a solitary, short and dense analysis that is quickly set aside. While the templates proposed here are a good basis for communication and decision making, the real value is in the quality of the interactions established to gather all the useful pieces of information together. In some instance, one person is able to access the key people who hold the relevant information to sketch a first set of hypotheses that can be enriched, challenged and tested. In other instances, multifunctional or even consortium teams might be needed. In such cases, the value includes the enriched perspective and mutual understanding gained by all players.

The rest of this article offers a six step process that managers can use to map, analyse and take advantage of business ecosystems. It is illustrated by a case study on blenders. It is inspired by the work carried out at Groupe Seb, a leader in Small Domestic Appliances and Cookware industry. The case is meant to illustrate the use of the six steps process. It is not meant to describe accurately how Groupe Seb approaches its business ecosystem.

The key steps proposed: (see Figure 1)

 

 

1. Define the scope

To start on the right foot, you need to clearly define the scope. This can include expressing specific business segments, market segments, and type of transformation you will investigate. In any case it is essential to state the orientation you want to base your ecosystem analysis on.

The SEB group is a/the leader in Small Domestic Appliances and Cookware industry. It has multiple brands and each brand has a specific positioning. One of the products present in more than one of the brands is a blender. A blender is used for food preparation; it mixes things together by liquidising, chopping or pureeing ingredients. The major components of a blender include: the motor, the electronic components (PCBA), the glass Jar, blades and material housing. In order to understand how the supplier network and the wider ecosystem could support the different product segments and brands, it is essential to understand and focus on how the brands try to differentiate from each other.

  • KRUPS values: Precision(high cooking performance) – Reliability (Robust design) – Passion
  • MOULINEX values: Intuitive (easy to use and clean) – Performing solutions (multi-function) – liberate desire

It was also decided that the focus should be on the European Market as markets dynamics differ throughout the world.

 2. List key trends

Thinking in terms of trends before mapping ecosystems is very valuable. It helps to anticipate forthcoming transformations. It ensures that important future players that are not yet on the Radar are more likely to be identified. Beginning with megatrends is a nice starting point. Many companies have already looked at the impact of the megatrends on the business as part of their strategic planning process. Then it can be useful to survey the business trends that can affect the company and its ecosystem. A simple but useful checklist consists of expected changes in terms of economic conditions, customer needs, market evolutions and technology development. Finally evolution if the political, legal or environmental context can be considered. Cross functional work interviews can be used to ensure that important trends are identified. They should be precise enough to foresee their possible implication for the company and its ecosystem. For Groupe SEB and the analysis of the blender ecosystem, the following trends were identified:

  • The European market is mature and therefore the growth potential is low
  • Omni channel retailing (connection between stores, e-commerce, Apps and Social Media) is becoming pervasive across Europe
  • Cooking is becoming a valued life experience. TV series such as MasterChef play an important role in driving
    this change
  • Couples and individuals have active lives. They value everything that is fast, efficient and easy for daily cooking. However they enjoy being more sophisticated, elaborated for occasional cooking (friends, family…)
  • Touchscreens are increasingly used for high end
    domestic appliances
  • New technology (electronics, sensors and actuators) are being integrated in the final product
  • Healthy food as well as simple, nice and tasty cooking is more and more valued by consumers. This includes smoothies and whole-fruit juice.
  • There is a growing tendency to offer connected products to consumers
  • European standards & norms are strict about material
    & safety

3. For each trend, identify relevant players or clusters of players

The next step consists of identifying the players or cluster of players that could be relevant to include in the ecosystem. Existing partners, suppliers or distributors are easy to identify. The trends can help identify the ones that are taking a rising role or that could play a role in the ecosystem in the future.

For  Groupe SEB, existing important players for blenders include, traditional distributors, specialist retailers, motor suppliers, the electronic components (PCBA) suppliers, the glass Jar suppliers, blade suppliers and material housing suppliers. From the trends a number of players who will play a more significant role in the future were identified. They include:

  • Online distributors and Social Media players as part of the Omni channel trend
  • Nutritionists due to the healthy food trends
  • Ergonomists due to the ease of use and safety imperatives
  • Technology suppliers further down the value chain (touchscreen, electronics, sensors, actuators)
  • Connected technology startups due to the connected product trend

After a quick check it was identified that new material suppliers who were traditionally seen as suppliers of suppliers should also be taken into consideration in the analysis as their importance was growing.

4. Map players within the ecosystem

The evolution of an ecosystem results from the knitting of collaboration and competition forces over time.

When most players have been identified, the ecosystem can be mapped. Four groups of players can be identified: (1) customers, (2) existing value chain players, (3) potential or rising members of the value chain, (4) influencers. They can be displayed on a map as presented underneath. Existing value chain players can include players that are not directly in contact with the company studied. The supplier of a supplier can be considered as part of the ecosystem. It is not necessary to be exhaustive. Only key players in relation with the scope and ambition of the ecosystem analysis should be considered. We mention both potential and rising players in the third group as when ecosystems change, fast clear cuts cannot easily be made. Influencers are typically organisations, institutions and groups of people who can influence the evolution of the ecosystem without being part of the overall supply chain. They influence on decisions, enable or hinder changes without having a strong economic stake in the process. Standardisation bodies, economic development organisations and non-profit making organisations figure often as influencers.  Here, in terms of influencers, two new players were identified: Standard Bodies and Social Regulation Bodies.

 

 

5. Analyse the dynamics across the ecosystem

At this stage, we have adopted a forward looking perspective and identified all the key players within the ecosystem. To analyse the potential evolution and the dynamic of the ecosystem it is important to identify on the one hand: Who competes with whom today? Who could compete with whom in the future? And on the other hand who collaborates with whom today? Who could collaborate with whom tomorrow? The evolution of an ecosystem results from the knitting of collaboration and competition forces over time.

Here a number of questions can be used to support such an analysis.

 A first level of questions include:

  • Across the ecosystem, who competes with whom? Today? Tomorrow?
  • Across the ecosystem, who collaborates with whom? Today? Tomorrow?

Second level questions to deepen the analysis can include:

  • Which players have similar agendas or interests?
  • Which players have conflicting agendas or interests?
  • Who is attractive to whom as a partner?
  • Who could own specific data, patent, assets or other special business advantages that will be critical in the future?

For the analysis of the Blender ecosystem, we present some of the main areas of competition and collaboration. What appeared here are opportunities to widen existing collaboration to new players (see Figure 3 below).

 

 

6. Establish some recommendations

Building on the previous analysis, some recommendation should be developed. This can include a diversity of action or even a set of “what if” scenarios. However classic key actions can be classified in three broad areas and include:

  1. Scouting and intelligence
  • Performing Scouting activities to gain further market and technology intelligence or to find new partners
  • Monitoring of specific evolutions and development within the ecosystem
  • Taking an active role in knowledge exchange hosted by clusters or other networks
  1. Early exchange of information
  • Establishing accelerators and innovation centre to engage new players in exploratory activities
  • Encourage key people to start exchanging with specific ecosystem players
  • Establishing early exchanges with specific key players within the ecosystem.
  1. Managing a portfolio of collaborations
  • Establishing new collaborations with one or more ecosystem players
  • Reducing progressively the level of collaboration with some ecosystem players
  • Strengthening existing collaborations

For the Blender Ecosystem, the conclusion was that it would be fruitful to encourage collaborations amongst:

  • Motor and blades suppliers, and nutritionists
  • Designer and glass jar supplier, and ergonomists
  • Housing material and Tier 2 material suppliers
  • PCBA supplier, Electronics suppliers and some start-ups which are working on connected technologies

Each brand would be expected to take the lead in specific areas while sharing information as progress materialises.

The focus included the development of a strong relationship with key motor manufacturers in order to develop high performance motors while leveraging the market power of the company to source cost effective motors with variable power or speed.

In order to contribute to improve the cooking performance, it was decided to initiate with a transversal team – including participation from purchasing, marketing and R&D – some technical workshops with motor suppliers, blades suppliers and nutritionist partners. The idea was to run some “design thinking” workshops with all the players in the same room. At the same time, the team decided to gain market intelligence related to technologies from
other industries.

As the glass jar has a big impact on the design, R&D and Purchasing decided to work together to facilitate the collaboration between the Designer and the glass jar manufacturer.

R&D and marketing decided to work upstream and prepare a middle-long term (5 years) roadmap defining in which direction do we want to go in terms of Cooking Performance and share with the members of the future collaboration before the organisation of workshops with players from the ecosystem.

It was also decided to identify some start-ups working on connected technologies and to initiate a co-development with PCBA suppliers, key tier 2 components and a start-up.

Finally through Market Intelligence, it was also decided that the company should identify new material with new performances, new aesthetic, and decoration or painting process.

 

Conclusions

As existing value chains are sliced, diced and experience significant transformation; mapping and strategising across business ecosystems becomes a critical capability companies need to master.

As existing value chains are sliced, diced and experience significant transformation; mapping and strategising across business ecosystems becomes a critical capability companies need to master. The process outlined above is a proven starting point to progress in this direction. Each step is essential but most importantly this needs to be turned into a continuous
collaborative process.

The following diagram outlines how ecosystem mapping integrates with other strategic lenses such as the business model canvas and the value proposition designer (Alexander Osterwalder, 2014) (see Figure 5 below). Combining the three views provides a truly holistic view of how a business can co-evolve with its environment. This also outlines that Ecosystem mapping can benefit from being used in conjunction with other business tools.

 

 

How This Nethodology was Developed

Over the past 5 years, one of the co-authors has taught a course on the topic of innovation and entrepreneurship. Participants were executives and a high potential population working in procurement departments of large and middle-sized companies. The course led them through the use of innovation frameworks to sense, seize and realise business opportunities. If many existing tools and frameworks were readily applicable by the participants, it quickly appeared that the traditional ways of looking at the supply market needed to be significantly expanded. Participants required new techniques to snorkel and dive across business ecosystems to anticipate forthcoming changes and spot new opportunities ahead of others. As the need for new tools and frameworks was recognised, a multiyear action learning initiative was quickly on its way. The methodology was enriched over time and evolved into a solid methodological flow that has been applied with great learning to many business contexts. Participants of the course used and tested it individually as well as in small groups to study a diversity of industry challenges. This methodology owes a lot to the many people who embarked on this learning Journey. Their challenges, questions and suggestions nourished the development of the methodology. They used, bent and sometimes changed it according to their needs.

The second author was keen to use and experiment with the methodology within her own company environment. The blender ecosystem appeared as a very interesting case that was rich, easy to understand and illustrative of the key methodological issues. The co-author was keen to refine her early work and contribute to the development of a case to be used for dissemination of the methodology to a wider public and for educational purposes. The case presented in this paper has been adapted from real business challenges and relevant experiences. However, it has been simplified to first avoid any confidentiality concerns and secondly to offer pedagogical value.

[/ms-protect-content]

 

About the Authors

Hervé Legenvre is Professor and Global Executive MBA Director at the EIPM, a Training Institute for Purchasing and Supply Management. He manages educational programmes for global clients, conducts researches and teaches in the fields of innovation, and sustainability across the value chain. Hervé holds a PhD from Université Paris Sud.

Isabelle Herbet is Asia Purchasing Director for raw material, components and subassemblies for Groupe SEB, and specialist in Cookware and Small Electrical Appliances. She seeks to bring a Sustainable Competitive Advantage for the Group, managing Purchasing centers, Purchasing Performance & Operation and Category Management in Asia to drive Value creation to serve the Business.

References

1. Alexander Osterwalder, 2014. Value Proposition Design: How to Create Products and Services Customers Want (Strategyzer). 1 Edition. Wiley.
2. J. Moore, “Predators and Prey: The New Ecology of Competition”, Harvard Business Review, vol.71, no.3, 1993

 

The post Mapping and Strategising Across Business Ecosystems appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/mapping-and-strategising-across-business-ecosystems/feed/ 0
Leveraging Collaborations to Create Shared Value https://www.europeanbusinessreview.com/leveraging-collaborations-to-create-shared-value/ https://www.europeanbusinessreview.com/leveraging-collaborations-to-create-shared-value/#respond Tue, 31 May 2016 16:41:37 +0000 http://www.europeanbusinessreview.com/?p=9851 By Hervé Legenvre, Francois Bacalou & Hugues Schmitz In this article the authors discuss how the competitiveness of an organisation and the health of the communities around it are mutually […]

The post Leveraging Collaborations to Create Shared Value appeared first on The European Business Review.

]]>
By Hervé Legenvre, Francois Bacalou & Hugues Schmitz

In this article the authors discuss how the competitiveness of an organisation and the health of the communities around it are mutually dependent, connecting the two ideas and showing how to make them work through a series of case studies in the water and sanitation industry. This offers a thought-provoking perspective for redefining and achieving business performance.


We are on the verge of a change on how companies improve their performance and gain market advantages. Business leaders take with greater serious the idea that the competitiveness of their company and the health of the communities around it are mutually dependent. The same business leaders realise they can gain benefits from developing and strengthening collaborations with suppliers and partners.

 [ms-protect-content id=”9932″]

In the midst of a change

Over the past decades, the management of suppliers has been dominated by supplier reduction, arm-lengths communication and short term action. It is now difficult for many companies to seize innovation opportunities along their supply chain (Choi, Linton, 2011). They need to attract and work collaboratively with existing and new partners to gain a competitive position.

At the same time, companies have become subject to the scrutiny of stakeholders for their social and environmental performance. This impacts business results through customer preferences, brand and people engagement. This offers opportunities to innovate and grow. Business leaders increasingly embrace shared value creation (Porter & Kramer, 2011) which means that well-thought value creation for business can simultaneously yields profit and greater social impact.

Today social and environmental performance offers opportunities to innovate and grow.

By conflating these two streams together companies can deliver value for the business while increasing their environmental and social performance. This can be achieved by developing and strengthening collaborations with partners along and beyond the supply chain. Through a series of four case studies in the water and sanitation industry we have identified five success factors that can help business leaders to embark successfully on this transformation.

 

The 5 key success factors

Strategy and culture

Creating shared value through collaborations needs to be part of the company strategy. It has to become central to how decisions are taken and to how progress is reviewed. Two questions need to be revisited on an ongoing basis:

How social and environmental issues can help us reengineer our offerings, our value chains and our business models?

What are the opportunities for collaboration that can positively impact environmental, social and business performance?

This requires a continuous strategic dialogue between customer and supplier facing functions. Our tendency to continuously simplify our understanding of the external environment quickly brings people back to the old paradigm. Hence, perseverance is essential; you need to continuously remind people of the logic that underpins this emerging paradigm. If this new way of thinking is not yet part of the strategy one can harness an entrepreneurship spirit and show that it works through grassroots initiatives.

 

An open mind-set about collaboration

This transformation requires looking at partners from multiple perspectives. By understanding the trends at work in your industry you can foresee the players that will matter tomorrow. When potential partners have been identified, it is essential to assess if a strategic fit exist amongst them. This is not about mutual dependence but about sharing common agendas, interest and long term aspirations. In some instances you will strengthen collaborations with existing suppliers and partners to further improve. In other cases, when breakthrough changes are on your radar, you might have to work with new players. Some relationships might need to be abandoned, others have to be developed. Leaders willing to maximise the value they create need to look at collaborations with three questions in mind:

If they strengthen their collaboration with us how can this contribute to enhance our performance and deliver our strategy?

If we strengthened our collaboration with them how can this contribute to enhance their performance and deliver their strategy?

Looking further into the future, what can we achieve together that we could not accomplish alone? Can we reach new levels of performance together?

 

Trust and Transparency

Creating trust and transparency were instrumental in the four case studies. First, it needs to underpin internal collaborations before extending it to external partners. After detecting where strategic fits exist, a shared vision has to be established. This requires facilitating iterative strategic dialogues where leaders from all sides adopt an open mind-set. Unceasingly preferential treatment needs to be earned by both parties. Reaching deeply rooted trust calls for patience, mutual understanding and common processes. The more progress you make together the more trust you have between the partners. Ultimately, you reach a level of transparency that offers real advantages. This generates value, leads to employee motivation, engagement and improved performance. Continuity of attitude is critical. As people change jobs, newcomers might be tempted to seek short terms benefits by using more combative approach; the collaborative approach needs to be continuously monitored.

 

Measurement and value sharing

Fourth, there is a need for a new measurement and sharing culture. Collaborations thrive on common ambitions and complementarities. All players can share common goals and targets but they also need to have specific goals and targets that reflect their unique contribution to the collaborations. These should be reflected in contracts using effective incentives such as revenue and risk sharing models. Jointly defined and transparent measurement related to performance, costs and revenues help to develop fair solutions. In the end, the overall value is shared amongst the partners, but everyone should keep in mind it is created together. For the projects presented underneath the social and environmental impact were also measured using relevant performance indicators. These results were presented next to the business results. This helps to maintain the shared value creation holistic perspective.

 

People skills and leadership perseverance

Specific skills are needed to support collaborations that create shared value. People need the right mind-set to work collaboratively within and outside the company. They need a combination of business acumen, partnership management skills and soft skills. This can be difficult to find, develop or retain. Partnership management skills include the ability to identify and validate opportunities, to assess the strategic fit with partners and to continuously design and facilitate meetings that support the collaboration. From a soft skill point of view it requires an ability to empathise with internal and external players and to positively engage them in teamwork. This requires continuous attention from the leaders who need to take the long view and create a climate where people are encouraged to persevere and where error is accepted and considered as an entire part of the learning process.

The four case studies

The following four case studies outline how shared value was created together with existing and new partners by SUEZ water activities in France. Each case outlines what was essential to creating shared value through effective collaborations.

 

The smart metering project

In the water utility business, metering is the corner stone for a fair billing capability (see table below). Historically reading was done manually once a year. Advanced Meter Reading (AMR) technology is a breakthrough. Availability of frequent and accurate data is an unprecedented platform to offer new services to customers. As water is a basic need it allows offering customised billing scheme and assistance programs to low income population.

Specific skills are needed to support collaborations that create shared value. People need the right mind-set to work collaboratively within and outside the company.

Ten years ago, SUEZ won the service for a French city that was particularly interested in deploying AMR technology. No on-the-shelves solution existed, and ground-breaking developments were needed. A cross functional team scouted the ecosystem and looked for technology and partners. The final objective was to accurately collect, clean, analyse and communicate useful data to customer. Suez Water needed the expertise of both a meter supplier and a radio system provider. They would concentrate on data production, transfer and management, while the operator would boost its leadership by providing advanced services to the end customer such as on time leakage alerts and repairs. The stakes were high! An integrated solution calls for all parties to be jointly responsible for delivering the innovation and openly sharing their expertise. With three actors being regarded as market leaders, collaborating effectively was critical but demanding. This led to a 10 years partnership structured around five points:

A co-investment in and a co-ownership of the technology.

An innovative cost and profit sharing model. The partners shared the initial investment and the returns according to their respective contribution.

A coordinated commercial strategy. The partners bring together their sales networks to boost the promotion of the solution.

A royalty mechanism. Products were intended to be sold to new customers. The profit is shared among the partners.

A periodical technology review to continuously challenge the roadmap and maintain a leading edge position.

The collaboration required open and transparent governance. It is sometimes hard to restrain people to favour short term gains at the expense of longer term benefits. All attempts to derail the collaboration had to be addressed in a timely manner.

 

collab-table1

 

Today a first generation of products has been launched and adapted to the gas market. Millions of units have been sold. A new generation is under development. The value created benefits the three partners by creating a new technical standard on the market which led to differentiation. As implementation started, it also provided value to society by offering a more resource effective water management system and enabling the implementation of these billing schemes for low income population.

Joint Improvement Program for sewage cleaning services

The cleaning of sewerage networks is a core business activity of SUEZ (see table below). It can impact its performance and reputation. It requires a truck fleet equipped with high pressure pumps. They are operated by qualified operators who intervene in difficult conditions. Pressurised water is pulled through pipes to scrub the sides of dirty drains, break apart clogs and flush out residue. Security and compliance to regulation are of utmost importance. The French market is fragmented and lacks reliable service providers. The operator had developed relationship with one service provider. The quality of service and the productivity were not matching the expectation of SUEZ and the service provider was unsatisfied with its profitability. As all signals were turning to red, SUEZ decided to try another approach. A joint improvement program was offered to the supplier. This aimed at improving the overall performance through a long term collaboration focused on continuous improvement. The aim was to increase the competitiveness of both players through cost improvement and innovation without compromising with quality.

 

collab-table2

 

First, the key challenge was to convince the service provider to move from a focus on price to improvements. This happened through extensive dialogues. A fair approach to sharing costs and benefits was agreed. Targets based on shared value creation principles were agreed. A cross functional project team was established to manage this new-born collaboration.

The following months saw the development of a partnership supported from both sides. The team collected data and developed detailed cost models for each steps of the operation process that were shared openly to establish a common baseline. Value engineering techniques were used to identify improvement opportunities. Workshops helped scope and develop new ideas that led to new business practices, processes, equipment and organisational changes. This helped to determine the impact of each improvement on the final cost structure.

As a consequence, a new planning system was implemented by the operator. As visibility improved, the service provider was willing to invest in specific equipment and to redesign its organisation. A new contract was established between the two companies. It included a formal review process to follow up the joint improvement plan and the overall financial goals. Over the years this allowed the companies to meet their respective goals by moving from a combative to a collaborative logic. Both were able to gain a better competitive position.

This enduring partnership between SUEZ and its service provider created value for both. Results related to productivity gains and late interventions improved year on year. For the service provider, this guided them towards positive impact on its EBITDA. From a social and environmental performance perspective, the service provider could claim that it had helped a supplier re-gained a sound competitive position.

Working with key partners to positioning the operator as a key player of local development

Today French municipalities and local authorities are concerned with the development of small and medium enterprises on their territory (see table below). They are keen to help entrepreneurs grow their business. Their competiveness depends on their ability to gain access to competitive suppliers. However their buying power and purchasing expertise is limited. SUEZ saw this as an opportunity. Thanks to its size and procurement expertise it benefits from established relationships with strategic partners who offer favourable terms and conditions. At the same time, a critical component of its strategy is to become the preferred partner of municipalities and local authorities. Beyond its traditional core business of supplying utilities, it works with its clients on new initiatives to improve environmental and societal performance. By connecting procurement capability and business development strategy it became relevant to offer small and medium size companies located on the territory of clients an exclusive access to competitive conditions on a large range of products and services. The benefits go beyond offering attractive prices; they enjoy premium services and environmentally friendly products.

Beyond its traditional core business of supplying utilities, SUEZ works with its clients on new initiatives to improve environmental and societal performance.

The operator had to set up new partnerships with a sub-set of historical partners. The objective was to develop an attractive offer that matched the needs of small companies and provided them with a competitive advantage. This includes a wide range of general supplies and services as well as business specific mechanical, electrical, health and safety products. For each category, a medium term exclusive agreement was signed with a preferred partner outlining conditions to be proposed to future users.

 

collab-table3

 

For the partners this was an opportunity to develop their market share on the small and medium size market segment. The exclusivity agreement was perceived as a real opportunity to consolidate their position on the market.

In practical terms, the local companies subscribe to become members of the purchasing services. They can deal directly with the exclusive partners and benefit right away from favourable conditions. The objective is to propose this service to municipalities and to work closely with their economic development department to offer its added value to small and medium companies.

The first pilots were conclusive. Beyond favourable pricing, the companies that subscribed appreciated the quality of products and services provided. However, the most significant benefits were the time and resource saved by easily and immediately accessing relevant suppliers and products.

The Municipalities are enthusiastic and appreciate this unique contribution to support their own objectives. It is perceived as a positive contribution to Public-Private Partnership, indeed partners can investigate new avenues of collaboration and new opportunities to boost territory attractiveness.

Co-innovation with network equipment suppliers

In the water business, having innovative and simple network equipment is of utmost importance (see table below). This network connects the water production installation, the water transport and distribution network, reservoirs, storage tanks, fire hydrants, and the final users. Assets management and ongoing maintenance is a sensitive customer issue and new functionalities have to be frequently integrated to match customer’s evolving expectation. Achieving standardisation is valuable but demanding and all operations including installations needs to be easy to perform.

 

collab-table4

 

Over the years, the relationship between SUEZ and its local network equipment suppliers consisted of discussions on price and volumes. No relevant collaboration on innovation existed across the value chain. The supply market was increasingly dominated by low cost country supplier. This was reducing local production, quality problems were on the rise and environmental impacts were not going in the right direction. Price pressure became exacerbated by inflation and raw materials price. Furthermore there was a lack of exchanges and subsequently alignment between the technology roadmaps across the industry. As concerns started to surge the operator decided to develop co-innovation projects with existing strategic suppliers. The operator wanted to position itself as a lead user that stimulates innovation. Benefits were expected in terms of performance and cost across the value chain.

The first project was investigating composite water surface box and the second one stainless steel connection collar for water network. Both started with a joint assessment of new ideas with the partner. The significance of the market opportunity and the technological compatibility with the industry were assessed. This included defining the product functionalities and the expected performance. Then some assumptions about the market potential and the target price could be developed. Cross-functional project teams were designed gathering all technical, operational and business needed skills from both sides. The partners signed with SUEZ Letters of Intent that included joint development goals and the IP. In the following step the project team integrated some of the supplier’s engineers to develop a proof of concept based on value analysis and validations to be carried out during the design phase. After, long term contracts including commercial agreements specifying value sharing rules were signed with both suppliers. The team could then move to product design, industrialisation, qualification and deployment. Toll gates reviews allowed managing effectively the development.

Co-innovation with a local partner is a strong tool to develop local competitiveness.

As the raw material price appeared as a key issue for the supplier involved in these projects but also for other network equipment providers, SUEZ decides now to look further in its ecosystem and to extend its network of strategic partners to tier 2 polymers producers. The focus was to qualify innovative polymers for water applications to be used by the suppliers to bring significant cost and performance benefits.

The two Co-Innovation projects were a success. They allowed all partners to meet their goals in terms of cost optimisation and innovation. Investing in such projects offers SUEZ with productivity gains and increased expertise. On the societal side, the project delivered improvement in terms of environmental footprint and the development of local employment. Co-innovation with a local partner is a strong tool to develop local competitiveness. In these cases, one of the projects helped maintain employment in the country and in the other case it allowed to bring back production in the country.

About the Authors

legenvre-webHervé Legenvre is Professor and Global Executive MBA Director at the EIPM, a Training Institute for Purchasing and Supply Management. He manages educational programmes for global clients, conducts researches and teaches in the fields of innovation, and sustainability across the value chain. Hervé holds a PhD from Université Paris Sud.

bacalou-webFrancois Bacalou started to work with Suez in 2002 in the role of Chief Procurement Officer for Lyonnaise des Eaux. He joined Suez Australia as Chief Procurement Officer in September 2014. Prior to this, Francois held roles in operations, finance and sourcing with Motorola Electronic Group in France, UK and USA. Francois holds an engineering degree in electronics and an MBA from Purdue University (USA)

schmitz-webHugues Schmitz has worked within the utility industry over the past 20 years in several procurement and logistics functions. He joined Suez Water France in 2007 as procurement manager and then as Chief Procurement Officer since September 2014. Hugues holds an international business degree and an MBA from EIPM (France)

 

[/ms-protect-content]

 

The post Leveraging Collaborations to Create Shared Value appeared first on The European Business Review.

]]>
https://www.europeanbusinessreview.com/leveraging-collaborations-to-create-shared-value/feed/ 0