Published February 10, 2026

Alibaba Chairman Explains Why Full-Cycle Companies Win in Open-Source AI

At the World Government Summit 2026, Alibaba Chairman Joe Tsai discussed which companies will dominate the development of open-source AI models and why owning the entire technology chain is becoming a decisive factor for success.

Business
Event Source: Alibaba Cloud Reading Time: 4 – 5 minutes

Joe Tsai, Chairman of Alibaba, spoke at the World Government Summit 2026, sharing his thoughts on the trajectory of the open-source AI market. His key thesis: strategic advantage will go to companies that control the entire tech stack – from microchips to the final software product.

Definition of a Full-Cycle AI Company

What Is a Full-Cycle Company in the Context of AI

When people talk about the «full-stack», they usually mean traditional development: the interface, the back end, and databases. However, in the artificial intelligence industry, this concept has been transformed.

A full-cycle company in the AI world is an organization capable of independently handling every stage: from designing its own processors and building infrastructure for model training to creating ready-to-use services for users. In other words, such a company has no critical dependence on external suppliers at key production stages.

Alibaba is exactly that type of company: the holding group has its own cloud division, manufactures chips, develops models, and implements services based on them. Therefore, Tsai's stance looks not just like a forecast, but as a rationale for the corporation's long-term strategy.

Challenges of Implementing Open Source AI Models

Why Open Source Is Accessible to All, but Beneficial Only to Some

Open-source models are neural networks with source code that can be downloaded, studied, modified, and operated. At first glance, such accessibility levels the playing field: the code is open to everyone, just use and develop it.

However, Tsai emphasizes a crucial nuance: the availability of a model doesn't mean equality of opportunity. It is one thing to gain access to a model's «weights», and quite another to have the resources for its effective deployment, fine-tuning, and scaling.

Full-cycle companies can not only release an open model but also offer a complete ecosystem around it: cloud infrastructure for launch, adaptation tools, and technical support. This creates value-add that is virtually impossible to copy, even if the model's code itself is in the public domain.

Infrastructure Control as a Competitive Advantage

Training large language models requires colossal computing power. If a company rents servers from third-party providers, it becomes a hostage to their pricing, limits, and terms. Having its own infrastructure allows it to optimize processes for specific tasks, strictly control costs, and significantly speed up the cycle of experiments.

The same applies to hardware. Developing custom AI chips allows for tailoring the architecture to the specifics of particular models. This is not a requirement for everyone, but in the long run, it provides a major edge, especially at the scale of Chinese or American tech giants.

Essentially, Tsai argues that while open code is important, the true power lies in the ability to operate that code effectively. And for that, a powerful infrastructure is needed – ideally, one's own.

Leading Companies Building Full-Cycle AI Ecosystems

Who Else Falls Into This Category

Alibaba is not the only player building a full cycle. Meta releases open models from the Llama family, training them on its own infrastructure. Google designs specialized TPU processors and develops a cloud platform for its needs. Amazon is actively improving AWS and investing in the development of custom «hardware».

Notably, there are almost no traditional startups among these players. Building a full cycle requires massive investments, years of experience, and a scale that is currently only available to the largest tech corporations.

Future Outlook for the Open Source AI Market

What This Means for the Industry

If Tsai's forecast is correct, the future of open-source AI will be shaped not by the efforts of independent enthusiast researchers, but by major corporations with the resources to maintain a full development cycle.

This is not necessarily a negative scenario. Open models from these giants remain accessible to everyone: they can be integrated into projects and adapted for local tasks. However, the rules of the game in this ecosystem will be dictated by those who control the clouds and chip manufacturing.

The main question remains open: will small companies manage to find their niche in this paradigm, or will the market finally consolidate around a narrow circle of tech giants? While there is no definitive answer yet, Tsai's speech clearly outlines the direction in which the leaders of the AI arms race are moving.

Original Title: Joe Tsai on the Future of Open-Source AI: Why Full-Stack Companies Will Excel
Publication Date: Feb 9, 2026
Alibaba Cloud www.alibabacloud.com A Chinese cloud and AI division of Alibaba, providing infrastructure and AI services for businesses.
Previous Article AMD Shows How to Train Large Models Without the Fear of Losing Progress to a Single Crash Next Article Oracle Academy Trains Specialists for AI-Era Data Centers

From Source to Analysis

How This Text Was Created

This material is not a direct retelling of the original publication. First, the news item itself was selected as an event important for understanding AI development. Then a processing framework was set: what needs clarification, what context to add, and where to place emphasis. This allowed us to turn a single announcement or update into a coherent and meaningful analysis.

Neural Networks Involved in the Process

We openly show which models were used at different stages of processing. Each performed its own role — analyzing the source, rewriting, fact-checking, and visual interpretation. This approach maintains transparency and clearly demonstrates how technologies participated in creating the material.

1.
Claude Sonnet 4.5 Anthropic Analyzing the Original Publication and Writing the Text The neural network studies the original material and generates a coherent text

1. Analyzing the Original Publication and Writing the Text

The neural network studies the original material and generates a coherent text

Claude Sonnet 4.5 Anthropic
2.
Gemini 3 Pro Google DeepMind step.translate-en.title

2. step.translate-en.title

Gemini 3 Pro Google DeepMind
3.
Gemini 3 Flash Preview Google DeepMind Text Review and Editing Correction of errors, inaccuracies, and ambiguous phrasing

3. Text Review and Editing

Correction of errors, inaccuracies, and ambiguous phrasing

Gemini 3 Flash Preview Google DeepMind
4.
Gemini 3 Flash Preview Google DeepMind Preparing the Illustration Description Generating a textual prompt for the visual model

4. Preparing the Illustration Description

Generating a textual prompt for the visual model

Gemini 3 Flash Preview Google DeepMind
5.
FLUX.2 Pro Black Forest Labs Creating the Illustration Generating an image based on the prepared prompt

5. Creating the Illustration

Generating an image based on the prepared prompt

FLUX.2 Pro Black Forest Labs

Related Publications

You May Also Like

Explore Other Events

Events are only part of the bigger picture. These materials help you see more broadly: the context, the consequences, and the ideas behind the news.

Want to know about new
experiments first?

Subscribe to our Telegram channel — we share all the latest
and exciting updates from NeuraBooks.

Subscribe