Joe Tsai, Chairman of Alibaba, spoke at the World Government Summit 2026, sharing his thoughts on the trajectory of the open-source AI market. His key thesis: strategic advantage will go to companies that control the entire tech stack – from microchips to the final software product.
Definition of a Full-Cycle AI Company
What Is a Full-Cycle Company in the Context of AI
When people talk about the «full-stack», they usually mean traditional development: the interface, the back end, and databases. However, in the artificial intelligence industry, this concept has been transformed.
A full-cycle company in the AI world is an organization capable of independently handling every stage: from designing its own processors and building infrastructure for model training to creating ready-to-use services for users. In other words, such a company has no critical dependence on external suppliers at key production stages.
Alibaba is exactly that type of company: the holding group has its own cloud division, manufactures chips, develops models, and implements services based on them. Therefore, Tsai's stance looks not just like a forecast, but as a rationale for the corporation's long-term strategy.
Challenges of Implementing Open Source AI Models
Why Open Source Is Accessible to All, but Beneficial Only to Some
Open-source models are neural networks with source code that can be downloaded, studied, modified, and operated. At first glance, such accessibility levels the playing field: the code is open to everyone, just use and develop it.
However, Tsai emphasizes a crucial nuance: the availability of a model doesn't mean equality of opportunity. It is one thing to gain access to a model's «weights», and quite another to have the resources for its effective deployment, fine-tuning, and scaling.
Full-cycle companies can not only release an open model but also offer a complete ecosystem around it: cloud infrastructure for launch, adaptation tools, and technical support. This creates value-add that is virtually impossible to copy, even if the model's code itself is in the public domain.
Infrastructure Control as a Competitive Advantage
Training large language models requires colossal computing power. If a company rents servers from third-party providers, it becomes a hostage to their pricing, limits, and terms. Having its own infrastructure allows it to optimize processes for specific tasks, strictly control costs, and significantly speed up the cycle of experiments.
The same applies to hardware. Developing custom AI chips allows for tailoring the architecture to the specifics of particular models. This is not a requirement for everyone, but in the long run, it provides a major edge, especially at the scale of Chinese or American tech giants.
Essentially, Tsai argues that while open code is important, the true power lies in the ability to operate that code effectively. And for that, a powerful infrastructure is needed – ideally, one's own.
Leading Companies Building Full-Cycle AI Ecosystems
Who Else Falls Into This Category
Alibaba is not the only player building a full cycle. Meta releases open models from the Llama family, training them on its own infrastructure. Google designs specialized TPU processors and develops a cloud platform for its needs. Amazon is actively improving AWS and investing in the development of custom «hardware».
Notably, there are almost no traditional startups among these players. Building a full cycle requires massive investments, years of experience, and a scale that is currently only available to the largest tech corporations.
Future Outlook for the Open Source AI Market
What This Means for the Industry
If Tsai's forecast is correct, the future of open-source AI will be shaped not by the efforts of independent enthusiast researchers, but by major corporations with the resources to maintain a full development cycle.
This is not necessarily a negative scenario. Open models from these giants remain accessible to everyone: they can be integrated into projects and adapted for local tasks. However, the rules of the game in this ecosystem will be dictated by those who control the clouds and chip manufacturing.
The main question remains open: will small companies manage to find their niche in this paradigm, or will the market finally consolidate around a narrow circle of tech giants? While there is no definitive answer yet, Tsai's speech clearly outlines the direction in which the leaders of the AI arms race are moving.