The Complex Landscape of Cloud Computing, Big Models, and AI Chips

Vincent Hsu

Vincent Hsu

Oct 13, 20235 min read


The Complex Landscape of Cloud Computing, Big Models, and AI Chips


In a surprising move, Amazon recently announced a substantial investment of up to $4 billion in Anthropic, a leading big models company known for its ChatGPT-based chatbot, Claude. This investment comes after Microsoft made an impressive start to the year by securing a $10 billion deal with OpenAI, the creator of ChatGPT, and a major customer for Microsoft's Azure cloud. With these developments, it is evident that the relationships between players in the big models industry are becoming increasingly complex. However, Amazon's investment in Anthropic goes beyond merely "locking in" customers for AWS or focusing solely on big models. By delving into the details of the partnership, we can gain insights into Amazon's intentions and its pursuit of AI chips.

Amazon's Strategic Investment in Anthropic:

As part of the deal, Amazon plans to initially invest $1.25 billion to purchase a minority stake in Anthropic, with the option to increase its investment to $4 billion. The official partnership agreement reveals that Anthropic will utilize AWS Trainium and Inferentia chips to build, train, and deploy its future foundational models. Additionally, both companies will collaborate on the development of Trainium and Inferentia technologies. It is worth noting that AWS Trainium is a customized machine learning training chip launched by AWS in late 2020, while Inferentia is a high-performance machine learning inference chip introduced by AWS in 2019. By deepening its collaboration with Anthropic through this investment, Amazon aims to accelerate the development of its own AI chips.

Amazon's Ambitions and the Quest for AI Chips:

To understand Amazon's motivations, it is essential to consider its long-term objectives. One obvious reason for the investment is to secure customers. Big models companies and AI application enterprises represent the key customers in the future of cloud computing, making them highly sought after by major cloud providers. This year, Google, Microsoft, AWS, Oracle, and NVIDIA have all strategically invested to "lock in" customers, although this approach has sparked financial controversies.

In the case of Anthropic, it has been an AWS customer since 2021. By deepening its collaboration through a $4 billion investment, Amazon aims to gain a stronger foothold in two critical areas: big models and AI chips. In other words, Amazon is paying a significant tuition fee to learn how to excel in the big models arena. This investment not only allows Amazon to compete directly with OpenAI but also presents an opportunity to develop AI chips that could potentially disrupt NVIDIA's GPU dominance. While GPUs have been extensively modified to accommodate neural network training, they were not originally designed for this purpose. Amazon's CEO, Andy Jassy, indirectly acknowledged this by stating that the partnership with Anthropic could help improve short-term and long-term customer experiences, which aligns with Amazon's focus on big models and AI chips.

Amazon's Approach to Big Models and Cloud Computing:

Despite initially developing its own big model called Titan and announcing typical customers for it, Amazon faced criticism from one of those customers who claimed that Amazon's big model was subpar. This incident highlighted Amazon's insufficient preparation in the realm of self-developed big models. Consequently, Amazon shifted its focus towards promoting the Amazon Bedrock platform, allowing customers to access services from various leading big models providers, including Anthropic.

Moreover, Amazon is keen on solidifying its position in cloud computing. As the era of big models unfolds, cloud computing faces diverse workloads that require exploring new technologies to achieve faster inference capabilities. Amazon has been at the forefront of this exploration, having developed its own data center chips and servers that offer higher speeds and energy efficiency. In the realm of AI-specific chips and servers, Amazon has also been an early mover among the three major cloud providers (Microsoft, Google, and AWS). However, the progress and performance of Amazon's AI chips have not been independently disclosed, as they are bundled within servers and provided to customers through cloud services. As a result, customers primarily perceive the performance of Amazon's cloud computing services and lack direct visibility into the underlying chip capabilities.

The Collaboration with Anthropic as a Means to an End:

By collaborating with Anthropic, Amazon aims to gain insights into the most suitable processors for different workloads. Currently, out of the 69 companies in The Information's database of generative AI, 32 use Amazon as their cloud provider, 26 use Google, and 13 use Microsoft. It is important to note that some companies utilize multiple cloud providers. In the era of big models, the collaboration and competition between cloud computing, big models, and AI applications are becoming increasingly intricate. This complexity presents an opportunity for transformation in the long-standing cloud computing industry.


In conclusion, Amazon's investment in Anthropic signifies a strategic move to solidify its position in the evolving landscape of cloud computing, big models, and AI chips. By leveraging the collaboration with Anthropic, Amazon aims to enhance its big models capabilities while also accelerating the development of its own AI chips. As the industry continues to evolve, organizations must adapt and seek unique partnerships that enable them to stay at the forefront of technological advancements.

Actionable Advice:

1. Businesses interested in leveraging big models should consider exploring partnerships with leading big models companies like Anthropic to gain access to cutting-edge technologies and enhance customer experiences.

2. Cloud computing providers should invest in the development of AI chips to cater to the growing demand for faster inference capabilities in the era of big models.

3. Organizations should regularly evaluate their cloud provider options and consider utilizing multiple providers to diversify their capabilities and mitigate risks associated with vendor lock-in.


  1. "280 亿!亚马逊投了 OpenAI 最大敌人", (Glasp)
  2. "Integration LINE | Make", (Glasp)

Want to hatch new ideas?

Glasp AI allows you to hatch new ideas based on your curated content. Let's curate and create with Glasp AI :)