Leveraging Strategic Investments in Big Models: The Battle for AI Dominance between Amazon and OpenAI's Rivals

Vincent Hsu

Hatched by Vincent Hsu

Oct 05, 2023

5 min read

0

Leveraging Strategic Investments in Big Models: The Battle for AI Dominance between Amazon and OpenAI's Rivals

In a surprising move, Amazon announced its investment of up to $4 billion in Anthropic, a prominent big models company known for its ChatGPT-like chatbot named Claude. This investment comes after Microsoft's successful venture with OpenAI earlier this year, securing a $10 billion deal as a major customer for their Azure cloud platform and gaining priority access to ChatGPT. It is clear that Amazon's investment in Anthropic goes beyond securing clients for its AWS platform or solely focusing on big models.

The official collaboration details reveal that Anthropic will utilize AWS Trainium and Inferentia chips for building, training, and deploying their future foundational models. Furthermore, both companies will collaborate on the development of Trainium and Inferentia technologies. It is worth noting that AWS Trainium is a custom machine learning training chip launched by AWS at the end of 2020, while Inferentia is a high-performance machine learning inference chip introduced by AWS in 2019.

By deepening their partnership through this investment, Amazon aims to accelerate the development of its in-house AI chips. This move aligns with a recent report suggesting that NVIDIA approached major cloud providers, including Amazon, with a proposal to lease their servers instead of selling chips directly. However, Amazon rejected this proposal, indicating its commitment to advancing its proprietary AI chips.

The motivation behind Amazon's investment in Anthropic becomes clearer when considering the broader context of the cloud computing market. As big models and AI applications become the most sought-after clients in the future of cloud computing, major cloud providers such as Google, Microsoft, AWS, Oracle, and NVIDIA are aggressively investing to secure clients. Although these strategic investments have raised financial concerns, they demonstrate a shared understanding among these companies regarding the significance of big models and AI applications.

In the case of AWS, Anthropic has already been an existing client since 2021. By deepening the collaboration through a $4 billion investment, Amazon seeks to gain a more profound stake in big models and, most importantly, in the development of self-designed AI chips. This strategic investment allows Amazon to learn from Anthropic's expertise in big models while challenging and potentially disrupting NVIDIA's dominance in AI chips, which were primarily designed for graphics processing and later repurposed for training neural networks.

Amazon CEO Andy Jassy indirectly affirmed this intention, stating, "We believe we can help improve many short-term and long-term customer experiences through deeper collaboration." The short-term and long-term customer experiences he refers to are directly related to Amazon's big models and self-designed AI chips.

Jassy further emphasized the excitement among customers for Amazon Bedrock, AWS's new managed service that enables the construction of generative AI applications using various foundational models. He also highlighted AWS Trainium, the company's AI training chip, and how the collaboration with Anthropic could enhance the value of both functionalities for customers.

Interestingly, Amazon faced some setbacks in the development of its own big model, Titan. After initially announcing its typical customers, the company received negative feedback, suggesting that the model was not performing as expected. This incident highlighted the need for Amazon to strengthen its preparations in the field of big models. Consequently, Amazon has shifted its focus towards promoting the Amazon Bedrock platform, where customers can access services from various leading big model providers, including Anthropic.

On another front, Amazon aims to maintain its dominance in the cloud computing market, which serves as its fundamental revenue source. In the era of big models, cloud computing requires exploration of new technologies to achieve faster inference capabilities. In this regard, Amazon has been a pioneer, investing in proprietary data center chips and servers with higher speeds and energy efficiency, distinguishing itself from Microsoft and Google.

Regarding AI-specific chips and servers, Amazon has been at the forefront among the three major cloud providers (Microsoft Azure, Google Cloud). However, the progress and performance of Amazon's AI chips have not been publicly disclosed as standalone products; instead, they are bundled within servers and provided to customers through cloud services. As a result, customers can only perceive the performance of cloud computing services, without direct visibility into the underlying chip's capabilities.

Now, Amazon aims to gain insights into which workloads are best suited for specific processors, and its collaboration with Anthropic serves as one of the means to achieve this. According to The Information, out of the 69 companies in the Generative AI Database, 32 use Amazon as their cloud provider, while 26 use Google and 13 use Microsoft. It is important to note that some companies utilize multiple cloud providers.

In the era of big models, the collaboration and competition among cloud computing, big models, and AI applications are becoming increasingly complex. The cloud computing industry, which has been relatively stable for a long time, is finally on the verge of transformative change.

Actionable Advice:

  • 1. Stay updated on strategic investments in big models: Keep a close eye on the investments made by major cloud providers in big models and AI applications. This will provide insights into the evolving landscape and potential opportunities for collaboration.
  • 2. Explore the capabilities of cloud providers: Understand the unique offerings and technologies presented by different cloud providers. Assess their AI chip performance and consider which provider aligns best with your organization's needs and workloads.
  • 3. Embrace generative AI applications: Explore the possibilities of generative AI applications and how they can enhance your existing applications and customer experiences. Leverage platforms like Amazon Bedrock to access a variety of foundational models and customize them to meet your specific requirements.

In conclusion, Amazon's strategic investment in Anthropic signifies its ambition to secure a prominent position in the big models and AI chip development landscape. By leveraging this collaboration, Amazon aims to enhance its cloud computing services, strengthen its big model platform, and potentially disrupt NVIDIA's dominance in AI chips. As the battle for AI dominance intensifies, it is crucial for organizations to stay informed about these developments and leverage the advancements in big models and AI to drive innovation and improve customer experiences.

Hatch New Ideas with Glasp AI 🐣

Glasp AI allows you to hatch new ideas based on your curated content. Let's curate and create with Glasp AI :)