Apple Just Exposed a Massive Risk for Nvidia Stock: The AI Chip Battle with Google and TPU Technology

In the ever-evolving landscape of artificial intelligence (AI) and machine learning (ML), major tech companies are constantly vying for the upper hand. Recently, Apple has made headlines by revealing its use of Google’s Tensor Processing Units (TPUs) instead of Nvidia’s highly sought-after graphics processing units (GPUs) to train its AI models. This decision has significant implications for Nvidia, a company that has long dominated the AI chip market. By opting for Google’s TPUs, Apple not only highlights the intense competition in the AI chip industry but also exposes a potential vulnerability in Nvidia’s business model.

Apple’s announcement came through a detailed research paper published on Monday, where the tech giant discussed its foundation language models developed for Apple Intelligence features. The paper specifically mentioned the use of TPUV4 and TPUV5P chips from Google to train these AI tools. Notably absent from the paper was any mention of Nvidia chips, which have been in high demand across the tech industry. This move by Apple underscores a broader trend among large technology companies seeking alternatives to Nvidia’s GPUs, which have become increasingly difficult to obtain due to skyrocketing demand.

Nvidia’s GPUs have been the cornerstone for many AI and ML projects, with companies like Meta, Microsoft, OpenAI, and Anthropoc investing billions in acquiring these chips. According to Mizuho Securities, Nvidia controls more than 70% of the AI chip market, with each chip costing tens of thousands of dollars. However, Google’s TPUs offer a compelling alternative. These specialized chips are designed specifically for AI workloads and are available through Google’s cloud service, making them accessible to clients like Apple. The TPUV5P chips, in particular, have been used extensively by Apple, with the company employing 2,048 of these chips to build its AI model, which is expected to operate on iPhones and other Apple hardware.

The use of Google’s TPUs by Apple raises important questions about the future of AI chip technology. While Nvidia has enjoyed a dominant position in the market, the emergence of cost-effective and energy-efficient alternatives like Google’s TPUs could disrupt the status quo. Google’s TPUs are not only competitive in terms of performance but also offer significant cost advantages. For instance, the latest generation of Google’s TPUs costs under $2 per hour when pre-booked for three years, making them an attractive option for companies looking to optimize their AI training costs.

Apple’s decision to use Google’s TPUs instead of Nvidia’s GPUs is a major testament to the effectiveness of these chips. This choice also raises concerns for Nvidia, which relies heavily on a small group of tech giants for a significant portion of its business. Google, one of Nvidia’s biggest customers, is designing its own AI chips, potentially posing a threat to Nvidia’s sales and profits. Other major customers, such as Microsoft, Amazon, and Meta, are also exploring custom silicon for AI, further intensifying the competition in the AI chip market.

The potential for more companies to adopt cost-effective application-specific integrated circuits (ASICs) like Google’s TPUs instead of Nvidia’s GPUs represents a major risk for Nvidia. Despite its current dominance, Nvidia’s future success is not guaranteed. The company’s high stock price, trading at over 42 times forward earnings and an enterprise value-to-revenue multiple of around 34, reflects investors’ high expectations. However, the risk of relying heavily on a few customers and the potential for a slowdown in sales and profits could significantly impact Nvidia’s stock price.

Apple’s use of Google’s TPUs also highlights the growing importance of AI infrastructure investment. Executives from companies like Facebook and Alphabet have acknowledged the risks of not investing in AI infrastructure, despite concerns about overinvesting. Apple’s relatively late entry into the AI space, with the introduction of Apple Intelligence, demonstrates the company’s commitment to catching up with its competitors. Apple Intelligence includes new features such as a revamped Siri and improved natural language processing, with plans to introduce generative AI functions, including image and emoji generation, in the coming year.

The technical details provided in Apple’s research paper offer a glimpse into the scale and complexity of its AI training efforts. The AFM model used in Apple Intelligence was trained on a single slice of 2,048 TPUV5P chips, while the AFM server was trained on 8,192 TPUV4 chips configured to work together. This level of investment in AI training infrastructure underscores the importance of efficient and scalable solutions, which Google’s TPUs appear to provide.

Google’s TPUs, first introduced in 2015, have become a popular choice for AI workloads. While Google remains a major customer for Nvidia, using their GPUs for AI training and offering access to their technology through cloud services, the increasing adoption of TPUs by companies like Apple suggests a shift in the market dynamics. Apple’s decision to publicly share its use of Google’s TPUs is particularly noteworthy given the company’s typically reserved approach to product development details.

It remains uncertain whether Apple will continue to use Google’s TPUs for its future AI features or develop its own in-house solutions. The company’s preference for in-house development suggests that it may eventually transition to using its own silicon for AI training. However, the current collaboration with Google highlights the competitive landscape and the demand for high-performing AI chips. The ongoing battle between Nvidia and Google in the AI chip market will likely shape the future of AI technology and its applications.

For investors, Apple’s decision to use Google’s TPUs instead of Nvidia’s GPUs serves as a major warning. The reliance on a few key customers and the potential for these customers to seek alternative solutions poses a significant risk to Nvidia’s business. As the AI chip market continues to evolve, companies that can offer cost-effective, efficient, and scalable solutions will likely emerge as leaders. Google’s TPUs have proven to be a formidable competitor, and their adoption by major players like Apple signals a potential shift in the market.

In conclusion, Apple’s use of Google’s TPUs to train its AI models exposes a massive risk for Nvidia stock. The competition between Nvidia and Google in the AI chip market is intensifying, with Google’s TPUs offering a viable alternative to Nvidia’s GPUs. As more companies explore custom silicon for AI, Nvidia’s reliance on a small group of tech giants for a significant portion of its business could become a vulnerability. Investors should carefully consider these risks and the evolving dynamics of the AI chip market before making investment decisions. The future of AI technology and its applications will be shaped by the ongoing battle between Nvidia and Google, with significant implications for the tech industry as a whole.