Meta is reportedly in talks with Google to use its custom AI chips, known as Tensor Processing Units (TPUs). Media reports say Meta could start renting TPUs from Google Cloud next year and may begin placing these chips in its own data centres by 2027.
If confirmed, this would be a major shift in Meta’s AI strategy and a possible challenge to Nvidia, which currently dominates the AI chip market. The discussions come at a time when AI companies are struggling with the rising cost and limited supply of Nvidia hardware.
Google has been positioning TPUs as a cheaper and more reliable option for training large AI models. For Meta, this could help reduce costs and reduce dependence on a single chip supplier. The company needs massive computing power to train AI models for Facebook, Instagram, WhatsApp and its new AI assistants.
Reports indicate that the plan would begin with renting TPUs through Google Cloud next year, giving Meta time to test the chips before integrating them fully into its own systems by 2027. Google insiders believe such deals could help TPUs secure a stronger market presence, taking a share of spending that mainly goes to Nvidia today.
The market reacted quickly to the news. Nvidia’s stock fell by around 2.7%, while Alphabet shares rose as investors viewed the talks as a boost for Google’s cloud and AI ambitions.
So far, Meta, Google and Nvidia have not commented publicly, but the discussions point to a more competitive phase in the AI chip race.






