June 17, 2024

How Apple used Google’s help to train its AI models

Apple CEO Tim Cook announced on stage Monday a stunning deal with OpenAI to integrate its powerful AI model into its Siri voice assistant.

However, in the fine print of a white paper Apple released after the event, the company explains that Alphabet’s Google has emerged as another winner in the Cupertino, California-based company’s race to catch up with artificial intelligence.

To create Apple’s basic AI models, the company’s engineers used their framework software with a range of hardware, particularly on-premises graphics processing units (GPUs) and chips available only in Google’s cloud called Tensor Processing Units (TPUs).

Google has been making TPUs for about 10 years and has publicly talked about two different types of 5G chips that could be used for AI training. The fifth generation performance version offers performance that Google says rivals the Nvidia H100 AI chips.

Google announced at its annual developers conference that the sixth generation will be launched later this year.

The processors are specifically designed to run AI applications and training models, and Google has built a cloud computing hardware and software platform around them.

Apple and Google did not immediately respond to requests for comment.

Apple did not comment on how much it relies on Google’s chips and software compared to Nvidia’s hardware or other AI vendors.

However, using Google Slides typically requires a customer to purchase access to it through Google’s cloud division, similar to how customers buy compute time from Amazon.com’s AWS or Microsoft’s Azure. (Reporting by Max A. Cherny, Editing by Sandra Maler)

See also  In short: After the European Union, US senators to get a unified connection to charge smartphones +++ new Gmail web design is the standard offer | News