← Back to KHAO

Nvidia ·

Google Cloud publishes two new AI chips to compete with Nvidia

2 min read

Compiled by KHAO Editorial — aggregated from 1 outlet. See llms.txt for citation guidance.

◌ Single Source

Image accompanies the article at TechCrunch AI. No description was extracted from the source.

Google Cloud on Wednesday announced that its eighth generation of custom-built AI chips, or tensor processing units (TPUs), will be split in two.

Key facts

Summary

Inference is the ongoing usage of models, aka what happens after users submit prompts. As you might expect, the company touts some impressive performance specs for these new TPUs compared to the previous generations: up to 3x faster AI model training, 80% better performance per dollar, and the ability to get 1 million+ TPUs to work together in a single cluster. But Google’s chips are not a full frontal assault on Nvidia’s future, at least not yet. One day the hyperscalers building their own AI chips (which includes Amazon, Microsoft, and Google) may grow to need Nvidia less, as enterprises move their AI needs to their clouds and port their apps to these chips.

Read full article at TechCrunch AI →

#nvidia #google