Google launches the third version of its A.I. chips, an alternative to Nvidia's

  • Google first announced its custom chip effort two years ago.
  • Big groups, or "pods," full of the third-generation chips will be eight times more powerful than pods full of the second-generation chips, which were announced a year ago.
Google CEO Sundar Pichai talks about the company's third-generation artificial intelligence chips.
Source: YouTube screenshot
Google CEO Sundar Pichai talks about the company's third-generation artificial intelligence chips.

Google on Tuesday announced that it has developed a third generation of its special chips for artificial intelligence.

The new Tensor Processing Units (TPUs) will help Google improve applications that use artificial intelligence to do things like recognize words people are saying in audio recordings, spot objects in photos and videos, and pick up underlying emotions in written text. As such, the chips represent an alternative to Nvidia's graphics processing units.

Also, if the new version is anything like its predecessor, it will also become accessible to third-party developers through Google's public cloud service, which could help Google compete with Amazon and Microsoft. Earlier this week Microsoft announced the early availability of special chips in its Azure cloud.

Pichai boasted about the vast computing power that's possible when people use large fleets of these third-generation TPUs -- pods, to use Google's term.

"Each of these pods is now eight times more powerful than last year's version -- well over 100 petaflops," he said. For context, a box containing 16 of Nvidia's latest GPUs offers two petaflops of computing power.

The chips are liquid-cooled -- a feature that's sometimes used for high-performance computing chips or some performance-oriented chips in people's PCs.

Last year's version is already showing good results. Test results posted in recent months suggest that the second-generation TPUs could deliver better performance than existing options with GPUs in certain scenarios, although the TPUs do have certain limitations, like lacking support for the Facebook-backed PyTorch AI software framework. The PyTorch open-source community has been working to change that.

Google first announced the TPU initiative in 2016.