Google using self-designed AI processors

Specialized technology ramps up performance of its software, says Urs Holzle.


Google has begun to use computer processors its engineers designed to increase the performance of the company’s artificial intelligence software, potentially threatening the businesses of traditional chip suppliers such as Intel Corp. and Nvidia Corp.

During the past year, Google has deployed “thousands” of these specialized artificial intelligence chips, called TensorFlow Processing Units (TPUs), in servers within its data centers, said Urs Holzle, the company’s senior vice president of infrastructure, Wednesday at the company’s developer conference. Google declined to specify precisely how many of the chips it’s using, but stressed the company continues to use many typical central processing units and graphics processing units made by other companies.

“If you use cloud voice recognition, then it goes to TPU. If you use Android voice recognition, then it goes to TPUs,” Holzle said. “It’s been in pretty widespread use for about a year.”

Google has no plans to sell the specialized chips to third parties, said Diane Greene, Google’s senior vice president of cloud services.

Google and other big data center operators are the largest consumers of server processors, the main engine of growth and profit for Intel, the world’s largest manufacturer of semiconductors. Graphics maker Nvidia is also pinning much of its future growth ambitions on the bet that its chips will have a larger role to play in data processing, including artificial intelligence and machine learning.

Google’s chip connects to computer servers via a protocol called PCI-E, which means it can be slotted into the company’s computers, rapidly augmenting them with faster artificial intelligence capabilities. It represents a first attempt by Google at designing specialized hardware for its AI workloads, Holzle said.

As the field matures, Google “might very well” build more specialized processors for specific AI tasks, he said, adding that, over time, Google expects to design more system-level components.

Even Nvidia, which makes traditional graphics processing units that have been adopted for machine learning, is beginning to add more custom elements to its hardware. “In some sense, a GPU is too general for machine learning,” Holzle said.

Google wouldn’t disclose what company is manufacturing the specially designed chips.

More for you

Loading data for hdm_tax_topic #better-outcomes...