Google Has Its Very Own Chip For Artificial Intelligence

InnovationResearch
0 0 No Comments

Google has been testing out a custom chip designed for artificial intelligence and machine learning that will work with TensorFlow

Google has grand ambitions for artificial intelligence (AI). A large chunk of the web giant’s future business will rely on machine learning to power its range of upcoming services and products like self-driving cars, AI chatbots, and virtual reality devices.

But AI requires powerful computing. While no stranger to building its own hardware, Google has traditionally bought components such chips from already established players like Intel.

This relationship took a twist this week when the company revealed that it has indeed been making its very own long-rumoured chips, designed specifically for machine learning.

TPU

Google IO 2016 Sundar PichaiA custom ASIC (Application-specific integrated circuit), called a ‘Tensor Processing Unit’ (TPU), is now part of Google’s armoury, with the Alphabet subsidiary admitting it has in fact been using the chips for a year now in various applications.
Software works best when running on the finest hardware, and that’s why Google said it made the decision to design its own chip.

The entire project, which Google has kept under wraps for a few years, essentially gives the company its own power for machine learning applications. The chip is tailored for TensorFlow, Google’s own open source machine learning software library, and has been up and running in Google data centres since 2015.

“We’ve been running TPUs inside our data centres for more than a year, and have found them to deliver an order of magnitude better-optimised performance per watt for machine learning. This is roughly equivalent to fast-forwarding technology about seven years into the future (three generations of Moore’s Law),” said Google’s Norm Jouppi this week.

“TPU is tailored to machine learning applications, allowing the chip to be more tolerant of reduced computational precision, which means it requires fewer transistors per operation,” he added.

Does it matter to your business whether your data is stored in the EU?

View Results

Loading ... Loading ...

Data Centre Use

interxion“Because of this, we can squeeze more operations per second into the silicon, use more sophisticated and powerful machine learning models and apply these models more quickly, so users get more intelligent results more rapidly. A board with a TPU fits into a hard disk drive slot in our data centre racks.”

TPUs have been tested already in Google applications such as RankBrain, which is used to improve the relevancy of search results. Street View has also been sucking up power from the TPUs to boost accuracy in Google Maps.

But perhaps the highest profile case of Google’s stealthy use of the chips is in its recent AlphaGo Chinese board game win against Go world champion Lee Sedol.

“Our goal is to lead the industry on machine learning and make that innovation available to our customers,” Jouppi said.

“Building TPUs into our infrastructure stack will allow us to bring the power of Google to developers across software like TensorFlow and Cloud Machine Learning with advanced acceleration capabilities. Machine Learning is transforming how developers build intelligent applications that benefit customers and consumers, and we’re excited to see the possibilities come to life.”

But questions remain over the chip, which was unveiled at Google’s annual I/O conference. It is not yet clear whether the chips will work exclusively with TensorFlow, or will be open to other machine learning libraries. The location of the TPUs’ manufacturing was also not revealed by Google, and seeing as Google doesn’t have its own factory, the job of manufacturing is likely outsourced to a specialist like GlobalFoundries.

Take our virtual reality quiz here!

 


Click to read the authors bio  Click to hide the authors bio