X

Google's Custom TPUs Push AI 7 Years Into The Future

Google I/O 2016 was, as usual, an exciting event for smartphone users and Android enthusiasts alike, but of course, the yearly event was also a place for Google to discuss its business and variety of products at large. Surprisingly enough, yesterday Google also unveiled a new custom Tensor Processing Unit said to have been used by Google in secret for more than a year. What’s special about the custom TPU is that, according to Google, it is powerful enough to advance machine learning technology seven years into the future, or “three generations of Moore’s Law”.

Google’s Tensor Processing Unit, or TPU for short, appears to be one of the company’s most recent technological advancements in the field of machine learning. According to Google, the company’s custom TPUs have been secretly used in its data centers for more than a year, and the same technology was also used for powering AlphaGo, the computer that beat Go world champion, Lee Sedol, 4-to-1 in early 2016, in a series of epic matches whose results took the world’s biggest AI innovators by surprise. According to Google’s CEO, Sundar Pichai, the TPUs have been running inside the company’s data centers for over a year, and found the technology to “deliver an order of magnitude better-optimized performance per watt for machine learning”. Mr. Pichai added that these results can be compared to the equivalent of “fast-forwarding technology about seven years into the future (three generations of Moore’s Law)”.

The search engine giant didn’t reveal much about the TPU’s hardware, but the CEO added that a TPU is “tailored to machine learning”, and thanks to higher tolerance towards reduced computational precision, TPUs require fewer transistors per operation and thus “can squeeze more operations per second into the silicon”, and use and apply “more sophisticated and powerful machine learning models”. The custom TPU is small enough to fit into a hard drive slot, and Google revealed that the technology is already powering Street View and RankBrain. According to analyst Patrick Moorhead of Moore Insight & Strategy, Google’s custom TPU isn’t “doing the teaching or learning”, but rather, “it’s doing the production or playback”. He made an analogy between TPUs and the relationship between a CPU and an application-specific integrated circuit (ASIC), adding that TPUs might implement machine learning algorithms created using more power-hungry CPUs and GPUs, similar to how ASICs are highly-optimized chips designed to do one thing very well.