This is what the the processor created by Google looks like that is being used to develop an artificial intelligence platform.
Google I/O 2016 is already underway, it will be three days in which all kinds of workshops and lectures for developers will take place.
This year there is a common theme in virtually everything presented: the importance of artificial intelligence that will happen in on our future. Not in the sense of replacing the tasks we humans perform, but help us in our day to day, while the machines are becoming more and more intelligent, learning for themselves.
Tensor Processing Unit, the heart that machines use for learning
On paper this looks very nice, but hardware is needed to achieve it, and Google does. This Google TPU processor, named “Tensor Processing Unit,” is an integrated circuit tailored specifically created for the “learning machine”.
In addition, the TPU has been adapted to work with TensorFlow, which is a library of open source software for the development of artificial intelligence in machines that anyone can access from Google’s code.
This piece is adapted to machine learning applications, is able to perform more operations per second than traditional hardware using equal consumed energy, it can prioritize tasks and learn from failures to automatically correct on these actions. The most interesting is that this is not a theoretical project, Google has kept secret already more than a year using it in different services and projects. They have been using TPU in their data centers and is part of projects like RankBrain, Street View or to improve the accuracy of maps via Maps and navigation.
Google has made it clear they want to lead the development of learning machines and artificial intelligence. TPU is another step and most likely will end up licensing its technology to other companies. Skynet anyone say?