Machine learning and artificial intelligence are finding their way into an insane amount of products and services these days, some more conventionally useful than others. You would be hard-pressed to find a company in the tech sphere that isn’t involved in AI in some form, whether by using somebody’s product that leverages AI, or by developing their own solutions. This push is not a fad or just another random happening in the tech world; it’s something that has been long in the making, but the technology just hasn’t been up to snuff until now. Specifically, neural networking in computers and other methods of machine learning have been researched since the 1970s, though they spent a while on the shelf, since people recognized after a while that the technology just wasn’t ready. Now it is, however, and the race is in full swing to find the next great AI platform for all of the services of tomorrow to run on. That AI platform may just be able to run from your pocket.
A team of researchers from MIT have created what is essentially a small neural network on a chip, and they’ve done so in just such a way that it requires ten times less power than the average mobile GPU. With this special chip, nicknamed Eyeriss, the average smartphone no longer has to reach out to the cloud for basic computing tasks related to machine learning. The chip has 168 cores, and each of those not only has its own memory, but can communicate directly with the cores that are adjacent to it. This means that data can pass freely from one core to the next without having to go through the central bus, shaving precious picoseconds off of every calculation and essentially allowing the Eyeriss chip to adapt itself to any use, shifting its architecture on a whim. Rather than the conventional pipelines found in most types of processors, the cores can form their own data pipelines in a sort of “grapevine” system.
Armed with this new chip, the smartphones of tomorrow would be able to not only perform their daily functions a bit better, but would be able to undertake AI and deep learning tasks that they would normally have to outsource. While a smartphone with an Eyeriss chip inside would certainly be no match for the likes of the Google Brain, more basic tasks like keeping track of a user’s interests, schedule, and usage patterns to better optimize the mobile experience would be a piece of cake, and would mean a vastly different and much better user experience.
Sorting through the daily use case scenarios of average users, the various uses for onboard AI become quite clear. A user surfing through the Play Store for a new game to play or a new launcher app to tinker with could get suggestions based on the apps they have installed. The chip would look at things like the current apps’ sizes, feature sets via code, use statistics for the user, and online opinions, and recommend something the user may like based on that. For example, somebody who plays an MMORPG like Avabel online for a few minutes each day while sinking hours into competitive shooters like Shadowgun Deadzone may see a recommendation for something like Bounty Hunter: Black Dawn, an open-world shooter with a storyline, heavy RPG elements and online competitive play, in the same vein as Borderlands. Another user may find themselves roaming downtown looking for a new bar for the night. They ask the AI, “Find me a good bar I haven’t been to yet.” From there, the AI may look over bank statements to find how often the user goes to certain bars, how much they spend there, and then check reviews of those bars to find keywords like “good ambiance”, or “good selection of craft beer”, and recommend a new bar in the immediate area based on what it finds. All of these computations, right now, take place on a distant server and get beamed to the phone, making it harder to integrate other data and apps on the device, as well as to manage user data. Onboard AI stands to revolutionize personal computing devices, especially smartphones, and it looks as though Eyeriss will be leading the charge in the near future.