X

Tech Talk: How Can Google Assistant Continue To Improve?

Google has spent a substantial amount of its resources on A.I. innovations and there’s still a lot more to come, especially with regards to Google Assistant. That shouldn’t come as a shock with consideration for the fact that the company’s CEO, Sundar Pichai, has repeatedly emphasized the technology. However, teardowns over the past several months show that Google is more focused than ever. Central themes of those have included deeper integrations with various hardware and software ecosystems, new functionality, and the possibility of new A.I.-enhanced hardware. Better, with Google I/O just around the corner, the expected improvements and changes shouldn’t be too far off. Of course, the company will almost certainly reveal new ideas that just haven’t been spotted or have been kept under wraps. In the meantime, there’s already quite a bit to look forward to.

To begin with, as mentioned above, there seems to be an overall drive to make Google Assistant a more integral part of platforms other than Google’s Home smart speakers. The A.I. in those is essentially as fully integrated as possible. That’s not to say the company is not trying to make improvements to those devices. A more natural sounding voice, better incorporation of smart home accessories and devices, and more are arguably areas where those could be improved. However, the virtual helper has yet to become quite so ingrained into Android or Chromebooks. That may just be the one area that Assistant lags behind Apple’s Siri, which interacts quite well with most first and third party apps on iOS devices. With Chrome OS consistently showing increased support for Android apps and even for Google Assistant, now is the time for Google to buckle down and make those deeper interactions possible. Moreover, given the company’s more recently expressed interest in optimizations for low-spec devices, that integration could include device-management aspects to help a given smartphone or laptop perform better and last longer. That’s a tactic that Huawei’s HiSilicon has taken with its own A.I., particularly with its Kirin 970 SoC, so it isn’t entirely unfeasible.

Aside from the more general improvements to integration, several teardowns across a number of Google apps have suggested that Google may be working on a screen-enabled device to rival Amazon’s. If any of those devices include a camera, then it is arguably time for Google to get creative with how its Assistant sees the world. Of course, that includes improvements to Lens, which have also been spotted in teardowns. The feature has shown up in a few Google apps for some users but is not yet widespread or included as part of Assistant itself. Setting aside the fact that Google really needs to incorporate Lens into Assistant, however, the company could take things quite a bit further. It isn’t difficult to image the ways it could accomplish that. For example, Amazon has a dedicated Alexa-branded device which can offer up fashion advice. For Google to mimic that capability and perhaps build on it would be a substantial step forward for its A.I. helper. Better still, the company could make that feature and features like it available on all Assistant-enabled devices that feature a camera. If included on Android, it would add immediate appeal over Amazon’s implementation solely because it doesn’t require the purchase of a secondary device. Of course, it goes without saying that Google has not so much as hinted at those kinds of capabilities but it is possible that something like that could happen.

Finally, there are more obvious improvements that are likely to be made over the near future as Google continues to press its Assistant to a more global audience. The addition of even more spoken and recognized languages jumps immediately to mind but that could be accompanied by improvements to the speed and usability of translation features. Beyond that, the company could add more options for natural sounding Assistant voices to give each user a bit more to tweak in terms of personalization of their Assistant. Of course, there is something along those lines on the way, if code found in teardowns is to be believed. Namely, the company appears to be on the verge of adding the ability for users to set custom hotwords. For those who aren’t aware, hotwords are used to wake an Assistant-enabled device and interact with the A.I. itself. For now, the only options are “Hey Google” or “Okay Google.” Being allowed to set custom words or phrases for that would be a huge improvement for more tech-savvy users who want to get more out of their devices. However, the company could also choose to include more choices – preferably as a secondary addition on top of custom words – to begin with, for those users who don’t want to invest that much time or effort.

Unfortunately, there’s no way to know whether any of this will require a complete implementation of Assistant as a core component of Android itself. Regardless of whether or not that turns out to be the case, Google is likely to put them front and center as either a part of or tied in with announcements of a new version of the OS. So it isn’t unlikely that at least some of these improvements will be shown at the upcoming Google I/O 2018 event. On the other hand, Google may opt to simply suggest some of the smaller features so that they can be utilized to keep interest up when they are unveiled between updates. As mentioned above, there will almost certainly be other improvements over time, whether that entails more general optimizations or major new features. In any case, the developer-focused conference takes place in early May, so nobody should have to wait too long to find out.