X

Tech Talk: Amazon Is Showing AWS Devs The Power Of Alexa

Once a year, Amazon holds an exclusive developer summit for their Amazon Web Services platform. Amazon Web Services is the cloud service of choice for a huge number of businesses, netting it the number one spot out of all of the cloud services out there at the moment and making it a serious thorn in the sides of Google and Microsoft, who are both trying hard to bust up Amazon’s stranglehold on the infrastructure as a service business. The sheer scale of Amazon Web Services’ user base is insane, making the annual AWS re:Invent conference a pretty hot destination for developers. Working with web development and IaaS in any capacity isn’t terribly far off from developing for an IoT and AI baseline like Alexa, so Amazon is leveraging the conference’s huge attendance this year to push their nascent AI platform.

Amazon’s Alexa codebase is a full-fledged platform in every sense of the word. Bots, which may soon replace traditional apps, are there to provide services just as readily as individual apps made for Alexa, as well as the backbone AI itself. Armed with machine learning, Alexa is set to become one of the top dogs of the consumer-facing AI world in the near future, with Amazon’s smart speaker and IoT hub, the Echo, being the centerpiece of it all. While Alexa may end up outgunned in raw power and R&D man-hours by Google’s Assistant solution, the real battle will come down to two fronts; users, and developers. Having a large number of users on board will allow the AI systems to have more nodes to work with for machine learning, and to gain more experience at a faster pace than they otherwise would. Having a large number of third party developers on board, likewise, will provide more capabilities, more experience, and more features to bait consumers into dropping their dollars on systems featuring the AI.

With Google Home right around the corner and the backbone Assistant AI soon to make its way to Android phones running 7.0 Nougat and up, Assistant’s ubiquity is unquestionable even before release. Amazon is hoping to counter that with a slew of third-party developers extending Alexa’s capabilities, which is why they’re holding an entire session track for Alexa at this year’s AWS re:Invent conference. Not only will existing AI and bot developers have a chance to get an in-depth look at Alexa, curiosity may drive some AWS system admins and developers to check out Alexa, and some may even see dollar signs or an easy path to fruition for an idea they’ve been nurturing for a while. Amazon is counting on this happening in massive numbers in order to compete with Google’s install base for Assistant. The platform is allowed to end up on other companies’ hardware, so that, too, may drive Alexa further toward the same kind of ubiquity that Assistant is already bound to experience upon release. With word on the street being that Apple is working on their own high-powered AI personal assistant with machine learning tech at its heart, Amazon is racing against the clock to swing the Alexa name from a tech sector term to a household name.

Developers at the summit who decide to check out the Alexa session track will find a fairly comprehensive crash course that will cover most of what they need to know to begin developing for Alexa, along with how to use and implement a number of key features. Developers can learn via examples, like a voice-activated banking experience built on Alexa by Capital One, or about things like voice-enabled cars, smart homes, and testing and QA for Alexa-based products. The wide range of situations, examples, and fundamentals on offer to any summit attendee who wishes to check them out should be more than enough to entice a good number of developers who would otherwise have never touched Alexa, and will likely send those who were curious about Alexa springing into action to develop for the platform. This is likely the most prudent path that Amazon could have chosen to keep Alexa competitive, since Apple’s platform is still supposedly in early development, and Google Home and Assistant will be fully opened up on launch, with the usual set of developer advocacy tools that come with a major Google product that welcomes third party development.