Industrious YouTuber PatchBOTS has built himself a new L3-37 robot, which may be the best example yet of just how versatile modern A.I. SDKs – and Amazon’s Alexa Voice Services (AVS) in particular – can be. For those who may not have seen the new Solo: A Star Wars Story, the robot in question is a partial buildout of a key character in the film. Specifically, the AVS Device SDK was used, allowing for a custom wake word and custom initial response from Alexa. It goes without saying that the functionality is still exactly as it would be with a more traditional Amazon Echo device aside from those two differences. However, its installation in a fairly straightforward robotic head and neck assembly seems to allow for a unique and relatively convincing experience – even if the voice and attitude are just a bit off when compared to the movie.
The build itself is comprised entirely of custom-designed 3D-printed parts that effectively act as mounting surfaces for components, the LED lights that make up the eyes, and all the connected wiring. Those are painted up and assembled to resemble the look of L3-37, including some extra non-functional wiring and wiring tubes. Various miscellaneous parts are used, such as a custom cut welding mask, to give areas which would be glass to have a shiny glass-like appearance. An oversized servo is used at the top to allow the head to be tilted back while Alexa responds to queries. Setting all of that aside though, the real magic is in the use of a Raspberry Pi 3 and a small speaker to provide the robot with a voice. AVS Device effectively allows for full customization of Alexa in a hardware independent build. That means that instead of using an Amazon Echo Dot, PatchBOTS could use a custom board and chipset to allow the wake word to be set as “L3” and L3’s dry response to “What?” It’s also what allows additional functionality such as control of servos and other non-Alexa things, on top of the IoT functionality built into Alexa itself.
The result is impressive but, more than anything else, really just shows that there’s quite a bit that could be built using Amazon’s AVS Device SDK. The robot in the video simply tilts to “look” at the user while primarily responding just as any other Alexa device would. However, it shouldn’t be at all impossible to take the build quite a bit further, using PatchBOTS uploaded video and assets as a base. That’s only going to become more prominent as more companies build fully mobile robotics which could be torn down and modified with builds similar to what’s presented in the video below.