Amazon Developing an AI-Powered Wearable that Reads Human Emotions

Tech giant Amazon is already a force to be reckoned with in terms of innovation, so the recent news from behind the scenes about the potential development of a wrist-worn AI-powered device that can discern the wearer's emotional state based on the sound of their voice, while shocking through the nature and ambition of the project, come as no surprise as efforts are ongoing on the company's part to revolutionize the tech market.

  • Internally, Amazon is said to refer to the reportedly developing device as a health and wellness product, and the codename Dylan is used for it.

It might sound pulled straight out of a sci-fi story written by Isaac Asimov, but so did Echo and Alexa a decade ago.

Let's face it, no one would have guessed that we would have Alexa and voice command our way to a plethora of household chores and actions, and now we are at a point where there are more than enough Alexa-compatible gadgets on the market that help make our lives easier, Amazon inspiring manufacturers to produce new versions of their products that work with it as they sell at a much higher rate as a result of their usability.

The same goes for this apparent project as, if released to consumers and implemented to work with Alexa and provide some ingenious benefits, it is bound to appeal to the modern market.

As hard to achieve as the project sounds, considering that Amazon put together for this task two teams that proved to be successful in multiple instances, chances are that what seems almost impossible right now might actually be achieved, the entities working towards developing Dylan being Lab126 and the Alexa software team.

Lab126 is the department we have to thank for the development of the Echo, Kindle, and the lesser appreciated Fire Phone. Reportedly, the currently under development device is equipped with two microphones that, with the help of the accompanying software, tune in on the user's voice to pick up their emotional state.

  • Lab126 is reportedly engaged in developing a home robot that has been attributed internally with the codename Vesta. Allegedly, the robot will act as a "mobile Alexa", following users to around the house. This way, even when you are not able to directly speak to an Echo speaker, your command will come to fruition as the robot is by your side to pick it up. If developed the right way, as the robot maps your home and understands its layout as a result of following you around, it won't only let you start the cleaning cycle of the robot pool cleaner when you are on the sunbed tanning and away from your Echo speaker, but it will be able to turn on the lights in the kitchen, for example, as it will be aware of the houses' structure.

On the other hand, the Alexa software team is most probably using a part of its AI software research on adversarial training in the development process of the new wearable, and it's more than likely that when the project is finished, Amazon will implement it in its other devices as well to expand their usability.

How would the wearable aid users?

At this point, it feels a bit stretched out and it is unclear if it will ever become a commercial device, and we are left only with speculations for the most part, but there are countless ways Dylan can change the world. Let's go over some 'what if?' scenarios that could end up becoming reality if the device is developed and released for public use:

  • As it recognizes your emotional state through voice analysis, Dylan could give you advice or guidance so that you work more successfully with others, thus acting like a counselor on human interaction or a rudimentary psychologist.

  • It could even help expand the ecosystem of Alexa-based devices as, based on the emotions you experience at that moment it could suggest a certain product that helps in your situation.

Thoughts on expanding Alexa's skill set

As aforementioned, it could extend the current ecosystem of Alexa-based gadgets. Seeing how Amazon filed a patent in 2017 which portrays a framework that utilizes voice software analysis of vocal patterns to determine a person's feelings, it's easy to clue in on how the technology could potentially be used if released for the public.

  • In the patent, a diagram was incorporated that depicted a woman saying she is hungry while sniffling, and an Echo smart speaker that detected her ailment and asked if she needed a recipe for chicken soup, later suggesting that the woman order cough drops to appease her symptoms. In the documented patent, it is reported that the algorithms take other factors into consideration before making any suggestions, including age, gender, and ethnicity.


With two major projects apparently under development at the moment, Dylan and Vesta to be specific, Amazon seems eager to turn smart homes even smarter, exploring new opportunities for a future where our lives are simplified as much as possible by the electronics in our homes, and even a future where our mental health necessities are somewhat catered to by the gadgets and devices we use.

Real Time Analytics