Thanks to the popularity of Amazon’s Alexa virtual assistant and its online shopping hub, the company likely already knows a great deal about you – including your online purchases, the music you listen to, your address, among other things.
But soon enough, Amazon might be able to discern your emotional state, too. The online retail giant is reportedly working on a new wrist-worn gadget that can recognize human emotions, according to a new report from Bloomberg.
The gadget would work in conjunction with a smartphone app and would include microphones to recognize speech. The software present on the device would be capable of understanding the wearer’s emotional state based on the sound of his or her voice, says the report, citing internal Amazon documents and a person familiar with the situation.
It’s said to be a collaborative effort between Amazon’s Alexa voice software team and Lab126, its consumer gadgets division.
It’s unclear how far along the project is or if it will ever come to light as a consumer product, but Amazon has filed patents for software that analyzes a person’s voice patterns to determine how he or she is feeling, as Bloomberg notes.
When contacted by Business Insider, Amazon said it does not comment on rumors or speculation.
If the product does launch, there’s a chance it could raise additional concerns about consumer privacy, just as the Echo did when it launched in 2014. Privacy woes bubbled up again last month when Bloomberg reported that Amazon employees and contractors manually review and transcribe conversations that Alexa users have with the digital assistant.
Amazon’s reported efforts to help its products understand human emotion would come as technology companies are under heightened scrutiny regarding the information they collect about consumers. Google and Facebook, for example, have both testified before Congress to answer questions about their data collection policies among other topics.
It would also come as rivals are making significant enhancements to their voice assistants, particularly Google. Earlier this month, the search giant unveiled a faster new version of the Google Assistant that’s better at holding a conversation without requiring the trigger phrase to be repeated and can personalize answers based on previous requests.
But getting voice assistants like Alexa or the Google Assistant to understand a user’s emotional state is challenging because of the sheer amount of data that would be required. Amazon would likely need to train its technology using vast amounts of data depicting the human voice in different emotional states to get such a capability to work.
Google, for example, recently announced its effort to help the Google Assistant work better for people living with disabilities that impact speech, which is calls Project Euphonia. Under this initiative, Google has partnered with the ALS Therapy Development Institute and ALS Residence Initiative to record the voices of people leaving with disease to improve the accuracy of its speech recognition technology.
To create a device that uses voice to understand a user’s emotional state, Amazon would likely have to similarly collect a large trove of data.