Home News Empath AI Technology Recognizes User Emotions in Real-Time

Empath AI Technology Recognizes User Emotions in Real-Time

0
SHARE

Amazon is breaking boundaries in the AI (artificial intelligence) technological department with Alexa. However, there are some features that Amazon hasn’t been able to achieve yet such as making it possible for Alexa to determine a user’s emotions by analyzing the tone of the user’s voice. Well, it looks like this technological barrier has now been broken by a company from Tokyo known as Empath which uses AI to determine emotion from the user’s voice.

Empath

As previously mentioned, Empath is a company based in Tokyo and it was founded back in 2017. The company uses algorithms which are trained on tens of thousands of voice samples provided by the Japanese health tech company known as Smartmedical Corp. Even though Empath has a staff of 20 people, it has managed to create an innovative a platform dubbed as Emotion AI.

Emotion AI can be used to automatically detect one of four emotions in a user’s voice. The detectable emotions are: joy, anger, calmness and sorrow. To make things even more interesting, Emotion AI can do all of this by analyzing real-time speech in any language and even in high-noise environments such as in a busy market.

The API is Supported Across Windows, iOS and Android

Empath has announced that the Web Empath API is now supported across Windows, iOS and Android. This means that app developers will be able to use the API in order to make it possible for their apps to detect the user’s emotion. Therefore, it’s not going to take a long time before we see Empath’s AI technology utilized in some of our favorite’s apps.

Call Centers

Since the AI technology is so innovative, it has already been deployed in multiple call centers. According to Empath, the AI technology has managed to reduce the supervisor overtime by 20% and boost the overall sales conversions by up to 400%.