Hume AI Unveils Emotional Intelligence Voice Interface, EVI
Hume AI recently announced the completion of a $50 million Series B financing led by EQT Ventures. The company also unveiled its new flagship product, the Empathic Voice Interface (EVI), which is a revolutionary conversational AI with emotional intelligence capabilities.
This financing round attracted participation from numerous well-known institutions, including Union Square Ventures, Nat Friedman & Daniel Gross, Metaplanet, Northwell Holdings, Comcast Ventures, and LG Technology Ventures. The funds will be used to expand Hume's team, accelerate its AI research, and further develop its emotional intelligence voice interface.
EVI is a universal voice interface that combines transcription, cutting-edge large language models (LLMs), and text-to-speech into a single API. It utilizes advanced multimodal generative AI technology, combining LLMs with expressive measures to create Hume's unique empathetic large language model (eLLM). The eLLM allows EVI to adjust its language and intonation based on context and user emotional expressions. Developers can easily integrate EVI into various applications with just a few lines of code. The product is scheduled for public release in April.
Hume's research in emotional AI is based on the Semantic Space Theory (SST), which is a data-driven framework for emotional understanding. Through extensive data collection and advanced statistical modeling, SST reveals the complete spectrum of human emotions, demonstrating their high-dimensional characteristics and continuity. This theoretical foundation provides strong guidance for training Hume's models and developing products.
EVI has several advanced features, such as turn-taking detection, which uses the user's intonation for cutting-edge turn-taking detection, avoiding awkward overlaps. Additionally, EVI is interruptible and can promptly stop speaking and start listening when interrupted, mimicking natural human conversations. Furthermore, EVI can respond to prosody, understanding the natural fluctuations in tone that convey meaning beyond words. It can generate appropriate intonation to respond to user requests in a natural and expressive manner.
EVI aims to optimize user happiness and satisfaction, continuously learning and improving from user feedback. This interactive coordination with users makes EVI an almost human-like conversationalist, better understanding and meeting user needs.
Hume's technology has broad application prospects in healthcare, customer service, productivity tools, and other fields. For example, the Icahn School of Medicine at Mount Sinai is using Hume's expressive AI model to monitor the mental health of patients undergoing experimental deep brain stimulation therapy. Additionally, the productivity chatbot Dot utilizes Hume's AI technology to provide context-aware emotional support to users.
This announcement comes at a time when Hume is achieving remarkable growth. In the past year, the company has launched two key products: the Expression Measurement API and Custom Models. The former is an advanced toolkit for measuring human emotional expressions, while the latter uses transfer learning from these measurements to predict human preferences. Furthermore, Hume has expanded its foundational database, incorporating natural data from over one million different participants. The company has also doubled its workforce from 15 to 30 employees and published multiple academic papers in top-tier journals.
Alan Cowen, CEO and Chief Scientist of Hume, believes that empathetic AI is crucial for combining artificial intelligence with human well-being. He stated, "By building AI that learns directly from proxies of human happiness, we are effectively teaching it to reconstruct human preferences from first principles and then update that knowledge with every new person it talks to and every new application it is embedded in."
Hume's technology is highly appealing. As the company continues to push the boundaries of machine emotional intelligence, it has the potential to redefine how we interact with technology, paving the way for a more intuitive, empathetic, and ultimately human-centered AI experience. It is worth trying their demo to experience the new interactive possibilities brought by emotional intelligence.