I had a great conversation with Dave Kemp of Oaktree Products recently about the technology introduced at CES in 2019 by the hearing aid brands. Like me, Dave is fascinated by smart assistants and the voice first strategy we are beginning to see in technology. He recently wrote an article for Voice Bot called "Hearing Aids as a Home for Smart Assistants – The Hearables Market Expands" which detailed the latest changes in hearing aid functionality in this area. You can watch our conversation below.
Voice First
Voice first is basically the switch from physical user inputs to voice inputs. Instead of typing, you can just say it. Voice recognition has been around for a very long time. However, with the introduction of smarter and faster systems, it has become exponentially better. Voice first has led to the introduction of the new wave of smart home assistants like the Alexa Echo and Google Home Smart Speakers. With the advent of a deeper connection between hearing aids and Smartphones, voice first features make a lot of sense in hearing aids.
Smart Assistants & Machine Learning
Three of the big name hearing aid brands have recently introduced a personal assistant aspect to their latest hearing aids. Oticon introduced Kaizn, first which they billed “the world’s first AI personal assistant for your ears.” Of course, GN Resound announced at CES 2019 that a Siri app integration will soon be available for their Quattro and LiNX 3D hearing aids, through an app update in February. Finally, Starkey announced at CES that it will be introducing its Thrive personal assistant, through its Thrive app for its new line of Livio AI hearing aids.
I absolutely love the idea of my hearing aids being my smart assistant. I can see clearly how it could make my life better. With the explosion of function for and use of smart assistants like Siri or Alexa, it makes perfect sense to integrate these systems with hearing aids. Dave pointed out in his article that according to Deloitte, the 55 to 75 age user group is actually the fastest growing cohort of users of smart assistants.
What's My Schedule?
Smart assistants offer a huge amount of functionality, integrating them with the function of your hearing aids makes real sense. The concept moving forward would be that your hearing aids function as your personal assistant. Telling you what your day is like, notifying you of and reading you your emails and text messages. In turn, you will instruct them to use a particular programme or increase the amount of noise reduction and narrow the microphone focus.
The introduction of these types of strategies allows the hearing aid brands to use machine learning to improve the user experience. Widex was first to the fray with Machine Learning with the introduction of their Evoke Fusion 2. The idea is simple, the brands will leverage user input and preferences alongside sound environment data to make their hearing aids function better. Machine learning really is the future, it uses the idea of big data to its best effect.
It's an exciting new world, like us on Facebook by clicking on the link below to keep up to date with our latest articles.