I was contacted by a rather nice bloke from the Verge called Jacob Kastrenakes recently to talk him through the Starkey Livio AI and to see if I felt it was a true game-changer. He was contributing to the article The Verge's Gadgets of The Decade. I do, and I was quoted as saying "The Livio kind of represents a merging of hearing aids and #hearable tech and how that will probably look in the future". Since that conversation, and because of a couple of other conversations over the last few weeks, my thinking around the direction of hearing aid features has expanded. I also had a very clear insight, the features that are attractive to younger consumers and the features that are attractive to older traditional hearing aid wearers are converging, not diverging. Let's talk about augmented reality and hearing aid features.
Those Features
The feature set of the Starkey Livio AI is fascinating and I have touched on them before here on Know. But it is what they represent that so excites me. Some of the features mentioned below are desired features they say they want to introduce. It is what they represent that I believe gives us that future roadmap. I think we can look at it in this way:
Easier Communication
- Translation in 27 languages
- Voice-to-text Transcription
- Augmented audio
Augmented Reality
- Amazon® Alexa connectivity
- Thrive Virtual Assistant, built on Google Assistant
- An Expanded Personal Assistant
- Augmented Audio
Health Monitoring
- Fall Detection and Alerts
- Heart Rate Measurement
- Emotion Tracking
- Sociability Monitoring
- Temperature Sensor
- Sleep Assistance
Ease of Use
- Natural user interface
- Adaptive Fine-Tuning & Cloud Connectivity
- Voice-first integration
- Self Check for hearing performance
The Features of The Future
If we look at the overall feature set, both existing and desired, it is easier to understand the attractiveness of it to both health conscious consumers and to the more traditional hearing aid wearer. At the core of that attractiveness is the desire to understand and monitor health, to be able to communicate in an easier fashion and to have our everyday experiences augmented in a useful way.
Yes, our core is amplification to correct a hearing loss, but in fact, what we are doing is delivering augmented audio. We are at the forefront of innovation in that space. Augmented audio is attractive to people without a hearing loss as well. The attractiveness of augmented audio has been at the forefront of the explosion of hearables.
I have heard some people say that these features or abilities aren't core, I beg to differ. I think they are as core to the design of modern hearing aids as Bluetooth radios are. We help people with hearing loss live better and fuller lives through the augmentation of their ability to hear and communicate. Many of these features will either make their lives better, make their lives easier, help them hear better or augment their ability to communicate.
While we manipulate audio augmentation to deliver better hearing, why should we not manipulate it to deliver better living?
Voice First & Hearing Aids
Voice-first is already exploding as the technology becomes better and more relevant to more users. Just a short time ago here on Know, I spoke about voice first and what it might mean for hearing aids. The Echo Buds introduced by Amazon recently represent a massive step forward for voice-first technology. With integration with Alexa and expanding Alexa abilities, they represent the beginning of a sea change in how consumers will interact with our computing devices and indeed our world.
Voice-first represents complete ease of use and as its abilities and integration increase, it will also represent augmentation of our reality. Voice-first represents much more than access to the music you want to hear or the web searches you want to undertake. It can also mean access to information, directions to a place, the answer to burning questions and what you are doing today.
It doesn't stop at that, in fact, it only stops at the limit of the technology companies imagination. That is where hearing aids come in, hearing aids are an ideal access device to voice-first. You wear them every day and most modern hearing aids will connect to smartphones, it is this ability that drives so much opportunity.
Hearing Aids Are Ideal For Voice First
Hearing aids are an ideal vehicle for the future of computing, especially when we factor in voice-first as a leading technology. It's simple, you wear them all the time and more importantly, they are listening all the time. Voice-first is an attractive technology that makes life easier, it can be leveraged to make life easier for a young thirty-something office worker as well as for a retired seventy-year-old.
It is also the core technology in the future of smarter personal assistants on our phones. Something that I have spoken about probably too much, anyone remember Bob? A smart personal assistant whispering sweet nothings in our ear can be as relevant to a forty-year-old corporate worker as it is to an eighty-year-old retiree who needs to remember to take meds.
Voice-first can also help with the day to day management of the hearing augmentation from situation to situation. No need to pull out your smartphone to go to the app, simply say turn down the background noise, or make speech a little clearer. When tied to a machine learning system which I will talk about a bit later, it would deliver huge power with simplicity of use.
The language translation system in the Starkey Livio AI represents a very intelligent use of the voice-first concept leveraging the microphones of the hearing aids. As does their virtual assistant. It simply makes sense.
Health Monitoring
The Livio AI was the first hearing aid to offer relatively decent health monitoring opportunities, while it monitors activity and now heart rate, it also tries to monitors socialisation and how much time is spent engaging with people.
Health monitoring functions are hugely attractive to younger consumers, just as they are to more traditional hearing aid wearers. An understanding of activity levels and general health monitoring is as important if not more important for older people.
The fall detection system that Starkey offers is an outstanding development. I honestly think that this is a stroke of genius, I have said for many years that hearing aids could be the ideal platform for the monitoring of activity and health of older people, ensuring that they can lead independent lives for longer. This system represents a big part of that concept.
Machine Learning
Machine learning is a part of the greater artificial intelligence movement in computers. Large amounts of data are processed in order that a management system can learn from the data. Machine learning and other artificial intelligence strategies have been used in healthcare for several years, but only recently in hearing aids.
Widex has pointed the way forward here. Their system learns the sound preferences of hearing aid users in different sound situations in real-time. This allows their system to make better judgements on what hearing aids should do in complex sound situations.
Designing Simple
Widex took great pain to design the user part of the machine learning function to be simple. Anyone else who follows this concept needs to do so as well. This is really important because if it is difficult to use, users just won't use it. Widex offer a very simple A/B choice structure, does A profile sound better in this situation or B profile? The user picks and the data is stored and sent to the cloud via the Smartphone app.
If this system was paired with voice-first technology it would be a simple exercise. With a few words, a user could quickly process through a few changes in real-time and pick the best setting for that moment. That would be a huge step forward in ease of use.
Over time, the evolution of machine learning will make a huge difference for hearing aid users everywhere. The always-on continuous data exchange between the hearing aids and the cloud server system will lead to higher functionality in hearing aids than ever before. Hearing technology of the future will continuously improve as more people use them in more situations. That will deliver a better augmented audio reality for anyone who wants it.
Smartphone Apps
The core interface for all of this technology will be the smartphone app. Many of the hearing aid brands now offer in-depth control of their hearing aids through the smartphone interface. I think that ReSound is still probably the king of the apps right now, but only because Widex doesn't quite offer as much in-depth control and their machine learning system is confined to very few of their hearing aids.
For the foreseeable future, the smartphone app is where all the magic will truly happen, it will be the interface for all of these amazing possibilities. The app and its connectivity are what will drive innovation moving forward. As it does though, the paradox will be that we use that app less and less. Voice-first integration will make sure of that.
I think the future of hearing aids will be about augmented reality through audio. An augmented reality that makes life better and easier for many people. That may well include people that don't have hearing loss. The concept can be all-encompassing, it can include everything from your health data, through what your day looks like to the details of the beautiful piece of art you are standing in front of.
Like us on Facebook by clicking the button below to keep up with our latest utterances. Alternatively, if you don't like Facebook, sign up to the newsletter below. It is important to remember, a hearing aid isn't properly fitted unless they do Real Ear Measurement.