Wearables have been part of our lives for years, but now we are at a convergence point for wearables where hardware and software, in combination with artificial intelligence (AI), can help improve lives. In this blog we’ll take a closer look at how developers can use AI to enhance the functionality of wearables through new form factors and experiences, and help you take advantage of this growing space.
The Growth of AI and Wearables
The Wearable Artificial Intelligence opportunity is estimated to reach US$180 billion (€160 billion) by 2025. Some of the trends in AI and wearable technology driving this growth are:
- Continuous improvements in design, functionality, and types of wearable devices available.
- Emergence of edge computing on powerful processors like the Qualcomm® Snapdragon™ Wear 3100 Platform with the quad-core Arm Cortex A7 main processors, QCC1110 co-processor, and Qualcomm® HexagonTM QDSP6 v56 to accelerate AI tasks.
- Enhanced AI algorithms, and advancements in wireless connectivity like 5G, are increasing functionality to provide a seamless user experience.
More Form Factors Than Smartwatches
When most of us think of wearables, smartwatches are usually the first devices that come to mind. However, as you plan your AI development for wearables, consider other form factors that may be better suited for your application and its ability to improve the overall quality of life of your user:
How AI Is Making Wearables Even Smarter
While wearable hardware continues to evolve, it is the creation of new algorithmic models that help identify high-value data as the fuel for useful AI. By creating personalized and convenient user experiences, AI technologies such as machine-learning, computer-vision, gesture-recognition, and even emotional-recognition, are feeding these sophisticated algorithms to help expand the opportunities for these new devices. To give you an idea of just how innovative and beneficial these new wearables with AI are becoming, here are a few examples to get you thinking about what you could create:
Health and Safety
- Computer vision and natural language processing AIs can “learn to walk like a human”. By learning the visual cues that sighted people can recognize in cities such as buildings, paths, street furniture, pavements, sidewalks, curbs, and corners; natural voice cues can help visually impaired wearers get from one place to another safely.
- Advanced machine learning with real-time monitoring in wearables can learn the unusual patterns of oncoming seizures. The wearable could be worn by a person with epilepsy, and then alert them if a pattern is identified while they are driving so that it could give them time to safely pull off the road.
- Machine learning can create meaningful data by monitoring physiological signals and markers of emotional arousal and stressors of children with autism spectrum disorder. This can identify the potential precursors to challenging behavior so caregivers can offer preventive care.
- Computer vision can provide real-time, actionable, audio and visual feedback on metrics that act as a personal coach to assist with improving performance while decreasing the chance of injuries. Machine learning algorithms may also be used to cheer you up to stay motivated if you fall outside of your normal parameters.
- Deep learning, natural language processing algorithms, and a set of hearables (wearables for the ears) can help travelers, or anyone needing to converse in different languages, translate incoming speech so they can have a direct conversation without the need of a third party (i.e., a translator).
- Machine learning can analyze conversations to support people who suffer from conditions like social anxiety or Asperger syndrome to better understand and detect social cues in everyday conversations. Computer vision with facial detection can also provide emotion recognition analysis. Together they can act like a social coach to help people understand the whole mood of a conversation so they can respond appropriately.
As you work with your AI and wearable development team, here are some development challenges for you to consider specific to battery life, privacy, and security:
Battery – There are times when battery life is critical. Having a personal sports coach wearable run out is one thing, but depending on something to alert you of a possible oncoming seizure while driving is another level of required reliability. So choose your battery based on requirements, and include time to recharge the wearable in your development. To further extend battery life, only process data when you need to (consider edge processing) and optimize your code to minimize usage.
Privacy – Using AI to make wearables useful generally involves lots of data. But with privacy concerns a top priority, developing with a privacy-first mindset is important. During development, ask “do I need to collect this?” and if you don’t, then just don’t. For a wearable that is collecting conversation data for translation, you can compute the derived data on the fly. And then if possible, allow the user to choose if privacy controls are on or off.
Security – For security sensitive AIs, like something that uses a wearable camera for facial recognition, you can opt for on-device data processing. The use of edge computing– besides adding real-time speed–means data should not need to be transmitted through private or public cloud services. Avoiding data transmissions offers an optimal level of privacy. Of course, if you must transmit it, you should look at ways to encrypt the data.
We have only scratched the surface of AI's potential with wearable technology development. As mobile processors get smaller, more powerful, and more efficient, it is AI technologies that will support multi-functionalism with vision, hearing, emotions, touch, and even cognition being rolled into a single application. Focusing on wearables that use AI to solve problems can likely propel wearables from occasional nice-to-haves to ubiquitous must-haves. How are you going to change someone’s life today with your AI development?