Typing, clicking, speaking, gesturing, beeping, gazing, oh my! Blend modalities into natural and graceful experiences.
The rise of the Natural User Interfaces (NUIs) such as speech-, gaze-, and gesture recognition, as well as the skyrocketing adoption of connected devices such as smart speakers and wearables, has brought in the age of multi-modal interactions.
They allow us to create beautifully complex transitions between touchpoints, devices, and input modalities.
But tackling such interfaces can be scary and overwhelming. I would like to share the lessons I’ve learned from creating such experiences for the medical professionals with the IxDA community.



