The King of Input
We get eased into interaction paradigms.
We're slowly presented with the same motion, the same gesture, the same feeling. The physio-mechanichal interaction between your mind, and the UI you're presented with, is becoming more gestural, more reflexive.
A simple example is the adoption of the "swipe up to dismiss image" feature used in Facebook's mobile app… it's finally being adopted by UI/UxD devs more widely. We've been trained, eased, into something that now takes on a "natural", intuitive feeling.
We're pretty far away from the days of skeuomorphic designs, and the jumbled, overly complicated, process driven interactions which we suffered through in the past decade.
This reality has made the finger The King of Input, at the moment. You can watch users attempt 'gestures' on desktop apps that are reserved for mobile devices. It's kind of funny.
The idea of cross-platform development with branched interaction structures is fading away, and being replaced with apps that have consistent analogs of the traits; character, personality, and aesthetic.
What happens when you throw IoT into the mix?
Google recently announced it's new IoT initiative. They're focused on geolocation, machine learning, and cloud services. This is a logical next step for IoT, but there's more coming, and it's not just iBeacon and IFTTT.
Whether you treat IoT in your application as a form of information (data), a form of sensing (input), or both (…or inverted), these approaches will affect the course of your UxD vision. It still can be something you can feel.
The world is encroaching on your app.
Brillo and Weave can help you get started with integration of Google's vision, but you're on your own when it comes to the UX/UxD of your vision, and if you're not able to fully accept this elephant-in-the-room sea change. then you may end up playing catchup… because the next King of Input is the world itself.
Get ready. It's going to be fun!
Should be clearer now.