Interaction between humans and machines is getting more and more exciting. We see magic happening every day when we are swiping and pinching our smartphones. We see concepts influenced by Steven Spielberg’s “Minority Report”. We talk to Siri & Co. But things got even better: At Google I/O in San Francisco Project Soli was announced a miniaturized radar system that tracks gestures to bring interaction with devices to a new level. Watch the video!
The idea behind it: Gestures with our hands are so meaning full and precise, but we can’t leverage it yet. Optical systems are either too big or not effective enough. Who wants to have a big 3D camera integrated in his smart watch or other wearable devices? Why limit the interaction to the space of a tiny screen on your wrist? This is why Project Soli is looking into miniaturized radar technology.
Now imagine this technology in your clothing, your car, your TV. It will need a first activation, so that not every gesture is mistakenly interpreted, but then everything seems to be possible. There are so many interesting opportunities to design this interaction. Experiences with products and services will change – in a positive way.
I am very excited to see this in a few years in a broader range of products.