A startup called HyperSurfaces wants to completely change how you interact with the physical world — and based on some recently released demo videos, it might just meet that lofty goal.
The London-based startup recently unveiled a new technology that can transform any object into a user interface. Essentially, this tech lets you communicate with a computing system using virtually anything you like as a conduit — a glass wall, a car door, even a metal clothes rack — and it has the potential to end our reliance on keyboards, buttons, and touch screens forever.
The HyperSurfaces system comprises two parts: sensors that detect the vibrations that form when a user touches an object and a system-on-a-chip that uses AI to process that sensory information. Because all the processing takes place on the chip — and not, for instance, in the cloud — the feedback in nearly instantaneous.
This wouldn’t have been possible just a few years ago, according to HyperSurfaces CEO Bruno Zamborlin.
“The HyperSurfaces algorithms belong to the current state of the art in deep learning research,” he told TechCrunch. “On top of this, the computational power of microchips literally exploded over the last years allowing for machine learning algorithms to run locally in real-time whilst achieving a bill of material of just a few dollars. These applications are possible now and were not possible 3 or 5 years ago.”
Scratch the Surface
In the demos, HyperSurfaces’s system-on-a-chip connects to a laptop that declares the action taking place (“knock wall” or “bounce ball on wall,” for example). However, it’s not hard to imagine how we could calibrate the system to be much more useful. Want to turn on your home audio system? Just tap your wall. Think the music is too quiet? Drag a finger down it to decrease the volume.