Google has recently unveiled a new system for recognizing gestures that doesn’t require physical contact – technology that would have far-reaching implications specifically in the field of wearable tech. It’s Project Soli, a new project from Google ATAP that can recognize a wide array of user gestures through radar – and even George Bluth could use it, since there’s no touching required.
According to Google, the goals of Project Soli were to deal with the bulkiness of today’s wearable technology, while not sacrificing the precision of a larger touchscreen. The solution was to eliminate the touchscreen entirely, and Soli can recognize more gestures than a screen ever could anyway, including simple points and snaps.
But the Google Project Soli team can explain how the whole thing works a lot better than I can, and you can watch them show off their new toys in the video below. And for more from Google Advanced Technologies And Projects (ATAP), check out its official YouTube channel right here.
Here’s the video: