Intel Discovers How to Turn Any Surface into a Touchscreen
Since humankind is naturally so inventive, there have been, over the years, many who demonstrated a great idea or another that would have the potential to advance a certain segment of the IT industry. On a more particular front, in the touchscreen segment to be precise, emphasis is placed on recognition of multiple inputs, and gestures. There are, however, certain solutions far less common than anyone might think. Intel's latest idea seems to be one of them.
Intel managed to somehow make an algorithm that can give any surface a usability akin to that of a panel with touch input. Basically, a camera is placed above the workspace. Said camera uses an unnamed and unexplained method to create so-called “virtual islands” that actually keep track of whatever is on the desk. To that is added the ability to simply tap on the counter or table and bring up menus, make shopping lists or look up recipes.
In short, a real-time 3D object recognition software builds a model of almost anything placed on the counter, even dirty hands, and simulates touchscreen-based menus depending on movements. This is, practically, the very definition of using technology in everyday life.