Turning ‘natural interface’ input into a new data standard

Khronos Group announces a new open standard working group for advanced device and sensor input. Two game input developers are already on board.

Sensors are all around us and they are gathering a wealth of information. Unfortunately, much of the information these things are gathering is not being used as fully as possible. The dots have not yet been connected.

The Khronos Group says they have gotten requests and proposals from multiple members and as a result they have formed the StreamInput working group. The group will examine the ways in which information from devices like depth cameras, motion-tracking sensors, touch-screens, and haptic devices can be accessed and put to work in applications.

TransGaming has volunteered to lead the working group; CEO Guvriel State is looking for more input. He says the nascent group has a variety of supporters and, he says, “we hope that many other companies will join us to help build a lasting standard that can be broadly adopted across multiple devices and market segments.”

Sensor data doesn't have to be location-bound or device dependent. Data Logger for iPhone from Pachube enables users to store and graph a wide variety of sensor data types, along with a time stamp and geolocation. The Khronos StreamInput standard would be able to work with data from this and many other device types. (Image courtesy Pachube).

Microsoft’s Kinect and the similar devices being developed to capture 3D data are an obvious catalyst for the group. Eric Krzeslo, the chief strategy officer at Softkinetic, says there is considerable opportunity coalescing around natural and intuitive human machine interfaces, in particular using 3D depth sensing and gesture recognition. His company is interested in standardized APIs for natural interface tools as well as traditional interfaces to make it easier for developers to create re-usable, and consistent applications. There are quite a few devices being built that take advantage of low cost camera sensors for 3D input depth sensing and gesture recognition.

Khronos president Neil Trevett describes the variety of devices that are already out there and the new products being developed. The complexity is daunting; Trevett says the industry needs “a robust input API,” which will help drive market adoption of these new sensors.

The new working group will develop a general-purpose framework for consistently handling new generation sensors as well as traditional input devices. The group plans an API that will enable system-wide sensor synchronization for advanced multi-sensor applications including augmented reality. Khronos says their strategy of enabling extensions for new uses, situations, and API variations will enable the group to respond quickly as new input devices are introduced.

The Group says it plans to have a first public release within 12 months.

What do we think?

It is interesting that just last month we described how sensors and the data they collect are becoming connected. In that case the applications were primarily being explored by the architectural community to incorporate information about the environment into design and maintenance strategies for buildings. (See “Pachube crowdsourcing real-world data for fun and progress.”) We also talked about scientist/performance artist Usman Haque and his Pachube project, a site that enables enthusiasts to plug in their sensor data and make it available for others to incorporate in applications. And now this.

There is a tipping point out there somewhere and it doesn’t seen too far away. The Internet of Things is practically building itself. Khronos’ first role will be to help developers take advantage of sensors for mobile and console devices, but the day is not far off when the applications for sensors broaden further into our everyday lives and capabilities. Gesture control and facial recognition will come into play every time we enter a room, and our mobile devices will routinely use augmented reality to help us access critical information about the things and people around us. Won’t you be glad when you can surreptitiously point your camera phone at someone and it will tell you who the person is and why they look so damned familiar? – K.M.