Inventing the future: Alex Kipman reflects on Hololens at LiveWorx

Meanwhile, back in Redmond, Microsoft opens access to HoloLens sensors.

PTC’s LiveWorx conference is all about the Internet of Things and AR. It has grown out of the PTC’s acquisition of ThingWorx  and the subsequent remaking of the company to put it in the middle of digital lifecycle of products. In January 2018, PTC and Microsoft announced a strategic alliance around the HoloLens, which would encourage the development of AR apps for HoloLens using PTC’s tools and content created in the Creo.

It’s a sensible deal, platform to platform, which makes life easier for developers who have more standardized tools to work with and support. Such deals can make AR a little less like the Wild West, which it really is right now.

Alex Kipman, the “father of HoloLens” is something of a patron saint at LiveWorx this year. He gave a keynote on day 2 of the conference. Kipman not only led the development of the HoloLens, but he also ushered the Kinect into our homes. Earlier this year, at Microsoft Build, Microsoft announced Kinect for Azure. The Kinect for gaming was discontinued in 2017, but the Kinect lives on in HoloLens as the device’s depth sensing capabilities. At LiveWorx, Alex Kipman says the Microsoft vision is for an intelligent edge connected to an intelligent cloud.

Kinect lives: the Kinect for Azure. Microsoft’s latest take on Kinect is a connected device with onboard compute, access to Azure AI, an advanced time of flight (ToF) sensor, RGB camera, 360-degree mic array, and accelerometer. Microsoft asks developers to use it and help build an ambient intelligence.

Kipman has worked at Microsoft since 2000. He has gone through some of the worst, or at least most uncertain, periods of the company and he has seen dramatic transitions and tensions. He was there in the Ballmer years and survived the transition to the Satya Nadella as CEO.

Kipman seems a creature of the Nadella era for nothing says post-PC like the HoloLens. Actually, in a recent Microsoft research blog, Microsoft Director of Science Marc Pollefeys describes the Hololens as “the world’s first self-contained holographic computer.” It is something like Computer 4.0, or at least that works at this LightWorx a conference that explores the next era of engineering and design.

Kipman told the audience that the HoloLens lived a perilous existence throughout its early development, which began back in 2007. He said the period between 2010 and 2016 in its incubation period was the most dangerous.  “HoloLens was always on the chopping block,” but Kipman now sees that process as a necessary part of product development. The HoloLens developers had to fight for their project every day, said Kipman.

Today, the HoloLens is assured life through at least a second version. While Kipman was speaking at LiveWorx, rumors about the due date for the HoloLens 2 leaked out of Microsoft HQ. It is said, the lighter and less expensive version will be announced later this year and will arrive in 2019. It’s also being reported that the HoloLens 2 will have a Qualcomm Snapdragon XR1 processor. This processor has already been picked up by HTC for their Vive; Pico (which debuted its Neo headset at E3), and Vuzix.

More immediately, HoloLens developers have gotten a significant update from Microsoft Research. With the latest update of Windows 10, Microsoft has included Research Mode for the Hololens and future products.

Designed purely for researchers to experiment and investigate the potential for computer vision and robotics, Research Mode opens up access to the HoloLens sensors including:

  • The four environment tracking cameras used by the system for map building and head tracking.
  • Two versions of the depth camera data—one for high-frequency (30 FPS) near-depth sensing, commonly used in hand tracking, and the other for lower-frequency (1 FPS) far-depth sensing, currently used by Spatial Mapping.
  • Two versions of an IR-reflectivity stream, used by the HoloLens to compute depth, but valuable in its own right as these images are illuminated from the HoloLens and reasonably unaffected by ambient light.

In his blog article, Pollefeys says researchers can now use raw sensor data with their own algorithms and the streams can be processed on the device as they are now but also streamed to another computer or wirelessly transferred to Azure, that smart cloud Kipman was talking about.

Microsoft is opening up all the sensors HoloLens headset to researchers. The company does warn them to stay in the realm of theoretical because opening up the sensors also decreases the security of the device and it will run hot. It’s sure is going to be fun to see what comes of it all. (Source: Microsoft)

Back in Boston, Kipman talked about the future for devices like HoloLens and Kinect and other devices creating an intelligent edge. He says AR and AI are inevitable and they are intertwined. He described real life as a “analog substrate” and digital content like AR and AI are being built on top of it, to enrich our lives.

He sees the roll out of HoloLens being a gradual use dependent on its value to the user, and that value does not exist right now for consumers. He says devices like the HoloLens will evolve along a price to value ratio and they’ll be used more as they become more comfortable and the battery lives longer. For the users right now he says, if a device enables someone to do something they have not been able to do before, then they’ll use it and they’ll become habituated to using it. Right now, those people are likely to be front line workers. He also mentioned actual use cases such as a surgeon being able to project a patient’s MRI on top of a patient’s body in preparation for an operation. Or an automotive designer can replace the clay model with an AR version of a possible design.

So, what’s next? Kipman says these are lonely experiences and, he predicts presence is the next big thing. The next step will be the ability to have shared experiences and to be able to collaborate on projects. One of the aspects of HoloLens that’s already there, said Kipman, is that when you’re looking at something and you share that vision, perhaps you annotate it, the other person also knows the context of your view. They can tell what your location was when you were looking, what the angle is. That’s a new level of information and perspective that has not been available in looking at shared models.

Kipman had advice for engineers at LiveWorx. He said, don’t think so much about the near future, think 10 years out. “To innovate,” he said, “you want to push ahead.” He said that if you tell a colleague about an idea and that colleague says, yeah, that’s a great idea. Kipman said, it might be a good idea, but it’s not a great idea. But, if your colleague says you’re crazy, well then you might have something. “my job as an engineer is to make the future happen.”

What do we think?

As we have said Microsoft never gives up. The Kinect was a magnificent innovation that no one knew how to use for a while, but obviously, Microsoft never wholly gave up on the device and now it’s going where it belongs out on the edge.

The HoloLens will probably continue to struggle as it has considerably more evolving to do, but Microsoft sees such projects as valuable because it helps push the company into new markets and forces them to build out technology to meet up, in the cloud like Azure, or somewhere in the gaming community like Xbox.

Doesn’t all this make you curious about what Microsoft’s geniuses in research are thinking about mobile. HoloLens is one example, but there are probably more in the skunkworks. Let’s ask, 10 years from now, what will you want in a phone?

One of the interesting things Kipman said about product development was that he thinks about product lines in terms of seasons. For mainframes, it’s Winter, PC’s might be living in Fall, but AR and AI are Spring products. The industry is new and there are lots of opportunities. Kipman says, “you have to keep looking for Springs,” and he says, “Microsoft may have missed the mobile Spring.” Knowing Microsoft, we’re betting there are interesting products in the pipeline that are mobile but they’ll be able to see, think with AI, and to communicate with a higher intelligence: Azure Cloud.