Data is needed everywhere, and virtualization is the new transport. A report from the recent Virtualize Conference.
By Kathleen Maher
Jon Peddie Research’s second annual Virtualize Conference featured an ambitious agenda, designed to reach a wider audience interested in the potential of virtualization technology. We probably went a little far, cramming in discussions of virtualized GPUs, gaming in the cloud, design, construction, geospatial, virtual reality (VR), and virtualized GPUs for automotive. On the other hand, it turned out to be a really informative and frank discussion.
Virtualization became a practical reality when Intel and AMD announced hardware support for virtualization technology in 2006—that was for their CPUs. The technology continued to improve, and software providers like VMWare and Citrix made the technology accessible. Microsoft stepped up with the Azure cloud services, Amazon with AWS, and Google has added the Google Cloud Platform. Nvidia gets credit for kicking off the next wave of virtualization, which enables visualization via virtualized GPUs. Intel followed suit. And, in 2015, AMD has announced their own implementation.
The stage is set and the potential is huge, but virtualization doesn’t really make headlines; it’s still just too geeky, but make no mistake change is coming. Imagine a world where we are unburdened by our devices—especially any kind of personal computer. Instead, compute power, your applications, your work, and your content are all out there just waiting to flow through the most appropriate portal. The reality of work anywhere, anytime, on any machine has been realized, but not yet for all. In the meantime, though, the Virtualize Conference served as a check-up on where we are now.
The conference featured speakers who have a need to share visual information, collaborate, and extend their capabilities. Interestingly, sometimes it’s not sectors on the bleeding edge that are making the most use of advanced technologies. Construction and geographic information systems (GIS) are among those industries, and we learned much about how professionals in these areas are already using virtualization to their advantage.
McCarthy Builders has been transitioning to BIM workflows, which in turn has stimulated the company’s commitment to centralized, cloud-based document management. McCarthy is a US-based design and construction firm doing $3.3 billion annually. The company ranks among the top 25 US contractors. They’re thriving as the construction industry comes roaring back to life after more than a decade of slow growth and recession. Healthcare is among the largest segments served by McCarthy, but the company’s customer list also includes Education, Federal, Water/Wastewater, Science and Technology, Solar, Industrial, and Civil. The company also does a good business in parking structures.
Early in his career at McCarthy, David Burns worked with the company’s Oracle-based ERP system, and he’s now part of the effort to add capabilities to the company’s cloud-based workflow. The company is rummaging around in the whole toolbox, they are working with VMWare for hardware virtualization and taking advantage of Citrix’s XenApp for application virtualization. Burns noted that the virtualized applications are distinguished by their nature— “usually those applications are very chatty,” he said, and include scheduling software, cost estimating, model-based estimation, etc. The company is also working with Microsoft to provide Virtualization via Azure to their project partners and contractors.
The company is still in the building phase beginning with document-based workflows and working their way up to taxing the system for more visualization for their applications such as Revit. The real evolution is happening in the field. Design has been an electronic process for McCarthy for some time, but Burns says work in the field is much different. It’s paper based, and more than that, he said, there’s a lot of arts and crafts involved—plans are cut, pasted, and taped together to communicate revisions. In this, McCarthy is a reflection of the transition going on in the AEC field. Burns noted that the transition to electronic plans and to shared 3D models started happening in 2010 and 2011 as BIM became more prevalent in the industry. Now the industry is also adding on reality capture to communicate the truth on the ground. Currently the size of the data makes it difficult to interactively share the results of reality capture, but the company is looking forward to adding higher bandwidth technologies including Google Fiber and 5G networks.
The digitization of AEC to the field has just picked up speed over the last five years, Burns says. It comes as mobile connectivity has become ubiquitous, putting access to digital content in the hands of everyone who has a phone. The phone has helped make the argument, now tablets are also being used, and the on-site sheds and trailers are becoming high-tech communication centers with the result that AEC is finally transforming itself.
Geospatial applications vendor Hexagon Geospatial is a major company in the industry, formed by the acquisition of leading companies in design, mapping, metrology, and process and power. The company has interests in oil and gas, mining, agriculture, mapping, and utilities. Hexagon CTO Brad Skelton recognizes many of their customers—who work in remote locations but require access to information and analysis—could benefit from cloud-based applications. However, much of the data managed by Hexagon’s customers is also very sensitive.
Skelton’s team built an online version of the company’s Producer product, which features remote sensing, GIS, and ERDAS Imagine tools for analysis of imaging information. Presenter Patrick Bergen from Hexagon says Hexagon’s end users can be up and running in 30 seconds. He called it native software as a service, emphasizing that the software made available via Producer Online is the same as the company’s desktop software.
Hexagon worked with Numecent to create Producer Online. Numecent’s technology includes an authoring tool to pre-virtualize software. That’s key to the security side of cloud-paging. Numecent says they do not transmit pixels from the cloud or execute the actual application on a server. Cloud-paging transmits the pre-virtualized software instructions, and the instructions are executed on the user’s machine in a transient manner. Once it’s gone, it’s gone. This implementation, noted Bergen, satisfies the security requirements that are typical of Hexagon’s customers.