A milestone for the SPECviewperf benchmark

For 30 years, SPEC’s SPECviewperf benchmark has been the go-to tool for comparing graphics performance across different systems—and it just got a major upgrade. The new SPECviewperf 15 now tests eight modern applications like Blender, Unreal Engine, and SolidWorks, measuring performance across OpenGL, DirectX, and Vulkan.

What’s really exciting? It now captures cutting-edge tech like advanced ray tracing, Unreal’s Nanite virtualized geometry, and GPU-accelerated rendering. The best part? You can still run it without expensive software licenses!

As AI and neural rendering reshape graphics, SPEC promises to keep evolving this trusted benchmark to help everyone—from individual buyers to major enterprises—make smarter hardware decisions.

(Source: SPEC)

New graphics technologies, from advanced ray tracing to virtualized geometry, offer exciting new capabilities for application developers, design engineers, game developers, and users—all of whom crave more powerful graphics capabilities. However, the increased complexity associated with these technologies has made it more difficult for hardware vendors and system buyers to compare the performance of graphics applications on differently configured computing systems.

Fortunately, SPEC, which is celebrating 30 years of providing the SPECviewperf benchmark to the industry, has delivered a new version that keeps pace with these technical advances, enabling unbiased, vendor-neutral comparisons of application performance on the latest generation of platforms.

History of the SPECviewperf benchmark

Three decades ago, the SPEC Graphics and Workstation Performance Group (SPEC/GWPG) developed SPECviewperf benchmark—then called Viewperf—the first benchmark to use real-world datasets (viewsets), tests, and weighting to provide consistent results across OpenGL implementations. It was also the first benchmark to be developed in cooperation with the independent software vendors (ISVs) creating the major graphics applications of the day, including PTC’s CDRS, IBM Data Explorer, Intergraph DesignReview, Alias Wavefront’s Advanced Visualizer, and the Lightscape Visualization System.

A key to the huge success of the benchmark was that it could be run without installing licenses for the represented applications, which meant vendors and system buyers could make accurate comparisons of hardware performance without the prohibitive cost of purchasing software licenses.

The SPECviewperf 15 benchmark

Over the years, the SPECviewperf benchmark has continued to evolve to provide the industry with a standard and consistent way of measuring graphics performance, keeping pace with evolving hardware, applications, and user requirements.

Today, eight modern graphics applications are represented in the SPECviewperf 15 benchmark: 3ds Max, Catia, Creo, Maya, SolidWorks, Unreal Engine, Blender, and Enscape, and the benchmark now measures the 3D graphics performance of systems running under the OpenGL, DirectX, and Vulkan application programming interfaces (APIs). The benchmark can still be run without installing licenses for the applications, and the diverse sets of modern workloads are easy to install and run, and provide high-quality, consistent results.

The new benchmark also provides insight into the performance impact of new processor-intensive graphics technologies, including:

  • Advanced ray tracing, which simulates how light behaves in the real world to create realistic images and real-time graphics. It works by tracing the path of light rays from a virtual camera through a 3D scene, simulating interactions with objects and gathering information about visual appearance, including color, texture, and lighting.
  • Unreal Engine’s Nanite, a virtualized geometry system that uses a new internal mesh format and rendering technology to render pixel-scale detail and high object counts. Nanite offers orders-of-magnitude increases in real-time geometry complexity and higher triangle and object counts than were previously possible.
  • Frame rate and latency reduction, as Temporal Super Resolution is enabled on Unreal Engine.

Under the hood, the SPECviewperf 15 benchmark has several new and updated workloads:

  • New workloads representing significant new use cases include:
    • Blender-01—An OpenGL benchmark highlighting the use of Blender 3.6 LTS in content-creation use cases.
    • Unreal_engine-01—A DirectX 12 benchmark highlighting content-creation use cases that rely on Epic’s Unreal Engine 5.4 with advanced rendering technologies such as Lumen, Nanite, and Temporal Super Resolution.
    • Enscape-01—A Vulkan benchmark highlighting GPU-accelerated ray tracing as used by the Chaos Enscape 4.0 application in architectural visualization.
  • Updated workloads include:
    • 3dsmax-08 workload updated with traces from Autodesk 3ds Max 2023, including subsets of KitBash3D’s Mission to Minerva model and Materials Kit, based on real-world production data commonly used by game developers and filmmakers.
    • Catia-07 workload updated with traces from the 2022x version of Dassault Systèmes 3DExperience Catia. Traces from Catia v5 are also included in the workload.
    • Creo-04 workload updated with traces from PTC Creo 9.
    • Maya-07 workload updated with traces from Autodesk Maya 2025. The update also includes two new models: Apollo as well as Sol and Solette.
    • Solidworks-08 workload updated with traces from Dassault Systèmes SolidWorks 2024.

How individuals, enterprises, and vendors use the SPECviewperf 15 benchmark

The following are typical scenarios demonstrating the value of the benchmark among the various types of users.

Enterprises—Configuring system upgrades.

John’s team of video game designers are beginning to use the Nanite virtualized geometry system to directly import film-quality source art, such as ZBrush sculpts and photogrammetry scans, into Unreal Engine 5 to create dramatically more realistic worlds. However, the slow performance of their existing workstations has caused slower release cycles, missed deadlines, and frustrated staff.

John has been given a budget for upgrading the workstations and needs to specify the optimal configuration within that budget. However, given the evolution of the technology and the complexity of the graphics tasks his group will be working on, John knows he needs expert help to make the best decision. He turns to SPEC and the SPECviewperf 15 benchmark for help. On the benchmark’s Web page, John finds many published results from vendors and enterprises that have measured the performance of their systems using the viewsets included in the benchmark, giving John confidence that the results reflect the performance gains his team would experience with different configurations.

To make his final decision, John develops a request for proposal (RFP) to send to three different vendors, asking for demo systems with two different configurations within his budget. He then downloads the SPECviewperf 15 benchmark so he can directly compare the results from his current systems to the performance of the demo systems.

Individuals—Balancing performance and cost.

Joyce has been using the same workstation to develop product designs since 2020, but after upgrading to the latest version of SolidWorks, she’s noticing significant lag when trying to manipulate models. After looking at options for upgrading her GPU, she feels overwhelmed, until she notices that several system reviews make GPU performance comparisons based on viewsets in the SPECviewperf 15 benchmark. She decides to download the benchmark, install the SolidWorks viewset, and run it on her current system—all for free. 

In just a few minutes, she can compare the results with those published on the SPEC website and in articles she is reading. This enables her to confidently select a midrange workstation GPU that offers the best balance between performance and cost.

Vendors—Ensuring product performance and competitive positioning.

Alex, a product manager at a graphics hardware manufacturer, is preparing for the launch of their latest professional graphics card. Months before release, Alex works closely with engineering and validation teams to run the full suite of SPECviewperf workloads on pre-production hardware, verifying that performance is consistent across driver builds and matches or exceeds expectations. When an unexpected performance regression appears in one of the workloads, quality assurance engineers turn to SPECviewperf’s repeatable tests to isolate the cause, then work closely with driver engineers to implement and verify the fix. This process ensures that any issues are addressed well before the product reaches customers.

In parallel, performance engineers analyze SPECviewperf benchmark results to identify opportunities for refining hardware and driver behavior in ways that benefit the professional applications and workflows represented by the benchmark. These insights guide adjustments that improve efficiency, stability, and responsiveness for the real-world scenarios the benchmark is designed to emulate.

When the product is ready, Alex partners with the marketing team to publish SPECviewperf results as vendor-neutral proof points that support performance claims and reinforce competitive positioning in product launches, customer proposals, and partner engagements.

The future of the SPECviewperf benchmark

The next wave of graphics technology innovation is on the way, as AI and machine learning (ML) will enable graphics that are more realistic and experiences that are more immersive. AI/ML will also make creation more accessible.

These new technologies include improvements in neural rendering and real-time graphics, such as:

  • Neural radiance fields (NeRFs) and derivatives that reconstruct incredibly realistic 3D scenes from 2D images.
  • Enhanced AI-powered upscaling and frame generation to reconstruct higher-resolution frames from lower-resolution inputs and generate entirely new frames.
  • Neural shaders that can learn and generate textures, materials, and lighting.
  • AI-powered ray tracing to enable real-time ray tracing by reducing noise in rendered images.  

These new technical capabilities will put more pressure on the performance of systems, which means more pressure on vendors, enterprises, and individuals to make informed choices about how and when to upgrade computing systems. SPEC is committed to continuing to update the SPECviewperf benchmark to help guide the decision-making of all these industry stakeholders.  

Ross Cunniff is the chair of the Standard Performance Evaluation Corporation’s (SPEC) Graphics Performance Characterization committee. He has more than 40 years of experience in the tech industry, including nearly 25 years with Nvidia, where he serves as a systems software development manager.

Anthony Mansur is the vice chair of SPEC’s Graphics Performance Characterization Committee. He currently serves as a graphics software engineer with Intel, where he focuses on delivering performance and quality for professional workloads on Intel Arc Pro-series GPUs. He holds a master’s degree in computer graphics and game technology from the University of Pennsylvania.