The renamed performance evaluation software features new and updated workloads throughout the suite; it is free of charge to users.
By Bob Cramblitt
SPECworkstation 3, an all-new version of the benchmark formerly known as SPECwpc, was released on October 31. It is developed and maintained by the SPEC Workstation Performance Characterization (SPECwpc) group, which includes AMD, Dell, Fujitsu, HP, Intel, Lenovo, and NVIDIA. Users can download the benchmark free of charge from the SPEC website.
Five years in the making
The first version of the SPEC workstation benchmark was introduced five years ago. It was the first benchmark to provide a comprehensive measure of total workstation performance based on professional applications. The benchmark did not require the target applications to be installed on the user’s system. It was relatively easy to use and could be run without intervention in a few hours.
In the years since its introduction, the benchmark has been expanded to provide a wider scope of testing for new features and technologies. Improvements have been made for greater ease of use, more representative capturing of real-world applications, and better reporting and validation of results.
SPECwpc 2.0, released in November 2015, improved scalability measurement for workstations with large numbers of processing cores. It fully integrated solid-state storage into performance testing and updated several workloads within the benchmark suite.
SPECworkstation 3 comprises more than 30 workloads containing nearly 140 tests to exercise CPU, graphics, I/O, and memory bandwidth. The workloads are divided by application categories that include media and entertainment (3D animation, rendering), product development (CAD/CAM/CAE), life sciences (medical, molecular), financial services, energy (oil and gas), general operations, and GPU compute. A full description of the workloads can be found on the SPEC website.
New features and upgrades in SPECworkstation 3 include:
- A totally redesigned storage workload based on traces of nearly two-dozen applications.
- Workloads that reflect changes in updated versions of Blender, Handbrake, Python, and LuxRender applications.
- GPU-accelerated workloads based on LuxRender, Caffe, and Folding@Home applications.
- Refreshed graphics workloads from SPECviewperf 13, including new viewsets representing Autodesk Maya, PTC Creo, energy (oil & gas), and medical applications.
- A more robust GUI and improved results validation and error reporting.
- An option to report results based on subsystems in addition to vertical market segments.
The bulk of the attention during the development of SPECworkstation 3 focused on the new workloads dedicated to measuring storage and GPU compute performance.
Keeping up with storage changes
Tom Fisher, SPECwpc chair, says that the ever-changing storage environment created the need for a different type of testing in SPECworkstation 3.
“Storage is an area undergoing rapid change with the proliferation of solid-state devices, new technologies like NAND and 3D Xpoint, and various PCIe connectivity options. The new storage tests take the performance aspects of these new developments into consideration. They give users a much better idea of how various devices affect the performance of their workstations when running the applications most relevant to their day-to-day work.”
Taking a page from the SPECviewperf design philosophy, the SPECwpc group developed a storage workload that uses actual traces from real applications, and plays them back as faithfully as possible.
Applications that were traced for the new storage workload include 7zip, Adobe Media Encoder, Adobe Premier Pro, Ansys Icepak, Ansys Mechanical, Autodesk 3ds Max, Autodesk Maya, Autodesk Revit, Blender, CalculiX, Dassault Systémes Solidworks, Handbrake, Lammps, Microsoft Visual Studio 2015, Namd, the SPECviewperf 13 energy viewset, and the SPECworkstation 3 WPCcfd workload.
Characterizing GPU performance
Accurately representing GPU performance for a wide range of professional applications poses a unique set of challenges for benchmark developers.
“Applications behave very differently, so producing a benchmark that measures a variety of application behaviors and runs in a reasonable amount of time presents difficulties,” says Jon Konieczny, an AMD representative in the SPECwpc group.
“Even within a given application, different models and modes can produce very different GPU behavior, so ensuring sufficient test coverage is a key to producing a comprehensive performance picture.”
Another major consideration is recognizing the differences between CPU and GPU performance measurement, according to Fisher.
“Generally speaking, the CPU has an architecture with many complexities that allow it to execute a wide variety of codes quickly. The GPU, on the other hand, is purpose-built to execute pretty much the same set of operations on many pieces of data, such as shading every pixel on the screen with the same set of operations.”
SPECworkstation 3 includes a dedicated suite for measuring GPU compute performance. It includes three workloads:
- LuxRender, which uses LuxMark, a benchmark based on the new LuxCore physically based renderer, to render a chrome sphere resting on a grid of numbers in a beach scene.
- Caffe, a deep-learning framework developed by Berkeley AI Research (BAIR) and by community contributors. Yangqing Jia created the project during his PhD studies at UC Berkeley.
- Folding@home (FAH or F@h), a distributed computing project for disease research that simulates protein folding, computational drug design, and other types of molecular dynamics.
Benchmarks can never rest
Accompanying the release of SPECworkstation 3 are initial benchmarking results from SPECwpc members, which will be expanded over time. Results will also begin to appear shortly in new product announcements, workstation reviews, and requests for proposals, as users gravitate to the newest version of the benchmark.
“We think SPECworkstation 3 represents the computing industry’s most comprehensive benchmark for measuring performance based on professional workstation applications,” says Fisher, “but we can’t rest on our laurels. It takes continuous improvement to keep up with the evolution of hardware, software and applications. As always, we welcome any and all input from the user community.”
Bob Cramblitt is communications director for SPEC. He writes frequently about performance issues and digital design, engineering and manufacturing technologies. To find out more about graphics and workstation benchmarking, visit the SPEC/GWPG website, subscribe to the SPEC/GWPG enewsletter or join the Graphics and Workstation Benchmarking LinkedIn group: https://www.linkedin.com/groups/8534330.