Standard Performance Evaluation Corporation

From Wikipedia, the free encyclopedia
Jump to: navigation, search
"SPEC" redirects here. For other uses, see SPEC (disambiguation).
Standard Performance Evaluation Corporation
SPEC-logo reg.png
Formation 1988
Type Not-for-profit
Headquarters Gainesville, Virginia
Membership Hardware & Software Vendors, Universities, Research Centers
Staff 4

The Standard Performance Evaluation Corporation (SPEC) is an American non-profit organization that aims to "produce, establish, maintain and endorse a standardized set" of performance benchmarks for computers.[1]

SPEC was founded in 1988.[2][3] SPEC benchmarks are widely used to evaluate the performance of computer systems; the test results are published on the SPEC website. Results are sometimes informally referred to as "SPECmarks" or just "SPEC".

SPEC evolved into an umbrella organization encompassing four diverse groups; Graphics and Workstation Performance Group (GWPG), the High Performance Group (HPG), the Open Systems Group (OSG) and the newest, the Research Group (RG). More details are on their website; [1].


Membership in SPEC is open to any interested company or entity that is willing to commit to SPEC's standards. It allows:

  • Participation in benchmark development
  • Participation in review of results
  • Complimentary software based on group participation

The list of members is available on SPEC's membership page;[2].

Membership levels[edit]

  • Sustaining Membership requires dues payment and typically includes hardware or software companies.
  • SPEC "Associates" pay a reduced fee and typically include Universities.
  • SPEC "Supporting Contributors" are invited to participate in development of a single benchmark, and do not pay fees.

SPEC benchmark suites[edit]

The benchmarks aim to test "real-life" situations. There are several benchmarks testing Java scenarios, from simple computation (SPECjbb) to a full system with Java EE, database, disk, and network (SPECjEnterprise).

The SPEC CPU suites test CPU performance by measuring the run time of several programs such as the compiler gcc, the chemistry program gamess, and the weather program WRF. The various tasks are equally weighted; no attempt is made to weight them based on their perceived importance. An overall score is based on a geometric mean.


SPEC benchmarks are written in a platform neutral programming language (usually C, Java or Fortran), and the interested parties may compile the code using whatever compiler they prefer for their platform, but may not change the code. Manufacturers have been known to optimize their compilers to improve performance of the various SPEC benchmarks. SPEC has rules that attempt to limit such optimizations.


In order to use a benchmark, a license has to be purchased from SPEC; the costs vary from test to test with a typical range from several hundred to several thousand dollars. This pay-for-license model might seem to be in violation of the GPL as the benchmarks include software such as GCC that is licensed by the GPL. However, the GPL does not require software to be distributed for free, only that recipients be allowed to redistribute any GPLed software that they receive; the license agreement for SPEC specifically exempts items that are under "licenses that require free distribution", and the files themselves are placed in a separate part of the overall software package.



  • SPECapc for 3ds Max™ 2011, performance evaluation software for systems running Autodesk 3ds Max 2011.
  • SPECapcSM for Lightwave 3D 9.6, performance evaluation software for systems running NewTek LightWave 3D v9.6 software.
  • SPEC CPU2006, combined performance of CPU, memory and compiler.
    • CINT2006 ("SPECint"), testing integer arithmetic, with programs such as compilers, interpreters, word processors, chess programs etc.
    • CFP2006 ("SPECfp"), testing floating point performance, with physical simulations, 3D graphics, image processing, computational chemistry etc.
  • SPECjbb2005, evaluates the performance of server side Java by emulating a three-tier client/server system (with emphasis on the middle tier). "SPECjbb2005 will be retired on October 1, 2013 in favor of its successor, SPECjbb2013"
  • SPECjEnterprise2010, a multi-tier benchmark for measuring the performance of Java 2 Enterprise Edition (J2EE) technology-based application servers.
  • SPECjms2007, Java Message Service performance
  • SPECjvm2008, measuring basic Java performance of a Java Runtime Environment on a wide variety of both client and server systems.
  • SPECapc, performance of several 3D-intensive popular applications on a given system
  • SPEC MPI2007, for evaluating performance of parallel systems using MPI (Message Passing Interface) applications.
  • SPEC OMP2001 V3.2, for evaluating performance of parallel systems using OpenMP ( applications.
  • SPECpower_ssj2008, evaluates the energy efficiency of server systems.
  • SPECsfs2008, File server throughput and response time supporting both NFS and CIFS protocol access
  • SPECsip_Infrastructure2011, SIP server performance
  • SPECviewperf 11, performance of an OpenGL 3D graphics system, tested with various rendering tasks from real applications
  • SPECvirt_sc2010 ("SPECvirt"), evaluates the performance of datacenter servers used in virtualized server consolidation environments. "SPECvirt_sc2010 will be retired on February 26, 2014 in favor of its successor, SPECvirt_sc2013"


  • SOA: according to SPEC's web site in late 2010, a subcommittee is investigating benchmarks for Service Oriented Architecture (SOA).


  • SPEC CPU2000
  • SPEC CPU95
  • SPEC CPU92
  • SPEC HPC96
  • SPEC HPC2002 (no longer available)
  • SPECjAppServer2001
  • SPECjAppServer2002
  • SPECjAppServer2004
  • SPECjbb2000
  • SPECjvm98
  • SPECmail2009
  • SPECmail2008
  • SPECmail2001
  • SPEC SDM91
  • SPEC SFS97_R1
  • SPEC SFS93
  • SPEC SMT97
  • SPECweb96
  • SPECweb99
  • SPECweb99_SSL
  • SPECweb2005
  • SPECweb2009


SPEC attempts to create an environment where arguments are settled by appeal to notions of technical credibility, representativeness, or the "level playing field". SPEC representatives are typically engineers with expertise in the areas being benchmarked. Benchmarks include "run rules", which describe the conditions of measurement and documentation requirements. Results that are published on SPEC's website undergo a peer review by members' performance engineers.


  1. ^ "SPEC Frequently Asked Questions". Retrieved 15 March 2010. 
  2. ^ "The SPEC Organization". Retrieved 15 March 2010. 
  3. ^ "SPEC Membership". Retrieved 15 March 2010. 
  • Kant, Krishna (1992). Introduction to Computer System Performance Evaluation. New York: McGraw-Hill Inc. pp. 16–17. ISBN 0-07-033586-9. 

External links[edit]