Gauge factor

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Gauge factor (GF) or strain factor of a strain gauge is the ratio of relative change in electrical resistance R, to the mechanical strain ε. The gauge factor is defined as:[1]


  • ε = strain =
    • = absolute change in length
    • = original length
  • ν = Poisson's ratio
  • ρ = resistivity
  • ΔR = change in strain gauge resistance due axial strain and lateral strain
  • R = unstrained resistance of strain gauge

Piezoresistive effect[edit]

It is a common misconception that the change in resistance of a strain gauge is based solely, or most heavily, on the geometric terms. This is true for some materials (), and the gauge factor is simply:

However, most commercial strain gauges utilise resistors made from materials that demonstrate a strong piezoresistive effect. The resistivity of these materials changes with strain, accounting for the term of the defining equation above. In constantan strain gauges (the most commercially popular), the effect accounts for 20% of the gauge factor, but in silicon gauges, the contribution of the piezoresistive term is much larger than the geometric terms. This can be seen in the general examples of strain gauges below:

Material Gauge Factor
Metal foil strain gauge 2-5
Thin-film metal (e.g. constantan) 2
Single crystal silicon -125 to + 200
Polysilicon ±30
Thick-film resistors 100
p-type Ge 102

Effect of temperature[edit]

The definition of the gauge factor does not rely on temperature, however the gauge factor only relates resistance to strain if there are no temperature effects. In practice, where changes in temperature or temperature gradients exist, the equation to derive resistance will have a temperature term. The total effect is:



  1. ^ Beckwith, Thomas G., N. Lewis Buck, Roy D. Marangoni (1982). Mechanical Measurements (Third ed.). Reading, MA: Addison-Wesley Publishing Co. p. 360. ISBN 0-201-00036-9.