# Settling time

Settling time is the time required for an output to reach and remain within a given error band following some input stimulus.

In control theory the settling time of a dynamical system such as an amplifier or other output device is the time elapsed from the application of an ideal instantaneous step input to the time at which the amplifier output has entered and remained within a specified error band.

Settling time includes a propagation delay, plus the time required for the output to slew to the vicinity of the final value, recover from the overload condition associated with slew, and finally settle to within the specified error.

Systems with energy storage cannot respond instantaneously and will exhibit transient responses when they are subjected to inputs or disturbances.[1]

## Definition

Tay, Mareels and Moore (1998) defined settling time as "the time required for the response curve to reach and stay within a range of certain percentage (usually 5% or 2%) of the final value."[2]

## Mathematical detail

Settling time depends on the system response and natural frequency.

The settling time for a second order, underdamped system responding to a step response can be approximated if the damping ratio ${\displaystyle \zeta \ll 1}$ by

${\displaystyle T_{s}=-{\frac {\ln({\text{tolerance fraction}})}{{\text{damping ratio}}\times {\text{natural freq}}}}}$

A general form is

${\displaystyle T_{s}=-{\frac {\ln({\text{tolerance fraction}}\times {\sqrt {1-\zeta ^{2}}})}{{\text{damping ratio}}\times {\text{natural freq}}}}}$

Thus, if the damping ratio ${\displaystyle \zeta \ll 1}$, settling time to within 2% = 0.02 is:

${\displaystyle T_{s}=-{\frac {\ln(0.02)}{\zeta \omega _{n}}}\approx {\frac {3.9}{\zeta \omega _{n}}}}$