Nuclear timescale

From Wikipedia, the free encyclopedia
  (Redirected from Nuclear time scale)
Jump to: navigation, search

In astrophysics, the nuclear timescale is an estimate of the lifetime of a star based solely on its rate of fuel consumption. Along with the thermal and dynamical time scales, it is used to estimate the length of time a particular star will remain in a certain phase of its life and its lifespan if hypothetical conditions are met. In reality, the lifespan of a star is greater than what is estimated by the nuclear time scale because as one fuel becomes scarce, another will generally take its place—hydrogen burning gives way to helium burning, etc. However, all the phases after hydrogen burning combined typically add up to less than 10% of the duration of hydrogen burning.

Stellar astrophysics[edit]

Hydrogen generally determines a star's nuclear lifetime because it is used as the main source of fuel in a main sequence star. It constitutes much of the star's core and is surrounded by what is known as a helium shell. Hydrogen becomes helium in the nuclear reaction that takes place within stars; when the hydrogen has been exhausted, the star moves on to another phase of its life and begins burning the helium.


 \tau_{nuc} = \frac{\mbox{total mass of fuel available}}{\mbox{rate of fuel consumption}} \times \mbox{fraction of star over which fuel is burned} = \frac{MX}{\frac{L}{Q}} \times F


where M is the mass of the star, X is the fraction of the star (by mass) that is composed of the fuel, L is the star's luminosity, Q is the energy released per mass of the fuel from nuclear fusion (the chemical equation should be examined to get this value), and F is the fraction of the star where the fuel is burned (F is generally equal to .1 or so). As an example, the Sun's nuclear time scale is approximately 10 billion years.