Jump to content

User:Forkandwait/Lee Carter Model: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
mNo edit summary
Line 1: Line 1:
= Overview =
= Overview =


The Lee-Carter model is a numerical algorithm used in [[mortality forecasting]]. The input to the method is a matrix of mortality rates sorted monotonically by time, usually with ages in columns and years in rows. The output is another forecasted matrix of mortality rates. Life expectancy and other [[life table]] measures can be derived from this new table. The algorithm (essentially) uses the [[Singular Value Decomposition]] (SVD) to find a [[univariate]] [[time series]] that captures 80-90% of the trend; in the literature, this time series is referred to as <math>k_t</math>, with <math>t</math> signifying time. The algorithm also creates a base set of age specific mortality rates (<math>a_x</math>) which have been transformed into logarithms and centered by the mean, and a vector (<math>b_x</math>) that describes the amount of mortality change at a give age for a unit of overall mortality change; the subscript <math>x</math> refers to age. Future mortality is derived by forecasting <math>k_t</math> with univariate [[ARIMA]] methods, using <math>b_x</math> and >math>a_x</math> to recover a set of logged mortality rates for each forecast year, and finally recovering regular mortality by calculating the exponential of the forecasted log mortality rates. In most implementations, [[confidence intervals]] for the forecasts are generated by running simulating multiple <math>k_t</math> using [[Monte-Carlo]] methods and the [[standard error]] of the time series parameters; mortality between 5% and 95% percentiles of the simulated results is considered to be a valid forecast. Additionally, many researchers adjust the <math>k_t</math> vector by fitting it to calculated life expectancies for each year, using the <math>a_x</math> and <math>b_x</math> just generated; in this approach, changes to k_t are usually small. Without applying SVD or some other method of [[dimension reduction]] the table of mortality data is a highly correlated multivariate data series (with each age group forming another dimension in addition to time); the complexity of these multidimensional time series makes such them almost impossible to forecast.
The Lee-Carter model is a numerical algorithm used in [[mortality forecasting]]. The input to the method is a matrix of mortality rates sorted monotonically by time, usually with ages in columns and years in rows. The output is another forecasted matrix of mortality rates. Life expectancy and other [[life table]] measures can be derived from this new table. The algorithm (essentially) uses the [[Singular Value Decomposition]] (SVD) to find a [[univariate]] [[time series]] that captures 80-90% of the trend; in the literature, this time series is referred to as <math>k_t</math>, with <math>t</math> signifying time. The algorithm also creates a base set of age specific mortality rates <math>a_x</math> which have been transformed into logarithms and centered by the mean, and a vector <math>b_x</math> that describes the amount of mortality change at a give age for a unit of overall mortality change; the subscript <math>x</math> refers to age. Future mortality is derived by forecasting <math>k_t</math> with univariate [[ARIMA]] methods, using <math>b_x</math> and <math>a_x</math> to recover a set of logged mortality rates for each forecast year, and finally recovering regular mortality by calculating the exponential of the forecasted log mortality rates. In most implementations, [[confidence intervals]] for the forecasts are generated by running simulating multiple <math>k_t</math> using [[Monte-Carlo]] methods and the [[standard error]] of the time series parameters; mortality between 5% and 95% percentiles of the simulated results is considered to be a valid forecast. Additionally, many researchers adjust the <math>k_t</math> vector by fitting it to calculated life expectancies for each year, using the <math>a_x</math> and <math>b_x</math> just generated; in this approach, changes to <math>k_t</math> are usually small. Without applying SVD or some other method of [[dimension reduction]] the table of mortality data is a highly correlated multivariate data series (with each age group forming another dimension in addition to time); the complexity of these multidimensional time series makes such them almost impossible to forecast.


The model was introduced by [[Ronald D. Lee]] and [[Lawrence Carter]] in 1992 with the article "Modeling and Forecasting the Time Series of U.S. Mortality," (Journal of the American Statistical Association 87 (September): 659-671). The model grew out of their work in the late 1980s and early 1990s attempting to use [[inverse projection]] to understand historical demography.
The model was introduced by [[Ronald D. Lee]] and [[Lawrence Carter]] in 1992 with the article "Modeling and Forecasting the Time Series of U.S. Mortality," (Journal of the American Statistical Association 87 (September): 659-671). The model grew out of their work in the late 1980s and early 1990s attempting to use [[inverse projection]] to understand historical demography.

Revision as of 02:55, 28 September 2010

Overview

The Lee-Carter model is a numerical algorithm used in mortality forecasting. The input to the method is a matrix of mortality rates sorted monotonically by time, usually with ages in columns and years in rows. The output is another forecasted matrix of mortality rates. Life expectancy and other life table measures can be derived from this new table. The algorithm (essentially) uses the Singular Value Decomposition (SVD) to find a univariate time series that captures 80-90% of the trend; in the literature, this time series is referred to as , with signifying time. The algorithm also creates a base set of age specific mortality rates which have been transformed into logarithms and centered by the mean, and a vector that describes the amount of mortality change at a give age for a unit of overall mortality change; the subscript refers to age. Future mortality is derived by forecasting with univariate ARIMA methods, using and to recover a set of logged mortality rates for each forecast year, and finally recovering regular mortality by calculating the exponential of the forecasted log mortality rates. In most implementations, confidence intervals for the forecasts are generated by running simulating multiple using Monte-Carlo methods and the standard error of the time series parameters; mortality between 5% and 95% percentiles of the simulated results is considered to be a valid forecast. Additionally, many researchers adjust the vector by fitting it to calculated life expectancies for each year, using the and just generated; in this approach, changes to are usually small. Without applying SVD or some other method of dimension reduction the table of mortality data is a highly correlated multivariate data series (with each age group forming another dimension in addition to time); the complexity of these multidimensional time series makes such them almost impossible to forecast.

The model was introduced by Ronald D. Lee and Lawrence Carter in 1992 with the article "Modeling and Forecasting the Time Series of U.S. Mortality," (Journal of the American Statistical Association 87 (September): 659-671). The model grew out of their work in the late 1980s and early 1990s attempting to use inverse projection to understand historical demography.

Detailed math explanation

More formally, let

Examples

Uses/ Users

Extensions and nuances

 Re-fixing e0. Coherent etc. =

Software

Other coolness

SVD.  Linear quality of change.  Idea of regime of mortality change.  

Links

to people Larry Carter, Ron Lee, CEDA, Aussie guy

See this http://www.soa.org/library/journals/north-american-actuarial-journal/2000/january/naaj0001_5.pdf article in the North American Actuarial Journal for a more contemporary overview, or some of the papers on Prof. Lee's website for extensions and applications.

Ron's papers http://www.ceda.berkeley.edu/papers/rlee/welcome.html

Inverse projection and history of LCFIT http://escholarship.org/uc/item/76b3712p

http://github.com/webbs/lcfit/tree/

http://lcfit.demog.berkeley.edu/

http://robjhyndman.com/software/demography/

Papers -- King, SOA, Ron's history of LC