100-year flood

From Wikipedia, the free encyclopedia
Jump to: navigation, search

A one-hundred-year flood is a flood event that has a 1% probability of occurring in any given year. The 100-year flood is also referred to as the 1% flood, since its annual exceedance probability is 1%,[1] or as having a return period of 100-years. The 100-year flood is generally expressed as a flowrate. Based on the expected 100-year flood flow rate in a given creek, river or surface water system, the flood water level can be mapped as an area of inundation. The resulting floodplain map is referred to as the 100-year floodplain, which may figure very importantly in building permits, environmental regulations, and flood insurance.

Probability[edit]

A common misunderstanding exists that a 100-year flood is likely to occur only once in a 100-year period. In fact, there is approximately a 63.4% chance of one or more 100-year floods occurring in any 100-year period. The probability Pe that one or more of a certain-size flood occurring during any period will exceed the 100-yr flood threshold can be expressed as

P_{e}=1-\left[ 1-\left( \frac{1}{T} \right) \right]^{n}

where T is the return period of a given storm threshold (e.g. 100-yr, 50-yr, 25-yr, and so forth), and n is the number of years. The exceedance probability Pe is also described as the natural, inherent, or hydrologic risk of failure.[2][3] However, the expected value of the number of 100-year floods occurring in any 100-year period is 1.

Ten-year floods have a 10% chance of occurring in any given year (Pe =0.10); 500-year have a 0.2% chance of occurring in any given year (Pe =0.002); etc. The percent chance of an X-year flood occurring in a single year can be calculated by dividing 100 by X.

The field of extreme value theory was created to model rare events such as 100-year floods for the purposes of civil engineering. This theory is most commonly applied to the maximum or minimum observed stream flows of a given river. In desert areas where there are only ephemeral washes, this method is applied to the maximum observed rainfall over a given period of time (24-hours, 6-hours, or 3-hours). The extreme value analysis only considers the most extreme event observed in a given year. So, between the large spring runoff and a heavy summer rain storm, whichever resulted in more runoff would be considered the extreme event, while the smaller event would be ignored in the analysis (even though both may have been capable of causing terrible flooding in their own right).

Statistical assumptions[edit]

There are a number of assumptions which are made to complete the analysis which determines the 100-year flood. First, the extreme events observed in each year must be independent from year-to-year. In other words the maximum river flow rate from 1984 cannot be found to be significantly correlated with the observed flow rate in 1985. 1985 cannot be correlated with 1986, and so forth. The second assumption is that the observed extreme events must come from the same probability distribution function. The third assumption is that the probability distribution relates to the largest storm (rainfall or river flow rate measurement) that occurs in any one year. The fourth assumption is that the probability distribution function is stationary, meaning that the mean (average), standard deviation and max/min values are not increasing or decreasing over time. This concept is referred to as stationarity.[3][4]

The first assumption has a very low chance of being valid in all places. Studies have shown that extreme events in certain watersheds in the U.S. are not significantly correlated,[citation needed] but this must be determined on a case by case basis. The second assumption is often valid if the extreme events are observed under similar climate conditions. For example, if the extreme events on record all come from late summer thunder storms (as is the case in the southwest U.S.), or from snow pack melting (as is the case in north-central U.S.), then this assumption should be valid. If, however, there are some extreme events taken from thunder storms, others from snow pack melting, and others from hurricanes, then this assumption is most likely not valid. The third assumption is only a problem if you are trying to forecast a low, but maximum flow event (say, you are tying to find the max event for the 1-year storm event). Since this is not typically a goal in extreme analysis, or in civil engineering design, then the situation rarely presents itself. The final assumption about stationarity has come into question in light of the research being done on climate change. In short, the argument being made is that if temperatures are changing and precipitation cycles are being altered, then there is compelling evidence that the probability distribution is also changing.[5] The simplest implication of this is that not all of the historical data are, or can be, considered valid as input into the extreme event analysis.

Probability uncertainty[edit]

When these assumptions are violated there is an unknown amount of uncertainty introduced into the reported value of what the 100-year flood means in terms of rainfall intensity, or river flood depth. When all of the inputs are known the uncertainty can be measured in the form of a confidence interval. For example, one might say there is a 95% chance that the 100-year flood is greater than X, but less than Y.[1] Without analyzing the statistical uncertainty of a given 100-year flood, scientists and engineers can decrease the uncertainty by using two practical rules. First, forecast an extreme event which is no more than double your observation years (e.g. you have 27 observed river measurements, so you can determine a 50-year event since 27×2=54, but not a 100-yr event). The second way to decrease the uncertainty of the extreme event is to forecast a value which is less than the maximum observed value (e.g. the maximum rainfall event on record is 5.25 inches/hour, so the 100-year storm event should be less than this).

Upslope factors[edit]

The amount, location, and timing of water reaching a drainage channel from natural precipitation and controlled or uncontrolled reservoir releases determines the flow at downstream locations. Some precipitation evaporates, some slowly percolates through soil, some may be temporarily sequestered as snow or ice, and some may produce rapid runoff from surfaces including rock, pavement, roofs, and saturated or frozen ground. The fraction of incident precipitation promptly reaching a drainage channel has been observed from nil for light rain on dry, level ground to as high as 170 percent for warm rain on accumulated snow.[6]

Most precipitation records are based on a measured depth of water received within a fixed time interval. Frequency of a precipitation threshold of interest may be determined from the number of measurements exceeding that threshold value within the total time period for which observations are available. Individual data points are converted to intensity by dividing each measured depth by the period of time between observations. This intensity will be less than the actual peak intensity if the duration of the rainfall event was less than the fixed time interval for which measurements are reported. Convective precipitation events (thunderstorms) tend to produce shorter duration storm events than orographic precipitation. Duration, intensity, and frequency of rainfall events are important to flood prediction. Short duration precipitation is more significant to flooding within small drainage basins.[7]

The most important upslope factor in determining flood magnitude is the land area of the watershed upstream of the area of interest. Rainfall intensity is the second most important factor for watersheds of less than approximately 30 square miles or 80 square kilometres. The main channel slope is the second most important factor for larger watersheds. Channel slope and rainfall intensity become the third most important factors for small and large watersheds, respectively.[8]

Downslope factors[edit]

Water flowing downhill ultimately encounters downstream conditions slowing movement. The final limitation is often the ocean or a natural or artificial lake. Elevation changes such as tidal fluctuations are significant determinants of coastal and estuarine flooding. Less predictable events like tsunamis and storm surges may also cause elevation changes in large bodies of water. Elevation of flowing water is controlled by the geometry of the flow channel.[8] Flow channel restrictions like bridges and canyons tend to control water elevation above the restriction. The actual control point for any given reach of the drainage may change with changing water elevation, so a closer point may control for lower water levels until a more distant point controls at higher water levels.

Effective flood channel geometry may be changed by growth of vegetation, accumulation of ice or debris, or construction of bridges, buildings, or levees within the flood channel.

Prediction[edit]

Statistical analysis requires all data in a series be gathered under similar conditions. A simple prediction model might be based upon observed flows within a fixed channel geometry.[9] Alternatively, prediction may rely upon assumed channel geometry and runoff patterns using historical precipitation records. The rational method has been used for drainage basins small enough that observed rainfall intensities may be assumed to occur uniformly over the entire basin. Time of Concentration is the time required for runoff from the most distant point of the upstream drainage area to reach the point of the drainage channel controlling flooding of the area of interest. The time of concentration defines the critical duration of peak rainfall for the area of interest.[10] The critical duration of intense rainfall might be only a few minutes for roof and parking lot drainage structures, while cumulative rainfall over several days would be critical for river basins.

Extreme flood events often result from coincidence such as unusually intense, warm rainfall melting heavy snow pack, producing channel obstructions from floating ice, and releasing small impoundments like beaver dams.[11] Coincident events may cause flooding outside the statistical distribution anticipated by simplistic prediction models.[12] Debris modification of channel geometry is common when heavy flows move uprooted woody vegetation and flood-damaged structures and vehicles, including boats and railway equipment.

See also[edit]

References[edit]

  1. ^ a b Holmes, R.R., Jr., and Dinicola, K. (2010) 100-Year flood–it's all about chance U.S. Geological Survey General Information Product 106
  2. ^ Mays, L.W (2005) Water Resources Engineering Hoboken: J. Wiley & Sons[page needed]
  3. ^ a b Maidment, D.R. ed.(1993) Handbook of Hydrology New York:McGraw-Hill[page needed]
  4. ^ Water Resources Council Bulletin 17B Water Resources Council Bulletin 17B "Guidelines for Determining Flood Flow Frequency,"
  5. ^ "Stationarity is Dead". Science Magazine (Sciencemag.org). 2008-02-01. Retrieved 2011-08-29. 
  6. ^ Babbitt, Harold E. and Doland, James J., Water Supply Engineering, McGraw-Hill Book Company, 1949
  7. ^ Simon, Andrew L., Basic Hydraulics, John Wiley & Sons, 1981, ISBN 0-471-07965-0
  8. ^ a b Simon, Andrew L., Practical Hydraulics, John Wiley & Sons, 1981, ISBN 0-471-05381-3
  9. ^ Linsley, Ray K. and Franzini, Joseph B., Water-Resources Engineering, McGraw-Hill Book Company, 1972
  10. ^ Urquhart, Leonard Church , Civil Engineering Handbook, McGraw-Hill Book Company, 1959
  11. ^ Abbett, Robert W., American Civil Engineering Practice, John Wiley & Sons, 1956
  12. ^ United States Department of the Interior, Bureau of Reclamation, Design of Small Dams, United States Government Printing Office, 1973

External links[edit]