Chip famine

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

A chip famine is a phenomenon in the integrated circuit (chip) industry that appears approximately every four years where demand for silicon chips outstrips supply.[1]


Chip famine typically occurs when there is some sociological or physical change that prevents certain chips from being produced in large enough numbers to satisfy demand. A severe case of chip famine occurred in 1988 after a pact between American and Japanese chip manufacturers resulted in severely reduced production. Changing to newer production methods also causes chip famines as the new factories are not built quick enough to meet the demand for the newer chips. An example was the shortage of smart card chips in 1999, another was during the rationing of other types of chips in 2004. Recently, the 2011 Japanese earthquake led to a period when it was difficult to source all the parts needed for various systems.

Chip famines can have a major effect on the electronics industry where manufacturers change their sourcing of chips and suffer major loss of profits, such as when PC manufacturer Gateway switched from Intel to AMD microprocessors in 2000. Some manufacturers may find themselves having to redesign their products to take account of the shortage of certain chips, or have to leave design options open to allow alternative chips to be incorporated into the design.

Cases of shortages[edit]

In 1988 there was a shortage due to high demand. Workers at seven Hitachi corporation factories had to work through their summer vacations to meet demand.[2] In 1994, there was a shortage due to new technologies being developed. The newer manufacturing processes required significantly cleaner “clean rooms” and many batches of integrated circuits were discarded due to manufacturing errors.

Intel suffered a shortage of several products in 2000. Larger companies were able to receive the products they needed but smaller companies such as Gateway had to wait or find other chips. [3]

There was a lack of CDMA chips in 2004.[4] This was due to the strong push of mobile phone companies to introduce and establish CDMA in both the United States and India.

After the 2011 earthquake in Japan, there was a severe chip shortage of NAND memory and displays. Qualcomm released a statement on April 19, 2012 that they expect a shortage of their Snapdragon chip due to a lack of facilities to manufacture the 28 nm chip.


1986 U.S.-Japan semiconductor trade pact was designed to help US chip manufacturers compete with Japanese companies. This resulted in severe cuts in Japanese production [5] A 1993 DRAM chip famine was caused by a factory explosion at the factory which produced 60% of the world's supply of resin used in chips.[6] From 1993 to 1994, there was a glut of chips and companies lost incentive to build new leading-edge factories. When the new generations came out, there were not enough factories to produce the new chips.[7]

A previous chip famine might also cause slumps in the market, which would result in fewer chips being manufactured. When the slump is over, the demand might grow too high for the companies to fulfill, resulting in a second shortage.

New generation chips are difficult to produce due to manufacturing capabilities. In many cases batches of product are discarded due to manufacturing defects in the first few runs, resulting in manufacturing output that could have gone to producing older chips not being used to ship newer chips either. Furthermore, customers that want the newest chips available and may not be willing to settle for older chips, so companies must wait for the newer chips to put into their products.[8]

1986 chip pact[edit]

A chip pact enacted in 1986 was designed to help the United States compete with Japanese manufacturers. However, it had unintended consequences. The pact called for Japanese companies to stop selling chips below cost, or dumping, which led to the companies producing and exporting fewer chips, the root cause of the dumping.[9] American companies did not reenter the market as expected due to the high cost of production and risk.[10]

Results of shortages[edit]

The 1988 chip famine caused the delay of Zelda II: The Adventure of Link due to a lack of SRAM.[11]

Shortages of DRAM, SRAM, and processors have led to raises the prices for remaining chips. Lack of IC's for their Wii console caused the global Wii shortage in 2007.[12]


  1. ^ "Printed Electronics: Chip famines (Glossary)". Printed Electronics World. Retrieved 2010-01-05.
  2. ^ "Chip Shortage Spurs Japanese Workers to Forgo Vacations". LA Times. Retrieved 2012-05-01.
  3. ^ Magee, Mike (2000-01-06). "Gateway to use AMD because of Intel chip famine". The Register. Retrieved 2010-01-05.
  4. ^ Malik, Om (2004-05-10). "CDMA chip shortage developing?". gigaom. Retrieved 2012-05-05.
  5. ^ "Chip Shortage Strains Computer Makers". LA Times. Retrieved 2012-05-01.
  6. ^ "Real Chip Shortage Or Just A Panic, Crunch Is Likely To Boost Pc Prices". Chicago Tribune. Retrieved 2012-05-01.
  7. ^ "Electronics industry fears chip shortage/Dearth of factories may drive up prices". AP. Retrieved 2012-05-01.
  8. ^ Nystedt, Dan. "Electronics industry fears chip shortage/Dearth of factories may drive up prices Intel laptop chip shortage worsens". IDG. Retrieved 2012-05-01.
  9. ^ Pollack, Andrew. "Shortage of Memory Chips Has Industry Scrambling". NY Times. Retrieved 2012-05-01.
  10. ^ Pollack, Andrew. "Chip Pact Falls Short of Goals". NY Times. Retrieved 2012-05-01.
  11. ^ Lazzareschi, Carla. "High-Tech Crisis Forces Publishers to Make Tough Choices : Shortage of Memory Chips Hurting Video Game Makers". LA Times. Retrieved 2012-05-01.
  12. ^ "Nintendo unable to ramp up Wii production". Digitimes. Retrieved 2012-05-01.