|Orders of magnitude of data|
The gigabit is a multiple of the unit bit for digital information or computer storage. The prefix giga (symbol G) is defined in the International System of Units (SI) as a multiplier of 109 (1 billion, short scale), and therefore.
- 1 gigabit = 109bits = 1000000000bits.
The gigabit has the unit symbol Gbit or Gb.
The gigabit is closely related to the gibibit, a unit multiple derived from the binary prefix gibi (symbol Gi) of the same order of magnitude, which is equal to 230bits = 1073741824bits, or approximately 7% larger than the gigabit.