Jump to content

Gigabit

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Kbrose (talk | contribs) at 15:53, 20 November 2015 (Undid revision 691451872 by SarahTehCat (talk) the definition is not fixed, both are available). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Decimal
Value Metric
1000 kbit kilobit
10002 Mbit megabit
10003 Gbit gigabit
10004 Tbit terabit
10005 Pbit petabit
10006 Ebit exabit
10007 Zbit zettabit
10008 Ybit yottabit
10009 Rbit ronnabit
100010 Qbit quettabit
Binary
Value IEC Memory
1024 Kibit kibibit Kbit Kb kilobit
10242 Mibit mebibit Mbit Mb megabit
10243 Gibit gibibit Gbit Gb gigabit
10244 Tibit tebibit
10245 Pibit pebibit
10246 Eibit exbibit
10247 Zibit zebibit
10248 Yibit yobibit
Orders of magnitude of data

The gigabit is a multiple of the unit bit for digital information or computer storage. The prefix giga (symbol G) is defined in the International System of Units (SI) as a multiplier of 109 (1 billion, short scale),[1] and therefore

1 gigabit = 109bits = 1000000000bits.

The gigabit has the unit symbol Gbit or Gb.

Using the common byte size of 8 bits, 1 Gbit is equal to 125 megabytes (MB) or approximately 119 mebibytes (MiB).

The gigabit is closely related to the gibibit, a unit multiple derived from the binary prefix gibi (symbol Gi) of the same order of magnitude,[2] which is equal to 230bits = 1073741824bits, or approximately 7% larger than the gigabit.

See also

References