|This article needs additional citations for verification. (May 2011)|
|See also: Nibble · Byte · Bit and Byte prefixes
Orders of magnitude of data
The megabit is a multiple of the unit bit for digital information or computer storage. The prefix mega (symbol M) is defined in the International System of Units (SI) as a multiplier of 106 (1 million), and therefore
- 1 megabit = 106bits = 1000000bits = 1000 kilobits.
The megabit has the unit symbol Mb or Mbit.
The megabit is closely related to the mebibit, a unit multiple derived from the binary prefix mebi (symbol Mi) of the same order of magnitude, which is equal to 220bits = 1048576bits, or approximately 5% larger than the megabit. Despite the definitions of these new prefixes for binary-based quantities of storage by international standards organizations, memory semiconductor chips are still marketed using the metric prefix names to designate binary multiples.
The megabit is widely used when referring to data transfer rates of computer networks or telecommunications systems. Network transfer rates and download speeds often use the megabit as the amount transferred per time unit, e.g., a 100 Mbit/s (megabit per second) Fast-Ethernet connection, or a 10 Mbit/s Internet access service, whereas the sizes of data units (files) transferred over these networks are often measured in megabytes. To achieve a transfer rate of one megabyte (1024kB) per second one needs a network connection with a transfer rate of eight megabits per second. This can be confusing for internet users assuming the values are in kilobytes and megabytes per second. If one buys a 2 megabit per second internet plan, one is only getting a 0.25 megabyte (ie, 256 kilobyte) per second plan. In other words: one megabit is only one-eighth of a megabyte. Or another way to it put is: eight megabits equal one megabyte.
- In Telecommunications, use of the correct SI definition of the unit is standard.
- Standard industry practice in RAM and ROM manufacture has been to use the Mb abbreviation in reference to the binary interpretation of the megabit. For example, a single discrete DDR3 chip specified at 512 Mb invariably contains 229 bits = 536870912bits = 512 Mibit of storage, or 671088648-bit bytes, variously referred to as either 64 mebibytes or 64 (binary) megabytes.
- During the 16-bit game console era, the megabit was a commonly used measure of the size (computer data storage capacity) of game cartridges. This size represented one mebibit (Mibit). The vast majority of SNES and Mega Drive (Genesis) games were produced on 8 megabit cartridges, although other sizes such as 4, 12, 16, 24, 32, and 48 megabit cartridges appeared. This usage continued on the Nintendo 64, with cartridge sizes ranging between 32 and 512 megabits.