Gigabit

The gigabit is a multiple of the unit bit for digital information or computer storage. The prefix giga (symbol G) is defined in the International System of Units (SI) as a multiplier of 109 (1 billion, short scale),[1] and therefore.

1 gigabit = 109bits = 1000000000bits.
Decimal
Value  Metric 
1000 kbitkilobit
10002 Mbitmegabit
10003 Gbitgigabit
10004 Tbitterabit
10005 Pbitpetabit
10006 Ebitexabit
10007 Zbitzettabit
10008 Ybityottabit
Binary
Value  IEC  Legacy 
1024 Kibitkibibit KbitKbkilobit
10242 Mibitmebibit MbitMbmegabit
10243 Gibitgibibit GbitGbgigabit
10244 Tibittebibit TbitTbterabit
10245 Pibitpebibit
10246 Eibitexbibit
10247 Zibitzebibit
10248 Yibityobibit
Orders of magnitude of data

The gigabit has the unit symbol Gbit or Gb.

Using the common byte size of 8 bits, 1 Gbit is equal to 125 megabytes (MB) or approximately 119 mebibytes (MiB).

The gigabit is closely related to the gibibit, a unit multiple derived from the binary prefix gibi (symbol Gi) of the same order of magnitude,[2] which is equal to 230bits = 1073741824bits, or approximately 7% larger than the gigabit.

See also

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.