Concept

Gigabyte

Summary
The gigabyte (ˈɡɪɡəbaɪt,_ˈdʒɪɡəbaɪt) is a multiple of the unit byte for digital information. The prefix giga means 109 in the International System of Units (SI). Therefore, one gigabyte is one billion bytes. The unit symbol for the gigabyte is GB. This definition is used in all contexts of science (especially data science), engineering, business, and many areas of computing, including storage capacities of hard drives, solid state drives, and tapes, as well as data transmission speeds. The term is also used in some fields of computer science and information technology to denote 1 073 741 824 (10243 or 230) bytes, however, particularly for sizes of RAM. Thus, prior to 1998, some usage of gigabyte has been ambiguous. To resolve this difficulty, IEC 80000-13 clarifies that a gigabyte (GB) is 109 bytes and specifies the term gibibyte (GiB) to denote 230 bytes. These differences are still readily seen, for example, when a 400 GB drive's capacity is displayed by Microsoft Windows as
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related publications

Loading

Related people

Loading

Related units

Loading

Related concepts

Loading

Related courses

Loading

Related lectures

Loading