« back to Glossary Index
- A unit of measure consisting of eight bits that is the basic measurement of most computer data as multiples of the byte value. One million bytes are equivalent to a "megabyte" while one billion bytes is a "gigabyte." 
- Eight bits. 
- A computer word or a sequence of bits used as one unit, usually eight bits long. In word processing, a single character, such as a letter, is usually one byte in size. 
- Eight bits. The ASCII standard to define letters, numbers and characters – maximum of 256. KB – Kilo-bytes, a thousand bytes (actually 2 10 or 1024 bytes). MB – Megabytes, a million bytes, (actually 2 20 or 1,024 KB or 1,048,576 bytes) GB – Gigabytes, a billion bytes (actually 2 30 or 1024 MB or 1,073,741,824 bytes). 
- Eight bits. A byte is a collection of bits used by computers to represent a character (i.e., "a", "1", or "&"). A "megabyte" is one million bytes or eight million bits or a "gigabyte" is one billion bytes or eight billion bits. 1 gigabyte = 1,000 megabytes. 1 terabyte = 1,000 gigabytes. 
- Standard unit of measure for computer storage. A byte is 8 bits (binary digits) and corresponds to about 1 English character.