Bit
Definition:
A bit (short for binary digit) is the most basic unit of information in classical computing and digital communication. It can take on one of two values, 0 or 1.
Scientific context:
In classical computers, bits are physically realized using systems that have two distinguishable states, such as voltage levels in circuits, magnetization directions in storage media, or the presence or absence of light in optical systems. Bits are the foundation of all classical data processing, encoding, and transmission.
Example in practice:
A single bit can represent a binary choice (e.g., yes/no, on/off, true/false). Multiple bits can be combined to represent numbers, characters, or more complex data (8 bits = 1 byte). In a file stored on your computer, every pixel, letter, or sound sample is ultimately encoded using bits.
Did you know?
The term “bit” was coined by statistician John Tukey in 1946, and popularized by Claude Shannon, the father of information theory, in his 1948 paper A Mathematical Theory of Communication.