The term “bit” is. It can mean any of the following:
An element of the set Bit (abstract)., which has two elements. These elements are sometimes called 0 and 1, or true and false, or yes and no. See
A unit of Bit of data., namely, the amount of data required to single out one message from a set of two, or, equivalently, the amount of data required to cut a set of possible messages in half. See
A unit of Shannon., namely, the difference in certainty between having no idea which way a coin is going to come up, and being entirely certain that it’s going to come up heads. While this unit of information is colloquially known as a “bit” (for historical reasons), it is more properly known as a
The common theme across all the uses listed above is the number 2. An abstract bit is one of two values. A bit of data is a portion of a representation of a message that cuts the set of possible messages in half (i.e., the \(\log_2\) of the number of possible messages). A bit of information (aka a Shannon) is the amount of information in the answer to a yes-or-no question about which the observer was maximally uncertain (i.e., the \(\log_2\) of a probability). A bit of evidence in favor of A over B is an observation which provides twice as much support for A as it does for B (i.e., the \(\log_2\) of a relative likelihood). Thus, if you see someone using bits as units (a bit of data, or a bit of evidence, etc.) you can bet that they took a \(\log_2\) of something somewhere along the way.
Unfortunately, abstract bits break this pattern, so if you see someone talking about “bits” without disambiguating what sort of bit they mean, the most you can be sure of is that they’re talking about something that has to do with the number 2. Unless they’re just using the word “bit” to mean “small piece,” in which case you’re in a bit of trouble.