The characteristic of the logarithm

Con­sider the in­ter­pre­ta­tion of log­a­r­ithms as the cost of com­mu­ni­cat­ing a mes­sage. Every time the num­ber of pos­si­ble mes­sages to send dou­bles, your com­mu­ni­ca­tion costs in­crease by the price of a coin, or what­ever cheaper stor­age medium you have that can com­mu­ni­cate one of two mes­sages. It doesn’t mat­ter whether the num­ber of pos­si­ble mes­sages goes from 4 to 8 or whether it goes from 4096 to 8192; in both cases, your costs go up by the price of a coin. It is the fac­tor by which the set grew (or shrank) that af­fects the cost; not the ab­solute num­ber of mes­sages added (or re­moved) from the space of pos­si­bil­ities. If the space of pos­si­ble mes­sages halves, your costs go down by one coin, re­gard­less of how many pos­si­bil­ities there were be­fore the halv­ing.

Alge­braically, writ­ing \(f\) for the func­tion that mea­sures your costs, \(c(x \cdot 2) =\) \(c(x) + c(2),\) and, in gen­eral, \(c(x \cdot y) =\) \(c(x) + c(y),\) where we can in­ter­pret \(x\) as the num­ber of pos­si­ble mes­sages be­fore the in­crease, \(y\) as the fac­tor by which the pos­si­bil­ities in­creased, and \(x \cdot y\) as the num­ber of pos­si­bil­ities af­ter the in­crease.

This is the key char­ac­ter­is­tic of the log­a­r­ithm: It says that, when the in­put goes up by a fac­tor of \(y\), the quan­tity mea­sured goes up by a fixed amount (that de­pends on \(y\)). When you see this pat­tern, you can bet that \(c\) is a log­a­r­ithm func­tion. Thus, when­ever some­thing you care about goes up by a fixed amount ev­ery time some­thing else dou­bles, you can mea­sure the thing you care about by tak­ing the log­a­r­ithm of the grow­ing thing. For ex­am­ple:

  • Con­sider the prob­lem of check­ing whether a date is con­tained in a gi­gan­tic sorted list of dates. You can do this by jump­ing to the mid­dle of the list, see­ing whether your date is ear­lier or later than the date in the mid­dle, and thereby cut­ting the search space in half. Each time you do this, you cut the list of dates you’re search­ing for in half, and so the to­tal num­ber of el­e­ments you need to look at goes up by one ev­ery time the size of the list dou­bles. Thus, the cost of search­ing an or­dered list grows log­a­r­ith­mi­cally in the size of the list. See also bi­nary search.

  • Con­sider a colony of bac­te­ria where each bac­terium in the colony re­pro­duces once per day. Thus, the size of the colony roughly dou­bles each day. If you care about how long this colony of bac­te­ria has been grow­ing, you can mea­sure the days by tak­ing the log­a­r­ithm of the num­ber of bac­te­ria in the colony. The log­a­r­ithm (base 2) counts how many times the colony has dou­bled (and the log base 3 counts how many times it has tripled, and so on).

  • The length of a num­ber in Dec­i­mal no­ta­tion grows more-or-less log­a­r­ith­mi­cally in the mag­ni­tude of the num­ber: When the mag­ni­tude of the num­ber goes up by a fac­tor of 10, the num­ber of digits it takes to write the num­ber down grows by 1. How­ever, this anal­ogy is not perfect: Some­times, mul­ti­ply­ing a num­ber by two does not in­crease its length (con­sider the num­ber 300), and some­times, di­vid­ing a num­ber by 10 does not de­crease its length by one digit (con­sider the num­ber 1). See also Length isn’t quite log­a­r­ith­mic.

Con­versely, when­ever you see a \(\log_2\) in an equa­tion, you can de­duce that some­one wants to mea­sure some sort of thing by count­ing the num­ber of dou­blings that an­other sort of thing has un­der­gone. For ex­am­ple, let’s say you see an equa­tion where some­one takes the \(\log_2\) of a rel­a­tive like­li­hood. What should you make of this? Well, you should con­clude that there is some quan­tity that some­one wants to mea­sure which can be mea­sured in terms of the num­ber of dou­blings in that like­li­hood ra­tio. And in­deed there is! It is known as (Bayesian) ev­i­dence, and the key idea is that the strength of ev­i­dence for a hy­poth­e­sis \(A\) over its nega­tion \(\lnot A\) can be mea­sured in terms of \(2 : 1\) up­dates in fa­vor of \(A\) over \(\lnot A\). (For more on this idea, see What is ev­i­dence?).

In fact, a given func­tion \(f\) such that \(f(x \cdot y) = f(x) + f(y)\) is al­most guaran­teed to be a log­a­r­ithm func­tion — mod­ulo a few tech­ni­cal­ities.

This puts us in a po­si­tion where you can de­rive all the main prop­er­ties of the log­a­r­ithm (such as \(\log_b(x^n) = n \log_b(x)\) for any \(b\)) your­self. Check this box if that’s some­thing you’re in­ter­ested in do­ing. y: path: [4bz n: ]

Con­di­tional text de­pend­ing what’s next on the path.

Parents: