The log­a­r­ithm base \(b\) of a num­ber \(n,\) writ­ten \(\log_b(n),\) is the an­swer to the ques­tion “how many times do you have to mul­ti­ply 1 by \(b\) to get \(n\)?” For ex­am­ple, \(\log_{10}(100)=2,\) and \(\log_{10}(316) \approx 2.5,\) be­cause \(316 \approx\) \(10 \cdot 10 \cdot \sqrt{10},\) and mul­ti­ply­ing by \(\sqrt{10}\) cor­re­sponds to mul­ti­ply­ing by 10 “half a time”.

In other words, \(\log_b(x)\) counts the num­ber of \(b\)-fac­tors in \(x\). For ex­am­ple, \(\log_2(100)\) counts the num­ber of “dou­blings” in the num­ber 100, and \(6 < \log_2(100) < 7\) be­cause scal­ing an ob­ject up by a fac­tor of 100 re­quires more than 6 (but less than 7) dou­blings. For an in­tro­duc­tion to log­a­r­ithms, see the Ar­bital log­a­r­ithm tu­to­rial. For an ad­vanced in­tro­duc­tion, see the ad­vanced log­a­r­ithm tu­to­rial.

For­mally, \(\log_b(n)\) is defined to be the num­ber \(x\) such that \(b^x = n,\) where \(b\) and \(n\) are num­bers.\(b\) is called the “base” of the log­a­r­ithm, and has a re­la­tion­ship to the base of a num­ber sys­tem. For a dis­cus­sion of com­mon and use­ful bases for log­a­r­ithms, see the page on log­a­r­ithm bases.\(x\) is unique if by “num­ber” we mean \(\mathbb R\), but may not be unique if by “num­ber” we mean \(\mathbb C\). For de­tails, see the page on com­plex log­a­r­ithms.

Ba­sic properties

Log­a­r­ithms satisfy a num­ber of de­sir­able prop­er­ties, in­clud­ing:

  • \(\log_b(1) = 0\) for any \(b\)

  • \(\log_b(b) = 1\) for any \(b\)

  • \(\log_b(x\cdot y) = log_b(x) + \log_b(y)\)

  • \(\log_b(x^n) = n\log_b(x)\)

  • \(\log_a(n) = \frac{\log_b(n)}{\log_b(a)}\)

For an ex­panded list of prop­er­ties, ex­pla­na­tions of what they mean, and the rea­sons for why they hold, see Log­a­r­ith­mic iden­tities.


  • Log­a­r­ithms can be in­ter­preted as a gen­er­al­iza­tion of the no­tion of the length of a num­ber: 103 and 981 are both three digits long, but, in­tu­itively, 103 is only barely us­ing three digits, whereas 981 is push­ing its three digits to the limit. Log­a­r­ithms quan­tify this in­tu­ition: the com­mon log­a­r­ithm of 103 is ap­prox­i­mately 2.01, and the com­mon log of 981 is ap­prox­i­mately 2.99. Log­a­r­ithms give rise to a no­tion of ex­actly how many digits a num­ber is “ac­tu­ally” mak­ing use of, and give us a no­tion of “frac­tional digits.” For more on this in­ter­pre­ta­tion (and why it is 316, not 500, that is two and a half digits long), see Log as gen­er­al­ized length.

  • Log­a­r­ithms can be in­ter­preted as a mea­sure of how much data it takes to carry a mes­sage. Imag­ine that you and I are both fac­ing a col­lec­tion of 100 differ­ent ob­jects, and I’m think­ing of one of them in par­tic­u­lar. If I want to tell you which one I’m think­ing of, how many digits do I need to trans­mit to you? The an­swer is \(\log_{10}(100)=2,\) as­sum­ing that by “digit” we mean “some method of en­cod­ing one of the sym­bols 0-9 in a phys­i­cal medium.” Mea­sur­ing data in this way is the cor­ner­stone of in­for­ma­tion the­ory.

  • Log­a­r­ithms are the in­verse of ex­po­nen­tials. The func­tion \(\log_b(\cdot)\) in­verts the func­tion \(b^{\ \cdot}.\) In other words, \(\log_b(n) = x\) im­plies that \(b^x = n,\) so \(\log_b(b^x)=x\) and \(b^{\log_b(n)}=n.\) Thus, log­a­r­ithms give us tools for an­a­lyz­ing any­thing that grows ex­po­nen­tially. If a pop­u­la­tion of bac­te­ria dou­bles each day, then log­a­r­ithms mea­sure days in terms of bac­te­ria — that is, they can tell you how long it will take for the pop­u­la­tion to reach a cer­tain size. For more on this idea, see Log­a­r­ithms in­vert ex­po­nen­tials.


Log­a­r­ithms are ubiquitous in many fields, in­clud­ing math­e­mat­ics, physics, com­puter sci­ence, cog­ni­tive sci­ence, and ar­tifi­cial in­tel­li­gence, to name a few. For ex­am­ple:

  • In math­e­mat­ics, the most nat­u­ral log­a­r­ith­mic base is \(e\) (Why?) and the log base \(e\) of \(x\) is writ­ten \(\ln(x)\), pro­nounced “nat­u­ral log of x.” The nat­u­ral log­a­r­ithm of a num­ber gives one no­tion of the “in­trin­sic length” of a num­ber, a con­cept that proves use­ful when rea­son­ing about other prop­er­ties of that num­ber. For ex­am­ple, the quan­tity of prime num­bers smaller than \(x\) is ap­prox­i­mately \(\frac{x}{\ln(x)},\) this is the prime num­ber the­o­rem.

  • Log­a­r­ithms also give us tools for mea­sur­ing the run­time (or mem­ory us­age) of al­gorithms. When an al­gorithm uses a di­vide and con­quer ap­proach, the amount of time (or mem­ory) used by the al­gorithm in­creases log­a­r­ith­mi­cally as the in­put size grows lin­early. For ex­am­ple, the amount of time that it takes to perform a bi­nary search through \(n\) pos­si­bil­ities is \(\log_2(n),\) which means that the search takes one unit longer to run ev­ery time the set of things to search through dou­bles in size.

  • Log­a­r­ithms give us tools for study­ing the tools we use to rep­re­sent num­bers. For ex­am­ple, hu­mans tend to use ten differ­ent sym­bols to rep­re­sent num­bers (0, 1, 2, 3, 4, 5, 6, 7, 8, and 9), while com­put­ers tend to use two digits (0 and 1). Are some rep­re­sen­ta­tions bet­ter or worse than oth­ers? What are the pros and cons of us­ing more or fewer sym­bols? For more on these ques­tions, see Num­ber bases.

  • The hu­man brain en­codes var­i­ous per­cep­tions log­a­r­ith­mi­cally. For ex­am­ple, the per­ceived tone of a sound goes up by one oc­tave ev­ery time the fre­quency of air vibra­tions dou­bles. Your per­cep­tion of tone is pro­por­tional to the log­a­r­ithm (base 2) of the fre­quency at which the air is vibrat­ing. See also Hick’s law.



  • Mathematics

    Math­e­mat­ics is the study of num­bers and other ideal ob­jects that can be de­scribed by ax­ioms.