Why is log like length?

If a num­ber \(x\) is \(n\) digits long (in dec­i­mal no­ta­tion), then its log­a­r­ithm (base 10) is be­tween \(n-1\) and \(n\). This fol­lows di­rectly from the defi­ni­tion of the log­a­r­ithm: \(\log_{10}(x)\) is the num­ber of times you have to mul­ti­ply 1 by 10 to get \(x;\) and each new digit lets you write down ten times as many num­bers. In other words, if you have one digit, you can write down any one of ten differ­ent things (0-9); if you have two digits you can write down any one of a hun­dred differ­ent things (00-99); if you have three digits, you can write down any one of a thou­sand differ­ent things (000-999); and in gen­eral, each new digit lets you write down ten times as many things. Thus, the num­ber of digits you need to write \(x\) is close to the num­ber of times you have to mul­ti­ply 1 by 10 to get \(x\). The only differ­ence is that, when com­put­ing logs, you mul­ti­ply 1 by 10 ex­actly as many times as it takes to get \(x\), which might re­quire mul­ti­ply­ing by 10 a frac­tion of a time (if x is not a power of 10), whereas the num­ber of digits in the base 10 rep­re­sen­ta­tion of x is always a whole num­ber.

Parents: