The logarithm function shows up throughout mathematics and computer science. In this note, we will give a brief survey of the logarithm function’s properties.

## Definition and Basic Properties

The logarithm function is typically defined as the inverse of the exponential function. That is, given values $$b > 0$$ with $$b \neq 1$$ and $$x > 0$$, $$y = \log_b x$$ is the unique solution to the equation $$b^y = x$$. Here $$b$$ is referred to as the base of the logarithm. Throughout this course, we will us $$\log$$ (without the subscript) to refer to $$\log_2$$.

The (base-$$2$$) logarithm function has some useful (and curious) properties, listed below.

1. $$\log 2 = 1$$,
2. $$\log(x \cdot y) = \log(x) + \log(y)$$,
3. $$\log(x^a) = a \log(x)$$.
4. Change of base formula:

$\log_b(x) = \frac{\log(x)}{\log(b)}$

These are the basic properties characterizing the logarithm function, and other useful properties can be derived from these as well. For example,

$\log(x / y) = \log(x \cdot y^{-1}) = \log(x) + \log(y^{-1}) = \log(x) - \log(y).$

The second equality comes from property 2, while the third comes from property 3.

## Growth of the Logarithm

One characteristic of $$\log n$$ is that it grows very slowly with its input. Property 2 above gives some indication of why $$\log n$$ grows so slowly: $$\log (2n) = \log (n) + \log (2) = \log(n) + 1$$. That is, if we double the input to the $$\log$$ function, its value only increases by $$1$$. In fact we can show that $$\log n$$ grows more slowly than every positive power of $$n$$. (The statement below uses big O notation.) We will frequently appeal to the following fact about $$\log n$$.

Fact. For every positive number $$a > 0$$, $$\log n = O(n^a)$$.

To get a more concrete sense of how slowly the logarithm function grows, note the following powers of $$2$$:

power of $$2$$   approximate value
$$2^{10}$$ $$\approx 1,000$$
$$2^{20}$$ $$\approx 1,000,000$$
$$2^{30}$$ $$\approx 1,000,000,000$$

Correspondingly, we get following table of logarithms:

logarithm   approximate value
$$\log(1,000)$$ $$\approx 10$$
$$\log(1,000,000)$$ $$\approx 20$$
$$\log(1,000,000,000)$$ $$\approx 30$$

In class, we showed that the binary search procedure for finding an element in a sorted array has a running time of $$O(\log n)$$. The table above gives some indication of how efficient binary search is. For example, suppose we use binary search to find a student in an array containing all Amherst College students ($$n \approx 2,000$$), and find the program takes about 10 ms to complete. If we used the same procedure to find a name in a sorted array containing the names of all Massachusetts residents ($$n \approx 7,000,000$$), we might expect that the program would only take about twice as long to complete: 21 ms. Running the same program on a sorted array containing the name of every person on earth ($$n \approx 8,000,000,000$$) would only require about 3 times as many operations as searching for an Amherst College students, so we might expect our program to complete in about 30 ms! Even though the world population is about 4 million times Amherst College’s population, a procedure with $$O(\log n)$$ running time might only take 3 times as long to run with the entire world population as its input as the same procedure run on just Amherst College students.