Theories of Exponential and Logarithmic Functions: Part 1

2 comments
In the past few months I saw a lot of questions on MSE regarding exponential and logarithmic functions. Most students were more used to the idea of defining $e$ by $\lim\limits_{n \to \infty}\left(1 + \dfrac{1}{n}\right)^{n}$ and then defining the exponential function as $e^{x}$. I tried to answer some of these questions and based on the suggestion of a user, I am trying to consolidate my answers into a series of posts here. One thing which I must mention here is that most students do have an intuitive idea of the exponential and logarithmic functions but many lack a sound theoretical foundation. In this series of posts I will provide multiple approaches to develop a theory of exponential and logarithmic functions. We will restrict ourselves to real variables only.

Properties of Exponential and Logarithmic Functions

Let us first revise common properties of these functions which any coherent theory must explain and establish. I first list down the properties of exponential function $\exp(x)$:
  • $\exp(x)$ is a function from $\mathbb{R}$ to $\mathbb{R}^{+}$ which is strictly increasing, continuous and differentiable for all $x$.
  • $\exp(0) = 1, \exp(x + y) = \exp(x)\exp(y), \exp(-x) = 1/\exp(x)$ for all $x, y \in \mathbb{R}$.
  • $e = \exp(1)$ and $e = \lim\limits_{n \to \infty}\left(1 + \dfrac{1}{n}\right)^{n}$
  • For all rational $x$ we have $\exp(x) = e^{x}$ so that the function $\exp(x)$ can be used to define irrational exponents for a specific base $e$.
  • $\lim\limits_{x \to 0}\dfrac{\exp(x) - 1}{x} = 1$
  • $\dfrac{d}{dx}\{\exp(x)\} = \exp(x)$
  • $\exp(x) = \lim\limits_{n \to \infty}\left(1 + \dfrac{x}{n}\right)^{n}$ for all $x \in \mathbb{R}$
  • $\displaystyle \exp(x) = 1 + x + \dfrac{x^{2}}{2!} + \dfrac{x^{3}}{3!} + \cdots = \sum_{n = 0}^{\infty}\frac{x^{n}}{n!}$ for all $x \in \mathbb{R}$
Next we list down the properties of logarithmic function:
  • $\log x$ is a function from $\mathbb{R}^{+}$ to $\mathbb{R}$ which is strictly increasing, continuous and differentiable for all $x > 0$.
  • $\log 1 = 0, \log(xy) = \log x + \log y, \log (1/x) = -\log x$ for all $x, y > 0$
  • $\log e = 1$
  • $\log a^{b} = b \log a$ for $a > 0$ and rational $b$ so that $a^{b}$ can be defined for irrational $b$ as $\exp(b\log a)$.
  • $\lim\limits_{x \to 0}\dfrac{\log (1 + x)}{x} = 1$
  • $\dfrac{d}{dx}\{\log x\} = \dfrac{1}{x}$
  • $\log x = \lim\limits_{n \to \infty}n(\sqrt[n]{x} - 1)$
  • $\displaystyle \log (1 + x) = x - \frac{x^{2}}{2} + \frac{x^{3}}{3} - \cdots = \sum_{n = 1}^{\infty}(-1)^{n - 1}\frac{x^{n}}{n}$ for all $x$ satisfying $-1 < x \leq 1$.
Once we have defined $a^{x}$ for $a > 0$ and all $x$ by $\exp(x\log a)$ we have additional properties of the general exponential function $a^{x}$:
  • $a^{x + y} = a^{x}a^{y}$ for all $x, y \in \mathbb{R}$
  • $\lim\limits_{x \to 0}\dfrac{a^{x} - 1}{x} = \log a$
  • $\dfrac{d}{dx}\{a^{x}\} = a^{x}\log a$
  • $\lim\limits_{h \to 0}(1 + xh)^{1/h} = \exp(x)$
  • $\lim\limits_{x \to \infty}\dfrac{\log x}{x^{a}} = 0$ for all $a > 0$.
  • $\lim\limits_{x \to \infty}\dfrac{x^{a}}{\exp(x)} = 0$ for all $a > 0$.
I have listed down the elementary properties of exponential and logarithmic functions and any coherent theory of these functions must establish these properties in a non-circular fashion. Note that the functions $\exp(x)$ and $\log x$ are inverses to each other and hence many of the properties of $\log x$ can be deduced via the corresponding properties of $\exp(x)$ and vice-versa. Hence we only need to establish one of the two corresponding properties. We now start with the easiest, simplest, but non-intuitive theory.

Definition of Logarithm as an Integral

This is the most usual approach found in books on "mathematical analysis", but when I studied this in Hardy's Pure Mathematics I was really amazed at its beauty and elegance. We start by defining $\log x$ as $$\log x = \int_{1}^{x}\frac{dt}{t}\tag{1}$$ for all $x > 0$. Clearly this definition is valid because if $x > 0$ then the function $1/t$ is defined and continuous in $[1, x]$ (or $[x, 1]$ if $x < 1$) and a continuous function is integrable. We immediately get some of the properties of $\log x$ namely $$\log 1 = 0, \frac{d}{dx}\{\log x\} = \frac{1}{x}\tag{2}$$ so that the derivative $(\log x)' = 1/x > 0$ and therefore $\log x$ is strictly increasing for $x > 0$. It follows that if $x < 1$ then $\log x < \log 1 = 0$ and if $x > 1$ then $\log x > \log 1 = 0$. Thus $\log x$ is negative if $0 < x < 1$ and positive if $x > 1$.

Next we see that the derivative of $\log x $ at $x = 1$ is $1/1 = 1$ and hence $$\lim_{x \to 1}\frac{\log x}{x - 1} = 1,\,\lim_{x \to 0}\dfrac{\log(1 + x)}{x} = 1\tag{3}$$ A novice reader may find this definition of $\log x$ sort of heavily crafted to meet the needs of continuity and differentiability and probably far removed from the basic functional property of $\log x$ namely $\log (xy) = \log x + \log y$. However it turns out that it is a matter of simple calculus: \begin{align} \log (xy) &= \int_{1}^{xy}\frac{dt}{t}\notag\\ &= \int_{1}^{x}\frac{dt}{t} + \int_{x}^{xy}\frac{dt}{t}\notag\\ &= \log x + \int_{1}^{y}\frac{d(vx)}{vx}\text{ (putting }t = vx)\notag\\ &= \log x + \int_{1}^{y}\frac{dv}{v}\notag\\ &= \log x + \log y\tag{4} \end{align} Putting $y = 1/x$ and noting that $\log 1 = 0$ we get $\log (1/x) = -\log x$. Next we define the number $e$ by $\log e = 1$. Since $\log x$ is strictly increasing the definition of $e$ is unambiguous and we have $e > 1$. Let us then analyze the expression $\log a^{b}$. First we start with $b$ as a positive integer. Then we have \begin{align} \log a^{b} &= \log (a\cdot a\cdots b\text{ times }\cdot a)\notag\\ &= \log a + \log a + \cdots b\text{ times}\notag\\ &= b\log a\notag \end{align} If $b = 0$ then $\log a^{b} = \log a^{0} = \log 1 = 0 = 0 \log a = b\log a$ so that the relation $\log a^{b} = b\log a$ holds for all non-negative integers $b$. If on the other hand $b$ is a negative integer, say $b = -m$ then $\log a^{b} = \log a^{-m} = \log (1/a^{m}) = -\log a^{m} = -m\log a = b\log a$ so that the desired relation is valid for negative integers $b$.

Suppose that $b$ is a rational number, say $b = p/q$ where $p$ is an integer and $q$ is a positive integer. Then we have $p\log a = \log a^{p} = \log (a^{p/q})^{q} = q\log a^{b}$ so that $\log a^{b} = (p/q)\log a = b\log a$. Thus the relation $$\log a^{b} = b\log a\tag{5}$$ holds for all $a > 0$ and all rational numbers $b$.

We now analyze the behavior of $\log x$ as $x \to \infty$. Suppose $N > 0$ is any pre-assigned number. Now note that $\log 2 > 0$ and hence we can choose an integer $n > 0$ such that $n > N/\log 2 > 0$ so that $n\log 2 > N$. Let us now choose $x > 2^{n}$ so that $\log x > \log 2^{n} = n\log 2 > N$. It thus follows that $\log x \to \infty$ as $x \to \infty$. Replacing $x $ by $1/x$ we can see that $\log x \to -\infty$ as $x \to 0^{+}$.

We now move to the exponential function. Exponential function $\exp(x)$ is defined as the inverse of the logarithm function defined above. Thus we write $y = \exp(x)$ if $\log y = x$. Note that the inverse function exists and is strictly increasing, continuous, differentiable in the domain because the original function $\log $ is strictly increasing, continuous and differentiable. Also $\exp(x)$ is a function from $\mathbb{R}$ to $\mathbb{R}^{+}$. Immediate consequences of this definition are $$\exp(0) = 1, \exp(1) = e, \exp(x + y) = \exp(x)\exp(y)\tag{6}$$ for all $x, y$. The derivative of $\exp(x)$ can be calculated via the technique of differentiation of inverse functions: $$y = \exp(x) \Rightarrow \log y = x \Rightarrow \frac{dx}{dy} = \frac{1}{y}\Rightarrow \frac{dy}{dx} = y = \exp(x)$$ so that the exponential function $\exp(x)$ is its own derivative. Therefore the derivative of $\exp(x)$ at $x = 0$ is $\exp(0) = 1$ so that $$\lim_{x \to 0}\frac{\exp(x) - 1}{x} = 1\tag{7}$$
Let's now establish the fundamental limit formula for $\exp(x)$ namely $$\exp(x) = \lim_{n \to \infty}\left(1 + \frac{x}{n}\right)^{n}\tag{8}$$ This is easy to establish if we observe that $\log$ function is continuous and therefore \begin{align} \log\left\{\lim_{n \to \infty}\left(1 + \frac{x}{n}\right)^{n}\right\} &= \lim_{n \to \infty}\left\{\log\left(1 + \frac{x}{n}\right)^{n}\right\}\notag\\ &= \lim_{n \to \infty}n\log\left(1 + \frac{x}{n}\right)\notag\\ &= \lim_{n \to \infty}x\cdot\frac{\log\{1 + (x/n)\}}{x/n}\notag\\ &= x\cdot 1 = x\notag \end{align} and then $$\lim_{n \to \infty}\left(1 + \frac{x}{n}\right)^{n} = \exp(x)$$ In exactly the same manner we can prove that $$\lim_{n \to \infty}\left(1 - \frac{x}{n}\right)^{-n} = \exp(x)\tag{9}$$ and thus we obtain the link to the common definition of $e$ as $$e = \exp(1) = \lim_{n \to \infty}\left(1 + \frac{1}{n}\right)^{n}$$
We now use the Taylor's Theorem namely $$f(a + h) = f(a) + hf'(a) + \frac{h^{2}}{2!}f''(a) + \cdots + \frac{h^{n - 1}}{(n - 1)!}f^{(n - 1)}(a) + \frac{h^{n}}{n!}f^{(n)}(a + \theta h)$$ where $f^{(n)}$ exists in a certain neighbourhood of $a$ (with $(a + h)$ also lying in the same neighborhood) and $\theta$ being some number in $(0, 1)$ to get the series for $\exp(x)$. We just need to replace $h$ by $x$ and put $a = 0$ and note that the remainder term tends to zero as $n \to \infty$. Thus we get $$\exp(x) = 1 + x + \frac{x^{2}}{2!} + \cdots + \frac{x^{n}}{n!} + \cdots\tag{10}$$
Again suppose that $x$ is rational so that $e^{x}$ is defined and then we have $\log e^{x} = x\log e = x$ so that $e^{x} = \exp(x)$ for rational $x$. This is expressed in a grand fashion as $$\left(1 + 1 + \frac{1}{2!} + \frac{1}{3!} + \cdots\right)^{x} = 1 + x + \frac{x^{2}}{2!} + \frac{x^{3}}{3!} + \cdots$$ where $x$ is rational.

The next step is to define the general power $a^{b}$ as $$a^{b} = \exp(b\log a)\tag{11}$$ where $a > 0$ and $b$ is any real number. Because of the equation $(5)$ above it matches with the usual definition of $a^{b}$ when $b$ is rational. The laws of exponents are proved easily using the laws application to exponential and logarithm functions. Using this definition we can see that $$\frac{d}{dx}\{a^{x}\} = \frac{d}{dx}\{\exp(x\log a)\} = \exp(x\log a)\log a = a^{x}\log a\tag{12}$$ which implies that the derivative of $a^{x}$ at $x = 0$ is $a^{0}\log a = \log a$ and this means that $$\lim_{x \to 0}\frac{a^{x} - 1}{x} = \log a\tag{13}$$ Putting $x = 1/n$ and letting $n \to \infty$ we get the limit formula $$\lim_{n \to \infty}n(\sqrt[n]{a} - 1) = \log a\tag{14}$$
The final result we establish out of this theory is the logarithmic series $$\log(1 + x) = x - \frac{x^{2}}{2} + \frac{x^{3}}{3} - \cdots\tag{15}$$ for $-1 < x \leq 1$. We note that \begin{align} \log(1 + x) &= \int_{0}^{x}\frac{dt}{1 + t}\notag\\ &= \int_{0}^{x}\left(1 - t + t^{2} - t^{3} + \cdot + (-1)^{n - 1}t^{n - 1} + \frac{(-1)^{n}t^{n}}{1 + t}\right)\,dt\notag\\ &= x - \frac{x^{2}}{2} + \frac{x^{3}}{3} - \cdots + \frac{(-1)^{n - 1}x^{n}}{n} + R_{n}\notag \end{align} where $$R_{n} = (-1)^{n}\int_{0}^{x}\frac{t^{n}}{1 + t}\,dt$$ Clearly if $0 \leq x \leq 1$ then we have $$0 \leq |R_{n}| \leq \int_{0}^{x}t^{n}\,dt = \frac{x^{n + 1}}{n + 1} \leq \frac{1}{n + 1}$$ so that $R_{n} \to 0$ as $n \to \infty$. If $-1 < x < 0$ then we can put $x = -y$ and see that $$R_{n} = \int_{0}^{y}\frac{t^{n}}{1 - t}\,dt \leq \frac{1}{1 - y}\int_{0}^{y}t^{n}\,dt = \frac{y^{n + 1}}{(1 - y)(n + 1)}$$ so that $R_{n} \to 0$ as $n \to \infty$. Thus we have established the logarithmic series.

Next we come to come to the limit formula $$\exp(x) = \lim_{h \to 0}(1 + xh)^{1/h}\tag{15}$$ This can be shown by taking logs and showing that the new expression tends to $x$ as $h \to 0$ so that the original limit is $\exp(x)$. We have also mentioned two fundamental limits $$\lim_{x \to \infty}\frac{\log x}{x^{a}} = 0,\,\, \lim_{x \to \infty}\frac{x^{a}}{\exp(x)} = 0\tag{16}$$ for $a > 0$. For the first limit let us take a number $b$ such that $0 < b < a$ and note that $$\log x = \int_{1}^{x}\frac{dt}{t} < \int_{1}^{x} \frac{dt}{t^{1 - b}} = \frac{x^{b} - 1}{b} < \frac{x^{b}}{b}$$ for $x > 1$. Then we can see that $$0 < \frac{\log x}{x^{a}} < \frac{1}{bx^{a - b}}$$ and letting $x \to \infty$ we see that $\dfrac{\log x}{x^{a}} \to 0$ as $x \to \infty$.

The second limit can be established by the use of exponential series. Clearly we can find an integer $n$ such that $n > a$ and then by exponential series $\exp(x) > x^{n}/n!$ so that $0 < x^{a}/\exp(x) < n!/x^{n - a}$ and taking limit as $x \to \infty$ we see that $\exp(x)/x^{a} \to 0$.

We have thus established all the common properties of exponential and logarithmic functions. It will be found that the most of the proofs depend on establishing the derivative of these functions. While discussing alternative theories in later posts we will try to establish the derivatives and let the reader carry on from that point onwards.

Print/PDF Version

2 comments :: Theories of Exponential and Logarithmic Functions: Part 1

Post a Comment

  1. Thanks Paramanand.
    I've been trawling Stackexchange to find derivations for exponential / logarithmic properties to add to my personal notes (all the while ensuring they aren't circular), when you have a comprehensive list here!

  2. Thank you very much, Paramanand Singh for very readable post. I think we must show that $lim_{n \to \infty} (1 + \frac{x}{n})^n$ exists when we prove (8).