Theories of Exponential and Logarithmic Functions: Part 2

4 comments

Exponential Function as a Limit

In the last post we developed the theory of exponential and logarithmic function using the standard approach of defining logarithm as an integral. In this post we will examine various alternative approaches to develop a coherent theory of these functions. We will start with the most common definition of $\exp(x)$ as the limit of a specific sequence. For users of MSE this is the approach outlined in this answer on MSE.

Thus for all $x \in \mathbb{R}$ we define $$\exp(x) = \lim_{n \to \infty}\left(1 + \frac{x}{n}\right)^{n}\tag{1}$$ In order that this definition is valid we must show that the limit in question exists for all $x$. Clearly it exists for $x = 0$ and the limit is $1$ so that $\exp(0) = 1$. Let us first suppose that $x > 0$. Then we have by the binomial theorem \begin{align} F(x, n) &= \left(1 + \frac{x}{n}\right)^{n}\notag\\ &= 1 + x + \frac{n(n - 1)}{2!}\left(\frac{x}{n}\right)^{2} + \cdots + \text{ upto }n\text{ terms}\notag\\ &= 1 + x + \dfrac{\left(1 - \dfrac{1}{n}\right)}{2!}x^{2} + \dfrac{\left(1 - \dfrac{1}{n}\right)\left(1 - \dfrac{2}{n}\right)}{3!}x^{3} + \cdots\notag \end{align} It can be easily seen that as the value of $n$ increases then the number of terms in the expansion of $F(x, n)$ increases as well as the value of each term increases. It follows that for $x > 0$ the sequence $F(x, n)$ increases as $n$ increases.

Next we need to show that the sequence $F(x, n)$ is bounded above by some constant independent of $n$. Clearly we can see that $$F(x, n) \leq 1 + x + \frac{x^{2}}{2!} + \cdots + \frac{x^{n}}{n!}$$ and if the series on the right is extended as an infinite series then it is convergent. It follows that $F(x, n)$ is bounded by a fixed constant independent of $n$. Thus we can see that $F(x, n)$ is an increasing and bounded sequence and hence $\lim_{n \to \infty}F(x, n)$ exists. It follows that the definition $(1)$ for $\exp(x)$ makes sense for $x > 0$.

We next need to see that the same definition is valid for negative $x$ also. To see that we need to consider another sequence $G(x, n) = \left(1 - \dfrac{x}{n}\right)^{-n}$ for $x > 0$. We will show that this sequence decreases. Clearly we have by the binomial theorem (for negative integral exponent) \begin{align} G(x, n) &= 1 + x + \frac{n(n + 1)}{2!}\left(\frac{x}{n}\right)^{2} + \cdots\notag\\ &= 1 + x + \dfrac{\left(1 + \dfrac{1}{n}\right)}{2!}x^{2} + \dfrac{\left(1 + \dfrac{1}{n}\right)\left(1 + \dfrac{2}{n}\right)}{3!}x^{3} + \cdots\notag \end{align} Note that the above derivation is valid only when $0 < x/n < 1$ and this will be the case after a certain value of $n$. Clearly each term in the series above is decreasing as $n$ increases. It follows that for $x > 0$ the sequence $G(x, n)$ starts to decrease after a certain value of $n$ onwards. It should also be clear that after a certain value of $n$ onwards the sequence $G(x, n) \geq 1 + x$. It follows that $\lim_{n \to \infty}G(x, n)$ exists for $x > 0$ and this limit is not less than $(1 + x)$. If we note carefully we find that that $G(x, n) = 1/F(-x, n)$ and hence it follows that $\lim_{n \to \infty}F(-x, n)$ exists for all $x > 0$ and this limit is positive.

We have thus shown that $\lim_{n \to \infty}F(x, n)$ exists and is positive for all values of $x$. Thus the function $\exp(x)$ is well defined and is positive for all values of $x$. In other words $\exp(x)$ is a function from $\mathbb{R}$ to $\mathbb{R}^{+}$. We next show that $$\exp(x + y) = \exp(x)\exp(y),\,\, \exp(-x) = 1/\exp(x)\tag{2}$$ for all $x, y$. It would be better if we first show $\exp(-x) = 1/\exp(x)$ for all $x$. Clearly this holds for $x = 0$ and from the nature of the relation it is sufficient to prove it for $x > 0$. Again the functions $F(x, n), G(x, n)$ help us in a very nice way.

First we can see that $$\frac{F(x, n)}{G(x, n)} = \left(1 - \frac{x^{2}}{n^{2}}\right)^{n}$$ and after a certain value of $n$ onwards the expression on the right is less than $1$. It follows that $G(x, n) > F(x, n)$ after a certain value of $n$ onwards. Also note that $F(x, n)$ increases and tends to $\exp(x)$ as $n \to \infty$ hence $F(x, n) \leq \exp(x)$. We can now see that if $f(x, n) = G(x, n) - F(x, n)$ then \begin{align} 0 < f(x, n) &= \left(1 - \frac{x}{n}\right)^{-n} - \left(1 + \frac{x}{n}\right)^{n}\notag\\ &= \left(1 + \frac{x}{n}\right)^{n}\left\{\left(1 - \frac{x^{2}}{n^{2}}\right)^{-n} - 1\right\}\notag\\ &\leq \exp(x)\left\{\left(1 - \frac{x^{2}}{n}\right)^{-1} - 1\right\}\notag\\ &= \frac{x^{2}\exp(x)}{n - x^{2}}\notag \end{align} Letting $n \to \infty$ and using Squeeze theorem we get $\lim_{n \to \infty}G(x, n) = \lim_{n \to \infty}F(x, n)$ and noting that $G(x, n) = 1/F(-x, n)$ it follows that $\exp(-x) = 1/\exp(x)$. At the same time we have shown that $$\exp(x) = \lim_{n \to \infty}\left(1 + \frac{x}{n}\right)^{n} = \lim_{n \to \infty}\left(1 - \frac{x}{n}\right)^{-n}\tag{3}$$ Using the above result it is easy to establish the series for $\exp(x)$ namely $$\exp(x) = 1 + x + \frac{x^{2}}{2!} + \frac{x^{3}}{3!} + \cdots\tag{4}$$ The result is clearly true for $x = 0$. For $x > 0$ let us define $$E(x, n) = 1 + x + \frac{x^{2}}{2!} + \cdots + \frac{x^{n}}{n!}$$ so that the limit $\lim_{n \to \infty}E(x, n) = E(x)$ gives the infinite series on the right side of equation $(4)$. From the expansions of $F(x, n), G(x, n)$ via binomial theorem we can easily see that $$F(x, n) \leq E(x, n) \leq G(x, n)$$ Letting $n \to \infty$ and noting that $\lim_{n \to \infty}F(x, n) = \lim_{n \to \infty}G(x, n) = \exp(x)$ we see that $\exp(x) = \lim_{n \to \infty}E(x, n)$. To establish the same result for $x < 0$ we need only note that the infinite series $E(x)$ satisfies the relation $E(x)E(-x) = 1$ for all $x$ and this is clearly matching the relation $\exp(x)\exp(-x) = 1$.

We now establish $\exp(x + y) = \exp(x)\exp(y)$. Note that since $\exp(-x) = 1/\exp(x)$ we only need to establish the relation for positive values of $x, y$. It holds trivially if $ x = 0$ or $y = 0$. Let us consider the function $f(x, y, n) = F(x, n)F(y, n) - F(x + y, n)$. We have \begin{align} 0 < f(x, y, n) &= \left(1 + \frac{x}{n}\right)^{n}\left(1 + \frac{y}{n}\right)^{n} - \left(1 + \frac{x + y}{n}\right)^{n}\notag\\ &= \left(1 + \frac{x + y}{n} + \frac{x^{2}}{n^{2}}\right)^{n} - \left(1 + \frac{x + y}{n}\right)^{n}\notag\\ &= \left(1 + \frac{x + y}{n}\right)^{n}\left\{\left(1 + \frac{xy}{n(n + x + y)}\right)^{n} - 1\right\}\notag\\ &< \exp(x + y)\left\{\left(1 + \frac{xy}{n^{2}}\right)^{n} - 1\right\}\notag\\ &= \exp(x + y)\left\{\frac{xy}{n} + \frac{(1 - 1/n)}{2!}\left(\frac{xy}{n}\right)^{2} + \cdots\right\}\notag\\ &< \exp(x + y)\left\{\frac{xy}{n} + \left(\frac{xy}{n}\right)^{2} + \cdots\right\}\notag\\ &= \frac{xy\exp(x + y)}{n - xy}\notag \end{align} and letting $n \to \infty$ we see that $f(x, y, n) \to 0$ so that $\lim_{n \to \infty}F(x, n)F(y, n) = \lim_{n \to \infty}F(x + y, n)$ and we have thus established that $\exp(x)\exp(y) = \exp(x + y)$.

We can now easily prove the fundamental limit $$\lim_{x \to 0}\frac{\exp(x) - 1}{x} = 1\tag{5}$$ We first establish the limit when $x \to 0^{+}$. We have \begin{align} \lim_{x \to 0^{+}}\frac{\exp(x) - 1}{x} &= \lim_{x \to 0^{+}}\dfrac{{\displaystyle \lim_{n \to \infty}\left(1 + \dfrac{x}{n}\right)^{n} - 1}}{x}\notag\\ &= \lim_{x \to 0^{+}}\lim_{n \to \infty}\frac{1}{x}\left\{\left(1 + \dfrac{x}{n}\right)^{n} - 1\right\}\notag\\ &= \lim_{x \to 0^{+}}\lim_{n \to \infty}\frac{1}{x}\left\{\left(1 + x + \dfrac{(1 - 1/n)}{2!}x^{2} + \cdots\right) - 1\right\}\notag\\ &= \lim_{x \to 0^{+}}\lim_{n \to \infty}\left(1 + \dfrac{(1 - 1/n)}{2!}x + \dfrac{(1 - 1/n)(1 - 2/n)}{3!}x^{2} + \cdots\right)\notag\\ &= \lim_{x \to 0^{+}}\lim_{n \to \infty}(1 + \psi(x, n))\notag \end{align} where $\psi(x, n)$ is a finite sum defined by $$\psi(x, n) = \frac{(1 - 1/n)}{2!}x + \cdots + \frac{(1 - 1/n)(1 - 2/n)\cdots(1 - (n - 1)/n)}{n!}x^{n - 1}$$ For fixed positive $x$, the function $\psi(x, n)$ is increasing as $n$ increases and it is clearly bounded above by the convergent series $$F(x) = \frac{x}{2!} + \frac{x^{2}}{3!} + \cdots$$ Hence $\lim_{n \to \infty}\psi(x, n) = \psi(x)$ exists and we have $0 \leq \psi(x) \leq F(x)$. Again we can see that if $0 < x < 2$ then $$F(x) \leq \frac{x}{2} + \frac{x^{2}}{2^{2}} + \cdots = \frac{x}{2 - x}$$ Taking limits as $x \to 0^{+}$ we see that $F(x) \to 0$ as $x \to 0^{+}$. Hence $\lim_{x \to 0^{+}}\psi(x) = 0$. We thus have \begin{align} \lim_{x \to 0^{+}}\frac{\exp(x) - 1}{x} &= \lim_{x \to 0^{+}}\lim_{n \to \infty}1 + \psi(x, n)\notag\\ &= \lim_{x \to 0^{+}}1 + \psi(x)\notag\\ &= 1 + 0 = 1\notag \end{align} This result also implies that $\exp(x) = 1 + x\left(\dfrac{\exp(x) - 1}{x}\right) \to 1 + 0\cdot 1 = 1$ as $x \to 0^{+}$. Now let us assume $x \to 0^{-}$ and put $x = -y$ so that $y \to 0^{+}$. We then have \begin{align} \lim_{x \to 0^{-}}\frac{\exp(x) - 1}{x} &= \lim_{y \to 0^{+}}\frac{1 - \exp(-y)}{y}\notag\\ &= \lim_{y \to 0^{+}}\frac{\exp(y) - 1}{y\exp(y)} = 1/1 = 1\notag \end{align} It has thus been established that $(\exp(x) - 1)/x \to 1$ as $x \to 0$.

We then come to fundamental result $$\frac{d}{dx}\{\exp(x)\} = \exp(x)\tag{6}$$ We have \begin{align} \{\exp(x)\}' &= \lim_{h \to 0}\frac{\exp(x + h) - \exp(x)}{h}\notag\\ &= \lim_{h \to 0}\frac{\exp(x)\exp(h) - \exp(x)}{h}\notag\\ &= \exp(x)\lim_{h \to 0}\frac{\exp(h) - 1}{h}\notag\\ &= \exp(x)\cdot 1 = \exp(x)\notag \end{align} and since $\exp(x) > 0$ for all $x$ it follows that $\exp(x)$ is strictly increasing, continuous and differentiable for all $x$. This allows us to define the inverse function $y = \log x$ if $x = \exp(y)$ and using rules of differentiation of inverse functions we can show that $(\log x)' = 1/x$. Once the derivatives for these functions are established it is an easy exercise to establish their common properties and thus I leave it to the reader to establish these results.

An alternative approach based on the same definition $(1)$ is available in my answer on MSE. This approach is simpler compared to the one provided above and hence I reproduce it here with some more details. The starting point is to establish that the limit in $(1)$ exists for any positive integer $x$ and this is first done for $x = 1$. Clearly we know that the sequence $F(x, n)$ is increasing and for $x = 1$ it is bounded above by the sum $$1 + 1 + \frac{1}{2!} + \cdots + \frac{1}{n!}$$ which is less than $$1 + 1 + \frac{1}{2} + \frac{1}{2^{2}} + \cdots = 3$$ Therefore the sequence $F(1, n)$ tends to a limit which is normally denoted by $e$ and clearly $2 < e < 3$.

Next we show that that if $x$ is any positive integer then $F(x, n)$ tends to limit $e^{x}$. We assume $x > 1$ as the case $x = 1$ is already handled. This is done via simple algebraic manipulation as shown below \begin{align} \lim_{n \to \infty}\left(1 + \frac{x}{n}\right)^{n} &= \lim_{n \to \infty}\left(\frac{n + 1}{n}\cdot\frac{n + 2}{n + 1}\cdots \frac{n + x}{n + x - 1}\right)^{n}\notag\\ &= \lim_{n \to \infty}\left(1 + \frac{1}{n}\right)^{n}\left(1 + \frac{1}{n + 1}\right)^{n}\cdots\left(1 + \frac{1}{n + x - 1}\right)^{n}\notag\\ \end{align} Each term (except the first) in the product on right is of the form $$\left(1 + \frac{1}{n + i}\right)^{n}$$ where $i$ is a positive integer independent of $n$. This can be written as $$\dfrac{\left(1 + \dfrac{1}{n + i}\right)^{n + i}}{\left(1 + \dfrac{1}{n + i}\right)^{i}}$$ Clearly the numerator tends to $e$ and denominator tends to $1$ so that each term $$\left(1 + \frac{1}{n + i}\right)^{n}$$ tends to $e$. Since the number of such terms is $x$ it follows that the desired limit is $e^{x}$.

Next we need to establish the result that $F(x, n)$ tends to a limit even when $x$ is not a positive integer. First we deal with real number $x > 0$. Let $m$ be a positive integer with $m > x$. Then clearly $F(x, n) < F(m, n)$. Since $F(m, n)$ is strictly increasing and tends to $e^{m}$ (shown in last paragraph), it follows that $F(m, n) < e^{m}$ and therefore $F(x, n) < e^{m}$. Thus $F(x, n)$ is bounded above and tends to a limit. Moreover since $F(x, n)$ is strictly increasing it follows that the limit is not less than $F(x, 1) = 1 + x$. Therefore we have $\exp(x) \geq 1 + x$ for all $x > 0$.

Next we handle the case when $x < 0$. Let $x = -y$ so that $y > 0$ so that $F(y, n)$ tends to a positive limit denoted by $\exp(y)$. Consider the product $$F(x, n) F(y, n) = \left(1 - \frac{x^{2}}{n^{2}}\right)^{n}$$ If $n > |x|$ then by Bernoulli's inequality we have $$1 - \frac{x^{2}}{n} \leq \left(1 - \frac{x^{2}}{n^{2}}\right)^{n} \leq 1$$ and hence by Squeeze theorem the product $F(x, n)F(y, n)$ tends to $1$ as $n \to \infty$. It now follows that $F(x, n)$ tends to a positive limit $1/\exp(y)$. We have thus shown that the limit $(1)$ exists for all $x$ and is positive. At the same time we have also proved that $\exp(x)\exp(-x) = 1$ for all $x$.

Next let $0 < x < 1$ and then $\exp(-x) = 1/\exp(x)$ so that $$\exp(-x) = \lim_{n \to \infty}\left(1 + \frac{x}{n}\right)^{-n} = \lim_{n \to \infty}1/F(x, n)$$ and by the general binomial theorem for any index we can see that $$\frac{1}{F(x, n)} \geq 1 - x$$ so that $\exp(-x) \geq 1 - x$ for all $0 < x < 1$. We thus have the inequality $$\exp(x) \geq 1 + x$$ for all $x \in (-1, \infty)$. We are now ready to prove the fundamental limit concerning the exponential function namely that $(\exp(x) - 1)/x \to 1$ as $x \to 0$.

Let $0 < x < 1$ then we have $$\exp(x) \geq 1 + x,\, \exp(-x) \geq 1 - x$$ which shows that $$\frac{\exp(x) - 1}{x} \geq 1,\, \exp(x) \leq \frac{1}{1 - x}$$ The last inequality leads us to $$\frac{\exp(x) - 1}{x} \leq \frac{1}{1 - x}$$ and thus we have $$1 \leq \frac{\exp(x) - 1}{x} \leq \frac{1}{1 - x}$$ for $0 < x < 1$. Letting $x \to 0^{+}$ we see that $$\lim_{x \to 0^{+}}\frac{\exp(x) - 1}{x} = 1$$ This limit also shows that $\exp(x) \to 1$ as $x \to 0^{+}$. Using this fact and the relation $\exp(x)\exp(-x) = 1$ it is easy to prove that $(\exp(x) - 1)/x \to 1$ as $x \to 0^{-}$. Thus the fundamental limit concerning the exponential function is established.

Next we prove the functional equation $\exp(x + y) = \exp(x)\exp(y)$ for all $x, y$. To prove this we need the following result:
Lemma: If $a_{n}$ is a sequence of real or complex terms such that $n(a_{n} - 1) \to 0$ as $n \to \infty$ then $a_{n}^{n} \to 1$ as $n \to \infty$.

To prove the lemma let us write $b_{n} = a_{n} - 1$ so that $nb_{n} \to 0$. We have $$a_{n}^{n} = (1 + b_{n})^{n} = 1 + nb_{n} + \frac{n(n - 1)}{2}b_{n}^{2} + \cdots$$ It follows that $$|a_{n}^{n} - 1| \leq |nb_{n}| + \frac{|nb_{n}|^{2}}{2} + \frac{|nb_{n}|^{3}}{2^{2}} + \cdots \leq \dfrac{|nb_{n}|}{1 - \dfrac{|nb_{n}|}{2}}$$ Since $nb_{n} \to 0$ it follows from the above inequality that $a_{n}^{n} \to 1$.

We now set $$a_{n} = \dfrac{\left(1 + \dfrac{x + y}{n}\right)}{\left(1 + \dfrac{x}{n}\right)\left(1 + \dfrac{y}{n}\right)}$$ and it is easy to check that $n(a_{n} - 1) \to 0$ and hence $a_{n}^{n} \to 1$. Thus we get $\exp(x + y) = \exp(x)\exp(y)$. Using this functional equation and the limit $\dfrac{\exp(x) - 1}{x} \to 1$ as $x \to 0$ we can prove as before that derivative of $\exp(x)$ is $\exp(x)$ itself. And from the positive derivative we get the strictly increasing nature of $\exp(x)$ and existence of inverse function $\log x$ and thus theory of these functions can be developed completely.

Approach Based on Differential Equations

Another approach which has its origins in this answer at MSE is very surprising and novel. It begins with the definition of $\exp(x)$ as the unique solution to the differential equation $$\frac{dy}{dx} = y, \,\, y(0) = 1\tag{7}$$ The fundamental challenge is to show that there exists such a solution. We thus need to find a function $f(x)$ such that $f'(x) = f(x)$ and $f(0) = 1$. We try to analyze the properties of such a function $f(x)$.

We first show that $f(x) > 0$ for all $x$. Let $g(x) = f(x)f(-x)$ so that $$g'(x) = f'(x)f(-x) - f(x)f'(-x) = f(x)f(-x) - f(x)f(-x) = 0$$ so that $g(x)$ is a constant and $g(x) = g(0) = f(0)f(0) = 1$. Thus $f(x)f(-x) = 1$ and hence $f(x) \neq 0$ for all $x$. It follows by continuity that $f(x)$ must be of constant sign and since $f(0) > 0$ it follows that $f(x) > 0$ for all $x$.

This means that $f'(x) = f(x) > 0$ so that $f(x)$ is strictly increasing, continuous and differentiable for all $x$ and hence possesses an inverse $F(x)$ such that $x = f(y)$ implies $y = F(x)$. Clearly by the rules of differentiation we get $F'(x) = 1/x$ and $F(1) = 0$ so that we have $F(x) = \int_{1}^{x}(dt/t)$ and then we reach the definition of $\log x = F(x)$ as an integral and derive all the properties of $F(x)$ and $f(x)$. This shows that there is a genuine function $f(x)$ which satisfies the differential equation $(7)$.

For the uniqueness part if we assume that there are two solutions $f(x), h(x)$ then their difference $\phi(x) = f(x) - h(x)$ satisfies $\phi'(x) = \phi(x)$ and $\phi(0) = 0$. We will show that such a function must be identically zero. Let us suppose on the contrary that there is a number $a$ for which $\phi(a) \neq 0$. Consider the function $\psi(x) = \phi(a + x)\phi(a - x)$ and then we have $$\psi'(x) = \phi'(a + x)\phi(a - x) - \phi(a + x)\phi'(a - x)  = 0$$ so that $\psi(x)$ is a constant and $$\phi(a + x)\phi(a - x) = \psi(x) = \psi(0) = \phi(a)\phi(a) > 0$$ Putting $x = a$ we get $\phi(2a)\phi(0) > 0$ which contradicts $\phi(0) = 0$. This shows that equation $(7)$ has a unique solution and the definition of $\exp(x)$ is unambiguous. Since we have found the derivatives of $\exp(x)$ and its inverse we may easily deduce all the properties of these functions.

Logarithm as a Limit

Another approach which is an extension of an exercise problem in Hardy's Pure Mathematics comes from the equation $$\log x = \lim_{n \to \infty}n(\sqrt[n]{x} - 1) = \lim_{n \to \infty}n(x^{1/n} - 1)\tag{8}$$ and we take it as the definition of $\log x$ for $x > 0$. Clearly we must show that the limit above exists for $x > 0$. This is bit tricky, but not that difficult either. We first show that $\sqrt[n]{x} \to 1$ otherwise the limit in $(8)$ won't exist.

This is clearly the case when $x = 1$, so let us take the case $x > 1$. Then we can see that $\sqrt[n]{x} > 1$ and therefore let $\sqrt[n]{x} = 1 + h$ where $h > 0$. Note that $h$ depends on $x$ as well as $n$. We then have $x = (1 + h)^{n} \geq 1 + nh$ so we have $$0 < h \leq \frac{x - 1}{n}$$ Taking limits as $n \to \infty$ we see that $h \to 0$ so that $\sqrt[n]{x} = 1 + h \to 1$. If $0 < x < 1$ then we can write $x = 1/y$ where $y > 1$. Then we can see that $\sqrt[n]{x} = 1/\sqrt[n]{y} \to 1/1 = 1$ as $n \to \infty$. Hence we have established that $\sqrt[n]{x} \to 1$ as $n \to \infty$ when $x > 0$.

Next we need to establish some inequalities. Let $\alpha, \beta$ be two numbers with $0 < \beta < 1 < \alpha$ and let $r$ be a positive integer. We can easily see that for $i = 0, 1, 2, \ldots, (r - 1)$ we have $\alpha^{i} < \alpha^{r}$ and adding these inequalities we get $$r\alpha^{r} > 1 + \alpha + \alpha^{2} + \cdots + \alpha^{r - 1}$$ Multiplying by $(\alpha - 1) > 0$ we get $$r\alpha^{r}(\alpha - 1) > \alpha^{r} - 1$$ Next we add $r(\alpha^{r} - 1)$ to both sides to get $$r(\alpha^{r + 1} - 1) > (r + 1)(\alpha^{r} - 1)$$ and thus we obtain $$\frac{\alpha^{r + 1} - 1}{r + 1} > \frac{\alpha^{r} - 1}{r}\text{ where} \alpha > 1\tag{9}$$ Similarly we can show that $$\frac{1 - \beta^{r + 1}}{r + 1} < \frac{1 - \beta^{r}}{r}\text{ where }0 < \beta < 1\tag{10}$$ It now follows that if $r, s$ are positive integers with $r > s$ then $$\frac{\alpha^{r} - 1}{r} > \frac{\alpha^{s} - 1}{s},\,\,\frac{1 - \beta^{r}}{r} < \frac{1 - \beta^{s}}{s}\tag{11}$$ where $0 < \beta < 1 < \alpha$.

Next we show that the above inequalities remain valid even when $r, s$ are positive rational numbers with $r > s$. Let $r = a/b, s = c/d$ where $a, b, c, d$ are positive integers with $ad > bc$. Let $\alpha = \gamma^{bd}$ so that $\gamma$ is the $bd^{\text{th}}$ root of $\alpha$. Then $\gamma > 1$ and by $(11)$ we have $$\frac{\gamma^{ad} - 1}{ad} > \frac{\gamma^{bc} - 1}{bc}$$ and further multiplication by $bd$ gives $$\frac{\alpha^{r} - 1}{r} > \frac{\alpha^{s} - 1}{s}$$ In the same manner we can show the inequality for $\beta$. It is now established that the inequalities $(11)$ are valid for all rational $r, s$ with $0 < s < r$. Putting $s = 1$ we get $$\alpha^{r} - 1 > r(\alpha - 1),\,\, 1 - \beta^{r} < r(1 - \beta)\tag{12}$$ where $r > 1$. Again putting $r = 1$ in $(10)$ we get $$\alpha^{s} - 1 < s(\alpha - 1),\,\,1 - \beta^{s} > s(1 - \beta)\tag{13}$$ where $0 < s < 1$. We will go one step further and note that if $\alpha > 1$ then $0 < 1/\alpha < 1$ and if $0 < \beta < 1$ then $1/\beta > 1$. Hence in the inequalities $(12), (13)$ we can replace $\alpha$ by $1/\beta$ and $\beta$ by $1/\alpha$. Doing so we get $$\alpha^{r} - 1 < r\alpha^{r - 1}(\alpha - 1),\,\, 1 - \beta^{r} > r\beta^{r - 1}(1 - \beta)$$ and $$\alpha^{s} - 1 > s\alpha^{s - 1}(\alpha - 1),\,\, 1 - \beta^{s} < s\beta^{s - 1}(1 - \beta)$$ We thus finally obtain $$r\alpha^{r - 1}(\alpha - 1) > \alpha^{r} - 1 > r(\alpha - 1),\,\,s\alpha^{s - 1}(\alpha - 1) < \alpha^{s} - 1 < s(\alpha - 1)\tag{14}$$ and $$r\beta^{r - 1}(1 - \beta) < 1 - \beta^{r} < r(1 - \beta),\,s\beta^{s - 1}(1 - \beta) > 1 - \beta^{s} > s(1 - \beta)\tag{15}$$ We are now ready to analyze the limit of function $\phi(x, n) = n(x^{1/n} - 1)$ which is used in definition $(8)$. Clearly if $x = 1$ then $\phi(x, n) = 0$ and hence the limit is $0$. Let us suppose that $x > 1$ then from inequality $(11)$ (putting $\alpha = x, r = 1/n, s = 1/(n + 1)$) we see that $\phi(x, n) > \phi(x, n + 1)$ so that the function $\phi(x, n)$ is strictly decreasing as $n$ increases. Also note that the function $\phi(x, n) > 0$ if $x > 1$. It follows that the limit of $\phi(x, n)$ exists as $n \to \infty$. If in $(14)$ we put $\alpha = x, s = 1/n$ then we can see that $$\phi(x, n) > \sqrt[n]{x}\left(1 - \frac{1}{x}\right) > 1 - \frac{1}{x}$$ so it follows that $\phi(x, n) \to l$ where $l \geq 1 - (1/x) > 0$. Hence $\phi(x, n)$ tends to a positive limit as $n \to \infty$. It follows that $\log x = \lim_{n \to \infty}n(\sqrt[n]{x} - 1)$ is defined for $x \geq 1$ and $\log 1 = 0$ and $\log x > 0$ if $x > 1$.

To analyze the limit $(8)$ when $0 < x < 1$ we put $x = 1/y$ so that $y > 1$ and then $\phi(x, n) = n(1 - \sqrt[n]{y})/\sqrt[n]{y} = -\phi(y, n)/\sqrt[n]{y}$ and thus $\phi(x, n) \to -\log y = -\log(1/x) < 0$. Thus we see that the function $\log x$ is defined for all $x > 0$ and is negative when $0 < x < 1$, positive when $x > 1$ and we also have $\log (1/x) = -\log x$ for all $x > 0$. We next establish the fundamental property of $\log x$ namely $$\log (xy) = \log x + \log y\tag{16}$$ Clearly we have \begin{align} \log(xy) &= \lim_{n \to \infty}n(\sqrt[n]{xy} - 1)\notag\\ &= \lim_{n \to \infty}n(\sqrt[n]{xy} - \sqrt[n]{y} + \sqrt[n]{y} - 1)\notag\\ &= \lim_{n \to \infty}n\sqrt[n]{y}(\sqrt[n]{x} - 1) + n(\sqrt[n]{y} - 1)\notag\\ &= 1\cdot \log x + \log y = \log x + \log y\notag \end{align} Replacing $y$ by $1/y$ and noting that $\log (1/y) = -\log y$ we can see that $\log(x/y) = \log x - \log y$. If $ x > y > 0$ then we can see that $x / y > 1$ and then $\log (x/y) > 0$ i.e. $\log x > \log y$ so that $\log x$ is a strictly increasing function of $x$ for all $x > 0$.

We next need to establish the derivative of $\log x$. We first establish the limit formula $$\lim_{x \to 1}\frac{\log x}{x - 1} = 1\text{ or equivalently }\lim_{x \to 0}\frac{\log(1 + x)}{x} = 1\tag{17}$$ We establish the first form which deals with $x \to 1$ and first let's focus on $x \to 1^{+}$ so that $x > 1$. Putting $\alpha = x, s = 1/n$ in $(14)$ we get $$x^{1/n}\cdot\frac{x - 1}{x} < n(\sqrt[n]{x} - 1) < x - 1$$ Taking limits as $n \to \infty$ we get $$\frac{x - 1}{x}\leq \log x \leq x - 1$$ and diving by $(x - 1) > 0$ we get $$\frac{1}{x}\leq \frac{\log x}{x - 1} \leq 1$$ and letting $x \to 1^{+}$ we get our desired limit as $1$. If $x \to 1^{-}$ so that $0 < x < 1$ then we put $x = 1/y$ and $y \to 1^{+}$. Then we have $$\lim_{x \to 1^{-}}\frac{\log x}{x - 1} = \lim_{y \to 1^{+}}\frac{\log(1/y)}{(1/y) - 1} = \lim_{y \to 1^{+}}\frac{y\log y}{y - 1} = 1\cdot 1 = 1$$ Thus the limit formula $(17)$ is established. It is now an easy matter to calculate derivative of $\log x$. We have \begin{align} \{\log x\}' &= \lim_{h \to 0}\frac{\log (x + h) - \log x}{h}\notag\\ &= \lim_{h \to 0}\dfrac{\log \left(\dfrac{x + h}{x}\right)}{h}\notag\\ &= \lim_{h \to 0}\dfrac{\log \{1 + (h/x)\}}{h/x}\cdot\frac{1}{x}\notag\\ &= 1\cdot\frac{1}{x} = \frac{1}{x}\notag \end{align} Once we have established the derivative of $\log x$ it is now an easy route to other properties of $\log x$ and defining $\exp(x)$ as the inverse of $\log x$. However I must mention one point about inverses. Suppose that $\phi(x, n) = n(\sqrt[n]{x} - 1) = y$ then we automatically get $x = \left(1 + \dfrac{y}{n}\right)^{n} = F(y, n)$ so that the functions $\phi(x, n)$ and $F(x, n)$ are natural inverses of each other and they maintain this relationship even in the limit when $n \to \infty$ each giving rise to $\log x$ and $\exp(x)$ respectively.

Approach Based on Infinite Series

Another approach to the theory of exponential and logarithmic functions defines $$\exp(x) = 1 + x + \frac{x^{2}}{2!} + \frac{x^{3}}{3!} + \cdots\tag{18}$$ but it kind of gives everything very easily. The property $\exp(x + y) = \exp(x)\exp(y)$ follows immediately by multiplication of infinite series and the binomial theorem. Also the fundamental limit $(\exp(x) - 1)/x \to 1$ is almost obvious and then the derivative formula $\{\exp(x)\}' = \exp(x)$ follows. I will prove that $\exp(x) \neq 0$ for any $x$. Clearly by the functional relation $\exp(x + y)=\exp(x)\exp(y)$ it follows that if $\exp(x)$ vanishes at one point then it vanishes everywhere and since $\exp(0) = 1$ it follows that $\exp(x)\neq 0$ for all $x$. Now it is easy to see that $\exp(x) = \exp(x/2)\exp(x/2) > 0$ for all $x$. The equation $(\exp(x))' = \exp(x) > 0$ gives the strictly increasing nature of $\exp(x)$ and thus the existence of its inverse. Readers are advised to complete the theory based on this approach.

In the next post we will provide the most elementary approach which starts by defining the general power $a^{b}$ directly for $a > 0$ and any real $b$. This approach is the most difficult one to handle but it is more appealing to those students who have not been introduced to calculus.

Print/PDF Version

4 comments :: Theories of Exponential and Logarithmic Functions: Part 2

Post a Comment

  1. This is a fantastic article! Very well done. - Mark (aka, Dr.MV on MSE)

  2. Dear Mark (Dr. MV),

    I am fortunate enough to receive such kind words from a person of your stature. Thanks a lot. Lately I have not been getting much time to add new entries in my blog (and neither much contribution to MSE). But your comment has given me enough energy to contribute something fresh on my blog as well as MSE.

    Paramanand

  3. This article is really helpful. Thank you.-(Dragon on MSE)

  4. This article is really helpful. Thank you.-(Dragon on MSE)