Teach Yourself Limits in 8 Hours: Part 4

After dealing with various techniques to evaluate limits we now provide proofs of the results on which these techniques are founded. This material is not difficult but definitely somewhat abstract and may not be suitable for beginners who are more interested in learning techniques and solving limit problems. But those who are interested in the justification of these techniques must pay great attention to what follows.

Proofs of Rules of Limits

We provide proofs for some of the rules and let the reader provide proofs for remaining rules based on similar line of argument. First we start with rule dealing with inequalities:
If $f(x) \leq g(x)$ in the neighborhood of $a$ then $\lim_{x \to a}f(x) \leq \lim_{x \to a}g(x)$ provided both these limits exist.

Let $A = \lim_{x \to a}f(x), B = \lim_{x \to a}g(x)$ and we have to prove that $A \leq B$. Suppose that $A > B$ and let $\epsilon  = (A - B)/2$. Then we know that there is a $\delta > 0$ such that $|f(x) - A| < \epsilon$ and $|g(x) - B| < \epsilon$ when $0 < |x - a| < \delta$. Then we have $$A - \epsilon < f(x) < A + \epsilon,\,\, B - \epsilon < g(x) < B + \epsilon$$ and noting that $A - \epsilon = B + \epsilon$ we see that $$g(x) < B + \epsilon = A - \epsilon < f(x)$$ which is contrary to the hypotheses $f(x) \leq g(x)$. Hence we must have $A \leq B$. Note that the same argument follows even if hypotheses are modified to $f(x) < g(x)$.

Translating this formal proof into common language we see that values of $f(x)$ are close to $A$ and those of $g(x)$ are near to $B$ and if $A > B$ there will be values of $f(x)$ which exceed values of $g(x)$ contrary to the hypotheses.

Next we establish the following rule of division:
$\displaystyle \lim_{x \to a}\frac{f(x)}{g(x)} = \dfrac{{\displaystyle \lim_{x \to a}f(x)}}{{\displaystyle \lim_{x \to a}g(x)}}$ provided that both the limits $\lim_{x \to a}f(x)$ and $\lim_{x \to a}g(x)$ exist and $\lim_{x \to a}g(x) \neq 0$.

Clearly since $\lim_{x \to a}g(x) = B \neq 0$ there exists a number $\delta_{1} > 0$ such that $$|g(x) - B| < \frac{|B|}{2}$$ whenever $0 < |x - a| < \delta_{1}$ so that $|g(x)| > |B|/2$ for $0 < |x - a| < \delta_{1}$. Next let $\lim_{x \to a}f(x) = A$. Then for any $\epsilon > 0$ there exist a $\delta_{2} > 0$ such that $$|f(x) - A| < \frac{|B|\epsilon}{4}$$ for $0 < |x - a| < \delta_{2}$. Further there is a $\delta_{3} > 0$ such that $$|g(x) - B| < \frac{|B|^{2}\epsilon}{4|A| + 1}$$ whenever $0 < |x - a| < \delta_{3}$. Let $\delta = \min(\delta_{1}, \delta_{2}, \delta_{3})$ and then for $0 < |x - a| < \delta$ we have $1/|g(x)| < 2/|B|$ so that \begin{align} \left|\frac{f(x)}{g(x)} - \frac{A}{B}\right| &= \left|\frac{Bf(x) - AB + AB - Ag(x)}{Bg(x)}\right|\notag\\ &= \left|\frac{f(x) - A}{g(x)} + \frac{A}{B}\frac{B - g(x)}{g(x)}\right|\notag\\ &\leq \frac{|f(x) - A|}{|g(x)|} + \frac{|A|}{|B|}\frac{|B - g(x)|}{|g(x)|}\notag\\ &< \frac{|B|\epsilon}{4}\cdot\frac{2}{|B|} + \frac{|A|}{|B|}\cdot\frac{|B|^{2}\epsilon}{4|A| + 1}\cdot\frac{2}{|B|}\notag\\ &< \frac{\epsilon}{2} + \frac{\epsilon}{2}\notag\\ &= \epsilon\notag \end{align} and therefore $f(x)/g(x) \to A/B$ as $x \to a$.

Going further we establish the Sandwich Theorem (or Squeeze Theorem):
If $f(x) \leq g(x) \leq h(x)$ in a neighborhood of $a$ and $\lim_{x \to a}f(x) = \lim_{x \to a}h(x) = L$ then $\lim_{x \to a}g(x) = L$.

Clearly for any $\epsilon > 0$ we have a $\delta > 0$ such that $$|f(x) - L| < \epsilon, |h(x) - L| < \epsilon$$ whenever $0 < |x - a| < \delta$. This implies that $$L - \epsilon < f(x) < L + \epsilon\text{ and }L - \epsilon < h(x) < L + \epsilon$$ Therefore $$L - \epsilon < f(x) \leq g(x) \leq h(x) < L + \epsilon$$ and hence $$L - \epsilon < g(x) < L + \epsilon$$ i.e. $|g(x) - L| < \epsilon$ so we get $\lim_{x \to a}g(x) = L$.

Finally we establish the rule of substitution in limits:
If $\lim_{x \to a}g(x) = b$ and $\lim_{x \to b}f(x) = L$ and $g(x) \neq b$ in a certain neighborhood of $a$ (except possibly at $a$) then $\lim_{x \to a}f\{g(x)\} = L$.

This I will present in the informal language itself. For $\lim_{x \to a}f\{g(x)\}$ to be equal to $L$ we must be able to get the values of $f(g(x))$ arbitrarily close to $L$ by choosing $x$ sufficiently close to $a$. Now we know that values of function $f$ can be made arbitrarily close to $L$ by choosing its argument sufficiently close to $b$ (because $\lim_{x \to b}f(x) = L$). Here in $f(g(x))$ the argument of $f$ is $g(x)$ and hence it follows that $f(g(x))$ can be made arbitrarily close to $L$ by making $g(x)$ sufficiently close to $b$ and since $\lim_{x \to a}g(x) = b$ this is possible by choosing $x$ sufficiently close to $a$. Note that the condition $g(x) \neq b$ is needed because the limit $\lim_{x \to b}f(x) = L$ assumes that we are dealing with values of $f$ when its argument is not equal to $b$. Hence while handling $f(g(x))$ it is essential that argument of $f$ namely $g(x)$ should not be equal to $b$.

Proof of Standard Limits

We first start with the trigonometric limit $$\lim_{x \to 0}\frac{\sin x}{x} = 1$$ We note that $(\sin (-x)) / (-x) = (\sin x) / x$ and hence it is sufficient to consider the case when $x \to 0+$. Let us then consider the case when $0 < x < \pi / 2$.
In the above figure we have a circle with center $O$ and radius $r$. $A, B$ are two points on the circle such that $\angle AOB = x$ and $AT$ is a tangent to the circle at $A$. Radius $OB$ extended meets tangent $AT$ at $T$ so that $OAT$ is a triangle. We can clearly see that $AT = r\tan x$ so that area of triangle $OAT$ is $\dfrac{1}{2}r^{2}\tan x$. We now see that \begin{align} &\text{area of triangle }OAB < \text{area of sector }OAB < \text{area of triangle }OAT\notag\\ &\Rightarrow \frac{1}{2}r^{2} \sin x < \frac{1}{2}r^{2}x < \frac{1}{2}r^{2}\tan x\notag\\ &\Rightarrow \sin x < x < \tan x = \frac{\sin x}{\cos x}\notag\\ &\Rightarrow \cos x < \frac{\sin x}{x} < 1\notag \end{align} Taking limits as $x \to 0+$ and noting $\lim_{x \to 0+}\cos x = 1$ we see that we have (by Sandwich Theorem) $\lim_{x \to 0+}\dfrac{\sin x}{x} = 1$

Next we focus on the algebraic limit $$\lim_{x \to a}\frac{x^{n} - a^{n}}{x - a} = na^{n - 1}$$ In case $n$ is positive integer the result follows easily by Binomial theorem as follows: $$\lim_{x \to a}\frac{x^{n} - a^{n}}{x - a} = \lim_{h \to 0}\frac{(a + h)^{n} - a^{n}}{h} = \lim_{h \to 0}\frac{(a^{n} + na^{n - 1}h + \cdots) - a^{n}}{h} = na^{n - 1}$$ If $n = 0$ then clearly the numerator vanishes and limit is $0$ so the formula is true in this case also. If $n$ is negative integer $n = -m$ so that $m$ is a positive integer, then $$\lim_{x \to a}\frac{x^{n} - a^{n}}{x - a} = \lim_{x \to a}\frac{a^{m} - x^{m}}{(x - a)x^{m}a^{m}} = -\frac{ma^{m - 1}}{a^{2m}} = -ma^{-m - 1} = na^{n - 1}$$ To handle the case when $n = p/q$ is a fraction we restrict to case when $a > 0$ and $p, q$ are positive integers. First we note that $$\lim_{x \to a}\frac{x^{q} - a^{q}}{x - a} = qa^{q - 1}$$ so that the ratio $\dfrac{x^{q} - a^{q}}{x - a}$ is bounded but away from zero when $x$ is in certain neighborhood of $a$ (and since $\lim\limits_{x \to a}x^{q} = a^{q}$ it follows that $x^{q}$ is in a certain neighborhood of $a^{q}$). Replacing $x$ by $x^{1/q}$ and $a$ by $a^{1/q}$ we note that $\dfrac{x - a}{x^{1/q} - a^{1/q}}$ is bounded but away from zero if $x$ is in a certain neighborhood of $a$. It follows that $\dfrac{x^{1/q} - a^{1/q}}{x - a}$ is also bounded and away from zero when $x \to a$ so that there is a constant $K$ such that $$0 < \frac{x^{1/q} - a^{1/q}}{x - a} < K$$ Thus we have $$0 < |x^{1/q} - a^{1/q}| < K(x - a)$$ Taking limits when $x \to a$ and using Sandwich theorem we get $$\lim_{x \to a}x^{1/q} = a^{1/q}$$ We can now see that \begin{align} \lim_{x \to a}\frac{x^{n} - a^{n}}{x - a} &= \lim_{x \to a}\frac{x^{p/q} - a^{p/q}}{x^{1/q} - a^{1/q}}\cdot \frac{x^{1/q} - a^{1/q}}{x - a}\notag\\ &= \lim_{y \to b}\frac{y^{p} - b^{b}}{y - b}\cdot\frac{y - b}{y^{q} - b^{q}}\text{ (putting }y = x^{1/q}, b = a^{1/q})\notag\\ &= pb^{p - 1}/qb^{q - 1} = \frac{p}{q}b^{p - q} = na^{n - 1}\notag \end{align} If $n$ is negative rational then the proof follows exactly along the same lines as the case for $n$ being a negative integer. In the above proof notice that it is absolutely essential that we establish that $\lim_{x \to a}x^{1/q} = a^{1/q}$ first and it requires a little bit of ingenuity in achieving this. Another simpler approach to prove this is to consider various possibilities for the limit $\lim_{x\to a} x^{1/q}$ and show that if this limit does not exist then $\lim_{x\to a} x$ does not exist. This is clearly a contradiction and hence the limit $\lim_{x\to a} x^{1/q}=b$ exists and is non-negative. Using the product rule of limits we can easily see that $b^{q} =a$ and hence $b=a^{1/q}$.

When $n$ is irrational then the definition of $x^{n}$ is dependent on the definitions of exponential and logarithm functions and hence we defer this part of the proof after the logarithmic and exponential limits are established.

To establish the logarithmic and exponential limits it is absolutely essential that we define them first. A typical easy route to their definition starts by defining $\log x$ as an integral. If we define $$\log x = \int_{1}^{x}\frac{dt}{t}$$ then we can see immediately that $\log 1 = 0$ and the derivative $(\log x)' = 1/x$ so that its derivative at $x = 1$ is $1$ and therefore we get $$\lim_{x \to 1}\frac{\log x - \log 1}{x - 1} = 1$$ or $$\lim_{h \to 0}\frac{\log(1 + h)}{h} = 1$$ Exponential function $\exp(x)$ or $e^{x}$ is defined as inverse of the logarithm function by relation $y = \exp(x)$ if $\log y = x$. So if we put $\log(1 + h) = x$ we get $h = e^{x} - 1$ and as $h \to 0$ we also have $x \to 0$. Thus $\lim_{h \to 0}(\log(1 + h))/h = 1$ implies that $$\lim_{x \to 0}\frac{x}{e^{x} - 1} = 1$$ so that the exponential limit is also established.

We return to the case of algebraic limit when $n$ is irrational. In this case $a$ must be positive otherwise $a^{n}$ is not defined. We have $x^{n} = \exp(n\log x)$ and $a^{n} = \exp(n\log a)$. Let $\log x = y$ and $\log a = b$ so that $x \to a$ implies $y \to b$. We now have \begin{align} \lim_{x \to a}\frac{x^{n} - a^{n}}{x - a} &= \lim_{y \to b}\frac{e^{ny} - e^{nb}}{e^{y} - e^{b}}\notag\\ &= \lim_{y \to b}\frac{e^{ny} - e^{nb}}{n(y - b)}\cdot\frac{n(y - b)}{e^{y} - e^{b}}\notag\\ &= n\lim_{h \to 0}\frac{e^{n(b + h)} - e^{nb}}{nh}\cdot\frac{h}{e^{b + h} - e^{b}}\notag\\ &= n \lim_{h \to 0}\frac{e^{nb}(e^{nh} - 1)}{nh}\cdot\frac{h}{e^{b}(e^{h} - 1)}\notag\\ &= ne^{nb}/e^{b} = ne^{b(n - 1)} = n(e^{b})^{n - 1} = na^{n - 1}\notag \end{align} Next we establish the limit $\lim_{x \to \infty}\dfrac{\log x}{x^{a}} = 0$ for $a > 0$. This is based on the definition of $\log x$ as an integral. Let $0 < b < a$ and $x > 1$ then we can see that $t^{-1} < t^{b - 1}$ for $t \in (1, x]$ so that \begin{align} 0 &< \int_{1}^{x}\frac{1}{t}\,dt < \int_{1}^{x}t^{b - 1}\,dt\notag\\ \Rightarrow 0 &< \log x < \frac{x^{b} - 1}{b} < \frac{x^{b}}{b}\notag\\ \Rightarrow 0 &< \frac{\log x}{x^{a}} < \frac{1}{bx^{a - b}}\notag \end{align} Taking limits as $x \to \infty$ and noting that $a - b > 0$ we see that $$\lim_{x \to \infty}\dfrac{\log x}{x^{a}} = 0$$ The above proofs based on definitions of exponential and logarithm functions are not suitable for a beginner learning limits for the first time because they depend upon the higher level concepts of integral and derivative. Also these proofs make use of properties of exponential and logarithmic in an implicit fashion (these can not be proved here because of the constraint to keep the post to a reasonable length). However when one has got an understanding of derivative and integral then it is better to revisit these proofs which justify the use of fundamental logarithmic and exponential limits in solving various limit problems.

Before we proceed to discuss the justification of L'Hospital's Rule and power series expansions we say a few words on the justification of limits dealing with $\infty$. Most of the thumb rules given earlier for dealing with $\infty$ can be easily proved if one uses the definitions of limits properly. For example consider the rule which says that if $f(x) \to \infty, g(x) \to \infty$ as $x \to a$ then $f(x) + g(x) \to \infty$ as $x \to a$. This can be easily understood if we follow definitions. Clearly for any given $N > 0$ we have a neighborhood of $a$ in which $f(x) > N, g(x) > N$ so that $f(x) + g(x) > 2N > N$ so that $f(x) + g(x) \to \infty$. Readers should try to establish other thumb rules dealing with $\infty$. In this connection it is important that $f(x) > N$ and $g(x) > N$ do not give any idea about the values of $f(x) - g(x)$ and hence there is no thumb rule provided for dealing with $f(x) - g(x)$ (or $f(x)/g(x)$).

Proof of L'Hospital's Rule

To establish this rule we need to prove a fundamental result called Cauchy's Mean Value Theorem:
If $f(x), g(x)$ are continuous on $[a, b]$ and differentiable on $(a, b)$ and $f'(x), g'(x)$ never vanish for the same value of $x$ and $g(b) \neq g(a)$ then there is a $c \in (a, b)$ such that $$\frac{f(b) - f(a)}{g(b) - g(a)} = \frac{f'(c)}{g'(c)}$$ This can be proved by using Rolle's theorem. Let $$h(x) = f(b) - f(x) - \frac{f(b) - f(a)}{g(b) - g(a)}\{g(b) - g(x)\}$$ then we have $h(a) = h(b)$ and hence by Rolle's theorem there is a $c \in (a, b)$ for which $h'(c) = 0$ i.e. $$f'(c) = \frac{f(b) - f(a)}{g(b) - g(a)}g'(c)$$ Now $g'(c) \neq 0$ otherwise $f'(c)$ would also be zero and simultaneous vanishing of $f'(x), g'(x)$ is not allowed. Hence dividing by $g'(c)$ our result follows.

Next we come to L'Hospital's Rule:
If $f(x), g(x)$ are differentiable in a certain neighborhood of $a$ (but not necessarily at $a$), $\lim_{x \to a}f(x) = \lim_{x \to a}g(x) = 0$ and $\lim_{x \to a}\dfrac{f'(x)}{g'(x)} = L$ then $\lim_{x \to a}\dfrac{f(x)}{g(x)} = L$.

We will consider the case of $x \to a+$ (case $x \to a-$ can be handled similarly). Let us define $f(a) = g(a) = 0$ so that $f(x), g(x)$ are continuous at $a$. Since $f'(x)/g'(x) \to L$ as $x \to a+$ there is a neighborhood of $a$ of type $(a, b]$ where $g'(x) \neq 0$. Now $f(x), g(x)$ are continuous on $[a, b]$ and differentiable in $(a, b)$. Also $g(a) \neq g(b)$ otherwise by Rolle's theorem $g'(x)$ would vanish somewhere in $(a, b)$. Hence by Cauchy's Mean Value theorem we have $$\frac{f(b)}{g(b)} = \frac{f(b) - f(a)}{g(b) - g(a)} = \frac{f'(c)}{g'(c)}$$ for some $c \in (a, b)$. If $b \to a+$ then $c \to a+$ and hence $$\lim_{b \to a+}\frac{f(b)}{g(b)} = \lim_{c \to a+}\frac{f'(c)}{g'(c)} = L$$ and L'Hospital's Rule is established.

There is another version of L'Hospital's Rule which is not so widely known and we state (and prove) it below:
If $f(x), g(x)$ are differentiable in a certain neighborhood of $a$ (but not necessarily at $a$), $\lim_{x \to a}\dfrac{1}{g(x)} = 0$ (equivalently $|g(x)| \to \infty$ as $x \to a$) and $\lim_{x \to a}\dfrac{f'(x)}{g'(x)} = L$ then $\lim_{x \to a}\dfrac{f(x)}{g(x)} = L$.

Thus in order to apply this version of L'Hospital's Rule we need to check that $|g(x)| \to \infty$ as $x \to a$. No check apart from differentiability is needed for the function $f(x)$. We prove this rule using the $\epsilon, \delta$ definition of limit. Since $f'(x)/g'(x) \to L$ as $x \to a$, it follows that $f'(x)/g'(x)$ is bounded in a certain deleted neighborhood of $a$. Therefore there is a number $A > 0$ and a number $\delta_{1} > 0$ such that $$\left|\frac{f'(x)}{g'(x)}\right| < A$$ for all $x$ with $0 < |x - a| < \delta_{1}$. Let $\epsilon > 0$ be arbitrary. Then we know that there is a $\delta_{2} > 0$ such that $$\left|\frac{f'(x)}{g'(x)} - L\right| < \frac{\epsilon}{3}$$ for all $x$ with $0 < |x - a| < \delta_{2}$.

Let's consider the ratio $$\frac{f(x) - f(y)}{g(x) - g(y)}$$ where both $x, y$ are distinct points lying in deleted neighborhood $(a - \delta_{3}, a + \delta_{3}) - \{a\}$ of $a$ and $\delta_{3} = \min(\delta_{1}, \delta_{2})$. We can express this ratio as $$\frac{f(x) - f(y)}{g(x) - g(y)} = \dfrac{\dfrac{f(x)}{g(x)} - \dfrac{f(y)}{g(x)}}{1 - \dfrac{g(y)}{g(x)}}$$ and from this equation we obtain $$\frac{f(x)}{g(x)} = \frac{f(x) - f(y)}{g(x) - g(y)}\left(1 - \frac{g(y)}{g(x)}\right) + \frac{f(y)}{g(x)} = \frac{f'(c)}{g'(c)}\left(1 - \frac{g(y)}{g(x)}\right) + \frac{f(y)}{g(x)}$$ where $c$ is some number between $x$ and $y$. Let $y$ have a fixed value in the deleted neighborhood $(a - \delta_{3}, a + \delta_{3}) - \{a\}$ of $a$. Then we know that $g(y)/g(x) \to 0, f(y)/g(x) \to 0$ as $x \to a$. Hence there are positive numbers $\delta_{4}, \delta_{5}$ such that $$\left|\frac{g(y)}{g(x)}\right| < \frac{\epsilon}{3A}$$ for all $x$ with $0 < |x - a| < \delta_{4}$ and $$\left|\frac{f(y)}{g(x)}\right| < \frac{\epsilon}{3}$$ for all $x$ with $0 < |x - a| < \delta_{5}$. Let $\delta = \min(\delta_{3}, \delta_{4}, \delta_{5})$ and let $0 < |x - a| < \delta$ then we have \begin{align} \left|\frac{f(x)}{g(x)} - L\right| &= \left|\frac{f'(c)}{g'(c)}\left(1 - \frac{g(y)}{g(x)}\right) + \frac{f(y)}{g(x)} - L\right|\notag\\ &= \left|\frac{f'(c)}{g'(c)} - L - \frac{f'(c)}{g'(c)}\cdot\frac{g(y)}{g(x)} + \frac{f(y)}{g(x)}\right|\notag\\ &\leq \left|\frac{f'(c)}{g'(c)} - L\right| + \left|\frac{f'(c)}{g'(c)}\right|\left|\frac{g(y)}{g(x)}\right| + \left|\frac{f(y)}{g(x)}\right|\notag\\ &< \frac{\epsilon}{3} + A\cdot\frac{\epsilon}{3A} + \frac{\epsilon}{3}\notag\\ &= \epsilon\notag \end{align} It now follows that $f(x)/g(x) \to L$ as $x \to a$ and the proof of the second version of L'Hospital's Rule is complete. In both the versions of L'Hospital's Rule it is easy to prove that if $f'(x)/g'(x)$ tends to $\infty$ (or to $-\infty$) then so does $f(x)/g(x)$. Note however that if $f'(x)/g'(x)$ does not tend to a limit (meaning that it oscillates finitely or infinitely) then we can't conclude anything about the limit of $f(x)/g(x)$ and in this case the L'Hospital's Rule does not apply.

Proof of Taylor's Theorem

We will use the L'Hospital's rule to prove a version of Taylor's theorem which forms the basis of the technique of using series expansions for evaluating certain limits:
If $f^{(n)}(a)$ exists then we have $$f(a + h) = f(a) + hf'(a) + \cdots + \frac{h^{n - 1}}{(n - 1)!}f^{(n - 1)}(a) + \frac{h^{n}}{n!}\{f^{(n)}(a) + \rho\}$$ where $\rho$ tends to $0$ with $h$.

Clearly this will be established if we show that $$\lim_{h \to 0}\dfrac{{\displaystyle f(a + h) - \sum_{k = 0}^{n - 1}\dfrac{h^{k}}{k!}f^{(k)}(a)}}{h^{n}} = \frac{f^{(n)}(a)}{n!}$$ Using L'Hospital's rule repeatedly we see that \begin{align} \lim_{h \to 0}\dfrac{{\displaystyle f(a + h) - \sum_{k = 0}^{n - 1}\dfrac{h^{k}}{k!}f^{(k)}(a)}}{h^{n}} &= \lim_{h \to 0}\dfrac{{\displaystyle f'(a + h) - \sum_{k = 1}^{n - 1}\dfrac{h^{k - 1}}{(k - 1)!}f^{(k)}(a)}}{nh^{n - 1}}\notag\\ &= \lim_{h \to 0}\dfrac{{\displaystyle f''(a + h) - \sum_{k = 2}^{n - 1}\dfrac{h^{k - 2}}{(k - 2)!}f^{(k)}(a)}}{n(n - 1)h^{n - 2}}\notag\\ &= \lim_{h \to 0}\dfrac{{\displaystyle f^{(n - 1)}(a + h) - \sum_{k = n - 1}^{n - 1}\dfrac{h^{k - n + 1}}{(k - n + 1)!}f^{(k)}(a)}}{n(n - 1)\cdots 3\cdot 2\cdot h}\notag\\ &= \lim_{h \to 0}\dfrac{f^{(n - 1)}(a + h) - f^{(n - 1)}(a)}{n! h}\notag\\ &= \frac{f^{(n)}(a)}{n!}\notag \end{align} It is important to note that in each application of the L'Hospital's rule the conditions for applicability of the rule are satisfied.

To conclude this series of posts on the techniques of evaluation of limits and their justification I just want to give a few remarks. Most of the time evaluation of a limit requires manipulation of the expression under consideration using various algebraic, trigonometric, exponential or logarithmic identities. The goal of these manipulations is to reduce the expression to a form which can take advantages of the standard limit formulas and rules. Sometimes one has to take the route of inequalities and Sandwich theorem to establish a limit (as we have done for the trigonometric limit).

If the manipulations don't lead to a form which allows us to take advantage of standard limits then we try to check if the expression meets the conditions of L'Hospital's rule and if so then this rule is applied in the hope that the new expression after the application of L'Hospital's rule will be simpler than the original one and will be amenable to evaluation by standard limits. As a last resort we may have to use the series expansions which are highly powerful but require some skill in the manipulation of power series. Once the reader has got familiarity with enough limit problems it is instructive for him to approach theoretical problems (such as this one from MSE) which require the use of definition of limits in innovative ways.

In this series of posts I have not dealt with the limit of sequences deliberately as they are not part of introductory calculus and require theoretical tools of different nature. However some limits of this type can be handled via the following obvious result: if $\lim_{x \to \infty}f(x) = L$ then $\lim_{n \to \infty}f(n) = L$ where $n$ takes integral values only.

Print/PDF Version

2 comments :: Teach Yourself Limits in 8 Hours: Part 4

Post a Comment

  1. Is it right to assume Epsilon for both functions will be same? I think it is not necessary for it to be same? Please correct me..

  2. @Rayman Vig,
    Remember that $\epsilon$ is an arbitrary positive number and hence it can be same for both functions under consideration. It is $\delta$ which will be different for both the functions.