# Monotone Functions: Part 1

### Introduction

Few posts ago we discussed continuous functions and their properties. In this series of posts we discuss another class of functions namely the monotone functions and their extensions. The word monotone crudely suggests that these functions should have a single tone which translates properly to "variation in a single direction". In other words such functions either increase all the time or decrease all the time.

### Definitions

More formally we adopt the following definitions:
Let $f$ be a function defined in an interval $I$. If for any points $a, b \in I$ with $a < b$ we have $f(a) \leq f(b)$ we say that the function $f$ is increasing in the interval $I$. If on the other hand for any points $a, b \in I$ with $a < b$ we have $f(a) \geq f(b)$ we say that the function $f$ is decreasing in interval $I$.
If a function is either increasing or decreasing in an interval $I$ we say the function is monotone in interval $I$.

In the above definitions if we replace the inequalities regarding the values of the function by their strict versions then we get the strictly increasing and strictly decreasing functions.
Let $f$ be a function defined in an interval $I$. If for any points $a, b \in I$ with $a < b$ we have $f(a) < f(b)$ we say that the function $f$ is strictly increasing in the interval $I$. If on the other hand for any points $a, b \in I$ with $a < b$ we have $f(a) > f(b)$ we say that the function $f$ is strictly decreasing in interval $I$.
If a function is either strictly increasing or strictly decreasing in an interval $I$ we say the function is strictly monotone in $I$.

In other words increasing functions preserve the nature of an inequality when they are applied on both sides of the inequality and decreasing functions reverse a given inequality when applied on both sides of the inequality.

It is easy to observe that the graph of an increasing function looks rising upwards and that of a decreasing function looks diving downwards.
Graphs of Monotone Functions

In the above figure $y = f(x)$ is an increasing function and $y = g(x)$ is a decreasing function. In fact they are monotone in the strict sense. Simple example of such functions can be constructed using polynomials and their reciprocals. For example $f(x) = x$ is increasing on $\mathbb{R}$ while $g(x) = x^{2}$ is increasing on $[0, \infty)$ and decreasing on $(-\infty, 0]$. Some more examples are below:
$\sin x$ increasing on $[-\pi/2, \pi/2]$
$\cos x$ decreasing on $[0, \pi]$
$1/x$ decreasing on $(0, \infty)$
$[x]$ increasing on $\mathbb{R}$ but not in the strict sense

These examples above are simple enough that their monotonicity can be determined directly by applying the definitions, but how to determine the monotonicity of a function which is given by a complicated formula for example $f(x) = (x + 2)^{3}(x - 3)^{3}$? To that end we need to examine the concept of monotonicity in more detail.

### Monotonicity at a Point

Till now we have defined monotone functions on an interval and this makes monotonicity a global property applicable to a continuous range of values. Is it possible to define it as a local property valid in some neighborhood of a point? It turns out that there is a possible definition in the local sense.
Let $f$ be a function defined in a certain neighborhood $I$ of a point $a$. If $x \in I, x < a$ implies that $f(x) \leq f(a)$ and $x \in I, x > a$ implies that $f(x) \geq f(a)$ then we say that the function $f$ is increasing at point $a$. If only first condition is satisfied (case $x < a$) then we say that the function is increasing from left at point $a$ and if only the second condition (case $x > a$) is satisfied we say that the function is increasing from the right at point $a$.

Thus if $f$ is increasing at point $a$ then the values of $f$ to the right of $a$ are greater than or equal to $f(a)$ and the values of $f$ to the left of $a$ are less than or equal to $f(a)$. Note that this does not mean that the function $f$ in increasing in the neighborhood $I$ of $a$. An actual example which will clearly demonstrate this point and reflect explicitly the difference between monotonicity at a point and monotonicity in an interval will be presented at the end of this post.

Like the monotonicity in an interval, there are definitions in the stricter sense for monotonicity at point. Thus if values of function $f$ to the right of point are more than $f(a)$ and its values to the left of $a$ are less than $f(a)$ we say that the function $f$ is strictly increasing at point $a$. The reader can frame corresponding definitions of the functions decreasing / strictly decreasing at a given point.

Studying the behavior of a function at point is normally simpler than studying its behavior in an interval and therefore we have tried to define the concept of monotonicity at a point. But this will help us only when we can infer the global properties of a function in an interval from its local properties at points of the interval. Fortunately for monotonicity this holds true i.e we can infer monotonicity in an interval from monotonicity at points of the interval.
Let a function $f$ be defined in an interval $I$ such that $f$ is increasing at every point of the interval $I$. Then $f$ is increasing in interval $I$.

Here we have to understand that if the interval contains its left hand end point then we can not discuss the values of function to the left of the end of point and hence in this case we need to understand that the function is increasing from the right at left end point of the interval. Corresponding remarks apply in case the interval contains its right hand end point.

Let $a, b \in I$ such that $a < b$. We shall prove that $f(a) \leq f(b)$. Clearly it is easy to use Dedekind's theorem here. We apply the theorem not to all the real numbers but only to those belonging to the interval $[a, b]$. In this case we define two sets $L, U$ such that $a \in L$ and put a point $x$ of interval $(a, b]$ in $L$ if the values of the function $f$ in interval $[a, x]$ are all greater than or equal to $f(a)$. The remaining points of $[a, b]$ are put in set $U$. We will show that the set $U$ is empty so that $L = [a, b]$ and hence $f(a) \leq f(b)$.

On the contrary let us assume that $U$ is non-empty. Then the conditions of the Dedekind's theorem are satisfied (readers should find it easy to verify themselves) and hence there is a number $\alpha \in [a, b]$ such that points of interval $[a, b]$ lying to the left of $\alpha$ are in $L$ and points of interval $[a, b]$ lying to the right of $\alpha$ are in $U$. Since the function $f$ is increasing at point $a$ so at points sufficiently close to and right of $a$ the values of $f$ are greater than or equal to $f(a)$. Hence these points to the right of $a$ and sufficiently close to $a$ belong to $L$ and therefore $\alpha > a$. It should be clear that $f(a) \leq f(\alpha)$. For, a point $c$ sufficiently close to $\alpha$ and less than $\alpha$ belongs to $L$ so that $f(a) \leq f(c)$ and since $f$ is increasing at $\alpha$ it follows that $f(c) \leq f(\alpha)$. This also shows that $\alpha \in L$.

If $\alpha < b$ then since the function $f$ is increasing at $\alpha$ there are points $c$ to the right of $\alpha$ and sufficiently close to $\alpha$ such that $f(\alpha) \leq f(c)$ and thus $f(a) \leq f(c)$ and so all these points must belong to $L$. But these should also belong to $U$ as they lie to the right of $\alpha$. Thus we cannot have $\alpha < b$ and therefore $\alpha = b$ and thus $b \in L$ and $U$ is empty.

Since the points $a, b \in I$ were arbitrary with condition $a < b$ it follows that the function $f$ is increasing in interval $I$. This result can also be established using the Heine Borel Principle which is specifically designed to deduce global properties from local properties and the reader should be able to supply a proof using Heine Borel Principle himself.

There are corresponding results for decreasing functions and the stricter cousins for which reader can supply the proof by changing inequalities (either reversing them or making them strict) in the above argument.

### Conditions for Monotonicity at a Point

Now the question arises: how to check for monotonicity at a point. Let's return to the definitions and translate them in symbols. Let $f$ be increasing at point $a$. Then we have $$f(a - h) \leq f(a),\,\, f(a) \leq f(a + h)$$ for all positive values of $h$ which are sufficiently close to zero. Thus $$\frac{f(a - h) - f(a)}{-h} \geq 0,\,\, \frac{f(a + h) - f(a)}{h} \geq 0$$ for all sufficiently small and positive values of $h$.

If the function $f$ is differentiable at point $a$ i.e. $f'(a)$ exists then by taking limits of the above inequalities as $h \to 0+$ we see that we must have $f'(a) \geq 0$. So we have the following result:
If a function $f$ is increasing at a point $a$ and $f'(a)$ exists then $f'(a) \geq 0$.

Similarly
If a function $f$ is decreasing at point $a$ and $f'(a)$ exists then $f'(a) \leq 0$.

Note that the stricter versions of the above results don't lead to strict inequalities for the derivative $f'(a)$ because when we take limits the strict inequality changes to the weaker one. To highlight this we explicitly state the results for the stricter versions even though it leads to some amount of repetition.

If a function $f$ is strictly increasing at a point $a$ and $f'(a)$ exists then $f'(a) \geq 0$.
If a function $f$ is strictly decreasing at a point $a$ and $f'(a)$ exists then $f'(a) \leq 0$.

In most calculus textbooks this result is presented wrongly. Ignoring the fact that operation of limits weakens an inequality these books give the above results with strict inequality for the derivative. It is important to convince ourselves of these results by real examples. Clearly $f(x) = x^{3}$ is strictly increasing at $0$, but the derivative $f'(0) = 0$ and the case for the strictly decreasing function can be demonstrated by $f(x) = -x^{3}$.

The above conditions are necessary but not sufficient. The examples mentioned above clearly show that if the derivative at a point is zero the function may be strictly increasing at that that point or strictly decreasing at that point and yes it can also be constant at that point (same values to the left and right of the point under consideration as can be seen by taking $f(x) = 1$). However if we ignore this case of derivative being zero then we get the sufficient conditions for monotonicity at a point.
If $f'(a) > 0$ then the function $f$ is strictly increasing at point $a$.
If $f'(a) < 0$ then the function $f$ is strictly decreasing at point $a$.

This is easy to establish. If $f'(a) > 0$ then the ratio $\{f(a + h) - f(a)\}/\{h\}$ must remain positive for all values of $h$ sufficiently close to zero. If $h$ is positive this shows that $f(a + h) > f(a)$ for all values of $h$ sufficiently close to zero. If $h$ is negative it means that $f(a + h) < f(a)$ for all sufficiently small values of $h$. Thus $f$ is strictly increasing at $a$. Similarly the case when $f'(a) < 0$ can be tackled.

### Condition for Monotonicity in an Interval

Because monotonicity at all points of an interval leads to monotonicity in the interval, the above results can at once be used to establish the monotonicity in an interval. Therefore we have
If $f'(x) > 0$ for all values of $x$ in an interval $I$, then the function $f$ is strictly increasing in interval $I$.
If $f'(x) < 0$ for all values of $x$ in an interval $I$, then the function $f$ is strictly decreasing in interval $I$.

Note that if the interval contains its left hand end point then we must consider the right hand derivative at that end point. Corresponding remark applies if the interval contains its right hand end point. In these cases if we assume that the function is continuous at the end points of the interval, then we don't need to assume the existence of left/right hand derivative at these end points. This is easy to understand.

Let's assume that the derivative $f'(x) > 0$ in the interior of interval $I$ and let $a \in I$ be the left hand end point of $I$. Then we must have $f(a) < f(x)$ for all interior points $x \in I$. Clearly we can find interior points $y, z \in I$ such that $a < z < y < x$. Then we have $f(z) < f(y) < f(x)$. We keep $x, y$ as fixed and let $z \to a+$ then by continuity we get $f(a) \leq f(y) < f(x)$. So we finally arrive at the following theorems which are the best we can hope using the concept of derivatives:
If $f'(x) > 0$ at all interior points of an interval $I$ and $f$ is continuous in $I$ then $f$ is strictly increasing in $I$.
If $f'(x) < 0$ at all interior points of an interval $I$ and $f$ is continuous in $I$ then $f$ is strictly decreasing in $I$.

Now that we have understood the concepts of monotonicity at a point and an interval and also know the conditions to guarantee monotonicity we will present an example which demonstrates the difference between monotonicity at a point and monotonicity at an interval. This example is taken from my most favorite book "A Course of Pure Mathematics" by G. H. Hardy.

Let $f(x) = ax + x^{2}\sin(1/x)$ for $x \neq 0$ and $f(0) = 0$. Then clearly the function $f(x)$ is differentiable everywhere with $f'(0) = a$. Let $a > 0$ so that $f'(0) > 0$. Then the function $f$ is strictly increasing at point $x = 0$. If $x \neq 0$ then $f'(x) = a + 2x\sin(1/x) - \cos(1/x)$. If $x \to 0$ then clearly $2x\sin(1/x) \to 0$, but $\cos(1/x)$ oscillates between $-1$ and $1$ and therefore the derivative $f'(x)$ also oscillates between $a - 1$ and $a + 1$ as $x \to 0$. If $0 < a < 1$ then $a - 1 < 0$ and hence as $x \to 0$, $f'(x)$ has negative values for an infinity of points sufficiently close to zero. Therefore even though $f'(0) = a > 0$ every neighborhood of $0$ contains points where the derivative $f'(x)$ is negative and hence there is no neighborhood of $0$ where the function $f(x)$ is increasing. Thus if a function $f$ is increasing at some point then it does not necessarily mean that it is increasing in a neighborhood of that point.

The results in this post have been obtained by first establishing the monotonicity at each point of an interval and then using Dedekind's theorem (or Heine Borel Principle) to transfer this property to the entire interval itself. There is another way to approach these results which uses standard theorems from differential calculus. This approach we discuss in the next post.