How Can We Help?

Maxima and Minima

You are here:
  • Main
  • Maxima and Minima
< All Topics

“The art of doing mathematics is finding that special case that contains all the germs of generality” – David Hilbert

Introduction:

In this article, we are going to learn about the concept of Maxima and Minima which has a significant role in Deep Learning. We will also learn about slopes and use them to find the maxima and minima of the function.

Slopes:

Let’s assume a function y = f(x) as shown in figure below:

Let’s take 2 points x_1 and x_2 on x-axis and let’s observe the situation if we change the input then how the value f(x_1) and f(x_2) of function f(x) will change, we can observe the situation in the below animation:

 

We can clearly see that if we increase the input, the output is also increasing, so let’s assume \Delta{x} = x_2 - x_1 and \Delta{y} = f(x_2) - f(x_1). Now,

    \[rate\ of\ change = \lim_{\Delta{x} \to 0} \frac{\Delta{y}}{\Delta{x}} = \lim_{\Delta{x} \to 0}\frac{f(x_2) - f(x_1)}{x_2 - x_1}\]

and we know that f(x_2) - f(x_1) > 0 OR we can say that rate\ of\ change is positive OR the slope of function where \Delta{x} \to 0 is positve, as the function changing positvely on positive nudge on input.

Let’s observe another situation on a different function f_1(k) as shown below:

 

Again, let’s the same previous situation on this function by taking two points on x-axis, k_1 and\ k_2 and let’s observe the rate of change of this function f_1(k):

 

We can notice that if we increase the input, the output is decreasing, so let’s \Delta{k} = k_2 - k_1 and \Delta{h} = f(k_2) - f(k_1). Now,

    \[rate\ of\ change = \lim_{\Delta{k} \to 0} \frac{\Delta{h}}{\Delta{k}} = \lim_{\Delta{k} \to 0}\frac{f(k_2) - f(k_1)}{k_2 - k_1}\]

as f(k_2) - f(k_1) < 0, we can say that rate\ of\ change is negative OR the slope of function where \Delta{k} \to 0 is negative, as the function is decreasing on positive nudge on input.

There are also some points when f(x_1) = f(x_2) \implies f(x_2) - f(x_1) = 0 OR  the point where function doesn’t change by nudging the input and stays at constant value as rate\ of\ change = 0 OR the derivative is 0.  In function f_1(k), let’s zoom to a specific portion and let’s observe the rate\ of\ change:

 

We can observe that, while nudging the input, function almost stays the same at that portion and doesn’t change at all. The point where \Delta{k} \to 0 and f(k_2) = f(x_2) will certainly have a Maxima or a Minima. We can say that point(s) where the derivative/slope of a function becomes 0 will definitely have either Maxima or Minima.

Let’s take an example and find the points where the derivative of that function is 0 and will visualize that if at that point there exists Maxima/Minima:

let f(x) = x^3 - 3x + 1.

    \[\frac{dy}{dx}\ OR\ f^\prime (x) = 3x^2 - 3\]

we know that, for determining maxima/minima, the derivative of the function should be 0 at some point

    \[\frac{dy}{dx} = 0 \implies 3x^2 - 3 = 0\]

 

    \[OR\ \ \ 3(x^2-  1) = 0\]

    \[OR\ \ \ x = \pm \sqrt{1} \implies x = 1,-1\]

The values of x where f^\prime(x) = 0 are -1, 1, so at x=-1 and at x=1 there should be maxima/minima. You can observe the figure below:

So, let’s conclude this article here and in next article we will explore the methods through which we can find where the point of Maxima and Minima exists for a function and will also explore the topic where the derivative of the function is not defined.

Was this article helpful?
0 out of 5 stars
5 Stars 0%
4 Stars 0%
3 Stars 0%
2 Stars 0%
1 Stars 0%
How can we improve this article?
Table of Contents