Skip to main content

THE GEOMETRY OF VECTOR CALCULUS

Section 10.1 Optimization

Before considering functions of several variables, let us first review how to find maxima and minima for functions of one variable. Recall that a local max/min can only occur at a critical point, where the derivative either vanishes or is undefined. The second derivative test can help to determine whether a critical point is a max or a min: If the second derivative is positive or negative, then the graph is concave up or down, respectively, and the critical point is a local min or a local max, respectively. However, if the second derivative vanishes, anything can happen.
Now imagine the graph of a function of two variables. At a local minimum, you would expect the slope to be zero, and the graph to be concave up in all directions. A typical example would be
\begin{equation*} f(x,y) = x^2 + y^2 \end{equation*}
at the origin, as shown in Figure 10.2.1. Similarly, at a local maximum, the slope would again be zero but the graph would be concave down in all directions; a typical example would be
\begin{equation*} g(x,y) = -x^2 - y^2 \end{equation*}
again at the origin, as shown in Figure 10.2.2. But now there is another possibility. Consider the function
\begin{equation*} h(x,y) = x^2 - y^2 \end{equation*}
whose graph still has slope zero at the origin, but which is concave up in the \(x\)-direction, yet concave down in the \(y\)-direction; this is a saddle point, as shown in Figure 10.2.3.
Motivated by the above examples, a critical point of a function \(f\) of several variables occurs where all of the partial derivatives of \(f\) either vanish or are undefined.
Critical points of a function \(f(x,y)\) of two variables can be classified using the second derivative test, which now takes the following form. Let
\begin{align*} D \amp= \frac{\partial^2 f}{\partial x^2}\frac{\partial^2 f}{\partial y^2} - \left(\frac{\partial^2 f}{\partial x\partial y}\right)^2\\ A \amp= \frac{\partial^2 f}{\partial x^2} \end{align*}
evaluated at a critical point \(P=(a,b)\text{.}\) Then
  • If \(D>0\) and \(A>0\text{,}\) then \(f(a,b)\) is a local min.
  • If \(D>0\) and \(A\lt 0\text{,}\) then \(f(a,b)\) is a local max.
  • If \(D\lt 0\text{,}\) then there is a saddle point at \(P\text{.}\)
  • If \(D=0\text{,}\) anything can happen.
Optimization problems typically seek a global max/min, rather than a local max/min. Just as for functions of one variable, in addition to finding the critical points, one must also examine the boundary. Thus, to optimize a function \(f\) of several variables, one must:
  • Find the critical points.
  • Find any critical points of the function restricted to the boundary.
  • Evaluate \(f\) at each of these points to find the global max/min.
The middle step is effectively a single-variable optimization problem, which may require considering different pieces of the boundary separately, together with any “corner points”.
As an example, consider the function \(f(x,y)=xy\text{.}\) Where are the critical points? Where the partial derivatives of \(f\) vanish. We have
\begin{equation} \Partial{f}{x} = y ;\qquad \Partial{f}{y} = x\tag{10.1.1} \end{equation}
so the only critical point occurs where \(x=0=y\text{,}\) that is, at the origin. We compute second derivatives and evaluate them at the origin, obtaining
\begin{equation} \frac{\partial^2 f}{\partial x^2}\Bigg|_{(0,0)} = 0 = \frac{\partial^2 f}{\partial y^2}\Bigg|_{(0,0)} ; \qquad \frac{\partial^2 f}{\partial x\partial y}\Bigg|_{(0,0)} = 1\tag{10.1.2} \end{equation}
so that in this case
\begin{equation} D = 0 - 1 = -1 \lt 0\tag{10.1.3} \end{equation}
which implies that \((0,0)\) is a saddle point of \(f\text{.}\)