Derivative test
In calculus, derivative tests are analytical methods employed to classify critical points of a differentiable function, determining whether they represent local maxima, local minima, or points of inflection by examining the behavior of the function's first, second, or higher-order derivatives.[1] These tests are fundamental tools in optimization and curve sketching, relying on the signs and changes in derivatives to infer the function's monotonicity and concavity without evaluating the function extensively.[2] The first derivative test focuses on the sign of the first derivative f'(x) around a critical point c, where f'(c) = 0 or f'(c) is undefined. If f'(x) > 0 for x < c (near c) and f'(x) < 0 for x > c (near c), then f has a local maximum at c; conversely, if f'(x) < 0 for x < c and f'(x) > 0 for x > c, then f has a local minimum at c. If the sign does not change, the test is inconclusive for extrema.[3] This test is always applicable where the first derivative exists and is particularly useful for functions where higher derivatives may be difficult to compute.[4] The second derivative test provides a quicker alternative by evaluating the second derivative f''(c) at the critical point c. If f''(c) > 0, then f has a local minimum at c; if f''(c) < 0, then f has a local maximum at c; and if f''(c) = 0, the test is inconclusive, requiring further analysis.[5] Beyond extrema, the second derivative also determines concavity: f''(x) > 0 indicates the function is concave up (like a cup), while f''(x) < 0 indicates concave down, helping identify inflection points where concavity changes.[6] For cases where the second derivative test fails (i.e., f''(c) = 0), higher-order derivative tests extend the approach using Taylor's theorem. Suppose the first n derivatives of f at c are zero, with the (n+1)-th derivative f^{(n+1)}(c) \neq 0. If n+1 is even and f^{(n+1)}(c) > 0, then c is a local minimum; if even and f^{(n+1)}(c) < 0, a local maximum. If n+1 is odd, c is typically a point of inflection rather than an extremum.[7] These tests assume sufficient differentiability and are grounded in the function's Taylor expansion around the critical point.[8]Single-Variable First-Derivative Test
Monotonicity Properties
A function f defined on an interval I is said to be increasing on I if for all x_1, x_2 \in I with x_1 < x_2, it holds that f(x_1) \leq f(x_2).[9] Similarly, f is decreasing on I if f(x_1) \geq f(x_2) whenever x_1 < x_2.[9] The function is strictly increasing on I if the inequality is strict, i.e., f(x_1) < f(x_2) for x_1 < x_2, and strictly decreasing if f(x_1) > f(x_2).[9] These definitions capture the intuitive notion that the function values grow or shrink consistently as the input advances across the interval, without requiring epsilon-delta criteria beyond the direct order preservation.[10] A fundamental result connecting derivatives to these properties is the following theorem: Suppose f is continuous on a closed interval [a, b] and differentiable on the open interval (a, b). If f'(x) > 0 for all x \in (a, b), then f is strictly increasing on [a, b]; if f'(x) < 0 for all x \in (a, b), then f is strictly decreasing on [a, b].[11] This extends to open intervals where f is differentiable, and the conclusion holds even if f'(x) = 0 at a finite number of isolated points, provided the derivative does not change sign across the interval.[9] The proof relies on the Mean Value Theorem (MVT), which states that if f is continuous on [x_1, x_2] and differentiable on (x_1, x_2) with x_1 < x_2, then there exists c \in (x_1, x_2) such that f'(c) = \frac{f(x_2) - f(x_1)}{x_2 - x_1}. Assume f'(x) > 0 on (a, b). For any x_1, x_2 \in [a, b] with x_1 < x_2, MVT yields c \in (x_1, x_2) where f'(c) > 0, so \frac{f(x_2) - f(x_1)}{x_2 - x_1} > 0. Since x_2 - x_1 > 0, it follows that f(x_2) > f(x_1), establishing strict increase. The case for f'(x) < 0 is analogous, yielding strict decrease.[11] Points where f'(x) = 0 or f' is undefined do not disrupt overall monotonicity if the sign of f' remains consistent in the surrounding subintervals. For instance, isolated zeros of f' allow the MVT application across them without sign reversal, preserving the inequality direction; similarly, points of non-differentiability (e.g., cusps where the tangent exists but is vertical) maintain monotonicity provided the left- and right-hand behaviors align with the derivative's sign elsewhere.[9] This ensures the function's global interval behavior is determined by the predominant sign of the first derivative.Statement for Local Extrema
A critical point of a function f is a point c in the domain of f where either f'(c) = 0 or f'(c) does not exist.[3] These points are the only candidates for local extrema, as established by Fermat's theorem, which states that if f has a local extremum at c and f' exists there, then f'(c) = 0.[12] The first derivative test provides conditions for identifying local maxima and minima at such critical points. Suppose f is continuous at a critical point c and differentiable in some open interval around c except possibly at c itself. If f'(x) > 0 for all x in (c - \delta, c) and f'(x) < 0 for all x in (c, c + \delta) for some \delta > 0, then f has a local maximum at c. Conversely, if f'(x) < 0 for all x in (c - \delta, c) and f'(x) > 0 for all x in (c, c + \delta), then f has a local minimum at c. If f'(x) does not change sign at c (i.e., it remains positive or negative on both sides), then f(c) is neither a local maximum nor a local minimum.[13][3] To apply the test, a sign chart is constructed by evaluating the sign of f'(x) in intervals determined by the critical points. This involves factoring f'(x) or using test values in each interval adjacent to c, often summarized in a table:| Interval | Test Value | Sign of f'(x) | Behavior of f |
|---|---|---|---|
| (c - \delta, c) | x_1 < c | Positive | Increasing |
| (c, c + \delta) | x_2 > c | Negative | Decreasing |