Real Analysis: The Rigorous Foundations of Calculus
Real analysis is the branch of mathematics that rigorously establishes the foundations of calculus. Where calculus asks "how," analysis asks "why" — providing precise definitions and proofs for limits, continuity, differentiation, and integration.
1. Axioms of the Real Numbers
Real analysis begins with a precise characterization of the real number system R. Rather than treating real numbers as an intuitive given, analysis constructs them axiomatically through three families of axioms.
Field Axioms
The real numbers form a field under addition and multiplication. The field axioms guarantee that addition and multiplication behave as expected — closure, commutativity, associativity, distributivity, and the existence of identity elements (0 for addition, 1 for multiplication) and inverses.
For all a, b, c in R:
- a + b = b + a (commutativity)
- (a + b) + c = a + (b + c) (associativity)
- a(b + c) = ab + ac (distributivity)
- a + 0 = a, and a * 1 = a (identity elements)
- For each a, there exists -a such that a + (-a) = 0
Order Axioms
The real numbers are also an ordered field. There exists a subset P (the positive reals) such that: for any x in R, exactly one of x in P, -x in P, or x = 0 holds (trichotomy); and P is closed under addition and multiplication.
The Completeness Axiom (Least Upper Bound Property)
This is the axiom that distinguishes the reals from the rationals. Every non-empty subset S of R that is bounded above has a least upper bound (supremum) in R.
Why This Matters
The set S = (x in Q : x squared less than 2) is bounded above by 2 in Q, but its supremum sqrt(2) is not rational. In R, the supremum always exists. Without completeness, limits might not exist, making calculus impossible to develop rigorously.
2. Supremum, Infimum, and Bounded Sets
The concepts of supremum and infimum formalize the idea of the "tightest" upper and lower bounds for a set.
Definitions
Supremum (sup S)
The least upper bound of S. A number M = sup S satisfies: (1) M is an upper bound: x ≤ M for all x in S, and (2) M is the least such bound: if M' is any upper bound, then M ≤ M'.
Infimum (inf S)
The greatest lower bound of S. A number m = inf S satisfies: (1) m is a lower bound: m ≤ x for all x in S, and (2) m is the greatest such bound: if m' is any lower bound, then m' ≤ m.
The Archimedean Property
A direct consequence of completeness: for any real number x, there exists a natural number n such that n greater than x. Equivalently, for any epsilon greater than 0, there exists n in N such that 1/n less than epsilon. This property is used constantly in epsilon-delta proofs.
Density of the Rationals
Between any two distinct real numbers a and b (with a less than b), there exists a rational number q such that a less than q less than b. Similarly, there exists an irrational number between any two reals. This is the density theorem, proved using the Archimedean property.
Key Characterization of Supremum
M = sup S if and only if: M is an upper bound of S, and for every epsilon greater than 0, there exists s in S such that M - epsilon less than s ≤ M. This characterization is the standard tool for proving supremum equalities.
3. Sequences, Limits, and Cauchy Sequences
A sequence is a function from the natural numbers to the reals, written (a_n). The limit theory for sequences underpins all of analysis.
Formal Definition of Sequence Limits
We say lim(n to infinity) a_n = L if for every epsilon greater than 0, there exists N in N such that for all n greater than N, we have |a_n - L| less than epsilon. The limit L must be unique when it exists.
Example: Proving lim(1/n) = 0
Given epsilon greater than 0, by the Archimedean property choose N such that 1/N less than epsilon. Then for all n greater than N: |1/n - 0| = 1/n ≤ 1/N less than epsilon. QED.
Limit Laws for Sequences
If lim(a_n) = A and lim(b_n) = B, then:
- →lim(a_n + b_n) = A + B (sum rule)
- →lim(a_n * b_n) = A * B (product rule)
- →lim(a_n / b_n) = A / B provided B is nonzero (quotient rule)
- →Squeeze Theorem: if a_n ≤ b_n ≤ c_n and lim(a_n) = lim(c_n) = L, then lim(b_n) = L
Monotone Convergence Theorem
Every bounded monotone sequence converges. If (a_n) is increasing and bounded above, then it converges to sup(a_n : n in N). This theorem is a direct consequence of completeness and is used to prove the existence of e = lim(1 + 1/n)^n.
Cauchy Sequences
A sequence (a_n) is Cauchy if for every epsilon greater than 0, there exists N such that for all m, n greater than N: |a_m - a_n| less than epsilon. The key theorem: in R, a sequence converges if and only if it is a Cauchy sequence.
Cauchy vs. Convergence
A convergent sequence is always Cauchy. The converse requires completeness: in Q, the sequence of rational approximations to sqrt(2) is Cauchy but does not converge in Q. In R, completeness guarantees every Cauchy sequence converges. This equivalence is called the Cauchy criterion.
Subsequences and the Bolzano-Weierstrass Theorem
A subsequence of (a_n) is a sequence (a_n_k) obtained by choosing an infinite subset of indices. The Bolzano-Weierstrass Theorem states: every bounded sequence has a convergent subsequence. This is fundamental for compactness arguments and existence proofs.
4. Infinite Series and Convergence Tests
An infinite series is the sum sum(a_n) from n=1 to infinity, defined as the limit of partial sums S_N = a_1 + a_2 + ... + a_N. A series converges if this limit exists and is finite.
Necessary Condition: The n-th Term Test
If sum(a_n) converges, then lim(a_n) = 0. Contrapositive: if lim(a_n) is not 0, the series diverges. Note: lim(a_n) = 0 is necessary but NOT sufficient — the harmonic series sum(1/n) diverges even though 1/n approaches 0.
The Comparison Test
If 0 ≤ a_n ≤ b_n for all n:
- →If sum(b_n) converges, then sum(a_n) converges.
- →If sum(a_n) diverges, then sum(b_n) diverges.
Example
sum(1/(n^2 + 1)) converges because 1/(n^2 + 1) ≤ 1/n^2 and sum(1/n^2) converges (p-series with p = 2 greater than 1).
The Ratio Test
Let L = lim(|a_(n+1) / a_n|):
- ✓If L less than 1, the series converges absolutely.
- ✗If L greater than 1 (or L = infinity), the series diverges.
- ?If L = 1, the test is inconclusive.
Example: sum(n! / n^n)
Ratio = |(n+1)! / (n+1)^(n+1)| * |n^n / n!| = n^n / (n+1)^n = (n/(n+1))^n approaches 1/e less than 1. Series converges.
The Root Test
Let L = lim sup(|a_n|^(1/n)):
- ✓If L less than 1, the series converges absolutely.
- ✗If L greater than 1, the series diverges.
- ?If L = 1, the test is inconclusive.
The Integral Test
If f is continuous, positive, and decreasing on [1, infinity) with f(n) = a_n, then sum(a_n) converges if and only if the improper integral from 1 to infinity of f(x) dx converges.
p-Series: sum(1/n^p)
By the integral test: integral of 1/x^p from 1 to infinity converges when p greater than 1 (equals 1/(p-1)). Therefore sum(1/n^p) converges for p greater than 1 and diverges for p ≤ 1.
Absolute vs. Conditional Convergence
A series sum(a_n) converges absolutely if sum(|a_n|) converges. Absolute convergence implies convergence. A series that converges but not absolutely is conditionally convergent. The alternating harmonic series sum((-1)^(n+1) / n) = ln(2) is conditionally convergent — it converges, but sum(1/n) diverges.
Riemann Rearrangement Theorem
A conditionally convergent series can be rearranged to converge to any real number or to diverge to infinity. Absolutely convergent series maintain their sum under any rearrangement.
5. Continuity and Uniform Continuity
Continuity is one of the central concepts of analysis. The rigorous epsilon-delta definition replaces the informal notion of "drawing without lifting the pen."
Epsilon-Delta Definition of Continuity
f : D → R is continuous at c in D if for every epsilon greater than 0, there exists delta greater than 0 such that whenever x is in D and |x - c| less than delta, we have |f(x) - f(c)| less than epsilon.
Example: Proving f(x) = x^2 is continuous at x = 2
|f(x) - f(2)| = |x^2 - 4| = |x - 2||x + 2|.
If |x - 2| less than 1, then 1 less than x less than 3, so |x + 2| less than 5.
Choose delta = min(1, epsilon/5). Then |f(x) - 4| less than 5 * delta ≤ epsilon.
Sequential Characterization of Continuity
f is continuous at c if and only if for every sequence (x_n) in D with x_n approaching c, we have f(x_n) approaching f(c). This equivalence allows sequence arguments to replace epsilon-delta arguments in many proofs.
Key Theorems for Continuous Functions on Closed Intervals
Extreme Value Theorem
If f is continuous on the closed bounded interval [a, b], then f attains both its maximum and minimum values. That is, there exist c, d in [a, b] with f(c) ≤ f(x) ≤ f(d) for all x in [a, b].
Intermediate Value Theorem (IVT)
If f is continuous on [a, b] and k is any value between f(a) and f(b), then there exists c in (a, b) with f(c) = k. The IVT is used to prove the existence of roots and fixed points.
Uniform Continuity
f : D → R is uniformly continuous on D if for every epsilon greater than 0, there exists delta greater than 0 such that for all x, y in D with |x - y| less than delta, we have |f(x) - f(y)| less than epsilon. The crucial difference from pointwise continuity: delta depends only on epsilon, not on the specific point.
Key Theorem: Heine-Cantor Theorem
Every continuous function on a closed bounded interval [a, b] is uniformly continuous. However, f(x) = 1/x on (0, 1) is continuous but NOT uniformly continuous — as x and y both approach 0, we cannot control |f(x) - f(y)| with a single delta.
Lipschitz Continuity
f is Lipschitz continuous with constant K if |f(x) - f(y)| ≤ K|x - y| for all x, y. Lipschitz continuity implies uniform continuity. A differentiable function with bounded derivative is Lipschitz (by the Mean Value Theorem).
6. Differentiability and the Mean Value Theorem
Differentiability is a stronger condition than continuity. In real analysis, we prove the relationship between these conditions and establish the major theorems of differential calculus rigorously.
The Derivative: Formal Definition
f is differentiable at c if the limit lim(h to 0) of (f(c + h) - f(c)) / h exists and is finite. This limit is denoted f'(c). Differentiability at c implies continuity at c, but not conversely — f(x) = |x| is continuous but not differentiable at 0.
Rolle's Theorem
If f is continuous on [a, b], differentiable on (a, b), and f(a) = f(b), then there exists c in (a, b) with f'(c) = 0. This geometric result (the function must turn around) is the foundation for the Mean Value Theorem.
The Mean Value Theorem (MVT)
Statement
If f is continuous on [a, b] and differentiable on (a, b), then there exists c in (a, b) such that: f'(c) = (f(b) - f(a)) / (b - a).
Geometrically: the instantaneous rate of change equals the average rate of change at some interior point.
The MVT has powerful corollaries: if f'(x) = 0 for all x in (a, b) then f is constant; if f'(x) greater than 0 then f is increasing; and if |f'(x)| ≤ M then |f(x) - f(y)| ≤ M|x - y| (Lipschitz bound).
Taylor's Theorem with Remainder
If f has n+1 continuous derivatives on [a, b], then for any x in [a, b]:
f(x) = f(a) + f'(a)(x-a) + f''(a)(x-a)^2/2! + ... + f^(n)(a)(x-a)^n/n! + R_n(x)
where R_n(x) = f^(n+1)(c)(x-a)^(n+1)/(n+1)! for some c between a and x. (Lagrange remainder form)
L'Hopital's Rule
If lim(x to c) f(x) = lim(x to c) g(x) = 0 (or both = infinity) and g'(x) is nonzero near c, then: lim(x to c) f(x)/g(x) = lim(x to c) f'(x)/g'(x), provided the latter limit exists. This is proved using the Cauchy Mean Value Theorem.
7. Riemann Integration and the Fundamental Theorem
The Riemann integral formalizes the area under a curve through increasingly refined partitions. This is the standard integral of undergraduate analysis.
Partitions, Upper and Lower Sums
A partition P of [a, b] is a finite set a = x_0 less than x_1 less than ... less than x_n = b. For each subinterval [x_(i-1), x_i]:
- →M_i = sup of f on [x_(i-1), x_i] (upper bound on subinterval)
- →m_i = inf of f on [x_(i-1), x_i] (lower bound on subinterval)
- →U(f, P) = sum of M_i * (x_i - x_(i-1)) (upper Riemann sum)
- →L(f, P) = sum of m_i * (x_i - x_(i-1)) (lower Riemann sum)
The Riemann Integral
f is Riemann integrable on [a, b] if the infimum of all upper sums equals the supremum of all lower sums. This common value is the Riemann integral from a to b of f(x) dx.
Riemann Criterion
f is integrable if and only if for every epsilon greater than 0, there exists a partition P such that U(f, P) - L(f, P) less than epsilon. Every continuous function on [a, b] is integrable. Every monotone bounded function on [a, b] is integrable.
Fundamental Theorem of Calculus — Part 1
If f is continuous on [a, b] and F(x) = integral from a to x of f(t) dt, then F is differentiable on (a, b) and F'(x) = f(x). This says differentiation and integration are inverse operations: integrating then differentiating recovers the original function.
Fundamental Theorem of Calculus — Part 2
If f is continuous on [a, b] and F is any antiderivative of f (meaning F'(x) = f(x)), then:
integral from a to b of f(x) dx = F(b) - F(a)
This provides the practical computation tool: to evaluate a definite integral, find any antiderivative and evaluate at the endpoints. The theorem connects the two major branches of calculus and justifies the notation df/dx and dx as genuine inverses.
Improper Integrals
An improper integral arises when either the interval is unbounded or f is unbounded on [a, b]. These are defined as limits of proper integrals:
integral from 1 to infinity of f(x) dx = lim(b to infinity) of integral from 1 to b of f(x) dx
Example: integral from 1 to infinity of 1/x^p dx converges for p greater than 1 and diverges for p ≤ 1.
8. Uniform Convergence, Power Series, and Approximation
The final pillar of undergraduate real analysis concerns sequences and series of functions, culminating in the Weierstrass approximation theorem.
Pointwise vs. Uniform Convergence
A sequence of functions (f_n) converges pointwise to f on D if for each x in D, lim(n to infinity) f_n(x) = f(x). Uniform convergence is stronger: f_n converges uniformly to f if for every epsilon greater than 0, there exists N (independent of x) such that for all n greater than N and all x in D, |f_n(x) - f(x)| less than epsilon.
Classic Example of the Distinction
f_n(x) = x^n on [0, 1]. Pointwise limit: f(x) = 0 for x in [0,1) and f(1) = 1. Each f_n is continuous but the limit is discontinuous. Convergence is NOT uniform — uniform convergence of continuous functions always produces a continuous limit.
Key Theorems on Uniform Convergence
Continuity Preserved
If f_n are continuous on D and f_n converges uniformly to f, then f is continuous on D.
Integration Interchange
If f_n converges uniformly to f on [a, b], then lim(n to infinity) of integral(f_n) = integral of lim(f_n) = integral(f). Uniform convergence allows passing limits through integrals.
Weierstrass M-Test
If |f_n(x)| ≤ M_n for all x and sum(M_n) converges, then sum(f_n) converges uniformly and absolutely. This is the primary tool for establishing uniform convergence of series.
Power Series
A power series centered at a is sum(c_n * (x - a)^n) from n=0 to infinity. Every power series has a radius of convergence R ≥ 0 (possibly infinity) determined by:
1/R = lim sup(|c_n|^(1/n)) (Hadamard formula)
The series converges absolutely for |x - a| less than R and diverges for |x - a| greater than R. On any closed interval [a - r, a + r] with r less than R, convergence is uniform.
Within the radius of convergence, a power series represents an infinitely differentiable function. The derivative and antiderivative are obtained by differentiating/integrating term by term, with the same radius of convergence.
The Weierstrass Approximation Theorem
Theorem
If f is continuous on [a, b] and epsilon greater than 0, then there exists a polynomial p such that |f(x) - p(x)| less than epsilon for all x in [a, b]. In other words, every continuous function on a closed bounded interval can be uniformly approximated by polynomials.
The constructive proof uses Bernstein polynomials: B_n(f, x) = sum from k=0 to n of f(k/n) * C(n,k) * x^k * (1-x)^(n-k). These converge uniformly to f on [0, 1]. The theorem is the foundation of numerical approximation theory and motivates the study of orthogonal polynomials.
Common Taylor Series
e^x
= sum(x^n / n!) = 1 + x + x^2/2! + ...
R = infinity
sin(x)
= sum((-1)^n * x^(2n+1) / (2n+1)!)
R = infinity
ln(1 + x)
= sum((-1)^(n+1) * x^n / n) = x - x^2/2 + ...
R = 1
1/(1 - x)
= sum(x^n) = 1 + x + x^2 + ...
R = 1
Frequently Asked Questions
What is the completeness axiom of the real numbers?▼
What is a Cauchy sequence and why does it matter?▼
How does the epsilon-delta definition of continuity work?▼
What is the difference between pointwise and uniform convergence?▼
What are the main tests for series convergence?▼
What does the Fundamental Theorem of Calculus state?▼
What is the Weierstrass Approximation Theorem?▼
Essential Theorems at a Glance
| Theorem | Hypothesis | Conclusion |
|---|---|---|
| Completeness | S non-empty, bounded above | sup S exists in R |
| Bolzano-Weierstrass | Bounded sequence | Has convergent subsequence |
| Extreme Value Thm | f continuous on [a,b] | f attains max and min |
| IVT | f continuous on [a,b] | f takes all intermediate values |
| Heine-Cantor | f continuous on [a,b] | f is uniformly continuous |
| MVT | f continuous on [a,b], diff on (a,b) | f'(c) = (f(b)-f(a))/(b-a) for some c |
| FTC Part 1 | f continuous | d/dx(integral of f) = f |
| FTC Part 2 | F' = f, f continuous | integral(a to b) f = F(b) - F(a) |
| Weierstrass Approx | f continuous on [a,b] | Uniformly approx by polynomials |
Ready to Test Your Real Analysis Skills?
Practice epsilon-delta proofs, convergence tests, and integration problems with our adaptive math tool. Get instant feedback on your reasoning and build confidence for your analysis exams.
Start Practicing NowCovers all topics from this guide — sequences, series, continuity, and integration.