User:TakuyaMurata/Continuous functions on a compact space

From Wikibooks, open books for an open world
Jump to: navigation, search

In this section, we will undergo a throughout study of C(K), the space of all real-valued or complex-valued functions on a compact space. The most important result in the line of this study is Ascoli's theorem and Stone-Weierstrass theorem. For simplicity, we assume functions in C(K) are real-valued. (A discussion will be given later as to why this does not diminish the generality.)

As usual, we topologizes C(K) by the norm \| \cdot \| = \sup | \cdot |. To say that C(K) is complete is precisely:
2. Lemma The limit of a uniformly convergent sequence of continuous functions is continuous.
Proof: Suppose f_n \in C(K) is a sequence such that \sup_K |f_n - f| \to 0 for some function f defined on K. For any x \in K, by the iterated limit theorem,

\lim_{y \to x} f(y) = \lim_{n \to 0} \lim_{y \to x} f_n(y) = \lim_{n \to 0} f_n(x) = f(x). \square

Hence, C(K) is a Banach space. (For the definition and basic results of Banach spaces, see Functional Analysis.)

2 Theorem (Ascoli) Let \Gamma \subset C(K). Then  \Gamma is relatively compact if and only if

  • (i) Given an \epsilon > 0 and x \in K, we can find a neighborhood G of x such that
    |f(y) - f(x)| < \epsilon for every y \in G and every f \in \Gamma
  • (ii) \sup_\Gamma | \cdot (x)| < \infty for every x \in K

Proof: First, assume that \Gamma is relatively compact. (ii) is then obvious. For (i), let  \epsilon > 0 and x \in K be given. For each f \in \Gamma, by continuity, we can find a neighborhood G_f of x such that:

|f(y) - f(x)| < \epsilon for every y \in G_f.

Since \Gamma is relatively compact, \Gamma contains a finite subset \gamma such that \Gamma is the union of the sets of the form

\{ f; f \in \Gamma, \sup_K |f - g| < \epsilon / 3 \}

over g \in \gamma. Let G be the intersection of G_g over g \in \gamma. Then for every f \in \Gamma, there is g \in \gamma with \sup_K |f - g| < \epsilon / 3, and so:

|f(y) - f(x)| \le |f(y) - g(y)| + |g(y) - g(x)| + |g(x) - f(x)| < \epsilon for any y \in G.

This proves (i). Next, suppose E satisfies (i) and (ii). To show that \Gamma is totally bounded, let \epsilon > 0 be given. For each x \in K, by (i), we can find a neighborhood G_x of x such that:

|f(y) - f(x)| < \epsilon / 3 for every y \in G_x and f \in \Gamma.

Since K is compact, we can find z_1, ... z_n \in K such that K is the union of G_{z_j} over j = 1, 2, ... n. Let

A = \{ (f(z_1), f(z_2), ... f(z_n)); f \in \Gamma \}.

By (ii), A is a bounded (thus totally bounded) subset of \mathbf{R}^n. That means that \Gamma contains a finite subset \gamma such that:

A \subset \bigcup_{g \in \gamma} \{ (t_1, ... t_n); t_j \in \mathbf{R}, \max_j | t_j - g(z_j) | < \epsilon / 3 \}

It now follows: given f \in \Gamma, we can find g \in \gamma such that:

\max_j | f(z_j) - g(z_j) | < \epsilon / 3.

Then, for each x \in K, since x \in G_{z_k} for some z_k,

|f(x) - g(x)| \le |f(x) - f(z_k)| + |f(z_k) - g(z_k)| + |g(z_k) - g(x)| < \epsilon

In other words, \sup_K |f - g| < \epsilon. Hence, \Gamma is totally bounded, or equivalently, relatively compact. \square

2 Corollary Let f_n \in C(K). f_n is uniformly convergent if and only if it is pointwise convergent and equicontinuous.

2 Theorem Let f_n converge pointwise to f \in C(K). If {f_n}' exists and converges uniformly to g, then f_n converges uniformly to f. Moreover, f is differentiable and its derivative is g. Proof: Let M = \sup \{ |f_n'(x)| | n \ge 1, x \in K \}. M is finite by uniform convergence. By the mean value theorem,

|f_n(y) - f_n(x)| \le M|y-x|

Thus, f_n is equicontinuous and converges uniformly by Ascoli's thoerem (or one of its corollaries.) \square

2 Theorem Let \Gamma be an equicontinous set of real-valued functions on \mathbf{R}. If \sup_{f \in \Gamma}|f(0)| = b < \infty, then there exist an a such that:

\sup_{f \in \Gamma} |f(x)| \le a|x| + b + 1

Proof: Let \delta > 0 be such that:

|f(x) - f(y)| < 1 for all f \in \Gamma and |x - y| < 2\delta

( 2\delta isn't a typo; it is meant to simplify the computation.) Let x > 0 be fixed. Then, for any f \in \Gamma,

|f(x)| \le |f(x) - f(0)| + |f(0)|,

and we estimate:

|f(0) - f(x)| \le \sum_{k=0}^{n-1} |f(k\delta) - f((k+1)\delta)| + |f(n\delta) - f(x)| \le n+1

where n is such that n\delta < x \le (n+1)\delta. Thus,

|f(x)| \le \delta^{-1}|x| + 1 + b

Since we can get the same estimate for x < 0, the proof is complete.\square

2 Corollary (Dini's theorem) Let f_n \in \mathcal{C}(K) be a sequence such that f_n(x) \to f(x) for every x \in K. If f_n is increasing, then f_n \to f.
Proof: Set g_n = f - f_n. Then g_n is decreasing and thus satisfies the hypothesis of Ascoli's theorem. Hence, g_n admits a convergent subsequence, which converges to 0 since the subsequence converges pointwise to 0. Since g_n is decreasing, g_n converges as well. \square

2 Theorem Suppose K is a metric space. Then \Gamma \subset C(K) is equicontinuous if and only if for every \epsilon > 0 there exists \delta > 0 such that

|f(x) - f(y)| < \epsilon

for every f \in \Gamma and x, y \in K with |x - y| < \delta.
Proof: (\Leftarrow) holds vacuously. For the converse, let \epsilon > 0 be given. Then for each x \in K, we can find \delta_x such that

|x - y| < \delta_x implies |f(x) - f(y)| < \epsilon / 2 for every f \in \Gamma

By compactness, we find x_1, x_2, ... x_n \in K such that:

K \subset \bigcup_j B(x_j, \delta_{x_j} / 2)

Let \delta = \min \{ \delta_{x_1}, \delta_{x_2}, ... \delta_{x_n} \} / 2, and then suppose we are given x, y \in K with |x - y| < \delta. It follows: there is a j with |x - x_j| < \delta_{x_j} / 2. Since

|y - x_j| \le |y - x| + |x - x_j| < \delta + \delta_{x_j} / 2 \le \delta_{x_j},

we have:

|f(x) - f(y)| \le |f(x) - f(x_j)| + |f(x_j) - f(y)| < \epsilon

for every f \in \Gamma. \square

The Stone-Weierstrass theorem states that polynomials are dense in C(K, R). It is however not the case that the space of polynomials in z is a dense in C(K, C). If it were, we have the equality in the below

\overline{P(K)} \subset A(K) \subset C(K)

But A(K) \ne C(K) if K has nonempty interior.

2 Theorem (intermediate value theorem) A function f:[a, b] \to \mathbf{R} is continuous if and only if

  • (i) If f(a) < c < f(b), then c is in f((a, b)).
  • (ii) If f^{-1}(\{c\}) is closed for every real c.

Proof: (\Rightarrow) Obvious. (\Leftarrow) Suppose f(x) < c. Since the complement of  f^{-1}(\{c\}), which contains x, is open, we have: f \ne c in some interval U in [a, b] containing x. We actually have: f < c in U. In fact, if y \in U and c < f(y), then f(x) < c < f(y), which implies U contains a point z such that f(z) = c, a contradiction. Hence, f is upper semicontinous at x. The same argument applied to -f shows that f is also lower semicontinous at x. \square

2 Theorem Let f be a real-valued continuous function on an open interval. Then the following are equivalent.

  • (i) f is injective.
  • (ii) f is strictly monotonic.
  • (iii) f is an open mapping.

Proof: (ii) \Rightarrow (i) is obvious. (iii) \Rightarrow (ii): If (ii) is false, then we can assume there exists a < c < b such that f(a) < f(c) and f(c) > f(b). By continuity and compactness, f attains a maximum in some point x in [a, b] but by hypothesis x \in (a, b) and so f(x) is a non-interior point of f((a, b)), falsifying (iii). If (iii) is false, then (a, b) contains a x such that f(x) is not an interior point of f( (a, b) ). Since f( (a, b) ) is an interval, we may assume that \sup_{ (a, b) } f = f(x). It then follows from the intermediate value theorem that f is not injective. \square

2 Theorem (mean value theorem) Suppose f \in C([a, b], \mathbf{R}) is differentiable on the open interval (a, b). Then

f(b) - f(a) = f'(c)(b - a)

for some c \in (a, b)
Proof: FIrst assume 0 = f(a) = f(b). By the theorem preceding this one, f attains a maximum or minimum at x \in (a, b); say, a maximum. By definition, we can write:

f(y) = f(x) + f'(x)(y - x) + o(|y-x|) as y \to x

Then since x is a maximum,

0 \ge f(y) - f(x) = f'(x)(y - x) + o(|y-x|).

If y > x,

0 \ge f'(x) + O(|y-x|)

and letting x \to 0 gives that f'(x) \le 0. If y < x, then by the same argument, we find that f'(x) \ge 0. Thus, f'(x) = 0. For the general case, let

g(x) = f(x) - rx where r = {f(b) - f(a) \over b - a}.

Then g(a) = g(b) = 0. Hence, applying the first part of the proof gives: g'(c) = 0 for some c. Since 0 = g'(c) = f'(c) - r, c is a solution of the equation. \square

2 Corollary Let f:\mathbf{R} \to \mathbf{R} be a differentiable function. If f'(a) < c < f'(b), then c is in f'((a, b)).
Proof:Let g(x) = f(x) - cx. Then g'(a) < 0 and g'(b) < 0. In other words, g is increasing at a and decreasing at b. By the theorem above, g is not injective on (a, b); i.e., g(x) = g(x') for some x \ne x' in (a, b). It follows:

0 = g(x) - g(x') = g'(y)(x - x') for some y \in (a, b).

and g'(y) = 0. \square

2 Corollary A strictly monotonic continuous function with closed range is a homeomorphism.

A real-valued function f on \mathbf{R}^n is said to be homogeneous of degree k (k could be any real number) if

f(tx) = t^k f(x)

for all x \in \mathbf{R}^n and all t > 0.

2 Theorem (Euler's relation) Let f: \mathbf{R}^n \to \mathbf{R} be a function differentiable on \mathbf{R}^n \backslash \{0\}. Then f is homogeneous of degree k if and only if:

\left( \sum_j x_j \partial_j - k \right) f(x) = 0

for all x \in \mathbf{R}^n.
Proof: (\Rightarrow) Differentiate with respect to t both sides of f(tx) = t^k f(x) and then put t = 1. (\Leftarrow) Note

(f(tx))' = \sum_j x_j {\partial_j f}(tx) = {k \over t} f(tx).

Thus, if we let

g(t) = \log \left| {f(tx) \over t^k f(x)} \right| = \log |f(tx)| - k\log t - \log|f(x)|,

then the derivative of g vanishes identically. Since g(1) = 0, g is identically zero. \square

The theorem permits a generalization. By definition, any (distribution) solution of the equation

\left( \sum_j x_j \partial_j - k \right) f = 0

is said to be homogeneous of degree k.

A homogeneous distribution is tempered and its Fourier transform is homogeneous.

2 Theorem (l'Hôpital's rule) Let 0 \le a \le \infty. If f(x), g(x) \to 0 as x \to a or if g(x) \to \infty as x \to a, then

\lim_{x \to a} {f(x) \over g(x)} = \lim_{x \to a} {f'(x) \over g'(x)}

provided the limit in the right-hand side exists.
Proof: First assume a is finite. We may redefine f(a) = g(a) = 0 (since the values of functions at a are immaterial when we compute the limit.) Fix x, and define

h(y) = f(y)g(x) - f(x)g(y)

Since h(x) = 0 = h(a), by the mean value theorem, we can find a c between x and a such that

0 = h'(c) = f'(c)g(x) - f(x)g'(c)

Since when x is close to a we may assume g' never vanishes,

{ f(x) \over g(x) } = { f'(c) \over g'(c) }

Since c \to a as x \to a, the proof of this case is complete. (TODO: handle other cases.) \square

The next theorem can be skipped without the loss of continuity, for more general results will later be obtained.

2 Theorem (The Weierstrass approximation theorem) Let f \in C([0, 1]), and define

f_n(x) = \sum_{k=0}^n f(k/n)p^n_k(x) with p^n_k(x) = {n \choose k} x^k (1-x)^{n-k} (w:Bernstein polynomial).

Then f_n \to f uniformly on [0, 1]
Proof [1]: First note that

1 = \sum_{k=0}^n p^n_k(x)

is a partition of unity, by the binomial theorem applied to (x + 1 - x)^n. Moreover, a simple computation gives the identity:

\sum (nx - k)^2 p^n_k(x) = nx(x-1)

It thus follows: for any \delta > 0

|f(x) - f_n(x)| \le \sum_{k : |nx - k| < n\delta} |f(x) - f(k / n)| + {nx(x-1) \over n^2 \delta^2}

Since f is uniformly continuous on [0, 1] by compactness, the theorem now follows. \square

2 Corollary Any continuous function vanishing at infinity is uniformly approximated by Hermite function. (TODO: made the statement more precise and give a proof.)

Example: Let f \in L^2([0, 1]). Let M be the linear span of polynomials. Then M is a dense subspace of L^2([0, 1]) by the above theorem since

\int_0^1 |f - p_n|^2 dx \le \sup|f - p_n|^2 \to 0

2 Theorem If f \in C(\mathbf{R}) is uniformly continuous and integrable, then f \in C_0(\mathbf{R}).
Proof: Define F(x) = \int_{-\infty}^x f(x)dx. That f is integrable means that M = \lim_{x \to \infty} F(x) exists and is finite. Let \epsilon > 0 be given. By uniform continuity, there is a \delta > 0 such that

|f(y) - f(x)| < \epsilon whenever |y-x| < \delta.

Then there is an R > 0 such that

|F(x + \delta) - F(x)| < \delta\epsilon whenever x > R

Now, let x > R be given. By the mean value theorem, we find y such that \delta f(y) = F(x + \delta) - F(x) and |x - y| < \delta. Thus,

|f(x)| \le |f(x) - f(y)| + \delta^{-1}|F(x + \delta) - M | + \delta^{-1}|M - F(x)| < 3\epsilon


Example No function is continuous only on rational points. To see this, let f: \mathbf{R} \to \mathbf{R} be a function, and let E be the set of all points at which f is continuous. It follows immediately from the definition of continuity that E is G_\delta; i.e., it is an intersection of countably many open sets. On the other hand, \mathbf{Q} is not G_\delta.

2 Theorem Let U \subset \mathbf{R} be a nonempty open subset.

\sup_U |f| \le 2 \sqrt{\sup_U |f'| \sup_U |f''|}

Proof: Let A = \sup_U |f'| and B = \sup_U |f'|. We assume A and B are finite; otherwise the inequality is trivial. Given x \in U, we can find h > 0 so that an interval [x, x + h] \subset U. By Taylor's formula,

f(x + h) = f(x) + f'(x)h + f''(x + \theta h)h^2 (where 0 < \theta < 1)

and so:

|f'(x)| \le {A \over h} + Bh

Now, take h = {\sqrt{A} \over \sqrt{B}}. \square

2 Theorem (Whitney extension theorem) Every real-valued L-Lipschitz function on a subset of \mathbf{R}^n is the restriction of a L-Lipschitz function on \mathbf{R}^n.
Proof ([2] pg. 5): Let f be a L-Lipschitz function on a subset A. Define

F(x) = \inf_{y \in A} (f(y) + L|x - y|) \qquad (x \in \mathbf{R}^n)

It is clear that F is L-Lipschitz continuous. \square

Exercise: Every closed set is a zero set of a C^\infty function.