-
Notifications
You must be signed in to change notification settings - Fork 25
/
ch-approximate.tex
5299 lines (4827 loc) · 169 KB
/
ch-approximate.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\chapter{Functions as Limits} \label{approx:chapter}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\section{Complex numbers}
\label{sec:complexnums}
%mbxINTROSUBSECTION
\sectionnotes{half a lecture}
\subsection{The complex plane}
In this chapter we consider approximation of functions, or in other words
functions as limits of sequences and series.
We will extend some results we already saw to a somewhat more
general setting, and we will look at some completely new results.
In particular, we consider complex-valued functions.
We gave complex numbers as examples before, but
let us start from scratch and properly define the complex number field.
A complex number is just a pair $(x,y) \in \R^2$ on which we define
multiplication (see below).
We call the set the \emph{complex numbers}\index{complex number}
and denote it by $\C$.
We identify $x \in \R$ with $(x,0) \in \C$.
The $x$-axis is then called the \emph{\myindex{real axis}} and the $y$-axis is
called the \emph{\myindex{imaginary axis}}.
As $\C$ is just the plane, we also call the set $\C$ the
\emph{\myindex{complex plane}}.
Define:
\begin{equation*}
(x,y) + (s,t) \coloneqq (x+s,y+t) , \qquad
(x,y) (s,t) \coloneqq (xs-yt,xt+ys) .
\end{equation*}
Under the identification above, we have $0 = (0,0)$ and $1 = (1,0)$. These
two operations make the plane into a field (exercise).
We write a complex number $(x,y)$ as $x+iy$, where we
define\footnote{Note that engineers use $j$ instead of $i$.}
\glsadd{not:imaginary}
\begin{equation*}
i \coloneqq (0,1) .
\end{equation*}
Notice that $i^2 = (0,1)(0,1) = (0-1,0+0) = -1$.
That is, $i$ is a solution to the polynomial equation
\begin{equation*}
z^2+1=0 .
\end{equation*}
From now on, we will not use the notation $(x,y)$ and use only $x+iy$.
See \figureref{fig:complexplane}.
\begin{myfigureht}
\includegraphics{figures/complexplane}
\caption{The points $1$, $i$, $x$, $iy$, and $x+iy$ in the complex
plane.\label{fig:complexplane}}
\end{myfigureht}
We generally use $x,y,r,s,t$ for real values and $z,w,\xi,\zeta$
for complex values, although that is not a hard and fast rule. In
particular, $z$ is often used as a third real variable in $\R^3$.
\begin{defn}
Suppose $z= x+iy$.
We call $x$
the \emph{\myindex{real part}} of $z$, and
we call $y$
the \emph{\myindex{imaginary part}} of $z$. We write
\glsadd{not:realpart}\glsadd{not:imagpart}
\begin{equation*}
\Re\, z \coloneqq x , \qquad
\Im\, z \coloneqq y .
\end{equation*}
Define
\glsadd{not:conj}
\emph{\myindex{complex conjugate}} as
\begin{equation*}
\bar{z} \coloneqq x-iy ,
\end{equation*}
and define \emph{\myindex{modulus}} as
\glsadd{not:modulus}
\begin{equation*}
\sabs{z} \coloneqq \sqrt{x^2+y^2} .
\end{equation*}
\end{defn}
Modulus is the complex analogue of the absolute value and
has similar properties.
For example,
$\sabs{zw} = \sabs{z} \, \sabs{w}$ (exercise).
The complex conjugate is a reflection of the plane across the real axis.
The real numbers are precisely those numbers for which the imaginary
part $y=0$. In particular, they are precisely those numbers which satisfy
the equation
\begin{equation*}
z = \bar{z} .
\end{equation*}
As $\C$ is really $\R^2$, we let the metric on $\C$ be the standard
euclidean metric on $\R^2$.
In particular,
\begin{equation*}
\sabs{z} = d(z,0) , \qquad
\text{and also} \qquad
\sabs{z-w} = d(z,w) .
\end{equation*}
So the topology on $\C$ is
the same exact topology as the standard topology on $\R^2$
with the euclidean metric,
and $\sabs{z}$ is equal to the euclidean norm on $\R^2$.
Importantly, since $\R^2$ is a complete metric space, then
so is $\C$.
As $\sabs{z}$ is the euclidean norm on $\R^2$, we have the
\emph{triangle inequality}\index{triangle inequality!complex numbers}
of both flavors:
\begin{equation*}
\sabs{z+w} \leq \sabs{z}+\sabs{w} \qquad \text{and} \qquad
\big\lvert \sabs{z}-\sabs{w} \big\rvert \leq \sabs{z-w} .
\end{equation*}
The complex conjugate and the modulus are even more intimately related:
\begin{equation*}
\sabs{z}^2 =
x^2+y^2 =
(x+iy)(x-iy) =
z \bar{z} .
\end{equation*}
\begin{remark}
There is no natural ordering on the complex numbers.
In particular,
no ordering that makes the complex numbers into an ordered field.
Ordering is one of the things we lose when we go from real to complex
numbers.
\end{remark}
\subsection{Complex numbers and limits}
Algebraic operations with complex numbers are
continuous because convergence in
$\R^2$ is the same as convergence for each component, and we already know
that the real algebraic operations are continuous. For example,
write $z_n = x_n + i\,y_n$ and
$w_n = s_n + i\,t_n$, and suppose that
$\lim_{n\to\infty} z_n = z = x+i\,y$ and $\lim_{n\to\infty} w_n = w = s+i\,t$.
Let us show
\begin{equation*}
\lim_{n\to\infty} z_n w_n = zw .
\end{equation*}
First,
\begin{equation*}
z_n w_n = (x_ns_n-y_nt_n) + i(x_nt_n+y_ns_n) .
\end{equation*}
The topology on $\C$ is the same as on $\R^2$, and so
$x_n \to x$, $y_n \to y$, $s_n \to s$, and $t_n \to t$.
Hence,
\begin{equation*}
\lim_{n\to\infty} (x_ns_n-y_nt_n) = xs-yt \qquad \text{and} \qquad
\lim_{n\to\infty} (x_nt_n+y_ns_n) = xt+ys .
\end{equation*}
As
$(xs-yt)+i(xt+ys) = zw$,
\begin{equation*}
\lim_{n\to\infty} z_n w_n = zw .
\end{equation*}
Similarly the modulus and the complex conjugate are continuous functions. We
leave the remainder of the proof of the following proposition as an exercise.
\begin{prop} \label{prop:continuityofcomplex}
Suppose $\{ z_n \}_{n=1}^\infty$, $\{ w_n \}_{n=1}^\infty$ are sequences of complex numbers converging
to $z$ and $w$ respectively. Then
\begin{enumerate}[(i)]
\item
$\displaystyle \lim_{n\to \infty} z_n + w_n = z + w$.
\item
$\displaystyle \lim_{n\to \infty} z_n w_n = z w$.
\item
Assuming $w_n \not= 0$ for all $n$ and $w\not= 0$,
$\displaystyle \lim_{n\to \infty} \frac{z_n}{w_n} = \frac{z}{w}$.
\item
$\displaystyle \lim_{n\to \infty} \sabs{z_n} = \sabs{z}$.
\item
$\displaystyle \lim_{n\to \infty} \bar{z}_n = \bar{z}$.
\end{enumerate}
\end{prop}
As we have seen above, convergence in $\C$ is the same as convergence in
$\R^2$. In particular, a sequence in $\C$ converges if and only if
the real and imaginary parts converge. Therefore, feel free to apply
everything you have learned about convergence in $\R^2$, as well as
applying results about real numbers to the real and imaginary parts.
We also need convergence of complex series. Let $\{ z_n \}_{n=1}^\infty$ be a
sequence of complex
numbers. The series
\begin{equation*}
\sum_{n=1}^\infty z_n
\end{equation*}
\emph{converges}\index{converges!complex series} if the limit of partial sums converges, that is, if
\begin{equation*}
\lim_{k\to\infty} \sum_{n=1}^k z_n \qquad \text{exists.}
\avoidbreak
\end{equation*}
A series \emph{converges absolutely}\index{converges absolutely!complex series}
if $\sum_{n=1}^\infty \sabs{z_n}$ converges.
We say a series
is \emph{Cauchy}\index{Cauchy!complex series}
if the sequence of partial sums is Cauchy. The following two
propositions have essentially the same proofs as for real series and we
leave them as exercises.
\begin{prop} \label{prop:cachysercomplex}
The complex series $\sum_{n=1}^\infty z_n$ is Cauchy if for every $\epsilon > 0$,
there exists an $M \in \N$ such that for every $n \geq M$
and every $k > n$, we have
\begin{equation*}
\abs{ \sum_{j={n+1}}^k z_j }
< \epsilon .
\end{equation*}
\end{prop}
\begin{prop} \label{prop:absconvmeansconv}
If a complex series $\sum_{n=1}^\infty z_n$ converges absolutely, then it converges.
\end{prop}
The series $\sum_{n=1}^\infty \sabs{z_n}$ is a real series. All the
convergence tests (ratio test, root test, etc.)\ that talk about
absolute convergence work with the numbers $\sabs{z_n}$, that is, they
are really talking about convergence of series of nonnegative real
numbers.
You
can directly apply these tests
them without needing to reprove anything for complex
series.
\subsection{Complex-valued functions}
When we deal with complex-valued functions
$f \colon X \to \C$, what we often do is to write
$f = u+i\,v$ for real-valued functions $u \colon X \to \R$ and $v \colon X \to
\R$.
Suppose we wish to integrate
$f \colon [a,b] \to \C$. We write
$f = u+i\,v$ for real-valued $u$ and~$v$.
We say that $f$ is \emph{Riemann integrable}\index{Riemann
integrable!complex-valued function}
if $u$ and $v$ are Riemann
integrable, and in this case we define
\begin{equation*}
\int_a^b f \coloneqq \int_a^b u + i \int_a^b v .
\end{equation*}
We make the same definition for every other type of integral (improper,
multivariable, etc.).
Similarly when we differentiate, write $f \colon [a,b] \to \C$ as
$f = u+i\,v$. Thinking of $\C$ as $\R^2$, we say that $f$ is differentiable
if $u$ and $v$ are differentiable. For a function valued in $\R^2$, the derivative
is represented by a vector in $\R^2$. Now a vector in $\R^2$ is a complex
number. In other words,
we write
the
\emph{derivative}\index{derivative!complex-valued function}
as
\glsadd{not:mvder}
\begin{equation*}
f'(t) \coloneqq u'(t) + i \, v'(t) .
\end{equation*}
The linear operator representing the derivative is the multiplication by
the complex number $f'(t)$, so nothing is lost in this identification.
\subsection{Exercises}
\begin{exercise}
Check that $\C$ is a field.
\end{exercise}
\begin{exercise}
Prove that for $z,w \in \C$, we have
$\sabs{zw} = \sabs{z} \, \sabs{w}$.
\end{exercise}
\begin{exercise}
Finish the proof of \propref{prop:continuityofcomplex}.
\end{exercise}
\begin{exercise}
Prove \propref{prop:cachysercomplex}.
\end{exercise}
\begin{exercise}
Prove \propref{prop:absconvmeansconv}.
\end{exercise}
\begin{samepage}
\begin{exercise}
Given $x +iy$ define the matrix
$\left[ \begin{smallmatrix} x & -y \\ y & x \end{smallmatrix} \right]$.
Prove:
\begin{enumerate}[a)]
\item
The action of this matrix on a vector $(s,t)$ is the same
as the action of multiplying $(x+iy)(s+it)$.
\item
Multiplying two such matrices is the same multiplying the underlying complex
numbers and then finding the corresponding matrix for the product.
In other words, the field $\C$ can be identified with a subset of the 2-by-2 matrices.
\item
The matrix
$\left[ \begin{smallmatrix} x & -y \\ y & x \end{smallmatrix} \right]$
has eigenvalues $x+iy$ and $x-iy$. Recall that $\lambda$ is
an eigenvalue of a matrix $A$ if $A-\lambda I$ (a complex matrix in our case)
is not invertible, that is,
if it has linearly dependent rows: one row is a (complex) multiple
of the other.
\end{enumerate}
\end{exercise}
\end{samepage}
\begin{exercise}
Prove the Bolzano--Weierstrass theorem for complex sequences.
Suppose $\{ z_n \}_{n=1}^\infty$ is a bounded sequence of complex numbers, that
is, there exists an $M$ such that $\sabs{z_n} \leq M$ for all $n$. Prove
that there exists a subsequence $\{ z_{n_k} \}_{k=1}^\infty$ that converges to some $z
\in \C$.
\end{exercise}
\begin{exercise}
\leavevmode
\begin{enumerate}[a)]
\item
Prove that there is no simple mean value theorem for complex-valued
functions: Find a differentiable function $f \colon [0,1] \to \C$ such that
$f(0) = f(1) = 0$, but $f'(t) \not= 0$ for all $t \in [0,1]$.
\item
However, there is a weaker form of the mean value theorem as there is for vector-valued
functions. Prove: If $f \colon [a,b] \to \C$ is continuous and differentiable in
$(a,b)$, and for some $M$, $\sabs{f'(x)} \leq M$ for all $x \in (a,b)$, then
$\sabs{f(b)-f(a)} \leq M \sabs{b-a}$.
\end{enumerate}
\end{exercise}
\begin{exercise}
Prove that there is no simple mean value theorem for integrals
for complex-valued
functions: Find a continuous function $f \colon [0,1] \to \C$ such that
$\int_0^1 f = 0$ but $f(t) \not= 0$ for all $t \in [0,1]$.
\end{exercise}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\sectionnewpage
\section{Swapping limits}
\label{sec:swaplim}
%mbxINTROSUBSECTION
\sectionnotes{2 lectures}
\subsection{Continuity}
Let us get back to swapping limits and expand on
\volIref{\chapterref*{vI-fs:chapter} of volume I}{\chapterref{fs:chapter}}.
Let $\{ f_n \}_{n=1}^\infty$ be a sequence
of functions $f_n \colon X \to Y$ for a set $X$ and a metric space $Y$.
Let $f \colon X \to Y$ be a
function and for every $x \in X$, suppose
\begin{equation*}
f(x) = \lim_{n\to \infty} f_n(x) .
\end{equation*}
We say the sequence $\{ f_n \}_{n=1}^\infty$
\emph{\myindex{converges pointwise}}\index{pointwise convergence} to $f$.
For $Y=\C$, a series of functions
\emph{converges pointwise}\index{converges pointwise!complex
series}\index{pointwise convergence!complex series} to $f$ if
for every $x \in X$, we have
\begin{equation*}
f(x) = \lim_{n\to \infty} \sum_{k=1}^n f_k(x) =
\sum_{k=1}^\infty f_k(x) .
\end{equation*}
\medskip
The question is:
If $f_n$ are all continuous, is $f$ continuous? Differentiable?
Integrable? What are the derivatives or integrals of $f$?
For example, for continuity of the pointwise limit of a sequence
of functions
$\{ f_n \}_{n=1}^\infty$, we are asking if
\begin{equation*}
\lim_{x\to x_0} \lim_{n\to\infty} f_n(x)
\overset{?}{=}
\lim_{n\to\infty} \lim_{x\to x_0} f_n(x) .
\end{equation*}
A priori, we do not even know if both sides exist, let alone if they equal each other.
\begin{example}
The functions $f_n \colon \R \to \R$,
\begin{equation*}
f_n(x) \coloneqq \frac{1}{1+nx^2},
\end{equation*}
are continuous and converge pointwise to the discontinuous function
\begin{equation*}
f(x) \coloneqq
\begin{cases}
1 & \text{if } x=0, \\
0 & \text{else.}
\end{cases}
\end{equation*}
\end{example}
So pointwise convergence is not enough to preserve continuity (nor even
boundedness). For that, we need uniform convergence.
Let $f_n \colon X \to Y$ be functions. Then
$\{f_n\}_{n=1}^\infty$ \emph{\myindex{converges uniformly}}\index{uniform convergence}
to $f$ if
for every $\epsilon > 0$, there exists an $M$ such that
for all $n \geq M$ and all $x \in X$, we have
\begin{equation*}
d\bigl(f_n(x),f(x)\bigr) < \epsilon .
\end{equation*}
A series $\sum_{n=1}^\infty f_n$ of complex-valued functions converges uniformly if the sequence of
partial sums converges uniformly, that is, if for every $\epsilon > 0$,
there exists an $M$ such that
for all $n \geq M$ and all $x \in X$,
\begin{equation*}
\abs{\left(\sum_{k=1}^n f_k(x)\right)-f(x)} < \epsilon .
\end{equation*}
The simplest property preserved by uniform convergence is
boundedness. We leave the proof of the following proposition as an
exercise. It is almost identical to the proof for real-valued functions.
\begin{prop} \label{prop:uniformconvbounded}
Let $X$ be a set and $(Y,d)$ a metric space.
If $f_n \colon X \to Y$ are bounded functions and converge uniformly to $f
\colon X \to Y$, then $f$ is bounded.
\end{prop}
If $X$ is a set and $(Y,d)$ is a metric space, then a sequence $f_n \colon X
\to Y$ is \emph{\myindex{uniformly Cauchy}} if for every $\epsilon > 0$, there is an $M$ such that
for all $n, m \geq M$ and all $x \in X$, we have
\begin{equation*}
d\bigl(f_n(x),f_m(x)\bigr) < \epsilon .
\end{equation*}
The notion is the same as for real-valued functions.
The proof of the following proposition is
again essentially the same as in that setting and
is left as an exercise.
\begin{prop} \label{prop:unifcauchymetric}
Let $X$ be a set, $(Y,d)$ be a metric space, and
$f_n \colon X \to Y$ be functions.
If $\{ f_n \}_{n=1}^\infty$ converges uniformly,
then $\{f_n\}_{n=1}^\infty$ is uniformly Cauchy. Conversely, if
$\{f_n\}_{n=1}^\infty$ is uniformly Cauchy and $(Y,d)$ is Cauchy-complete,
then $\{f_n\}_{n=1}^\infty$ converges uniformly.
\end{prop}
For $f \colon X \to \C$, we write\glsadd{not:uniformnorm}
\begin{equation*}
\snorm{f}_X \coloneqq \sup_{x \in X} \sabs{f(x)} .
\end{equation*}
We call $\snorm{\cdot}_X$
the \emph{\myindex{supremum norm}} or \emph{\myindex{uniform norm}},
and the subscript denotes the set over which the supremum is taken.
Then a sequence of functions
$f_n \colon X \to \C$ converges uniformly to $f \colon X \to \C$ if and only if
\begin{equation*}
\lim_{n\to \infty} \snorm{f_n-f}_X = 0 .
\end{equation*}
The supremum norm satisfies the triangle inequality: For every $x \in X$,
\begin{equation*}
\sabs{f(x)+g(x)} \leq
\sabs{f(x)}+\sabs{g(x)} \leq
\snorm{f}_X+\snorm{g}_X .
\end{equation*}
Take a supremum on the left to get
\begin{equation*}
\snorm{f+g}_X \leq
\snorm{f}_X+\snorm{g}_X .
\end{equation*}
For a compact metric space $X$,
the uniform norm is a norm on the vector space $C(X,\C)$.
We leave it as an exercise.
While we will not need it, $C(X,\C)$ is in fact a complex
vector space, that is, in the definition of a vector space we can replace
$\R$ with $\C$.
Convergence in the metric space $C(X,\C)$ is
uniform convergence.
We will study a couple of types of series of functions, and
a useful test for uniform convergence of a series is the
so-called \emph{\myindex{Weierstrass $M$-test}}.
\begin{thm}[\myindex{Weierstrass $M$-test}] \label{thm:weiermtest}
Let $X$ be a set.
Suppose $f_n \colon X \to \C$ are functions and $M_n > 0$ numbers such
that
\begin{equation*}
\sabs{f_n(x)}\leq M_n \quad \text{for all } x \in X,
\qquad \text{and} \qquad
\sum_{n=1}^\infty M_n
\quad \text{converges}.
\end{equation*}
Then
\begin{equation*}
\sum_{n=1}^\infty f_n(x)
\quad \text{converges uniformly}.
\end{equation*}
\end{thm}
Another way to state the theorem is to say that if
$\sum_{n=1}^\infty \snorm{f_n}_X$ converges, then $\sum_{n=1}^\infty f_n$ converges uniformly.
Note that the converse of this theorem is not true.
Applying the theorem to $\sum_{n=1}^\infty \sabs{f_n(x)}$, we
see that this series also converges uniformly. So the series converges both absolutely
and uniformly.
\begin{proof}
Suppose $\sum_{n=1}^\infty M_n$ converges. Given $\epsilon > 0$,
we have that the partial sums of $\sum_{n=1}^\infty M_n$ are Cauchy so
there is an $N$ such that for all $m, n \geq N$ with $m \geq n$, we have
\begin{equation*}
\sum_{k=n+1}^m M_k < \epsilon .
\end{equation*}
We estimate a Cauchy difference of the partial
sums of the functions
\begin{equation*}
\abs{\sum_{k=n+1}^m f_k(x)} \leq
\sum_{k=n+1}^m \sabs{f_k(x)} \leq
\sum_{k=n+1}^m M_k < \epsilon .
\end{equation*}
The series converges by \propref{prop:cachysercomplex}.
The convergence is uniform, as $N$ does not depend on $x$.
Indeed, for all $n \geq N$,
\begin{equation*}
\abs{\sum_{k=1}^\infty f_k(x) - \sum_{k=1}^n f_k(x)} \leq
\abs{\sum_{k=n+1}^\infty f_k(x)} \leq \epsilon .
\qedhere
\end{equation*}
\end{proof}
\begin{example} \label{example:sinnsqfourier}
The series
\begin{equation*}
\sum_{n=1}^\infty \frac{\sin(nx)}{n^2}
\end{equation*}
converges uniformly on $\R$. See \figureref{fig:fouriersern2}.
This series is a Fourier series, and
we will see more of these in a later section. Proof:
The series converges uniformly because
$\sum_{n=1}^\infty \frac{1}{n^2}$
converges and
\begin{equation*}
\abs{\frac{\sin(nx)}{n^2}} \leq
\frac{1}{n^2} .
\end{equation*}
\end{example}
\begin{myfigureht}
\includegraphics{figures/fouriersern2}
\caption{Plot of
$\sum_{n=1}^\infty \frac{\sin(n x)}{n^2}$ including
the first 8 partial sums in various shades of gray.\label{fig:fouriersern2}}
\end{myfigureht}
\begin{example}
The series
\begin{equation*}
\sum_{n=0}^\infty \frac{x^n}{n!}
\end{equation*}
converges uniformly on every bounded interval.
This series is a power series that we will study shortly.
Proof: Take the interval $[-r,r] \subset \R$ (every bounded interval
is contained in some $[-r,r]$).
The series $\sum_{n=0}^\infty \frac{r^n}{n!}$ converges by the ratio test,
so $\sum_{n=0}^\infty \frac{x^n}{n!}$ converges uniformly on $[-r,r]$ as
\begin{equation*}
\abs{\frac{x^n}{n!} } \leq
\frac{r^n}{n!} .
\end{equation*}
\end{example}
Now we would love to say something about the limit. For example, is it
continuous?
\begin{prop} \label{prop:uniformswitch}
Let $(X,d_X)$ and $(Y,d_Y)$ be metric spaces, and suppose $(Y,d_Y)$ is
Cauchy-complete.
Suppose $f_n \colon X \to Y$ converge uniformly to
$f \colon X \to Y$.
Let $\{ x_k \}_{k=1}^\infty$ be a sequence in $X$ and $x \coloneqq \lim_{k \to \infty} x_k$. Suppose
\begin{equation*}
a_n \coloneqq \lim_{k \to \infty} f_n(x_k)
\end{equation*}
exists for all $n$. Then
$\{a_n\}_{n=1}^\infty$ converges and
\begin{equation*}
\lim_{k \to \infty} f(x_k) = \lim_{n\to\infty} a_n .
\end{equation*}
\end{prop}
In other words,
\begin{equation*}
\lim_{k \to \infty} \lim_{n\to\infty} f_n(x_k) =
\lim_{n \to \infty} \lim_{k\to\infty} f_n(x_k) .
\end{equation*}
\begin{proof}
First we show that $\{ a_n \}_{n=1}^\infty$ converges. As
$\{ f_n \}_{n=1}^\infty$ converges uniformly it is uniformly Cauchy.
Let $\epsilon > 0$ be given. There is
an $M$ such that for all $m,n \geq M$, we have
\begin{equation*}
d_Y\bigl(f_n(x_k),f_m(x_k)\bigr) < \epsilon \qquad \text{for all } k .
\end{equation*}
Note that
$d_Y(a_n,a_m) \leq
d_Y\bigl(a_n,f_n(x_k)\bigr) +
d_Y\bigl(f_n(x_k),f_m(x_k)\bigr) +
d_Y\bigl(f_m(x_k),a_m\bigr)$ and take the limit as $k \to \infty$ to find
\begin{equation*}
d_Y(a_n,a_m) \leq \epsilon .
\end{equation*}
Hence $\{a_n\}_{n=1}^\infty$ is Cauchy and converges since $Y$ is complete. Write
$a \coloneqq \lim_{k \to \infty} a_n$.
Find a $k \in \N$ such that
\begin{equation*}
d_Y\bigl(f_k(p),f(p)\bigr) < \nicefrac{\epsilon}{3}
\end{equation*}
for all $p \in X$. Assume $k$ is large enough
so that
\begin{equation*}
d_Y(a_k,a) < \nicefrac{\epsilon}{3} .
\end{equation*}
Find an $N \in \N$ such that for $m \geq N$,
\begin{equation*}
d_Y\bigl(f_k(x_m),a_k\bigr) < \nicefrac{\epsilon}{3} .
\end{equation*}
Then for
$m \geq N$,
\begin{equation*}
d_Y\bigl(f(x_m),a\bigr)
\leq
d_Y\bigl(f(x_m),f_k(x_m)\bigr)
+
d_Y\bigl(f_k(x_m),a_k\bigr)
+
d_Y\bigl(a_k,a\bigr)
<
\nicefrac{\epsilon}{3} +
\nicefrac{\epsilon}{3} +
\nicefrac{\epsilon}{3} = \epsilon . \qedhere
\end{equation*}
\end{proof}
We obtain an immediate corollary about continuity.
If $f_n$ are all continuous then $a_n = f_n(x)$ and
so $\{ a_n \}_{n=1}^\infty$ converges automatically to
$f(x)$ and so we do not require completeness of $Y$.
\begin{cor} \label{cor:metricuniformcontinuous}
Let $X$ and $Y$ be metric spaces.
If $f_n \colon X \to Y$ are continuous functions
such that
$\{ f_n \}_{n=1}^\infty$ converges uniformly to $f \colon X \to Y$,
then $f$ is continuous.
\end{cor}
The converse is not true. Just because the limit is continuous does not mean
that the convergence is uniform. For example:
$f_n \colon (0,1) \to \R$ defined by $f_n(x) \coloneqq x^n$ converge to
the zero function, but not uniformly. However, if we add extra conditions
on the sequence, we can obtain a partial converse such as Dini's theorem,
\volIref{see \exerciseref*{vI-exercise:dinisthm} from volume I}{see \exerciseref{exercise:dinisthm}}.
In \exerciseref{exercise:CXCnormedspace} the reader is asked to prove
that for a compact $X$, $C(X,\C)$ is a normed vector space
with the uniform norm, and hence a metric space. We have just shown that
$C(X,\C)$ is Cauchy-complete: \propref{prop:unifcauchymetric} says that a Cauchy
sequence in $C(X,\C)$ converges uniformly to some function,
and \corref{cor:metricuniformcontinuous} shows that the limit is
continuous and hence in $C(X,\C)$.
\begin{cor}
Let $(X,d)$ be a compact metric space.
Then $C(X,\C)$ is a Cauchy-complete metric space.
\end{cor}
\begin{example}
By \exampleref{example:sinnsqfourier}
the Fourier series
\begin{equation*}
\sum_{n=1}^\infty \frac{\sin(nx)}{n^2}
\end{equation*}
converges uniformly and hence is continuous by \corref{cor:metricuniformcontinuous} (as is visible
in \figureref{fig:fouriersern2}).
\end{example}
\subsection{Integration}
\begin{prop} \label{prop:complexlimitswapintegral}
Suppose $f_n \colon [a,b] \to \C$
are Riemann integrable and suppose that $\{ f_n \}_{n=1}^\infty$ converges
uniformly to $f \colon [a,b] \to \C$. Then $f$ is Riemann integrable
and
\begin{equation*}
\int_a^b f = \lim_{n\to \infty} \int_a^b f_n .
\end{equation*}
\end{prop}
Since the integral of a complex-valued function is just the integral of
the real and imaginary parts separately,
the proof follows directly by the results of
\volIref{\chapterref*{vI-fs:chapter} of volume~I}{\chapterref{fs:chapter}}.
We leave the details as an exercise.
\begin{cor}
\pagebreak[2]
Suppose $f_n \colon [a,b] \to \C$
are Riemann integrable and suppose that
\begin{equation*}
\sum_{n=1}^\infty f_n(x)
\end{equation*}
converges uniformly. Then the series is Riemann integrable on $[a,b]$
and
\begin{equation*}
\int_a^b \sum_{n=1}^\infty f_n(x) \,dx
=
\sum_{n=1}^\infty \int_a^b f_n(x) \,dx
\end{equation*}
\end{cor}
\begin{example}
Let us show how to integrate a Fourier series.
\begin{equation*}
\int_{0}^x \sum_{n=1}^\infty \frac{\cos(nt)}{n^2} \,dt
=
\sum_{n=1}^\infty \int_{0}^x \frac{\cos(nt)}{n^2}\,dt
=
\sum_{n=1}^\infty \frac{\sin(nx)}{n^3}
\end{equation*}
The swapping of integral and sum is possible because of uniform convergence,
which we have proved before using the Weierstrass $M$-test
(\thmref{thm:weiermtest}).
\end{example}
We remark that we can swap integrals and limits under far less stringent hypotheses,
but for that we would need a stronger integral than the Riemann integral.
E.g.\ the Lebesgue integral.
\subsection{Differentiation}
Recall that a complex-valued function
$f \colon [a,b] \to \C$, where $f(x) = u(x)+i\,v(x)$,
is differentiable, if $u$ and $v$ are differentiable
and the derivative is
\begin{equation*}
f'(x) = u'(x)+i\,v'(x) .
\end{equation*}
The proof of the following theorem is to apply the corresponding theorem for
real functions to $u$ and $v$, and is left as an exercise.
\begin{thm} \label{thm:dersconvergecomplex}
Let $I \subset \R$ be a bounded interval and let
$f_n \colon I \to \C$ be continuously differentiable functions.
Suppose $\{ f_n' \}_{n=1}^\infty$ converges uniformly to $g \colon I \to \C$,
and suppose $\{ f_n(c) \}_{n=1}^\infty$ is a
convergent sequence for some $c \in I$. Then $\{ f_n \}_{n=1}^\infty$ converges uniformly to
a continuously differentiable function $f \colon I \to \C$, and $f' = g$.
\end{thm}
Uniform limits of the functions themselves are not enough, and can make
matters even worse. In \sectionref{sec:stoneweier} we will prove that
continuous functions are uniform limits of polynomials, yet as the following
example demonstrates, a continuous function need not be differentiable
anywhere.
\begin{example}
There exist continuous nowhere differentiable functions.
Such functions are often called
\emph{Weierstrass functions}\index{Weierstrass function},
although this
particular one, essentially due to
Takagi\footnote{\href{https://en.wikipedia.org/wiki/Teiji_Takagi}{Teiji
Takagi} (1875--1960) was a Japanese mathematician.}, is a different example than what Weierstrass gave.
Define
\begin{equation*}
\varphi(x) \coloneqq \sabs{x} \qquad \text{for } x \in [-1,1] .
\end{equation*}
Extend $\varphi$ to all of $\R$ by making it
2-periodic:
Decree that
$\varphi(x) = \varphi(x+2)$. The function $\varphi \colon \R \to \R$
is continuous, in fact, $\sabs{\varphi(x)-\varphi(y)} \leq \sabs{x-y}$ (why?).
See \figureref{fig:triangwave}.
\begin{myfigureht}
\includegraphics{figures/triangwave}
\caption{The 2-periodic function $\varphi$.\label{fig:triangwave}}
\end{myfigureht}
As $\sum_{n=0}^\infty {\left(\frac{3}{4}\right)}^n$ converges and $\sabs{\varphi(x)} \leq
1$ for all $x$, by the $M$-test
(\thmref{thm:weiermtest}),
\begin{equation*}
f(x) \coloneqq \sum_{n=0}^\infty
{\left(\frac{3}{4}\right)}^n \varphi(4^n x)
\end{equation*}
converges uniformly and hence is continuous.
See \figureref{fig:nowherediff}.
\begin{myfigureht}
\includegraphics{figures/nowherediff}
\caption{Plot of the nowhere differentiable function $f$.\label{fig:nowherediff}}
\end{myfigureht}
We claim $f \colon
\R \to \R$ is nowhere differentiable.
Fix $x$, and we will show $f$ is not differentiable at $x$.
Define
\begin{equation*}
\delta_m \coloneqq \pm \frac{1}{2} 4^{-m} ,
\avoidbreak
\end{equation*}
where the sign is chosen so that there is no integer
between $4^m x$ and $4^m(x+\delta_m) = 4^m x \pm \frac{1}{2}$.
We want to look at the difference quotient
\begin{equation*}
\frac{f(x+\delta_m)-f(x)}{\delta_m}
=
\sum_{n=0}^\infty
{\left(\frac{3}{4}\right)}^n
\frac{\varphi\bigl(4^n(x+\delta_m)\bigr)-\varphi(4^nx)}{\delta_m} .
\end{equation*}
Fix $m$ for a moment. Consider the expression inside the series:
\begin{equation*}
\gamma_{n} \coloneqq
\frac{\varphi\bigl(4^n(x+\delta_m)\bigr)-\varphi(4^nx)}{\delta_m} .
\end{equation*}
If $n > m$, then $4^n\delta_m$ is an even integer. As $\varphi$
is 2-periodic we get that $\gamma_n = 0$.
As there is no integer between
$4^m(x+\delta_m) = 4^m x\pm\nicefrac{1}{2}$ and $4^m x$, then on this interval
$\varphi(t) = \pm t + \ell$ for some integer $\ell$.
In particular,
$\abs{\varphi\bigl(4^m(x+ \delta_m)\bigr)-\varphi(4^mx)} =
\abs{4^mx\pm\nicefrac{1}{2}-4^mx} = \nicefrac{1}{2}$. Therefore,
\begin{equation*}
\sabs{\gamma_m} =
\abs{
\frac{\varphi\bigl(4^m(x+\delta_m)\bigr)-\varphi(4^mx)}{\pm (\nicefrac{1}{2}) 4^{-m}}
}
= 4^m .
\end{equation*}
Similarly, suppose $n < m$. Since $\sabs{\varphi(s) -\varphi(t)} \leq
\sabs{s-t}$,
\begin{equation*}
\sabs{\gamma_n} =
\abs{\frac{\varphi\bigl(4^nx\pm(\nicefrac{1}{2})4^{n-m}\bigr)-\varphi(4^nx)}{\pm
(\nicefrac{1}{2}) 4^{-m}}}
\leq
\abs{\frac{\pm(\nicefrac{1}{2})4^{n-m}}{\pm (\nicefrac{1}{2}) 4^{-m}}} = 4^n
.
\end{equation*}
And so
\begin{equation*}
\begin{split}
\abs{
\frac{f(x+\delta_m)-f(x)}{\delta_m}
}
%& =
%\abs{
%\sum_{n=0}^\infty
%{\left(\frac{3}{4}\right)}^n
%\frac{\varphi\bigl(4^n(x+\delta_m)\bigr)-\varphi(4^nx)}{\delta_m}
%}
=
\abs{
\sum_{n=0}^\infty
{\left(\frac{3}{4}\right)}^n
\gamma_n
}
& =
\abs{
\sum_{n=0}^m
{\left(\frac{3}{4}\right)}^n
\gamma_n
}
\\
& \geq
\abs{
{\left(\frac{3}{4}\right)}^m
\gamma_m}
-
\abs{
\sum_{n=0}^{m-1}
{\left(\frac{3}{4}\right)}^n
\gamma_n
}
\\
& \geq
3^m
-
\sum_{n=0}^{m-1}
3^n
=
3^m
-
\frac{3^{m}-1}{3-1}
=
\frac{3^m +1}{2} .
\end{split}
\end{equation*}
As $m \to \infty$, we have
$\delta_m \to 0$, but $\frac{3^m+1}{2}$
goes to infinity. So $f$ cannot be differentiable at~$x$.
\end{example}
\subsection{Exercises}
\begin{exercise}
Prove \propref{prop:uniformconvbounded}.
\end{exercise}
\begin{exercise}
Prove \propref{prop:unifcauchymetric}.
\end{exercise}
\begin{exercise} \label{exercise:CXCnormedspace}
Suppose $(X,d)$ is a compact metric space.
Prove that the uniform norm $\snorm{\cdot}_X$ is a norm on the vector space of
continuous complex-valued functions $C(X,\C)$.
\end{exercise}
\begin{exercise}
\pagebreak[2]
\leavevmode
\begin{enumerate}[a)]
\item
Prove that
$f_n(x) \coloneqq 2^{-n} \sin(2^n x)$
converge uniformly to zero, but there exists a dense set $D \subset \R$
such that $\lim_{n\to\infty} f_n'(x) = 1$ for all $x \in D$.
\item
Prove that
$\sum_{n=1}^\infty 2^{-n} \sin(2^n x)$
converges uniformly to a continuous function,
and there exists a dense set $D \subset \R$
where the derivatives of the partial sums do not converge.
\end{enumerate}
\end{exercise}
\begin{exercise}
Prove that $\snorm{f}_{C^1} \coloneqq \snorm{f}_{[a,b]}+\snorm{f'}_{[a,b]}$
is a norm on the vector space of
continuously differentiable complex-valued functions $C^1\bigl([a,b],\C\bigr)$.
\end{exercise}
\begin{exercise}
Prove \thmref{thm:dersconvergecomplex}.
\end{exercise}
\begin{exercise}
Prove \propref{prop:complexlimitswapintegral} by reducing to the real
result.
\end{exercise}
\begin{exercise}
Work through the following counterexample to the converse of
the Weierstrass $M$-test (\thmref{thm:weiermtest}). Define
$f_n \colon [0,1] \to \R$ by
\begin{equation*}
f_n(x) \coloneqq
\begin{cases}
\frac{1}{n} & \text{if } \frac{1}{n+1} < x < \frac{1}{n},\\
0 & \text{else.}
\end{cases}
\end{equation*}
Prove that $\sum_{n=1}^\infty f_n$ converges uniformly, but
$\sum_{n=1}^\infty \snorm{f_n}_{[0,1]}$
does not converge.
\end{exercise}