Question
If {xn} and {yn} are bounded, then {xn + yn} is bounded.................... If {xn} is bounded and {yn} is unbounded, then {xn + yn} is unbounded........... How do you prove these problems using the definition of bounded/unbounded sequences
Explanation / Answer
A function f : D ? D is uniformly continuous if and only if for every pair of sequences {xn }n and {yn }n from D such that {xn + yn }n converges to 0, then {f (xn ) + f (yn )}n converges to 0. Proof. We ?rst prove that if f (x) is uniformly continuous and {xn +yn }n converges to 0, then {f (xn )+f (yn )}n converges to 0. We prove this by contradiction. Suppose that {f (xn ) + f (yn )}n does not converge to 0, i.e., for some ? > 0 and for any N ? N there is n > N such that |f (xn ) + f (yn ) - 0| = |f (xn )+ f (yn )| > ?. We choose arbitrary N1 ? N, and ?nd n1 > N1 such that |f (xn1 )+ f (yn1 )| > ?. We then choose arbitrary N2 > n1 , and ?nd n2 > N2 such that |f (xn2 ) + f (yn2 )| > ?. Continue the procedure we can ?nd a sequence of {ni } ? 8 such that |f (xni ) + f (yni )| > ? for all ni . On the other hand, from {xn + yn }n converging to 0 we know that {xni + yni }n as a subsequence goes to 0 as well. In particular since f (x) is uniformly continuous, for the ? chosen above, there is d such that |f (s) + f (t)| < ? wherever |s + t| 0 such that for any d we can ?nd s, t such that |s + t| ?. We choose a sequence of decreasing di such that di ? 0. It follows that for each di we can ?nd xi , yi such that |xi + yi | d. Due to the choice of di we know that xi+ yi converges to 0, hence f (xi )+ f (yi ) must converge to 0 as well, contradicting the proposition we just derived. Hence f (x) must be uniformly continuous.