1. ## Taylor series MCQ

Consider this onteresting Taylor Series problemmore than one may be correct)

Which of the following are true statements?

A.Any function f(x) is equal to its Taylor series for all real x.

B.There exist functions f(x) which are equal to their Taylor series for all real x.

C.There exist functions f(x) which are equal to their Taylor series for some, but not all, real numbers x.

D.A function f(x) can never equal its Taylor series. The Taylor series is only ever an approximation to the function.

A is definitely wrong.
D should be wrong as Taylor series converges to f(x) in proper situation.

C is OK...
B--->may be like expansion of sin x or e^x etc...

Please check my work and give suggestion.

2. Well, the taylor series for sin x doesn't converge for all x.

But otherwist I think you're right.

3. Well, the taylor series for sin x doesn't converge for all x.
Then,I am in a serious mistake...Please tell me why doesn't it converge?

4. Originally Posted by neelakash
Then,I am in a serious mistake...Please tell me why doesn't it converge?
Ack you're right. I thought it only converged for |x| < pi/2 or something, but I just looked it up and I am wrong

Apologies!

5. Plus the Taylor series of e^x converges, and you can write e^x in terms of sine and cosine.

Damn. Think Ben!

6. Originally Posted by neelakash
Then,I am in a serious mistake...Please tell me why doesn't it converge?
It does.

http://dotancohen.com/eng/taylor-sine.php

The polynomial here has degree 21.

7. You have the right answers, B and C. D is just as wrong as A. If the Taylor series for a smooth function f(x) converges at some point x, it converges to f(x). The Taylor series is not an approximation for a smooth function. It is an alternate representation of the function over the series' interval of convergence. A truncated Taylor series, on the other hand, is an approximation--although not always a good approximation. For example, truncating the Taylor series for atan(x) yields a very lousy approximation.

8. The Taylor expansion of f(x) when f(x) is a polynomial of finite degree is equal to the function f(x) itself for all x.

9. That's a necessary but rather trivial result. Like a finite-degree polynomial, the Taylor expansions for functions like $f(x)=e^x$ and $f(x)=\sin x$ converge for all x. Unlike a finite-degree polynomial, these Taylor series do not terminate after a finite number of terms. On the other hand, the Taylor expansions for functions like $f(x)=\frac 1 {1-x}$ and $f(x)=\frac 1 {x^2+1}$ about $x=0$ (or any other number) have a finite interval of convergence. Why is this?

The function $f(x)=\frac 1 {1-x}$ has a pole at $x=1$. The Taylor expansion can't go beyond this pole. What negative values of x? The Taylor expansion fails to converge for $x<-1$ as well. The existence of a pole on one side of the expansion point appears to limit the range on the other side of the expansion point.

The function $f(x)=\frac 1 {x^2+1}$ is well-defined for all real x, yet the Taylor expansion about x=0 only converges for $x\in(-1,1)$. While this function is well-defined for all real x, it does have poles at $x=\pm\,i$. The distance between the expansion point (the origin) and the poles is equal to the distance from the expansion point to the ends of the convergence interval. The existence of complex poles appear to affect the behavior of the real Taylor expansion! This is indeed the case. The Taylor expansion of a complex function about some point has a radius of convergence defined by the distance from the expansion point to the nearest pole.

The Taylor series for finite polynomials and for functions like $f(x)=e^x$ and $f(x)=\sin x$ converge for all x (real or complex) because these functions don't have any poles. These functions are analytic at every point on the complex plane.

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•