4. Taylor series

Taylor’s Theorem

Taylor’s Theorem is a very powerful generalization of Lagrange Theorem which we have studied in the previous term. It gives us a possibility to approximate a function with polynomials.

Let k\in\mathbb{N} and f\colon\mathbb{R}\to\mathbb{R} has all derivatives, up to (k+1)-th. Letx_0\in\mathbb{R}, then:

    \[f(x)=f(x_0)+\frac{f'(x_0)}{1!}(x-x_0)+\frac{f''(x_0)}{2!}(x-x_0)^2+\frac{f'''(x_0)}{3!}(x-x_0)^3+\]

    \[+\ldots+\frac{f^{(k)}(x_0)}{k!}(x-x_0)^k+\frac{f^{(k+1)}(\theta)}{(k+1)!}(x-x_0)^{k+1},\]

where \theta is some number between x and x_0.

Sum f(x_0)+\frac{f'(x_0)}{1!}(x-x_0)+\frac{f''(x_0)}{2!}(x-x_0)^2+\frac{f'''(x_0)}{3!}(x-x_0)^3+ \ldots+\frac{f^{(k)}(x_0)}{k!}(x-x_0)^k will be called k-th order Taylor polynomial of f in point x_0, and R_k(x)=\frac{f^{(k+1)}(\theta)}{(k+1)!}(x-x_0)^{k+1} is the k-th remainder term.

Calculating approximations

Taylor’s Theorem enables calculating some approximate values of a function in a given point. E.g., \cos 0.01. Its second order Taylor polynomial in 0 is 1+0+\frac{-1}{2!}x^2, which for x=0.01 is 1-\frac{0,0001}{2}=0,99995 and it is \cos 0.01 up to the remainder term, so the possible error is not greater than \left|\frac{-\sin\theta}{3!}(0,01)^3\right|\leq\frac{0,000001}{6}.

Taylor series

Therefore, if on a given interval R_n(x)\to 0, then we can write f as a sum of a series. E.g. for e^x R_n is convergent on the whole real line, so the Taylor series of e^x in point 1 (it is easy because e^{(k)}(x)=e^x) is:

    \[e^x=e+\frac{e}{1!}(x-1)+\frac{e}{2!}(x-1)^2+\ldots=\sum_{n=1}^{\infty}\frac{e(x-1)^n}{n!}.\]

Such a series but for x_0=0 is called a Maclaurin series.