## You are here

HomeFrobenius method

## Primary tabs

# Frobenius method

Let us consider the linear homogeneous differential equation

$\sum_{{\nu=0}}^{n}k_{\nu}(x)y^{{(n-\nu)}}(x)\;=\;0$ |

of order $n$. If the coefficient functions $k_{\nu}(x)$ are continuous and the coefficient $k_{0}(x)$ of the highest order derivative does not vanish on a certain interval (resp. a domain in $\mathbb{C}$), then all solutions $y(x)$ are continuous on this interval (resp. domain). If all coefficients have the continuous derivatives up to a certain order, the same concerns the solutions.

If, instead, $k_{0}(x)$ vanishes in a point $x_{0}$, this point is in general a singular point. After dividing the differential equation by $k_{0}(x)$ and then getting the form

$y^{{(n)}}(x)+\sum_{{\nu=1}}^{n}c_{\nu}(x)y^{{(n-\nu)}}(x)\;=\;0,$ |

some new coefficients $c_{\nu}(x)$ are discontinuous in the singular point. However, if the discontinuity is restricted so, that the products

$(x-x_{0})c_{1}(x),\quad(x-x_{0})^{2}c_{2}(x),\quad\ldots,\quad(x-x_{0})^{n}c_{% n}(x)$ |

are continuous, and even analytic in $x_{0}$, the point $x_{0}$ is a regular singular point of the differential equation.

We introduce the so-called Frobenius method for finding solution functions in a neighbourhood of the regular singular point $x_{0}$, confining us to the case of a second order differential equation. When we use the quotient forms

$(x-x_{0})c_{1}(x)\;:=\;\frac{p(x)}{r(x)},\quad(x-x_{0})^{2}c_{2}(x)\;:\;=\frac% {q(x)}{r(x)},$ |

where $r(x)$, $p(x)$ and $q(x)$ are analytic in a neighbourhood of $x_{0}$ and $r(x)\neq 0$, our differential equation reads

$\displaystyle(x-x_{0})^{2}r(x)y^{{\prime\prime}}(x)+(x-x_{0})p(x)y^{{\prime}}(% x)+q(x)y(x)\;=\;0.$ | (1) |

Since a simple change $x\!-\!x_{0}\mapsto x$ of variable brings to the case that the singular point is the origin, we may suppose such a starting situation. Thus we can study the equation

$\displaystyle x^{2}r(x)y^{{\prime\prime}}(x)+xp(x)y^{{\prime}}(x)+q(x)y(x)\;=% \;0,$ | (2) |

where the coefficients have the converging power series expansions

$\displaystyle r(x)\;=\;\sum_{{n=0}}^{\infty}r_{n}x^{n},\quad p(x)\;=\;\sum_{{n% =0}}^{\infty}p_{n}x^{n},\quad q(x)\;=\;\sum_{{n=0}}^{\infty}q_{n}x^{n}$ | (3) |

and

$r_{0}\;\neq\;0.$ |

In the Frobenius method one examines whether the equation (2) allows a series solution of the form

$\displaystyle y(x)\;=\;x^{s}\sum_{{n=0}}^{\infty}a_{n}x^{n}\;=\;a_{0}x^{s}+a_{% 1}x^{{s+1}}+a_{2}x^{{s+2}}+\ldots,$ | (4) |

where $s$ is a constant and $a_{0}\neq 0$.

Substituting (3) and (4) to the differential equation (2) converts the left hand side to

$\displaystyle[r_{0}s(s\!-\!1)\!+\!p_{0}s\!+\!q_{0}]a_{0}x^{s}+$ | |||

$\displaystyle[[r_{0}(s\!+\!1)s\!+\!p_{0}(s\!+\!1)\!+\!q_{0}]a_{1}\!+\![r_{1}s(% s\!-\!1)\!+\!p_{1}s\!+\!q_{1}]a_{0}]x^{{s+1}}+$ | |||

$\displaystyle[[r_{0}(s\!+\!2)(s\!+\!1)\!+\!p_{0}(s\!+\!2)\!+\!q_{0}]a_{2}\!+\!% [r_{1}(s\!+\!1)s\!+\!p_{1}(s\!+\!1)\!+\!q_{1}]a_{1}\!+\![r_{2}s(s\!-\!1)\!+\!p% _{2}s\!+\!q_{2}]a_{0}]x^{{s+2}}\!+\ldots$ |

Our equation seems clearer when using the notations $f_{\nu}(s):=r_{\nu}{s}(s\!-\!1)+p_{\nu}{s}+q_{n}u$:

$\displaystyle f_{0}(s)a_{0}x^{s}+[f_{0}(s\!+\!1)a_{1}+f_{1}(s)a_{0}]x^{{s+1}}+% [f_{0}(s\!+\!2)a_{2}+f_{1}(s\!+\!1)a_{1}+f_{2}(s)a_{0}]x^{{s+2}}+\ldots\;=\;0$ | (5) |

Thus the condition of satisfying the differential equation by (4) is the infinite system of equations

$\displaystyle\begin{cases}f_{0}(s)a_{0}\;=\;0\\ f_{0}(s\!+\!1)a_{1}+f_{1}(s)a_{0}\;=\;0\\ f_{0}(s\!+\!2)a_{2}+f_{1}(s\!+\!1)a_{1}+f_{2}(s)a_{0}\;=\;0\\ \cdots\qquad\cdots\qquad\cdots\end{cases}$ | (6) |

In the first place, since $a_{0}\neq 0$, the indicial equation

$\displaystyle f_{0}(s)\equiv r_{0}s^{2}+(p_{0}-r_{0})s+q_{0}\;=\;0$ | (7) |

must be satisfied. Because $r_{0}\neq 0$, this quadratic equation determines for $s$ two values, which in special case may coincide.

The first of the equations (6) leaves $a_{0}\,(\neq 0)$ arbitrary. The next linear equations in $a_{n}$ allow to solve successively the constants $a_{1},\,a_{2},\,\ldots$ provided that the first coefficients $f_{0}(s\!+\!1)$, $f_{0}(s\!+\!2),$ $\ldots$ do not vanish; this is evidently the case when the roots of the indicial equation don’t differ by an integer (e.g. when the roots are complex conjugates or when $s$ is the root having greater real part). In any case, one obtains at least for one of the roots of the indicial equation the definite values of the coefficients $a_{n}$ in the series (4). It is not hard to show that then this series converges in a neighbourhood of the origin.

For obtaining the complete solution of the differential equation (2) it suffices to have only one solution $y_{1}(x)$ of the form (4), because another solution $y_{2}(x)$, linearly independent on $y_{1}(x)$, is gotten via mere integrations; then it is possible in the cases $s_{1}\!-\!s_{2}\in\mathbb{Z}$ that $y_{2}(x)$ has no expansion of the form (4).

# References

- 1 Pentti Laasonen: Matemaattisia erikoisfunktioita. Handout No. 261. Teknillisen Korkeakoulun Ylioppilaskunta; Otaniemi, Finland (1969).

## Mathematics Subject Classification

15A06*no label found*34A05

*no label found*

- Forums
- Planetary Bugs
- HS/Secondary
- University/Tertiary
- Graduate/Advanced
- Industry/Practice
- Research Topics
- LaTeX help
- Math Comptetitions
- Math History
- Math Humor
- PlanetMath Comments
- PlanetMath System Updates and News
- PlanetMath help
- PlanetMath.ORG
- Strategic Communications Development
- The Math Pub
- Testing messages (ignore)

- Other useful stuff
- Corrections