Revision as of 16:00, 16 October 2017 by Wen93 (Talk | contribs)

The Existence and Uniqueness Theorem for Solutions to ODEs

A slecture by Yijia Wen

2.0 Abstract

Before starting this tutorial, you are supposed to be able to:

· Find an explicit solution for $ \frac{dy}{dt}=f(t) $. This is the same thing as finding the integral of $ f(t) $ with respect to $ t $.

· Know the difference between general solution and a solution satisfying the initial conditions.

· Check one function is a solution to an ODE.

· Distinguish ODE and PDE, know the usual notations.

· Know the basic concepts of ODEs (order, linearity, homogeneity, etc).




2.1 Concept

From the first example from 1.1, here we still suppose that we had a linear equation $ ax+b=0 $ with respect to $ x $.

· When $ a=0 $, $ b≠0 $, there is no solution.

· When $ a≠0 $, there is one solution $ x=-\frac{b}{a} $.

· When $ a=b=0 $, there are infinitely many solutions to this linear equation.

Similarly, an ODE may also have no solution, a unique solution or infinitely many solutions.

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett