Revision as of 16:21, 16 October 2017 by Wen93 (Talk | contribs)

The Existence and Uniqueness Theorem for Solutions to ODEs

A slecture by Yijia Wen

2.0 Abstract

Before starting this tutorial, you are supposed to be able to:

· Find an explicit solution for $ \frac{dy}{dt}=f(t) $. This is the same thing as finding the integral of $ f(t) $ with respect to $ t $.

· Know the difference between general solution and a solution satisfying the initial conditions.

· Check one function is a solution to an ODE.

· Distinguish ODE and PDE, know the usual notations.

· Know the basic concepts of ODEs (order, linearity, homogeneity, etc).




2.1 Concept

From the first example from 1.1, here we still suppose that we had a linear equation $ ax+b=0 $ with respect to $ x $.

· When $ a=0 $, $ b≠0 $, there is no solution.

· When $ a≠0 $, there is one solution $ x=-\frac{b}{a} $.

· When $ a=b=0 $, there are infinitely many solutions to this linear equation.

Similarly, an ODE may also have no solution, a unique solution or infinitely many solutions. The existence theorem is used to check whether there exists a solution for an ODE, while the uniqueness theorem is used to check whether there is one solution or infinitely many solutions.


2.2 Existence Theorem

First we are going to define the continuity of a function. Similar with what we have learnt in Calculus 1 but replaced by a 2-variable function, $ f(t,y) $ is continuous at the point $ (t_0, y_0) $, if it is defined and its limit exists here.

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang