m (New page: This is my favorite theorem.) |
|||
Line 1: | Line 1: | ||
− | + | Any two nonzero integers <math>a</math> and <math>b</math> have a greatest common divisor, which can be expressed as the smallest positive linear combination of <math>a</math> and <math>b</math>. Moreover, an integer is a linear combination of <math>a</math> and <math>b</math> if and only if it is a multiple of their greatest common divisor. | |
+ | |||
+ | The greatest common divisor of two numbers can be computed by using a procedure known as the Euclidean algorithm. First, note that if <math>a</math> is not equal to 0 and <math>b|a</math>, then gcd<math>(a,b)=|b|</math>. The next observation provides the basis for the Euclidean algorithm. If <math>a=bq+r</math>, then <math>(a,b)=(b,r)</math>. Thus given integers <math>a>b>0</math>, the Euclidean algorithm uses the division algorithm repeatedly to obtain | ||
+ | |||
+ | <math>a=bq1+r1</math>, with <math>0\leq\,\!r1<b</math> <br> | ||
+ | <math>b=r1q2+r2</math>, with <math>0\leq\,\!r2<b</math> <br> | ||
+ | etc. | ||
+ | |||
+ | Since <math>r1>r2>\ldots\,\!</math> , the remainders get smaller and smaller, and after a finite number of steps we obtain a remainder <math>rn+1=0</math>. Thus, gcd<math>(a,b)</math> = gcd<math>(b,r1)=\ldots\,\!=rn</math>. |
Revision as of 18:23, 22 January 2009
Any two nonzero integers $ a $ and $ b $ have a greatest common divisor, which can be expressed as the smallest positive linear combination of $ a $ and $ b $. Moreover, an integer is a linear combination of $ a $ and $ b $ if and only if it is a multiple of their greatest common divisor.
The greatest common divisor of two numbers can be computed by using a procedure known as the Euclidean algorithm. First, note that if $ a $ is not equal to 0 and $ b|a $, then gcd$ (a,b)=|b| $. The next observation provides the basis for the Euclidean algorithm. If $ a=bq+r $, then $ (a,b)=(b,r) $. Thus given integers $ a>b>0 $, the Euclidean algorithm uses the division algorithm repeatedly to obtain
$ a=bq1+r1 $, with $ 0\leq\,\!r1<b $
$ b=r1q2+r2 $, with $ 0\leq\,\!r2<b $
etc.
Since $ r1>r2>\ldots\,\! $ , the remainders get smaller and smaller, and after a finite number of steps we obtain a remainder $ rn+1=0 $. Thus, gcd$ (a,b) $ = gcd$ (b,r1)=\ldots\,\!=rn $.