Revision as of 18:42, 22 January 2009 by Ddesutte (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Any two nonzero integers $ a $ and $ b $ have a greatest common divisor, which can be expressed as the smallest positive linear combination of $ a $ and $ b $. Moreover, an integer is a linear combination of $ a $ and $ b $ if and only if it is a multiple of their greatest common divisor.

The greatest common divisor of two numbers can be computed by using a procedure known as the Euclidean algorithm. First, note that if $ a\neq\,\!0 $ and $ b|a $, then gcd$ (a,b)=|b| $. The next observation provides the basis for the Euclidean algorithm. If $ a=bq+r $, then $ (a,b)=(b,r) $. Thus given integers $ a>b>0 $, the Euclidean algorithm uses the division algorithm repeatedly to obtain

$ a=bq_{1}\!+r_{1}\! $, with $ 0\leq\,\!r_{1}\!<b $
$ b=r_{1}\!q_{2}\!+r_{2}\! $, with $ 0\leq\,\!r_{2}\!<b $
etc.

Since $ r_{1}\!>r_{2}\!>\ldots\,\! $ , the remainders get smaller and smaller, and after a finite number of steps we obtain a remainder $ r_{n+1}\!=0 $. Thus, $ \gcd\,\!(a,b)=\gcd\,\!(b,r_{1}\!)=\cdots\,\!=r_{n}\! $.

Alumni Liaison

Questions/answers with a recent ECE grad

Ryne Rayburn