Any two nonzero integers $ a $ and $ b $ have a greatest common divisor, which can be expressed as the smallest positive linear combination of $ a $ and $ b $. Moreover, an integer is a linear combination of $ a $ and $ b $ if and only if it is a multiple of their greatest common divisor.
The greatest common divisor of two numbers can be computed by using a procedure known as the Euclidean algorithm. First, note that if $ a $ is not equal to 0 and $ b|a $, then gcd$ (a,b)=|b| $. The next observation provides the basis for the Euclidean algorithm. If $ a=bq+r $, then $ (a,b)=(b,r) $. Thus given integers $ a>b>0 $, the Euclidean algorithm uses the division algorithm repeatedly to obtain
$ a=bq1+r1 $, with $ 0\leq\,\!r1<b $
$ b=r1q2+r2 $, with $ 0\leq\,\!r2<b $
etc.
Since $ r1>r2>\ldots\,\! $ , the remainders get smaller and smaller, and after a finite number of steps we obtain a remainder $ rn+1=0 $. Thus, gcd$ (a,b) $ = gcd$ (b,r1)=\ldots\,\!=rn $.