To multiply in terms of adding and shifting you want to decompose one of the numbers by powers of two, like so:
21 * 5 = 10101_2 * 101_2 (Initial step)
= 10101_2 * (1 * 2^2 + 0 * 2^1 + 1 * 2^0)
= 10101_2 * 2^2 + 10101_2 * 2^0
= 10101_2 << 2 + 10101_2 << 0 (Decomposed)
= 10101_2 * 4 + 10101_2 * 1
= 10101_2 * 5
= 21 * 5 (Same as initial expression)
(_2
means base 2)
As you can see, multiplication can be decomposed into adding and shifting and back again. This is also why multiplication takes longer than bit shifts or adding - it's O(n^2) rather than O(n) in the number of bits. Real computer systems (as opposed to theoretical computer systems) have a finite number of bits, so multiplication takes a constant multiple of time compared to addition and shifting. If I recall correctly, modern processors, if pipelined properly, can do multiplication just about as fast as addition, by messing with the utilization of the ALUs (arithmetic units) in the processor.