Here, we describe a remarkable algorithm for matrix multiplication discovered by Strassen, [62]. Let and be block
Question:
Let
and
be block matrices of size n = 2m, where all blocks are of size m à m.
(a) Let
D1 = (A1 + A4) (B1 + B4),
D2 = (A1 - A3) (B1 + B2).
D3 = (A2 - A4) (B3 + B4),
D4 = (A1 + A2) Bi,
D5 = (A3 + A4) B1,
D6 = A4(B1 - B3),
D7 = A1(B2 - B2).
Show that
C1 = D1 + D3 - D4 - D6.
C2 -D4 + D7.
C3 = D5 - D6,
C4 = D1 - D2 - D5 + D7.
(b) How many arithmetic operations are required when A and B are 2 Ã 2 matrices? How does this compare with the usual method of multiplying 2Ã2 matrices?
(c) In the general case, suppose we use standard matrix multiplication for the matrix products in
D1.........D7. Prove that Strassen's method is faster than the direct algorithm for computing AB by a factor of 7/8.
(d) When A and B have size n à n with n = 2r, we can recursively apply Strassen's method to multiply the 2r-1 à 2r-l blocks A. B. Prove that the resulting algorithm requires a total of 7ʹ= nlog2 7= n280735 multiplications and
6(7r-1 - 4r-1) additions/subtractions, versus n3 multiplications and n3 - n2 n3 additions for the ordinary matrix multiplication algorithm. How much faster is Strassen's method when n =210?225?2100?
(e) How might you proceed if the size of the matrices does not happen to be a power of 2? Further developments of these ideas can be found in [11.32].
Step by Step Answer: