Rules for Matrix Algebra
MA 2071
1. Matrix multiplication is, in general, not commutative.
That is, in general, AB ≠ BA.
What this means is that one should not change the order of matrices in
an expression, unlike ordinary algebra. (see examples from class 3/17)
2. Matrix
multiplication is linear. This is a very
important concept! It amounts to , if c and k are
scalars,
A(cX + kY) = cAX
+ k AY
in other
words you can distribute
products and you can factor constants out.
You have seen this concept in other
places:
in
Calculus I, derivatives are linear d(c f(x)
+ k g(x))= c df + kdg
dx dx dx
in
Calculus II, integrals
are linear
Also Fourier Transforms, important to Signal
Processing, are linear
F( c(f(t)
+ kg(t) ) = c F
(f(t)) +
k F (g(t))
and Laplace Transforms,
used in systems analysis, are linear
L(
c(f(x) +
kg(x) ) = c L(f(x))
+ k L(g(x))
3. Matrix algebra is a binary operation. This means that at any time, no matter how many
matrices are involved
in an expression, you are only multiplying or
adding two at a time. You have your
choice of which two, as long as
you do not reverse the order of
multiplication. This is called the associative
property:
(AB)C = A(BC) = ABC
4. The identity matrix, In, (1s on diagonal, 0s
elsewhere) serves as the multiplicative identity:
AIn = A and InA = A
(so this is an
example of a rare time when order does not matter).
5. Some , but not all,
square matrices have multiplicative inverses. This means another matrix which
satisfies
AB
= BA = In
In such a case, B is
denoted by A-1. If a matrix has an inverse, it is
called nonsingular
in the text book.
Lots of matrices do not have inverses,
unlike ordinary numbers where only 0 has no multiplicative inverse.
Those that do not are called singular
matrices.