Suppose you wish to make all vectors in ℝ² longer or shorter by some factor

*s*≠ 0. You can represent this by a function

*f*(

__) =__

*v**s*. With a moment's work, we can verify that this is a linear function because of the distributive law. Thus, we can represent

__v__*f*by a matrix. To do so, remember that we calculate

*f*for each element of a basis. For simplicity, we will use the elementary basis {

__,__

*x*__}. Then,__

*y**f*(

__) =__

*x**s*and

__x__*f*(

__) =__

*y**s*. By using coordinates, we can write this as

__y__*f*([1; 0]) = [

*s*; 0] and

*f*([0; 1]) = [0;

*s*]. The matrix representation of

*f*then becomes:

Note that if

*s*= 1, the function

*f*doesn't do anything. Representing

*f*(

__) =__

*v*__as a matrix, we get the very special matrix called the__

*v**identity matrix*, written as

*I*, 𝟙 or 𝕀:

The identity matrix has the property that for any matrix

**M**,

**M**𝟙 = 𝟙

**M**=

**M**, much like the number 1 acts.

Of course, there's no requirement that we stretch

__and__

*x*__by the same amount. The matrix [__

*y**a*0; 0

*b*], for instance, stretches

__by__

*x**a*and

__by__

*y**b*. If one or both of

*a*and

*b*is negative, then we flip the direction of

__or__

*x*__, respectively, since -__

*y*__is the vector of the same length as__

*v*__but pointing in the opposite direction.__

*v*A more complicated example shows how matrices can "mix up" the different parts of a vector by

*rotating*one into the other. Consider, for instance, a rotation of the 2D plane by some angle

*θ*(counterclockwise, of course). This is more difficult to write down as a function, and so a picture may be useful:

By referencing this picture, we see that

*f*(

__) = cos__

*x**θ*+ sin

__x__*θ*

*, while*

__y__*f*(

*) = - sin*

__y__*θ*

__+ cos__

*x**θ*. Thus, we can obtain the famous

__y__*rotation matrix*:

As a sanity check, note that if

*θ*= 0, then

**R**

*= 𝟙, as we would expect for a matrix that "does nothing."*

_{θ}One very important note that needs to be made about matrices is that multiplication of matrices is not always (or even often) commutative. To see this we let the matrix

**S**swap the roles of

__and__

*x*__; that is,__

*y***S**= [0 1; 1 0]. Then, consider

**A**=

**S**

**R**

*and*

_{θ}**B**=

**S**. Since applying

**S**twice does nothing (that is,

**S**² = 𝟙), we have that

**BA**=

**R**

*. On the other hand, if we calculate*

_{θ}**AB**=

**S**

**R**

_{θ}**S**, we find that

**AB**=

**R**

*:*

_{-θ}(Sorry for the formatting problems with that equation.)We conclude that

**AB**≠

**BA**unless sin

*θ*= 0, neatly demonstrating that not all the typical rules of multiplication carry over to matrices.

I'll leave it here for now, but hopefully seeing a few useful matrices makes them seem less mysterious. Until next time!

## No comments:

Post a Comment