Transformations

A transformation from a set (domain) to a set (codomain) is a rule that for each argument assigns a value .
We write to say that is a transformation with domain and codomain .


Linear transformations

Let be vector spaces over the same field . A transformation is called linear if

Properties 1 and 2 together are equivalent to .
In other words, applied on a linear combination of and should equal the same linear combination of the transformed versions of and .

Properties of linear transformations

  • .
  • is a vector space (subspace of ).
  • is a subspace of .

Matrix-vector multiplication as a linear transformation

A linear transformation can be represented as “multiplication” by a matrix.

Claim: It is sufficient to know how acts on the standard basis of . Namely, it is sufficient to know vectors in ,

This is easy to show. Let . Then, , and

So, if we join the vectors (columns) together in a matrix , this matrix contains all the information about . We package together in manner because it makes it easy to compute when we define the product of a matrix and a vector like so:

So, by this column by coordinate definition of matrix by vector multiplication, we have , where the columns of are the standard basis vectors of transformed by .

Things

  • A linear transformation is completely defined by its values on the standard basis in its domain. In fact, the basis need not be standard - one can consider any basis, even a generating set (every generating set has a basis, right?)
  • Matrix multiplication did not just “happen” to be a linear transformation; it was defined specifically to be one.
  • It naturally arises from this definition that the number of columns of the matrix must equal the number of entries in the vector.
  • If is a linear transformation from , is represented by a matrix

Note

Often no distinction is made between a linear transformation and its matrix. may mean applying on or multiplying by the matrix of .

Important

We previously claimed that it is sufficient to know how a linear map acts on a basis of its domain to know how it acts on any member of its domain. Taking this further, it is also true that if we merely assign an element from a vector space to every element of a basis of a vector space , we get a linear map .

Linear transformations as a vector space

If we fix vector spaces and and consider the collection of all linear transformations from to (denoted by ), we can define two operations on : multiplication by a scalar and addition like so

It can be easily shown that these operations satisfy the axioms of a vector space. Thus, is a vector space.

While we usually do not define a product for vector spaces, a useful definition of multiplication does exist for some vector spaces, like this one.

Definition 1.

If and then the product is defined by

for .

is just the usual composition of two functions.

When the product is defined, the following properties can be easily proved:

  • Associativity:
  • Identity:
  • distributivity: , .

Matrix multiplication

Knowing matrix-vector multiplication, a natural way arises to define the product of two matrices: multiply by each column of , and join the resulting column vectors into a matrix.
If are the columns of , then are the columns of .
It is clear that in order for the multiplication to be defined, the number of columns in must equal the number of rows in (LADW, p20 for a better explanation of why this is natural).

Matrix multiplication is associative and distributive. One can also take scalar multiples out. In general, it is not commutative.