Mathematics
creation date: 2018-01-02,
latest update: 2018-12-02
Linear Algebra (Matrix/Vector)
- basis vectors
- think of vector as an arrow point on the map, with origin at [0,0]
- think of basis vector i and j and each x and y coordinate on the map
- linear transformation
- the linear transformation can be thought of the squishing or enlarging the nD dimensional space
- it won't be linear if the origin moves or the spacing becomes unequal
- we can put the "new" basis vector as column 1, 2,... of the transformation matrix. we use the dot product to apply the transformation
- 2x3 transformation matrix means we are squishing from 3D to 2D coordinates.
- the order of transformation is important -- "rotate then shear" is not the same as "shear then rotate"
- but it is associative -- you can compute the product of transformations (called compositions) as long as they are in the right order
- the linear transformation can be thought of the squishing or enlarging the nD dimensional space
- determinant
- the det is actually the area/volume changed by the basis vector of that matrix -- either expanding or squishing
- this is why (later) when det is zero, we might not have an inverse matrix
- inverse matrix
- think of doing the transformation in reverse, that is how we find the inverse matrix
- when the det is zero, we effective lose some dimension, which means many possible inverses (I think)
- cross product
- (example: 2 dimension). the cross product between 2 vectors is
- a unit vector orthogonalto both input vectors (which direction depends on the right-hand order)
- then we scale this unit vector by the determinant (area) of the 2 input vectors
- (example: 2 dimension). the cross product between 2 vectors is
- change of basis
- let A is a vector in basis1, and B is transformation matrix in basis2.
- we can construct a change of basis vector Q for basis1 --> basis2
- then we can make the transformation matrix C = invQ * B * Q
- then we can apply to A, getting Ahat = C * A in basis1
- eigen vectors and eigen values
- eigen vectors are the "orthogonal" vectors that don't not change direction after some transformation Q
- eigen values is the scaling value (lengthen/shorten) associated with certain eigen vectors
- why do we need this? because we can use eigen vector to make "change of basis", then make transformation product much easier to calculate ( namely, scaling transformation only -- diagonal matrix)