LinearAlgebra

来自公开记录
跳到导航 跳到搜索

Linear Algebra

<<>>

Vector Space

Assume u,v,w are in V, then we say V is a vector space if the following 10 axioms are satisfied:

  1. u+v is in V
  2. u+v = v+u
  3. u+(v+w) = (u+v)+w
  4. There is a zero vector, denoted 0, such that u+0=u for all u
  5. For each v, there is a negative of v, denoted -v, such that (-v)+v=0
  6. kv is in V
  7. k(u+v) = ku+kv
  8. (k+l)u = ku+lu
  9. k(lu) = (kl)u
  10. 1u = u

Subspace

If a collection of one or more vectors in a vector space is itself a vector space (use the same meaning of "addition" and "scaling"), then it is a subspace.

There is a simple test for whether a set is a subspace or not:

If V is a vector space, and W is a collection of one or more vectors from V, then W is a subspace of V if
  • W is closed under addition, i.e., if u and v are vectors in W, then u+v is also in W.
  • W is closed under scaling, i.e., if u is a vector in W and k is a scalar, then ku is also in W.

Linear Combination

Let V be a vector space, and v1,v2,...,vr are vectors in V. Then a linear combination of v1,v2,...,vr is any vector w of the form w=c1v1,c2v2,...,crvr.

Span

Given the set S=v1,v2,...,vr of vectors in V, the subspace W of all linear combinations of S is called the subspace spanned by S, or the span of S, and we say:

  • S spans W, or
  • S is a spanning set for W, or
  • W is the span of S.

Theorem

If S={v1,v2,...,vr} and S'={w1,w2,...,wr} are two sets of vectors in a vector space V, then span(S)=span(S') if and only if every vector in S' is a linear combination of S and every vector in S is a linear combination of S'.

Linearly (In)Dependent

Definition

We say S={v1,v2,...,vn} is linearly dependent if and only if c1v1+c2v2+...+cnvn=0 for some c's not all zero (or at least 1 is non-zero).

S is linearly independent is the opposite case, that is, whenever c1v1+c2v2+...+cnvn=0 occurs, it is because c1=c2=...=cn=0.

Prove

Prove linear independence or dependence usually involves deciding whether a certain homogeneous linear system has non-zero solutions.

Theorem

Let V be a vector space and let S={v1,v2,...,vr} be a set of vectors in V.

  • If any of the vectors vj is 0, then S is linearly dependent.
  • If r=1, i.e., if S only contains one vector, then S is linearly independent if and only if v1 is not the zero vector.
  • If S contains exactly two vectors, then S is linearly independent if and only if neither vector is a multiple of the other.

Let A be an m x n matrix, and let A -> B be its RREF.

  • The columns of A are linearly independent if every column of B has a pivot.
  • The columns of A are linearly dependent if some columns of B does not have a pivot.
  • So, the columns of A are linearly independent if and only if its row echelon form has n pivots. In other words, if the rank of A is equal to the number of columns.

Basis

Definition

Vector space V 中,集合 S={v1,v2,...,vn} 是 V 的 basis,如果:

  • S spans V
  • S is linearly independent

For an m x n matrix A, the set of solutions to Ax=0 is a subspace of R^m, called the nullspace of A.

Theorems

  • The columns of an n x n matrix A form a basis of R^n if and only if A is invertible.

Change of Basis

举例说明:我们有 B={1-2x+x2,x+x2,1-x2}, B'={1+x,x,1+x+x2}, [p]B=(3,2,-1)。要找 B'-coordinates,我们首先计算 p=3b1+2b2-1b3=2-4x+6x2。然后我们得到 p=c1b1'+c2b2'+c3b3',也就是 2-4x+6x2=c1(1+x)+c2(x)+c3(1+x+x^2),然后我们就可以计算 c1,c2,c3 了。

另一个例子:V=R^3, B={(−3,0,−3),(−3,2,−1),(1,6,−1)} and B'={(−6,−6,0),(−2,−6,4),(−2,−3,7)}. Find P_{B->B'}.方法是把 B 和 B' 写成 (B|B') 的形式,然后把 B 化简成 I,右边的就是我们要的答案。

Coordinate Vector

Suppose B={v1,v2,...,vn} is a basis of V and w=c1v2+c2v2+...+cnvn. Then the n-vector (w)B = (c1,c2,...,cn) is called the coordinate vector of w with respecto to B.

Rank (pivot version)

Rank 是矩阵的 RREF 矩阵中 pivot 的数目。

Let A be an m x n matrix, and let A' be its reduced row-echelon form. Let r be the rank of A, the number of pivots in A'.

  • r=n, the columns of A are linearly independent.
  • r=m, the columns of A span R^m.
  • r=m=n, the columns of A are a basis of R^m.

Dimension

If $V$ is a vector space, $B={v_1,v_2,\ldots,v_n}$ is a basis of $V$, and $B\prime={w_1,w_2,\ldots,w_m}$ is another basis of $V$, then $m=n$.

If ${v_1,v_2,\ldots,v_n}$ is a basis of $V$, then the number $n$ is the dimension of $V$, and $V$ is called an n-dimensional vector space.

Transition Matrix

P is the transition matrix from B to B': P=[[b1]B',[b2]B',...,[bn]B'].

Row / Column / Null Spaces

Let A be an m x n matrix.

  • The row space of A is the subspace of R^n spanned by the rows of A, written Row(A).
  • The column space of A is the subspace of R^m spanned by the columns of A, written Col(A).
  • The null space of A is the subspace of R^n consisting of all solutions to the homogeneous system Ax=0, written Null(A).

Theorems

  • If A' is in row-echelon form, then the non-zero rows of A' are a basis of the row space of A'.
  • If A' is in RREF, then the pivot columns of A' form a basis of Col(A').

Rank (dimension version), Nullity

  • The rank of a matrix A is the common dimension of Row(A) and Col(A). It is denoted rank(A).
  • The nullity of a matrix A is the dimension of Null(A). It is denoted nullity(A).

  • For any m x n matrix A, rank(A)+nullity(A)=n.

Linear Transformation

Given vector spaces $U$ and $V$, a linear transformation from $U$ to $V$ is a function $T$ with domain $U$ and range in $V$ such that for all $x$ and $y$ in $U$ and any scalar $k$,

  • $T(x+y)=T(x)+T(y)$
  • $T(kx)=kT(x)$

For a given vector $x$ in $U$, the vector $T(x)$ is called the image of $x$ under $T$, or the transform of $x$.

If $V=U$, then $T$ is called a linear operator on $V$.

Properties

Suppose U and V are vector spaces, and T is a linear transformation from $U$ to V.

  • T(0)=0
  • T(-x)=-T(x)
  • T(x-y)=T(x)-T(y)
  • T(c_1x_1+c_2x_2+\cdots+c_rx_r)=c_1T(x_1)+c_2T(x_2)+\cdots+c_rT(x_r)

Matrix Transformation

A linear transformation of the form $T_A$ is called a matrix transformation, and $A$ is called the standard matrix of $T=T_A$. We will often just call it "the matrix of T".

Every linear transformation from $Rn$ to $Rm$ is a matrix transformation.

寻找 Standard Matrix

举例说明。有了 $T(a,b,c,d)=(2a-3b+d,a-b-c-d,-6a+4b+8c-3d)$,我们计算

  • T(e_1)=T(1,0,0,0)=(2,1,−6)
  • T(e_2)=T(0,1,0,0)=(−3,−1,4)
  • T(e_3)=T(0,0,1,0)=(0,−1,8)
  • T(e_4)=T(0,0,0,1)=(1,−1,−3)

然后我们得到 A=((2,-3,0,1),(1,-1,-1,-1),(-6,4,8,-3))

Geometric Operators

Scalar Operators

Scalar operator is a linear transformation: Tc(x)=cx.

  • Dilation(膨胀): c>1
  • Contraction(收缩): 0<c<1

Compression / Expansion

These are operators on R^n that only scale one axis.

Given a scalar c, define Tc,j by Tc,j(x)=(x1,x2,...,cx_j,...,x_n).

  • compression in the x_j direction: 0<c<1
  • expansion: c>1

只扩大或缩小一个方向。

Rotation

旋转 \theta 度。

  • T(e1)=(cos\theta,sin\theta)
  • T(e2)=(-sin\theta,cos\theta)

Orthogonal Projection

把一个向量分解成两个向量的和,取其中一个。proj_a(u) 操作(u 是原始向量,a 是参考向量)。proj_a(u) = ((a.u)/(a.a))a.

这个操作符合 linear combination。

/!\ 以上操作都属于“简单”的操作,也就是说不计较长短。

Reflection

反射。

Composition

(T2 ◦ T1)(x) = T2(T1(x)).

One-to-one / Onto / Inverse

Let T be a linear transformation from U to V.

  • T is one-to-one if T transforms distinct vectors into distinct vectors. i.e., if x != y, then T(x) != T(y).
  • T is onto V if every vector in V is the transform of some vector in U. i.e., if y \in V, then there is some x \in U such that T(x)=y.
  • Suppose T' is a linear transformation from V to U. Then T' is the inverse of T if
    • for every x in U, T'(T(x))=x
    • for every y in V, T(T'(y))=y
In this case, T is said to be invertible, or an isomorphism, and we use the notation T^{-1}=T'.