Contents
Chapter 1 Vector Spaces
1.1 Introduction
In this chapter we introduce another algebraic system known as vector spaces.
If \(u\) is a vector then \({u}+{u}=2{u}\) is evidently a vector in the same direction as \({u},\) but of twice its magnitude. This introduces a new concept of multiplication of a vector by a scalar, the resulting product being a vector. Thus given any real number \(\alpha \) and a vector \(u\) passing through \(O\) then \(\alpha {u}\) is the vector whose direction is either the same as that of \(u\) or opposite to that of \(u\) according as \(\alpha >0\) or \(\alpha <0\) and whose magnitude is \(|\alpha |\) times the magnitude of \(u\). This association gives rise to a map from \(\mathbb {R}\times V\) to \(V\) given by \((\alpha ,\ {u})\rightarrow \alpha {u}\). It can be easily be verified that \((\alpha +\beta ){u}=\alpha {u}+\beta {u}\) and \(\alpha ({u}+{v})=\alpha {u}+\alpha {v}\) where \({u}, {v}\in V\) and \(\alpha , \beta \in \mathbb {R}\). These ideas motivate the following abstract definition of a vector space \(V\) over a field \(F.\)
1.2 Definition
Definition 1.2.1. A non-empty set \(V\) is said to be a vector space over a field \(F\) if
(i) \(V\) is an abelian group under an poeration called addition which we denote \({b}{y}+.\)
(ii) For every \(\alpha \in F\) and \(v\in V\), there is defined an element \(\alpha v\) in \(V\) subject to the following conditions.
(a) \(\alpha (u+v)=\alpha u+\alpha v\) for all \(u, v\in V\) and \(\alpha \in F.\)
(b) \((\alpha +\beta )u=\alpha u+\beta u\) for all \(u\in V\) and \(\alpha , \beta \in F.\)
(c) \(\alpha (\beta u)=(\alpha \beta )u\) for all \(u\in V\) and \(\alpha , \beta \in F.\)
(d) \(1u=u\) for all \(u\in V.\)
1. The elements of \(F\) are called scalars and the elements of \(V\) are called vectors.
2. The rule which associates with each scalar \(\alpha \in F\) and a vector \(v\in V\), a vector \(\alpha v\) is called the scalar multiplication. Thus a scalar multiplication gives rise to a function from \(F\times V\rightarrow V\) defined by \((\alpha ,\ v)\rightarrow \alpha v.\)
1.3 Examples
1. \(\mathbb {R}\times \mathbb {R}\) is a vector space over a field \(\mathbb {R}\) under the addition and scalar multiplication defined by \((x_{1},\ x_{2})+(y_{1},\ y_{2})=(x_{1}+y_{1},\ x_{2}+y_{2})\) and \(\alpha (x_{1},\ x_{2})=(\alpha x_{1},\ \alpha x_{2})\).
Proof : Clearly the binary operation \(+{i}{s}\) commutative and associative and \((0,0)\) is the zero element. The inverse of \((x_{1},\ x_{2})\) is \((-x_{1},\ -x_{2})\). Hence \((\mathbb {R}\times \mathbb {R},\ +)\) is an abelian group. Now, let \(u=(x_{1},\ x_{2})\) and \(v=(y_{1},\ y_{2})\) and let \(\alpha , \beta \in \mathbb {R}\). Then
\(\seteqnumber{0}{1.}{0}\)\begin{align*} \alpha (u+v)& =\alpha [(x_{1},\ x_{2})+(y_{1},\ y_{2})]\\ & =\alpha (x_{1}+y_{1},\ x_{2}+y_{2})\\ & =(\alpha x_{1}+\alpha y_{1},\ \alpha x_{2}+\alpha y_{2}) \\ & =(\alpha x_{1},\ \alpha x_{2})+(\alpha y_{1},\ \alpha y_{2})\\ & =\alpha (x_{1},\ x_{2})+\alpha (y_{1},\ y_{2})\\ & =\alpha u+\alpha v. \end{align*} Now,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} (\alpha +\beta )& =(\alpha +\beta )(x_{1},\ x_{2})\\ & =((\alpha +\beta )x_{1},\ (\alpha +\beta )x_{2})\\ &=(\alpha x_{1}+\beta x_{1},\ \alpha x_{2}+\beta x_{2}) \\ &=(\alpha x_{1},\ \alpha x_{2})+(\beta x_{1},\ \beta x_{2})\\ &=\alpha (x_{1},\ x_{2})+\beta (x_{1},\ x_{2})\\ &=\alpha u+\beta u. \end{align*} Also
\(\seteqnumber{0}{1.}{0}\)\begin{align*} \alpha (\beta u)&=\alpha (\beta (x_{1},\ x_{2}))\\ &=\alpha (\beta x_{1},\ \beta x_{2})\\ &=(\alpha \beta x_{1},\ \alpha \beta x_{2})\\ &=(\alpha \beta )(x_{1},\ x_{2})\\ &=(\alpha \beta )u \end{align*} Obviously \(1u=u.\) \(\therefore \quad \mathbb {R}\times \mathbb {R}\) is a vector space over \(\mathbb {R}.\) □
2. \(\mathbb {R}^{n}=\{(x_{1},\ x_{2},\ \ldots ,\ x_{n})\ :\ x_{i}\in \mathbb {R},\ 1\leq i\leq n\}\). Then \(\mathbb {R}^{n}\) is a vector space over \(\mathbb {R}\) under addition and scalar multiplication defined by
\(\seteqnumber{0}{1.}{0}\)\begin{align*} (x_{1},\ x_{2},\ \ldots ,\ x_{n})& +(y_{1},\ y_{2},\ \ldots ,\ y_{n})\\ &= (x_{1}+y_{1},\ x_{2}+y_{2},\ \ldots ,\ x_{n}+y_{n}) \end{align*} and
\(\seteqnumber{0}{1.}{0}\)\begin{align*} \alpha (x_{1},\ x_{2},\ \ldots ,\ x_{n})&=(\alpha x_{1},\ \alpha x_{2},\ \ldots ,\ \alpha x_{n}) \end{align*}
Proof : Clearly the binary operation\(+\)is commutative and associative.
\((0,0,\ \ldots ,\ 0)\) is the zero element.
The inverse of \((x_{1},\ x_{2},\ \ldots ,\ x_{n})\) is \((-x_{1},\ -x_{2},\ \ldots ,\ -x_{n})\).
Hence \((\mathbb {R}^{n},\ +)\) is an abelian group.
Now, let \(u=(x_{1},\ x_{2},\ \ldots ,\ x_{n})\) and \(v=(y_{1},\ y_{2},\ \ldots ,\ y_{n})\) and let \(\alpha , \beta \in \mathbb {R}\). Then
\begin{align*} \alpha (u+v)&=\alpha [(x_{1},\ x_{2},\ \cdots ,\ x_{n})+(y_{1},\ y_{2},\ \cdots ,\ y_{n})]\\ & =\alpha (x_{1}+y_{1},\ x_{2}+y_{2},\ \ldots ,\ x_{n}+y_{n})\\ &=(\alpha x_{1}+\alpha y_{1},\ \alpha x_{2}+\alpha y_{2},\ \ldots ,\ \alpha x_{n}+\alpha y_{n})\\ &=(\alpha x_{1},\ \alpha x_{2},\ \ldots ,\ \alpha x_{n})+(\alpha y_{1},\ \alpha y_{2},\ \ldots ,\ \alpha y_{n})\\ & =\alpha (x_{1},\ x_{2},\ \ldots ,\ x_{n})+\alpha (y_{1},\ y_{2},\ \ldots ,\ y_{n})\\ &=\alpha u+\alpha v. \end{align*} Similarly
\(\seteqnumber{0}{1.}{0}\)\begin{align*} (\alpha +\beta )u&=\alpha u+\beta u \end{align*} and
\(\seteqnumber{0}{1.}{0}\)\begin{align*} \alpha (\beta u)&=(\alpha \beta )u.\\ 1\cdot u&=u \end{align*} \(\mathbb {R}^{n}\) is vector space over \(\mathbb {R}.\) □
3. Let \(F\) be any field. Let \(F^{n}=\{(x_{1},\ x_{2},\ \ldots ,\ x_{n})\ :\ x_{i}\in F\}\). In \(F^{n}\) we define addition and scalar multiplication as in above example. Then \(F^{n}\) is a vector space over \(F\) and we denote this vector space by \(V_{n}(F)\).
4. \(\mathbb {C}\) is a vector space over the field \(\mathbb {R}\). Here addition is the usual addition in \(\mathbb {C}\) and the scalar multiplication is the usual multiplication of a real number and a complex number. (ie)., \((x_{1}+ix_{2})+(y_{1}+iy_{2})=(x_{1}+y_{1})+i(x_{2}+y_{2})\) and \(\alpha (x_{1}+ix_{2})=\alpha x_{1}+i\alpha x_{2}.\)
Proof : Clearly \((\mathbb {C},\ +)\) is an abelian group. Also the remaining axioms of a vector space are true since the scalars and vectors involved are complex numbers and further the operations are usual addition and multiplication. Hence \(\mathbb {C}\) is a vector space over \(\mathbb {R}.\) □
5. Let \(V=\{a+b\sqrt {2}\ :\ a,\ b\in \mathbb {Q}\}\). Then \(V\) is a vector space over \(\mathbb {Q}\) under addition and multiplication.
Proof : Obviously \(V\) is an abelian group under usual addition. The remaining axioms of a vector space are true since the scalars and vectors are real numbers and the operations are usual addition and multiplication. Hence \(V\) is a vector space over \(\mathbb {Q}. \) □
6. Let \(F\) be a field. Then \(F[x]\), the set of all polynomials over \(F\), is a vector space over \(F\) under the addition of polynomials and scalar multiplication defined by \(\alpha (a_{0}+a_{1}x+\cdots +a_{n}x^{n})=\alpha a_{0}+\alpha a_{1}x+\cdots +\alpha a_{n}x^{n}\)
7. The set \(V\) of all polynomials of degree \(\leq n\) including the zero polynomial in \(F[x]\) is a vector space over the field \(F\) under the addition and scalar multiplication defined as in previous example.
Proof : Let \(f, g\in V\). Then \(f\) and \(g\) are polynomials of degree \(\leq n\). \(f+g\) and \(\alpha f\) are of degree \(\leq n\). \(f+g, \alpha f\in V\). The other axioms of a vector space can easily be verified. Hence \(V\) is a vector space over \(F.\) □
8. The set \(M_{2}(\mathbb {R})\) of all \(2 \times 2\) matrices is a vector space over \(\mathbb {R}\) under matrix addition and scalar multiplication defined by
\[ \alpha \left [\begin {array}{ll} a & b\\ c & d \end {array}\right ]=\left [\begin {array}{ll} \alpha a & \alpha b\\ \alpha c & \alpha d \end {array}\right ] \]
9. Let \(V\) be the set of all functions from \(\mathbb {R}\) to \(\mathbb {R}\). Let \(f, g\in V\). We define
\(\seteqnumber{0}{1.}{0}\)\begin{align*} (f+g)(x)&= f(x)+g(x) \end{align*} and
\(\seteqnumber{0}{1.}{0}\)\begin{align*} (\alpha f)(x)&=\alpha [f(x)] \end{align*} \(V\) is a vector space over \(\mathbb {R}.\)
10. Let \(V\) denote the set of all solutions of the differential equation \(2\displaystyle \frac {d^{2}y}{dx^{2}}- 7\displaystyle \frac {dy}{dx}+3y=0\). Then \(V\) is a vector space over \(\mathbb {R}.\)
Proof : Let \(f, g\in V\) and \(\alpha \in \mathbb {R}\). Then
\(\seteqnumber{0}{1.}{0}\)\begin{align*} 2\displaystyle \frac {d^{2}f}{dx^{2}}-7\frac {df}{dx}+3f=0 \text { and }2\displaystyle \frac {d^{2}g}{dx^{2}}-7\frac {dg}{dx}+3g&=0 \\ 2\ (\frac {d^{2}f}{dx^{2}}+\frac {d^{2}g}{dx^{2}})-7(\frac {df}{dx}+\frac {dg}{dg})+3(f+g)&=0 \\ 2\frac {d^{2}}{dx^{2}}(f+g)-7\frac {d}{dx}(f+g)+3(f+g)&=0 \end{align*} Hence \(f+g\in V\). Also
\[2\displaystyle \frac {d^{2}}{dx^{2}}(\alpha f)-7\frac {d}{dx}(\alpha f)+3(\alpha f)=0\]
Hence \(\alpha f\in V\). Since the operations are usual addition and scalar multiplication, the axioms of vector space are true. Hence \(V\) is a vector space over \(\mathbb {R}.\) □
11. Any sequence of real numbers \(a_{1}, a_{2}, \ldots , a_{n}, \ldots \) is usually denoted by the symbol \((a_{n})\). Let \(V\) denote the set of all sequence of real numbers. \(V\) is a vector space over the field of real numbers. The addition and scalar multiplication are defined by
\[(a_{n})+(b_{n})=(a_{n}+b_{n}) \text { and }\alpha (a_{n})=(\alpha a_{n}).\]
12. Let \(V=\{0\}. V\) is a vector space over any field \(F\) under the obvious operations of addition and scalar multiplication.
13. \(\mathbb {R}\) is not a vector space over \(\mathbb {C}\). Clearly \((\mathbb {R},\ +)\) is an abelian group. But the scalar multiplication is not defined, for if \(\alpha =a+ib\in \mathbb {C}\) and \(u\in \mathbb {R}\), then \(\alpha u=au+ibu\not \in \mathbb {R}.\) Therefore \(\mathbb {R}\) is not vector space over \(\mathbb {C}.\)
14. Consider \(\mathbb {R}\times \mathbb {R}\) with usual addition. We define scalar multiplication by \(\alpha (x,\ y)= (\alpha x,\ \alpha ^{2}y)\). Then \(\mathbb {R}\times \mathbb {R}\) is not a vector space over \(\mathbb {R}\). Clearly \(\mathbb {R}\times \mathbb {R}\) with usual addition is an abelian group.
\(\seteqnumber{0}{1.}{0}\)\begin{align*} (\alpha +\beta )(x,\ y)&=((\alpha +\beta )x,\ (\alpha +\beta )^{2}y)\\ &=(\alpha x+\beta x,\ \alpha ^{2}y+\beta ^{2}y+2\alpha \beta y) \end{align*} Also,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} \alpha (x,\ y)+\beta (x,\ y)&=(\alpha x,\ \alpha ^{2}y)+(\beta x,\ \beta ^{2}y)\\ &=(\alpha x+\beta x,\ \alpha ^{2}y+\beta ^{2}y) \end{align*} Hence
\[(\alpha + \beta )(x,\ y)\neq \alpha (x,\ y)+\beta (x,\ y).\]
\(\mathbb {R}\times \mathbb {R}\) is not a vector space over \(\mathbb {R}.\)
15. Consider \(\mathbb {R}\times \mathbb {R}\) with usual addition. Define the scalar multiplication as \(\alpha (a,\ b)= (0,0)\). Clearly \(\mathbb {R}\times \mathbb {R}\) is an abelian group. Also,
(i) \(\alpha (u+v)=0\) and \(\alpha u+\alpha v=0+0=0\);
so that \(\alpha (u+v)=\alpha u+\alpha v.\)
(ii) Similarly \((\alpha +\beta )u=\alpha u+\beta u=0.\)
(iii) \(\alpha (\beta u)=(\alpha \beta )u=0.\)
However \(1 (a,\ b)=(0,0)\). Hence it is not a vector space.
16. Let \(V\) be the set of all ordered pairs of real numbers. Addition and multiplication are defined by
\[(x,\ y)+(x_{1},\ y_{1})=(x+x_{1},\ y+y_{1})\]
and
\[\alpha (x,\ y)=(x,\ \alpha y)\]
where \(x, y, x_{1}, y_{1}\) and \(\alpha \) are real numbers. Then \(V\) is not a vector space over \(\mathbb {R}\). Clearly \(V\) is an abelian group under the operation\(+\)defined above.
Let \(\alpha , \beta \in \mathbb {R}\) and \((x,\ y)\in V\). Now,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} (\alpha +\beta )(x,\ y)&=(x,\ (\alpha +\beta )y)\\ & =(x,\ \alpha y+\beta y) \end{align*} Also,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} \alpha (x,\ y)+\beta (x,\ y)&=(x,\ \alpha y)+(x,\ \beta y)\\ & =(2x,\ \alpha x+\beta y)\\ (\alpha +\beta )(x,\ y)& \neq \alpha (x,\ y)+\beta (x,\ y) \end{align*} Hence \(V\) is not a vector space over \(\mathbb {R}.\)
17. Let \(\mathbb {R}^{+}\) be the set of all positive real numbers. Define addition and scalar multiplication as follows
\[u+v=uv \text { for all }u, v\in \mathbb {R}^{+}\]
\[\alpha u=u^{\alpha } \text { for all }u\in \mathbb {R}^{+}\]
and \(\alpha \in \mathbb {R}.\) Then \(\mathbb {R}^{+}\) is a real vector space.
Proof : Clearly \((\mathbb {R}^{+},\ +)\) is an abelian group with identity \(1.\) Now,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} \alpha (u+v)& =\alpha (uv)=(uv)^{\alpha }=u^{\alpha }v^{\alpha }=\alpha u+\alpha v. \\ (\alpha +\beta )u& =u^{\alpha +\beta }=u^{\alpha }u^{\beta }=\alpha u+\beta u.\\ \alpha (\beta u)& =\alpha u^{\beta }=(u^{\beta })^{\alpha }=u^{\beta \alpha }=u^{\alpha \beta }=(\alpha \beta )u. \end{align*} Also,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} 1u& =u1=u. \end{align*} \(\mathbb {R}^{+}\) is a vector space over \(\mathbb {R}.\) □
Remark 1.3.4. Commutativity of addition in a vector space can be derived from the other axioms of the vector space (ie.,) the axiom of commutativity of addition in a vector space is redundant, for,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} (1+1)(u+v)& =1(u+v)+1(u+v)\\ & =1u+1v+1u+1v\\ & =u+v+u+v \end{align*} Also,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} (1+1)(u+v)&=(1+1)u+(1+1)v\\ &=u+u+v+v\\ u+v+u+v&=u+u+v+v\\ v+u&=u+v. \end{align*}
Proof :
(i) \(\alpha 0=\alpha (0+0)=\alpha 0+\alpha 0\). Hence \(\alpha 0=0.\)
(ii) \({0}v=({0}+{0})v=0v+0v\). Hence \(0v=0.\)
(iii) \(0=[\alpha +(-\alpha )]v=\alpha v+(-\alpha )v\).
Hence \((-\alpha )v=-(\alpha v)\).
Similarly \(\alpha (-v)=-(\alpha v)\).
Hence \((-\alpha )v=\alpha (-v)=-(\alpha v)\).
(iv) Let \(\alpha v=0\). If \(\alpha =0 \Rightarrow \alpha v =0\).
Let \(\alpha \neq 0\). Then \(\alpha ^{-1}\in F\).
Now, \(v=1v=(\alpha ^{-1}\alpha )v=\alpha ^{-1}(\alpha v)=\alpha ^{-1}0=0.\)
□
1.4 Subspaces
Proof : Let \(W\) be a subspace of \(V\). Then \(W\) itself is a vector space and hence \(W\) is closed with respect to vector addition and scalar multiplication.
Conversely, let \(W\) be a non-empty subset of \(V\) such that \(u, v\in W\Rightarrow u+v\in W\) and \(u\in W\) and \(\alpha \in F\Rightarrow \alpha u\in W\).
We prove that \(W\) is a subspace of \(V\).
Since \(W\) is non-empty, there exists an element \(u\in W\). \(0{ u} =0\in W\). Also \(v\in W\Rightarrow (-1)v= -v\in W\).
Thus \(W\) contains \(0\) and the additive inverse of each of its element. Hence \(W\) is an additive subgroup of \(V\).
Also \(u\in W\) and \(\alpha \in F\Rightarrow \alpha u\in W\).
Since the elements of \(W\) are the elements of \(V\) the other axioms of a vector space are true in \(W\).
Hence \(W\) is a subspace of \(V.\) □
Proof : Let \(W\) be a subspace of \(V\). Let \(u, v\in W\) and \(\alpha , \beta \in F\). Then \(\alpha u\) and \(\beta v\in W\) and hence \(\alpha u+\beta v\in W.\)
Conversely, let \(u, v\in W\) and \(\alpha , \beta \in F\Rightarrow \alpha u+\beta v\in W\).
Taking \(\alpha =\beta =1\), we get \(u, v\in W\Rightarrow u+v\in W\).
Taking \(\beta =0\), we get \(\alpha \in F\) and \(u\in W\Rightarrow \alpha \in F\) and \(u\in W\Rightarrow \alpha u\in W\). Hence \(W\) is a subspace of \(V.\) □
Examples
1. \(\{0\}\) and \(V\) are subspaces of any vector space \(V\).
They are called the trivial subspaces of \(V.\)
2. \(W=\{(a,\ 0,0)\ :\ a\in \mathbb {R}\}\) is a subspace of \(R^{3}\).
Let \(u=(a,\ 0,0), v=(b,\ 0,0)\in W\) and \(\alpha , \beta \in \mathbb {R}\).
Then \(\alpha u+\beta v=\alpha (a,\ 0,0)+\beta (b,\ 0,0)=(\alpha a+\beta b,\ 0,0)\in W\). Hence \(W\) is a subspace of \(\mathbb {R}^{3}\)
3. In \(\mathbb {R}^{3}, W=\{(ka,\ kb,\ kc)\ :\ k\in \mathbb {R}\}\) is a subspace of \(\mathbb {R}^{3}\).
For, if \(u=(k_{1}a,\ k_{1}b,\ k_{1}c)\) and \(v=(k_{2}a,\ k_{2}b,\ k_{2}c)\in W\) and \(\alpha , \beta \in \mathbb {R}\) then
\begin{align*} \alpha u+\beta v& =\alpha (k_{1}a,\ k_{1}b,\ k_{1}c)+\beta (k_{2}a,\ k_{2}b,\ k_{2}c) \\ & =((\alpha k_{1}+\beta k_{2})a,\ (\alpha k_{1}+\beta k_{2})b,\ (\alpha k_{1}+\beta k_{2})c)\in W \end{align*} Hence \(W\) is a subspace of \(\mathbb {R}^{3}\)
Note 1.4.5. Geometrically \(W\) consists of all points of the line \(\displaystyle \frac {x}{a}=\frac {y}{b}=\frac {z}{c}\) provided \(a, b, c\) are not all zero. Thus the set of all points on a line through the origin is a subspace of \(\mathbb {R}^{3}\) However a line not passing through the origin is not a subspace of \(\mathbb {R}^{3},\) since the additive identity \((0,0,0)\) does not lie on the line.
4. \(W=\{(a,\ b,\ 0)\ :\ a,\ b\in \mathbb {R}\}\) is a subspace of \(\mathbb {R}^{3}\). \(W\) consists of all points of the \(xy\)-plane.
5. Let \(W\) be the set of all points in \(\mathbb {R}^{3}\) satisfying the equation \(lx+my+nz=0\). \(W\) is a subspace of \(\mathbb {R}^{3}\).
Let \(u=(a_{1},\ b_{1},\ c_{1})\) and \(v=(a_{2},\ b_{2},\ c_{2})\in W\) and \(\alpha , \beta \in \mathbb {R}\). Then we have \(la_{1}+mb_{1}+nc_{1}=0=la_{2}+mb_{2}+nc_{2}\). Hence
\begin{align*} \alpha (la_{1}+mb_{1}+nc_{1})+\beta (la_{2}+mb_{2}+nc_{2})& =0. \\ (ie.,) l(\alpha a_{1}+\beta a_{2})+m(\alpha b_{1}+\beta b_{2})+n(\alpha c_{1}+\beta c_{2})& =0.\\ (ie.,) \alpha u+\beta v & \in W \end{align*} So that \(W\) is a subspace of \(\mathbb {R}^{3}\).
6. Let \(W=\{f\) : \(f\in F[x]\) and \(f(a)=0\}\).
(ie.,) \(W\) is the set of all polynomials in \(F[x]\) having \(a\) as a root where \(a\in F\).
Then \(W\) is a vector space over \(F\).
From this, \(x-a\in W\) and hence \(W\) is non-empty.
Let \(f, g\in F[x]\) and \(\alpha , \beta \in F\).
To prove that \(\alpha f+\beta g\in W\).
We have to show that \(a\) is a root of \(\alpha f+\beta g\). Now,
\[(\alpha f+\beta g)(a)= \alpha f(a)+\beta g(a)=\alpha 0+\beta 0=0.\]
Hence \(a\) is a root of \(\alpha f+\beta g. \alpha f+\beta g\in W\) and \(W\) is a subspace of \(F[x].\)
7. \(W=\left \{\left (\begin {array}{ll} a & 0\\ 0 & b \end {array}\right ) : a, b\in \mathbb {R}\right \}\) is a subspace of \(M_{2}(\mathbb {R})\).
Solved problems
Solution : Let \(A\) and \(B\) be two subspaces of a vector space \(V\) over a field \(F\).
We claim that \(A\cap B\) is a subspace of \(V\).
Clearly \(0\in A\cap B\) and hence \(A\cap B\) is non-empty.
Now, let \(u, v\in A\cap B\) and \(\alpha , \beta \in F\). Then \(u, v\in A\) and \(u, v\in B\). \(\alpha u+\beta v\in A\) and \(\alpha u+\beta v\in B\) (since \(A\) and \(B\) are subspaces) \(\alpha u+\beta v\in A\cap B\). Hence \(A\cap B\) is a
subspace of \(V.\)
Solution : Let \(A=\{(a,\ 0,0)\ :\ a\in \mathbb {R}\}, B=\{(0,\ b,\ 0)\ :\ b\in \mathbb {R}\}\).
Clearly \(A\) and \(B\) are subspaces of \(\mathbb {R}^{3}\) (Example 2 of Section 1.2). However \(A\cup B\) is not a subspace of \(\mathbb {R}^{3}\).
For, \((1,\ 0,0)\) and \((0,1,0)\in A\cup B\). But
\[(1,\ 0,0)+(0,1,0)=(1,1,0)\not \in A\cup B.\]
Problem 1.4.9. If \(A\) and \(B\) are subspaces of \(V\) prove that \(A+B=\{v\in V:v= a+b, a\in A, b\in B\}\) is a subspace of \( V\). Further show that \(A+B\) is the smallest subspace containing \(A\) and \( B\). (ie.,) If \(W\) is any subspace of \(V\) containing \(A\) and \(B\) then \(W\) contains \(A+B.\)
Solution : Let \(v_{1}, v_{2}\in A+B\) and \(\alpha \in F\). Then
\[v_{1}=a_{1}+b_{1}, v_{2}=a_{2}+b_{2}\]
where \(a_{1}, a_{2}\in A\) and \(b_{1}, b_{2}\in B\). Now,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} v_{1}+v_{2}& =(a_{1}+b_{1})+(a_{2}+b_{2})\\ & =(a_{1}+a_{2})+(b_{1}+b_{2})\in A+B \end{align*} Also,
\(\seteqnumber{0}{1.}{0}\)
\begin{align*}
\alpha (a_{1}+b_{1})& =\alpha a_{1}+\alpha b_{1}\in A+B
\end{align*}
Hence \(A+B\) is a subspace of \(V\).
Clearly \(A\subseteq A+B\) and \(B\subseteq A+B\).
Now, let \(W\) be any subspace of \(V\) containing \(A\) and \(B\).
We shall prove that \(A+B\subseteq W\).
Let \(v\in A+B\). Then \(v=a+b\) where \(a\in A\) and \(b\in B\). Since \(A\subseteq W, a\in W\).
Similarly \(b\in W\) and \(a+b=v\in W\).
Therefore \(A+B\subseteq W\) so that \(A+B\) is the smallest subspace of \(V\) containing \(A\) and \(B.\)
Solution : Let \(A\cap B=\{0\}\), \(\therefore \, v\in A+B\).
Let \(v=a_{1}+b_{1}=a_{2}+b_{2}\) where \(a_{1}, a_{2}\in A\) and \(b_{1}, b_{2}\in B\).
Then \(a_{1}-a_{2}=b_{2}-b_{1}\).
But \(a_{1}-a_{2}\in A\) and \(b_{2}-B_{1}\in B\).
Hence \(a_{1}-a_{2}, b_{2}-b_{1}\in A\cap B\). Since \(A\cap B=\{0\}, a_{1}-a_{2}=0\) and \(b_{2}-b_{1}=0\) so that \(a_{1}=a_{2}\) and \(b_{1}=b_{2}\).
Hence the expression of \(v\) in the form \(a+b\) where \(a\in A\) and \(b\in B\) is unique.
Conversely suppose that any element in \(A+B\) can be uniquely expressed in the form \(a+b\) where \(a\in A\) and \(b\in B\).
We claim that \(A\cap B=\{0\}\).
If \(A\cap B\neq \{0\}\), let \(v\in A\cap B\) and \(v\neq 0\). Then \(0=v-v=0+0\). Thus \(0\) has been expressed in the form \(a+b\) in two different ways which is a contradiction. Hence \(A\cap B=\{0\}\).
1. In \(V_{3}(\mathbb {R})\) let \(A=\{(a,\ b,\ 0)\ :\ a,\ b\in \mathbb {R}\}\) and \(B=\{(0,0,\ c)\ :\ c\in \mathbb {R}\}\).
Clearly \(A\) and \(B\) are subspaces of \(V\) and \(A\cap B=\{0\}\).
Also let \(v=(a,\ b,\ c)\in V_{3}(\mathbb {R})\).
Then \(v=(a,\ b,\ 0)+(0,0,\ c)\) so that \(A+B=V_{3}(\mathbb {R})\).
Hence \(V_{3} (\mathbb {R}) =A\oplus B.\)
2. In \(M_{2}(\mathbb {R})\), let \(A\) be the set of all matrices of the form \(\left (\begin {smallmatrix} a & b\\ 0 & 0 \end {smallmatrix}\right )\) and \(B\) be the set of all matrices of the form \(\left (\begin {smallmatrix} 0 & 0\\ c
& d \end {smallmatrix}\right )\).
Clearly \(A\) and \(B\) are subspaces of \(M_{2} (\mathbb {R})\) and
\(A\cap B=\left ( \begin {smallmatrix} 0 & 0\\ 0 & 0 \end {smallmatrix} \right ) \) and \(A+B=M_{2}(\mathbb {R})\).
Hence \(M_{2}(\mathbb {R})=A\oplus B.\)
Proof : Since \(W\) is a subspace of \(V\) it is a subgroup of \((V,\ +)\). Since \((V,\ +)\) is abelian, \(W\) is normal subgroup of \((V,\ +)\) so that (i) is a well defined operation.
Now we shall prove that (ii) is a well defined operation.
\[ W+v_{1}=W+v_{2}\Rightarrow v_{1}-v_{2}\in W\Rightarrow \alpha (v_{1}-v_{2})\in W\]
since \(W\) is a subspace
\[\Rightarrow \alpha v_{1}-\alpha v_{2}\in W\Rightarrow \alpha v_{1}\in W+\alpha v_{2}\Rightarrow W+\alpha v_{1}=W+\alpha v_{2}\]
Hence (ii) is a well defined operation.
Now, let \(W+v_{1}, W+v_{2}, W+v_{3}\in V/W.\) Then
\(\seteqnumber{0}{1.}{0}\)
\begin{align*}
(W+v_{1})+[(W+v_{2})& +(W+v_{3})]\\ & =(W+v_{1})+(W+v_{2}+v_{3})\\ & =W+v_{1}+v_{2}+v_{3}\\ & = (W+v_{1}+v_{2})+(W+v_{3})\\ & =[(W+v_{1})+(W+v_{2})]+(W+v_{3})
\end{align*}
Hence \(+ \) is associative.
\(W+0=W\in V/W\) is the additive identity element. For
\[(W+v_{1})+(W+{0})= W+v_{1}=(W+0)+(W+v_{1})\]
for all \(v_{1}\in V\).
Also \(W-v_{1}\) is the additive inverse of \(W+v_{1}\).
Hence \(V/W\) is a group under\(+\).
Further,
\begin{align*}
(W+v_{1})+(W+v_{2})& =W+v_{1}+v_{2}\\ & =W+v_{2}+v_{1}\\ & =(W+v_{2})+(W+v_{1}).
\end{align*}
Hence \(V/W\) is an abelian group.
Now, let \(\alpha , \beta \in F.\)
\begin{align*} \alpha [(W+v_{1})+(W+v_{2})] & =\alpha (W+v_{1}+v_{2})\\ & =W+\alpha (v_{1}+v_{2})\\ & = W+\alpha v_{1}+\alpha v_{2}\\ & =(W+\alpha v_{1})+(W+\alpha v_{2})\\ & =\alpha (W+v_{1})+\alpha (W+v_{2})(\alpha +\beta )(W+v_{1})\\ &= W+(\alpha +\beta )v_{1}\\ &=W+\alpha v_{1}+\beta v_{1}\\ &=(W+\alpha v_{1})+(W+\beta v_{1})\\ &=\alpha (W+v_{1})+\beta (W+ v_{1})\alpha [\beta (W+v_{1})]\\ &=\alpha (W+\beta v_{1})\\ &=W+\alpha \beta v_{1}1(W+v_{1})\\ &=W+1v_{1}=W+v_{1} \end{align*} Hence \(V/W\) is a vector space. The vector space \(V/W\) is called the quotient space of \(V\) by \(W.\) □
1.5 Linear Transformation
Definition 1.5.1. Let \(V\) and \(W\) be vector space over a field \(F\). A mapping \(T:V\rightarrow W\) is called a homomorphism if
(i) \(T(u+v)=T(u)+T(v)\) and
(ii) \(T(\alpha u)=\alpha T(u)\) where \(\alpha \in F\) and \(u, v\in V.\)
A homomorphism \(T\) of vector space is also called a linear transformation.
(a) If \(T\) is 1-1 then \(T\) is called monomorphism.
(b) If \(T\) is onto then \(T\) is called an epimorphism.
(c) If \(T\) is 1-1 and onto then \(T\) is called an isomorphism.
(d) Two vector spaces \(V\) and \(W\) are said to be isomorphic if there exists an isomorphism \(T\) from \(V\) to \(W\) and we write \(V\cong W.\)
(e) A linear transformation \(T:V\rightarrow F\) is called a linear functional.
1. \(T:V\rightarrow W\) defined by \(T(v)=0\) for all \(v\in V\) is a trivial linear transformation.
2. \(T:V\rightarrow V\) defined by \(T(v)=v\) for all \(v\in V\) is a identity linear transformation.
3. Let \(V\) be a vector space over a field \(F\) and \(W\) a subspace of \(V\). Then \(T:V\rightarrow V/W\) defined by \(T(v)=W+v\) is a linear transformation.
\(\seteqnumber{0}{1.}{0}\)\begin{align*} T(v_{1}+v_{2})& =T+v_{1}+v_{2}\\ &=(W+v_{1})+(W+v_{2})\\ &=T(v_{1})+T(v_{2}) \end{align*} Also,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} T(\alpha v_{1})&=W+\alpha v_{1}\\ &=\alpha (W+v_{1})\\ &=\alpha T(v_{1}) \end{align*}
This is called the natural homomorphism from \(V\) to \(V/W\). Clearly \(T\) is onto and hence \(T\) is an epimorphism.
4. \(T\) : \(V_{3}(\mathbb {R})\rightarrow V_{3}(\mathbb {R})\) defined by \(T(a,\ b,\ c)=(a,\ 0,0)\) is a linear transformation.
5. Let \(V\) be the set of all polynomials of degree \(\leq n\) in \(\mathbb {R}[x]\) including the zero polynomial. \(T:V\rightarrow V\) defined by \(T(f)=\displaystyle \frac {{d}f}{{d}x}\) is a linear transformation.
For, \(T(f+g)=\displaystyle \frac {{d}(f+g)}{{d}x}=\frac {{d}f}{{d}x}+\frac {{d}g}{{d}x}=T(f)+T(g)\).
Also \(T(\displaystyle \alpha f)=\frac {{d}(\alpha f)}{{d}x}=\alpha \frac {{d}f}{{d}x}=\alpha T(f)\).
6. Let \(V\) be as in Example 5. Then \(T : V\rightarrow V_{n+1}( \mathbb {R}) \) defined by \(T(a_{0}+a_{1}x+\cdots +a_{0}x^{n})= (a_{0},\ a_{1},\ \ldots ,\ a_{n})\) is a linear transformation.
Let \(f=a_{0}+a_{1}x+\cdots +a_{0}x^{n}\) and \(g=b_{0}+b_{1}x+\cdots +b_{0}x^{n}\). Then
\(\seteqnumber{0}{1.}{0}\)\begin{align*} f+g& =(a_{0}+b_{0})+(a_{1}+b_{1})x+\cdots +(a_{n}+b_{n})x^{n}\\ T(f+g)& =((a_{0}+b_{0}),\ (a_{1}+b_{1}),\ \ldots ,\ (a_{n}+b_{n}))\\ & =(a_{0},\ a_{1},\ \ldots ,\ a_{n})+(b_{0},\ b_{1},\ \ldots ,\ b_{n})\\ &=T(f)+T(g) \end{align*} Also,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} T(\alpha f)&=(\alpha a_{0},\ \alpha a_{1},\ \ldots ,\ \alpha a_{n})\\ &=\alpha (a_{0},\ a_{1},\ \ldots ,\ a_{n})\\ & =\alpha T(f). \end{align*}
Clearly \(T\) is 1-1 and onto and hence \(T\) is an isomorphism.
7. Let \(V\) denote the set of all sequence in \(\mathbb {R}. T:V\rightarrow V\) defined by \(T(a_{1},\ a_{2},\ \ldots ,\ a_{n},\ \ldots )= (0,\ a_{0},\ a_{1},\ \ldots ,\ a_{n},\ \ldots )\) is a linear transformation.
8. \(T : \mathbb {R}^{2}\rightarrow \mathbb {R}^{2}\) defined by \(T(a,\ b)=(2a-3b,\ a+4b)\) is a linear transformation.
Let \(u=(a,\ b)\) and \(v=(c,\ d)\) and \(\alpha \in \mathbb {R}.\)
\(\seteqnumber{0}{1.}{0}\)\begin{align*} T(u+v)& =T((a,\ b)+(c,\ d))\\ &=T(a+c,\ b+d)\\ &=(2(a+c)-3(b+d),\ (a+c)+4(b+d))\\ &=(2a+2c-3b-3d,\ a+c+4b+4d)\\ &=(2a-3b+2c-3d,\ a+4b+c+4d)\\ &=(2a-3b,\ a+4b)+(2c-3d,\ c+4d)\\ &=T(a,\ b)+T(c,\ d)\\ & =T(u)+T(v) \end{align*} Also,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} T(\alpha u)&=T(\alpha (a,\ b))=T(\alpha a,\ \alpha b)\\ &=(2\alpha a-3\alpha b,\ \alpha a+4\alpha b)\\ &=\alpha (2a-3b,\ a+4b)\\ &=\alpha T(a,\ b)\\ &=\alpha T(u) \end{align*} \(T\) is a linear transformation.
Proof : Let \(w_{1}\) and \(w_{2}\in T(V)\) and \(\alpha \in F\).
Then there exist \(v_{1}, v_{2}\in V\) such that \(T(v_{1})=w_{1}\) and \(T(v_{2})=w_{2}\). Hence
\begin{align*} w_{1}+w_{2}& =T(v_{1})+T(v_{2})\\ &=T(v_{1}+v_{2})\in T(V) \end{align*} Similarly,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} \alpha w_{1}&=\alpha T(v_{1})\\ &=T(\alpha v_{1})\in T(V) \end{align*} Hence \(T(V)\) is subspace of \(W.\) □
Proof :
1. Given \(V_{1}=ker \, T=\{v\) : \(v\in V\) and \(T(v)=0\}\). Clearly \(T({0})=0\).
Hence \(0\in kerT=V_{1}\). \(\therefore \, V_{1}\) is non-empty subset of \(V\).
Let \(u, v\in ketT\) and \(\alpha , \beta \in F\).
\(T(u)=0\) and \(T(v)=0\). Now
\begin{align*} T(\alpha u+\beta v)& =T(\alpha u)+T(\beta v)\\ & =\alpha T(u)+\beta T(v)\\ & =\alpha 0+\beta 0\\ & =0\\ \therefore \, \alpha u+\beta v & \in kerT \end{align*} Hence \(kerT\) is a subspace of \(V.\)
2. We define a map \(\varphi : \displaystyle \frac {V}{V_{1}}\rightarrow W\) by \(\varphi (V_{1}+v)=T(v)\).
\(\varphi \) is well defined.
Let \(V_{1}+v =V_{1}+w\)
\begin{align*} v & \in V_{1}+w\\ v& =v_{1}+w \quad \text { where } v_{1}\in V \\ T(v)& =T(v_{1}+ w) \\ & =T(v_{1})+T(w)\\ & =0+T(w)\\ & =T(w)\\ \varphi (V_{1}+v)& =\varphi (V_{1}+w) \end{align*}
\(\varphi \) is 1-1.
\[\begin {array}{lll} & \varphi (V_{1}+v)& =\varphi (V_{1}+w) \\ \Rightarrow & T(v)& =T(w)\\ \Rightarrow &T(v)-T(w)& =0\\ \Rightarrow &T(v)+T(-w) & =0 \\ \Rightarrow &T(v-w) &=0\\ \Rightarrow &v-w & \in kerT=V_{1} \\ \Rightarrow &v& \in V_{1}+w\\\Rightarrow &V_{1}+v& =V_{1}+w \end {array}\]
\(\varphi \) is onto.
Let \(w\in W\). Since \(T\) is onto, there exists \(v\in V\) such that \(T(v)=w\) and so \(\varphi (V_{1}+v)=w\).
\(\varphi \) is a homomorphism.
\(\seteqnumber{0}{1.}{0}\)\begin{align*} \varphi [(V_{1}+v)+(V_{1}+w)]& =\varphi [(V_{1}+(v+w) ] \\ & =T(v+w)\\ & =T(v)+T(w)\\ & = \varphi (V_{1}+v)+\varphi (V_{1}+w) \end{align*} Also,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} \varphi [\alpha (V_{1}+v)]& =\varphi [(V_{1}+\alpha v)]\\ & =T(\alpha v)=\alpha T(v)\\ & =\alpha T(V_{1}+v) \end{align*} Hence \(\varphi \) is an isomorphism from \(\displaystyle \frac {V_{1}}{V}\) onto \(W\) and so \(\displaystyle \frac {V_{1}}{V}\cong W.\)
□
Proof : We know that \(A+B\) is a subspace of \(V\) containing \(A\).
Hence \(\displaystyle \frac {A+B}{A}\) is also a vector space over \(F\).
An element of \(\displaystyle \frac {A+B}{A}\) is of the form \(A+(a+b)\) where \(a\in A\) and \(b\in B.\) But \(A+a=A\).
Hence an element of \(\displaystyle \frac {A+B}{A}\) is of form \(A+b\).
Now, consider \(f : B\displaystyle \rightarrow \frac {A+B}{A}\) defined by \(f(b)=A+b\).
Clearly \(f\) is onto. Also
\begin{align*} f(b_{1}+b_{2})& =A+(b_{1}+b_{2})\\ & = (A+b_{1})+(A+b_{2})\\ & =f(b_{1})+f(b_{2}) \end{align*} and
\(\seteqnumber{0}{1.}{0}\)
\begin{align*}
f(\alpha b_{1})&= A+\alpha b_{1}\\ & =\alpha (A+b_{1})\\ & =\alpha f(b_{1})
\end{align*}
Hence \(f\) is a linear transformation.
Let \(K\) be the kernel of \(f\). Then
\[K=\{b\ :\ b\in B,\ A+b=A\}.\]
Now, \(A+b=A\) if and only if \(b\in A\).
Hence \(K=A\cap B\) and so \(\displaystyle \frac {B}{A\cap B}\cong \frac {A+B}{A}.\) □
Theorem 1.5.9. Let \(V\) and \(W\) be vector spaces over a field \(F\). Let \(L(V,\ W)\) represent the set of all linear transformations from \(V\) to \(W\). Then \(L(V,\ W)\) itself is a vector space over \(F\) under addition and scalar multiplication defined by \((f+g)(v)=f(v)+g(v)\) and \((\alpha f)(v)=\alpha f(v)\).
Proof : Let \(f, g\in L(V,\ W)\) and \(v_{1}, v_{2}\in V\). Then
\(\seteqnumber{0}{1.}{0}\)\begin{align*} (f+g)(v_{1}+v_{2})& =f(v_{1}+v_{2})+g(v_{1}+v_{2})\\ & =f(v_{1})+f(v_{2})+g(v_{1})+g(v_{2})\\ & =f(v_{1})+g(v_{1})+f(v_{2})+g(v_{2})\\ &=(f+g)(v_{1})+(f+g)(v_{2}) \end{align*} Also,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} (f+g)(\alpha v)& =f(\alpha v)+g(\alpha v)\\ & =\alpha f(v)+\alpha g(v)\\ & =\alpha [f(v)+g(v)]\\ & =\alpha (f+g)(v) \end{align*} Hence \((f+g)\in L(V,\ W)\). Now,
\(\seteqnumber{0}{1.}{0}\)\begin{align*} (\alpha f)(v_{1}+v_{2})& =(\alpha f)(v_{1})+(\alpha f)(v_{2})\\ & =\alpha f(v_{1})+\alpha f(v_{2})\\ & =\alpha [f(v_{1})+f(v_{2})]\\ & =\alpha f(v_{1}+v_{2}) \end{align*} Also,
\(\seteqnumber{0}{1.}{0}\)
\begin{align*}
(\alpha f)(\beta v)&=\alpha [f(\beta v)]\\ & =\alpha [\beta f(v)]\\ & =\beta [\alpha f(v)]\\ &=\beta [(\alpha f)(v)]
\end{align*}
Hence \(\alpha f\in L(V,\ W)\).
Addition defined on \(L(V,\ W)\) is obviously commutative and associative.
The function \(f : V\rightarrow W\) defined by \(f(v)=0\) for all \(v\in V\) is clearly a linear transformation and is the additive identity of \(L(V,\ W)\).
Further \((-f) : V\rightarrow W\) defined by \((-f)(v)=-f(v)\) is the additive inverse of \(f\).
Thus \(L(V,\ W)\) is an abelian group under addition.
The remaining axioms for a vector space are obviously true.
Hence \(L(V,\ W)\) is a vector space over \(F.\) □
1.6 Span of a set
Definition 1.6.1. Let \(V\) be a vector space over a field \(F\). Let \(v_{1}, v_{2}, \ldots , v_{n}\in V.\) Then an element of the form \(\alpha _{1}v_{1}+\alpha _{2}v_{2}+\cdots +\alpha _{n}v_{n}\) where \(\alpha _{i}\in F\) is called a linear combination of the vectors \(v_{1}, v_{2}, \ldots , v_{n}.\)
Theorem 1.6.4. Let \(V\) be a vector space over a field \(F\) and \(S\) be a non-empty subset of \(V\). Then
(i) \(L(S)\) is a subspace of \(V.\)
(ii) \(S\subseteq L(S)\).
(iii) If \(W\) is any subspace of \(V\) such that \(S\subseteq W\), then \(L(S)\subseteq W\) (ie.,) \(S\) is the smallest subspace of \(V\) containing \(S.\)
Proof :
1. Let \(v, w\in L(S)\) and \(\alpha , \beta \in F\).
Then \(v=\alpha _{1}v_{1}+\alpha _{2}v_{2}+\cdots +\alpha _{n}v_{n}\) where \(v_{i}\in S\) and \(\alpha _{i}\in F\).
Also, \(w=\beta _{1}w_{1}+\beta _{2}w_{2}+\cdots +\beta _{m}w_{m}\) where \(w_{j}\in S\beta _{j}\in F.\) Now,
\begin{align*}
\alpha v+\beta w& =\alpha (\alpha _{1}v_{1}+\alpha _{2}v_{2}+\cdots +\alpha _{n}v_{n})\\ & \quad +\beta (\beta _{1}w_{1}+\beta _{2}w_{2}+\cdots +\beta _{m}w_{m})\\ &=(\alpha \alpha _{1})v_{1}+\cdots +(\alpha \alpha
_{n})v_{n}\\ & \quad +(\beta \beta _{1})w_{1}+\cdots +(\beta \beta _{m})w_{m}
\end{align*}
\(\therefore \) \(\alpha v+\beta w\) is also a linear combination of a finite number of elements of \(S\).
Hence \(\alpha v+\beta w\in L(S)\) and so \(L(S)\) is a subspace of \(S.\)
2. Let \(u\in S\). Then \(u=1u\in L(S)\). Hence \(S\subseteq L(S)\).
3. Let \(W\) be any subspace of \(V\) such that \(S\subseteq W\).
We claim that \(L(S) \subseteq W \).
Let \(u\in L(S)\).
Then \(u=\alpha _{1}u_{1}+ \alpha _{2}u_{2}+\cdots +\alpha _{n}u_{n}\) where \(u_{i}\in S\) and \(\alpha _{i}\in F\).
Since \(S\subseteq W\), we have \(u_{1}, u_{2}, \ldots , u_{n}\in W\) and so \(u\in W\).
Hence \(L(S)\subseteq W.\)
□
1. In \(V_{3}(\mathbb {R})\) let \(e_{1}=(1,0,0);e_{2}=(0,1,0)\) and \(e_{3}=(0,0,1)\)
(a) Let \(S=\{e_{1},\ e_{2}\}\). Then
\(\seteqnumber{0}{1.}{0}\)\begin{align*} L(S)& =\{\alpha e_{1}+\beta e_{2}\ :\ \alpha ,\ \beta \in \mathbb {R}\}\\ & =\{(\alpha ,\ \beta ,\ 0)\ :\ \alpha ,\ \beta \in \mathbb {R}\} \end{align*}
(b) Let \(S=\{e_{1},\ e_{2},\ e_{3}\}\). Then
\(\seteqnumber{0}{1.}{0}\)\begin{align*} L(S)& =\{\alpha e_{1}+\beta e_{2}+\gamma e_{3}\ :\ \alpha ,\ \beta ,\ \gamma \in \mathbb {R}\}\\ & =\{(\alpha ,\ \beta ,\ \gamma ) : \alpha , \beta , \gamma \in \mathbb {R}\}\\ & =V_{3} (\mathbb {R}) \end{align*} Thus \(V_{3} (\mathbb {R})\) is spanned by \(\{e_{1},\ e_{2},\ e_{3}\}.\)
2. In \(V_{n}(\mathbb {R})\) let \(e_{1}=(1,0,\ \ldots ,\ 0)\); \(e_{2}=(0,1,0,\ \ldots ,\ 0)\), \(\ldots \), \(e_{n}=(0,0,\ \ldots ,\ 1)\). Let \(S=\{e_{1},\ e_{2},\ \ldots ,\ e_{n}\}\). Then
\(\seteqnumber{0}{1.}{0}\)\begin{align*} L(S) & =\{\alpha _{1}e_{1}+\alpha _{2}e_{2}+\alpha _{n}e_{n}\ :\ \alpha _{i}\in \mathbb {R}\}\\ & = \{(\alpha _{1},\ \alpha _{2},\ \ldots ,\ \alpha _{n})\ :\ \alpha _{i}\in \mathbb {R}\}\\ & =V_{n}(\mathbb {R}) \end{align*} Thus \(V_{n}(\mathbb {R})\) is spanned by \(\{e_{1},\ e_{2},\ \ldots ,\ e_{n}\}.\)
Proof :
(a) Let \(S\subseteq T\). Let \(s\in L(S)\). Then
\[s=\alpha _{1}s_{1}+\alpha _{2}s_{2}+\cdots +\alpha _{n}s_{n} \text { where } s_{i}\in S \text { and } \alpha _{i}\in .F\]
Now, since \(S\subseteq T, s_{i}\in T\).
Hence \(\alpha _{1}s_{1}+\alpha _{2}s_{2}+\cdots +\alpha _{n}s_{n}\in L(T)\).
Thus \(L(S) \subseteq L(T).\)
(b) Let \(s\in L(S\cup T)\).
Then \(s=\alpha _{1}s_{1}+\alpha _{2}s_{2}+\cdots +\alpha _{n}s_{n}\) where \(s_{i}\in S\cup T\) and \(\alpha _{i}\in F.\)
Without loss of generality we can assume that \(s_{1}, s_{2}, \ldots , s_{m}\in S\) and \(s_{m+1}, \ldots , s_{n}\in T.\) Hence
\begin{align*} \alpha _{1}s_{1}+\alpha _{2}s_{2}+\cdots +\alpha _{m}s_{m} & \in L(S) \text { and } \\ \alpha _{m+1}s_{m+1}+\cdots +\alpha _{n}s_{n}& \in L(T) \end{align*}
\(\seteqnumber{0}{1.}{0}\)
\begin{align*}
s& = (\alpha _{1}s_{1}+\alpha _{2}s_{2}+\cdots +\alpha _{m}s_{m})+(\alpha _{m+1}s_{m+1}+\cdots +\alpha _{n}s_{n})\\ & \in L(S)+L(T)
\end{align*}
Also by (a) \(L(S)\subseteq L(S\cup T)\) and \(L(T)\subseteq L(S\cup T)\).
Hence \(L(S)+L(T)\subseteq L(S\cup T)\).
Hence \(L(S)+L(T)=L(S\cup T)\).
(c) Let \(L(S)=S\). Then \(L(S)=S\) is a subspace of \(V\).
Conversely, let \(S\) be a subspace \(V\). Then the smallest subspace containing \(S\) is \(S\) itself.
Hence \(L(S)=S.\)
□
0 Comments