MATRICES
1. INTRODUCTION:
A rectangular array of mn numbers in the form of \(m\) horizontal lines (called rows) and \(n\) vertical lines (called columns), is called a matrix of order \(m\) by \(n\), written as \(m \times n\) matrix. In compact form, the matrix is represented by \(\mathrm{A}=\left[\mathrm{a}_{\mathrm{i}}\right]_{\mathrm{mx}}\)
2. SPECIAL TYPE OF MATRICES :
(b) Column Matrix (Column vector) : \(A=\left[\begin{array}{c}a_{11} \\ a_{21} \\ : \\ a_{m 1}\end{array}\right]\) i.e. columr matrix has exactly one column.
(c) Zero or Null Matrix : \(\left(A=O_{m \times n}\right)\), An \(m \times n\) matrix whose all entries are zero.
(d) Horizontal Matrix : A matrix of order \(\mathrm{m} \times \mathrm{n}\) is a horizontal matrix if \(\mathrm{n}>\mathrm{m}\).
(e) Vertical Matrix : A matrix of order \(m \times n\) is a vertical matrix if \(\mathrm{m}>\mathrm{n}\)
(f) Square Matrix : (Order n) If number of rows \(=\) number of column, then matrix is a square matrix.
(i) The pair of elements \(a_{i j} \) & \(a_{\text {in }}\) are called Conjugate Elements.
(ii) The elements \(\mathrm{a}_{11}, \mathrm{a}_{22}, \mathrm{a}_{33}, \ldots \ldots . \mathrm{a}_{\mathrm{m}}\) are called Diagonal Elements. The line along which the diagonal elements lie is called "Principal or Leading diagonal". The quantity \(\sum \mathrm{a}_{\mathrm{ii}}=\) trace of the matrix written as, \(\mathrm{t}_{\mathrm{r}}(\mathrm{A})\)
3. SQUARE MATRICES :
(i) Minimum number of zeros in triangular matrix of order \(\mathrm{n}\) \(=\mathrm{n}(\mathrm{n}-1) / 2\)
(ii) Minimum number of zeros in a diagonal matrix of order \(\mathrm{n}\) \(=\mathrm{n}(\mathrm{n}-1)\)
(iii) Null square matrix is also a diagonal matrix.
4. EQUALITY OF MATRICES :
(a) both have the same order.
(b) \(\mathrm{a}_{\mathrm{ij}}=\mathrm{b}_{\mathrm{iij}}\) for each pair of \(\mathrm{i} \) & \( \mathrm{j}\).
5. ALGEBRA OF MATRICES :
(I) Addition : \(A+B=\left[a_{i j}+b_{i j}\right]\) where \(A \) & \(B\) are of the same order.
(b) Matrix addition is associative: \((A+B)+C=A+(B+C)\)
(c) \(\mathrm{A}+\mathrm{O}=\mathrm{O}+\mathrm{A}\) (Additive identity)
(d) \(A+(-A)=(-A)+A=O\) (Additive inverse)
(II) Multiplication of A Matrix By A Scalar:
If \(A=\left[\begin{array}{lll}a & b & c \\ b & c & a \\ c & a & b\end{array}\right]\), then \(k A=\left[\begin{array}{lll}k a & k b & k c \\ k b & k c & k a \\ k c & k a & k b\end{array}\right]\)
(III) Multiplication of matrices (Row by Column):
Let A be a matrix of order \(m \times n\) and \(B\) be a matrix of order \(p\) \(\times \mathrm{q}\) then the matrix multiplication \(\mathrm{AB}\) is possible if and only if \(\mathrm{n}=\mathrm{p}\)
Let \(A_{m \times n}=\left[a_{i j}\right]\) and \(B_{n \times p}=\left[b_{i j}\right]\), then order of \(A B\) is \(m \times p\) & \(\boxed{(\mathrm{AB})_{\mathrm{ij}}=\sum_{\mathrm{r}=1}^{\mathrm{n}} \mathrm{a}_{\mathrm{ir}} \mathrm{b}_{\mathrm{rj}}}\)
(IV)Properties of Matrix Multiplication:
Note :
If \(A\) and \(B\) are two non-zero matrices such that \(A B=O\), then \(A\) and \(B\) are called the divisors of zero. If \(A\) and \(B\) are two matrices such that
(i) \(A B=B A\) then \(A\) and \(B\) are said to commute
(ii) \(\mathrm{AB}=-\mathrm{BA}\) then \(\mathrm{A}\) and \(\mathrm{B}\) are said to anticommute
If \(A, B \) & \(C\) are conformable for the product \(A B \) & \( B C\), then \((\mathrm{AB}) \mathrm{C}=\mathrm{A}(\mathrm{BC})\)
(c) Distributivity :
\(A(B+C)=A B+A C]\) Provided \(A, B \) & \( C\) are conformable \((A+B) C=A C+B C\) for respective products
(V) Positive Integral Powers of A square matrix :
(b) \(\left(\mathrm{A}^{\mathrm{m}}\right)^{\mathrm{n}}=\mathrm{A}^{\mathrm{mn}}=\left(\mathrm{A}^{\mathrm{n}}\right)^{\mathrm{m}}\)
(c) \(\mathrm{I}^{\mathrm{m}}=\mathrm{I}, \mathrm{m}, \mathrm{n} \in \mathrm{N}\)
6. CHARACTERISTIC EQUATION:
Let \(A\) be a square matrix. Then the polynomial in \(x,|A-x I|\) is called as characteristic polynomial of \(A \) \(\&\) the equation \(|A-x I|=0\) is called characteristic equation of \(A\)
7. CAYLEY - HAMILTON THEOREM :
Every square matrix A satisfy its characteristic equation i.e. \(a_{0} x^{n}+a_{1} x^{n-1}+\ldots \ldots . .+a_{n-1} x+a_{n}=0\) is the characteristic equation of matrix \(A\), then \(a_{0} A^{n}+a_{1} A^{n-1}+\ldots \ldots . .+a_{n-1} A+a_{n} I=0\)
8. TRANSPOSE OF A MATRIX : (Changing rows & columns)
Let \(A\) be any matrix of order \(m \times n\). Then transpose of \(A\) is \(A^{T}\) or \(A^{\prime}\) of order \(\mathrm{n} \times \mathrm{m}\) and \(\left(\mathrm{A}^{\mathrm{T}}\right)_{\mathrm{ij}}=(\mathrm{A}_{ji})\).
Properties of transpose :
(a) \((\mathrm{A}+\mathrm{B})^{\mathrm{T}}=\mathrm{A}^{\mathrm{T}}+\mathrm{B}^{\mathrm{T}} ;\) note that \(\mathrm{A} \) & \(\mathrm{~B}\) have the same order.
(b) \((\mathrm{A} \mathrm{B})^{\mathrm{T}}=\mathrm{B}^{\mathrm{T}} \mathrm{A}^{\mathrm{T}}\) (Reversal law) \(\mathrm{A} \) & \(\mathrm{~B}\) are conformable for matrix product \(A B\)
(c) \(\left(\mathrm{A}^{\mathrm{T}}\right)^{\mathrm{T}}=\mathrm{A}\)
(d) \((\mathrm{kA})^{\mathrm{T}}=\mathrm{kA}^{\mathrm{T}}\), where \(\mathrm{k}\) is a scalar.
General: \(\left(A_{1} \cdot A_{2}, \ldots \ldots . A_{n}\right)^{\mathrm{T}}=A_{n}^{\mathrm{T}} \cdots \cdots \cdot \cdot A_{2}^{\mathrm{T}} \cdot A_{1}^{\mathrm{T}}\) (reversal law for transpose)
9. ORTHOGONAL MATRIX
A square matrix is said to be orthogonal matrix if \(\mathrm{A} \mathrm{A}^{\mathrm{T}}=\mathrm{I}\)
(i) The determinant value of orthogonal matrix is either 1 or \(-1\). Hence orthogonal matrix is always invertible
(ii) \(\mathrm{AA}^{\mathrm{T}}=\mathrm{I}=\mathrm{A}^{\mathrm{T}} \mathrm{A}\). Hence \(\mathrm{A}^{-1}=\mathrm{A}^{\mathrm{T}}\).
10. SOME SPECIAL SQUARE MATRICES :
(a) Idempotent Matrix : A square matrix is idempotent provided \(\mathrm{A}^{2}=\mathrm{A} .\)
(i) \(A^{n}=A\,\, \forall \,\, n \in N\)
(ii) determinant value of idempotent matrix is either 0 or 1
(iii) If idempotent matrix is invertible then it will be an identity matrix i.e. I.
Note that period of an idempotent matrix is 1 .
Note that a nilpotent matrix will not be invertible.
Note that \(A=A^{-1}\) for an involutary matrix.
\((\mathrm{A}+\mathrm{B})^{\mathrm{n}}={ }^{\mathrm{n}} \mathrm{C}_{0} \mathrm{~A}^{\mathrm{n}}+{ }^{\mathrm{n}} \mathrm{C}_{1} \mathrm{~A}^{\mathrm{n}-1} \mathrm{~B}+{ }^{\mathrm{n}} \mathrm{C}_{2} \mathrm{~A}^{\mathrm{n}-2} \mathrm{~B}^{2}\)\(+\ldots \ldots\)\( \ldots . .+{ }^{\mathrm{n}} \mathrm{C}_{\mathrm{n}} \mathrm{B}^{\mathrm{n}}\)
11. SYMMETRIC & SKEW SYMMETRIC MATRIX :
(a) Symmetric matrix :
Note : Maximum number of distinct entries in any symmetric matrix of order \(\mathrm{n}\) is \(\dfrac{\mathrm{n}(\mathrm{n}+1)}{2}\).
(b) Skew symmetric matrix :
Thus the diagonal elements of a skew square matrix are all zero, but not the converse.
(c) Properties of symmetric & skew symmetric matrix :
(ii) The sum of two symmetric matrix is a symmetric matrix and the sum of two skew symmetric matrix is a skew symmetric matrix.
(1) \(\mathrm{AB}+\mathrm{BA}\) is a symmetric matrix
(2) \(\mathrm{AB}-\mathrm{BA}\) is a skew symmetric matrix.
\(\begin{aligned}A &=\underbrace{\frac{1}{2}\left(\mathrm{~A}+\mathrm{A}^{\mathrm{T}}\right)}_{\text {symmetric }}+\underbrace{\frac{1}{2}\left(\mathrm{~A}-\mathrm{A}^{\mathrm{T}}\right)}_{\text {skew symmetric }} \\\text { and } \quad A &=\frac{1}{2}\left(\mathrm{~A}^{\mathrm{T}}+\mathrm{A}\right)-\frac{1}{2}\left(\mathrm{~A}^{\mathrm{T}}-\mathrm{A}\right)\end{aligned}\)
12. ADJOINT OF A SQUARE MATRIX :
Let \(A=\left[a_{i j}\right]=\left(\begin{array}{lll}a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33}\end{array}\right)\) be a square matrix and let the matrix formed by the cofactors of \(\left[a_{i j}\right]\) in determinant \(|A|\) is \(\left(\begin{array}{lll}\mathrm{C}_{11} & \mathrm{C}_{12} & \mathrm{C}_{13} \\ \mathrm{C}_{21} & \mathrm{C}_{22} & \mathrm{C}_{23} \\ \mathrm{C}_{31} & \mathrm{C}_{32} & \mathrm{C}_{33}\end{array}\right) .\) Then \((\operatorname{adj} \mathrm{A})=\left(\begin{array}{ccc}\mathrm{C}_{11} & \mathrm{C}_{21} & \mathrm{C}_{31} \\ \mathrm{C}_{12} & \mathrm{C}_{22} & \mathrm{C}_{32} \\ \mathrm{C}_{13} & \mathrm{C}_{23} & \mathrm{C}_{33}\end{array}\right)\)= Transpose of cofactor matrix.
If \(A\) be a square matrix of order \(n\), then
(i) \(\quad A(\operatorname{adj} A)=|A| I_{n}=(\operatorname{adj} A) \cdot A\)
(ii) \(\quad|\operatorname{adj} A|=|A|^{n-1}, n \geq 2\)
(iii) \(\operatorname{adj}(\operatorname{adj} A)=|A|^{n-2} A, \quad|A| \neq 0\).
(iv) \(\operatorname{adj}(\mathrm{AB})=(\operatorname{adj} B)(\operatorname{adj} \mathrm{A})\)
(v) \(\operatorname{adj}(\mathrm{KA})=\mathrm{K}^{\mathrm{n}-1}(\operatorname{adj} \mathrm{A})\), where \(\mathrm{K}\) is a scalar
13. INVERSE OF A MATRIX (Reciprocal Matrix) :
We have, \(A \cdot(\operatorname{adj} A)=|A| I_{n}\)
\(\Rightarrow \mathrm{A}^{-1} \cdot \mathrm{A}(\operatorname{adj} \mathrm{A})=\mathrm{A}^{-1} \mathrm{I}_{\mathrm{n}}|\mathrm{A}|\)
\(\Rightarrow \mathrm{I}_{\mathrm{n}}(\operatorname{adj} \mathrm{A})=\mathrm{A}^{-1}|\mathrm{~A}| \mathrm{I}_{\mathrm{n}}\)
\(\therefore \quad \mathrm{A}^{-1}=\frac{(\operatorname{adj} \mathrm{A})}{|\mathrm{A}|}\)
Note : The necessary and sufficient condition for a square matrix \(\mathrm{A}\) to be invertible is that \(|\mathrm{A}| \neq 0\)
Theorem : If \(A \) & \( B\) are invertible matrices of the same order, then \((\mathrm{AB})^{-1}=\mathrm{B}^{-1} \mathrm{~A}^{-1}\)
(i) If \(A\) be an invertible matrix, then \(A^{T}\) is also invertible & \(\left(\mathrm{A}^{\mathrm{T}}\right)^{-1}=\left(\mathrm{A}^{-1}\right)^{\mathrm{T}}\)
(a) \(\left(\mathrm{A}^{-1}\right)^{-1}=\mathrm{A}\)
(b) \(\left(A^{k}\right)^{-1}=\left(A^{-1}\right)^{k}=A^{-k} ; k \in N\)
(iii) \(\left|A^{-1}\right|=\dfrac{1}{|A|}\).
14. SYSTEM OF EQUATION & CRITERIA FOR CONSISTENCY Gauss - Jordan method:
\(a_{1} x+b_{1} y+c_{1} z=d_{1}\)
\(a_{2} x+b_{2} y+c_{2} z=d_{2}\)
\(a_{3} x+b_{3} y+c_{3} z=d_{3}\)
\(\Rightarrow\left[\begin{array}{l}a_{1} x+b_{1} y+c_{1} z \\ a_{2} x+b_{2} y+c_{2} z \\ a_{3} x+b_{3} y+c_{3} z\end{array}\right]=\left[\begin{array}{l}d_{1} \\ d_{2} \\ d_{3}\end{array}\right]\)\( \Rightarrow\left[\begin{array}{lll}a_{1} & b_{1} & c_{1} \\ a_{2} & b_{2} & c_{2} \\ a_{3} & b_{3} & c_{3}\end{array}\right]\left[\begin{array}{l}x \\ y \\ z\end{array}\right]=\left[\begin{array}{l}d_{1} \\ d_{2} \\ d_{3}\end{array}\right]\)
\(\Rightarrow A X=B \quad \Rightarrow A^{-1} A X=A^{-1} B\), if \(|A| \neq 0\)
\(\Rightarrow X=A^{-1} B=\dfrac{\operatorname{Adj} A}{|A|} \cdot B\)
(i) If | \(A \mid \neq 0\), system is consistent having unique solution
(ii) If \(|A| \neq 0 \) & \((\operatorname{adj} A) \cdot B \neq O\) (Null matrix), system is consistent having unique non-trivial solution.
(iii) If \(|A| \neq 0 \) & \((\operatorname{adj} A) \cdot B=O\) (Null matrix), system is consistent having trivial solution.
(iv) If \(|A|=0\), then
Comments
Post a Comment