I sometimes am a teaching assistant for MATH 133 at McGill, and introductory linear algebra course that covers linear systems, diagonalisation, geometry in two and three dimensional Euclidean space, and the vector space $ \mathbb{R}^n$, and I’ve collected a few theoretical questions here that I like to use in the hope that they may be useful to people studying this kind of material. I made up all of these questions, although obviously many of them in form are the same as elsewhere. Some of the questions are a bit unusual and curious, and none of them need special tricks to solve, just an understanding of the concepts in the course. They focus mostly on understanding the theory, and there are very few straight computational-type questions here.
Note. None of these questions are official in the sense that I do not write the final exams. The exact syllabus of the course should always be taken to be the class material and official course material on the course website. These are more for extra practice, but do cover the material of the course fairly closely.
Conventions. Matrices are matrices over the real numbers, and vector spaces are real vector spaces. Actually, all vector spaces considered will be $ \mathbb{R}^n$ for some $ n$ and subspaces since the class does not cover abstract vector spaces.
- Homogeneous Systems
- Homogenous Systems II
- Homogeneous Systems III
- Parameters
- The system $ AX = B$ has infinitely many solutions.
- The system $ AX = B$ has at least one solution.
- The system $ AX = 0$ has infinitely many solutions.
- The system $ AX = 0$ has at least one solution.
- The system $ AX = B$ has no solutions.
- Different Kinds of Parameters
- Different Kinds of Parameters II
- Zero or Nonzero?
- Well-Defined
- $ AB – C$
- $ AB – BA$
- $ ABC$
- $ CBA$
- $ BAC$
- $ A^tC$
- Real or Fake
- Real or Fake II
- Symmetric Matrices
- Symmetric Matrices II
- Symmetric Matrices III**
- Commuting Matrices
- Determinants and Products
- Invertibility?
- Inverses Inverses
- Inverses and Solutions
- The homogenous system $ AX = 0$ has infinitely many solutions.
- The system $ AX = B$ has infinitely many solutions for any column $ B$.
- Elementary Matrices
- Computing Determinants
- $ \det(A^2B)$
- $ \det(A^tB)$
- $ \det(AB^4)$
- $ \det(\mathrm{adj}(A)B)$
- $ \det(\mathrm{adj}(AB))$
- $ \det(5\mathrm{adj}(-6B))$
- $ \det(3A^t(A^tB^4))$
- Computing Determinants II**
- Characteristic Polynomial
- Characteristic Polynomial II
- Characteristic Polynomial III
- Rank and Diagonalization
- $ A – 3I$
- $ A + 3I$
- $ A – 2I$
- $ A + 2I$
- $ A – 10I$
- Rank and Diagonalization II
- $ A – I$
- $ A + 3I$
- $ A – 200I$
- Lonely Eigenvalue
- The matrix $ A + \lambda I_n$ can be reduced to the identity using elementary row operations.
- The matrix $ A – \lambda I_n$ is invertible.
- If $ A$ is diagonalizable, then $ A – \lambda I_n$ has rank $ 1$.
- Parameter Dependence
- When Can You Diagonalize?
- If $ A$ has $ n$ distinct eigenvalues, then $ A$ is diagonalizable.
- If $ A$ has fewer than $ n$ distinct eigenvalues, then $ A$ is not diagonalizable.
- If $ A$ has $ 0$ as an eigenvalue, then $ A$ is not diagonalizable.
- If $ A$ has $ 0$ as an eigenvalue, then $ A$ is not invertible.
- If $ \det(A) = 0$ then $ A$ is not diagonalizable.
- If $ \det(A) = 0$ then $ A$ has $ 0$ as its only eigenvalue.
- Matrix Operations and Diagonalization
- If $ A$ and $ B$ are diagonalizable, then $ A + B$ is diagonalizable.
- If $ A$ and $ B$ are diagonalizable, then $ AB$ is diagonalizable.
- If $ A$ is diagonalizable then $ A^k$ is diagonalizable.
- Either $ A$ or $ A + I$ is diagonalizable.
- $ A + kI$ is diagonalizable for some real number $ k$.
- Similarity
- $ B$ is diagonalizable
- $ P$ is diagonalizable
- Computation
- Lengths
- Distance Between a Line and a Point
- Distance Between a Line and a Point II**
- Points, Points
- Lots of Points
- Defined Things
- $ u\times v$
- $ u\cdot v$
- $ u\times(u\cdot v)$
- $ u\cdot (u\times v)$
- $ \tfrac{1}{u\times v}u$
- $ \tfrac{1}{u\cdot v}v$
- $ u\times(v\times w)$
- $ u\cdot(v\cdot w)$
- $ (u\times w)\cdot (v\times w)$
- $ (u\cdot v)\times (v\cdot u)$
- $ v\times ( (u\cdot u)v)$
- $ v\cdot ( (u\times u)\times v)$
- Examples
- Rotation by an angle of $ \theta = \tfrac{\pi}{6}$ counterclockwise.
- Rotation by an angle of $ \theta = \tfrac{\pi}{12}$ counterclockwise (your answer should be expressed only with roots and numbers, not with $ \sin(x)$ or $ \cos(x)$ appearing in your formula.
- Reflection about the line $ y = 10x$.
- Rotation by an angle of $ \theta = \tfrac{\pi}{4}$ clockwise followed by a reflection in the line $ y = -x$. How does this compare to the matrix of rotation counterclockwise by an angle of $ \tfrac{\pi}{4}$.
- Rotation, Reflection
- Tricky Examples
- $ f:\mathbb{R}\to \mathbb{R}$ given by $ f(x) = 3x + 4$
- $ f:\mathbb{R}\to \mathbb{R}$ given by $ f(x) = |x|$
- $ f:\mathbb{R}\to \mathbb{R}^2$ given by $ f(x) = [4x~~~0]^t$
- Let $ v\in\mathbb{R}^2$ be a fixed vector. Define $ f:\mathbb{R}^2\to \mathbb{R}$ by $ f(u) = u\cdot v$ where $ u\cdot v$ is the dot product between $ u$ and $ v$.
- Let $ v\in \mathbb{R}^3$ be a fixed vector. Define $ f:\mathbb{R}^3\to\mathbb{R}^3$ by $ f(u) = u\times v$ where $ u\times v$ is the cross product between $ u$ and $ v$.
- Let $ v\in \mathbb{R}^2$ be a fixed vector. Define $ f:\mathbb{R}^2\to \mathbb{R}$ by $ \det[u~~~v]$ where $ [u~~~v]$ is the matrix with $ u$ as the first column and $ v$ as the second column (hint: do an example).
- Let $ v\in\mathbb{R}^2$ be a fixed vector. Define $ f:\mathbb{R}^2\to \mathbb{R}$ by $ \mathrm{trace}[u~~~v]$.
- Effects
- Spanning Matrices
- Spanning Matrices II
- Spanning Matrices III
- Matrix Transformations
- The system $ AX = 0$ has infinitely many solutions.
- The matrix $ A$ cannot be reduced to the identity matrix using elementary row operations.
- $ \det(A) \not= 0$.
- The matrix $ A$ does not have $ 0$ as an eigenvalue.
- Matrix Transformations II
- No such linear transformation can exist, because a linear transformation cannot rotate one basis vector and reflect the other. Madness!
- The matrix $ A$ can be written as a product of elementary matrices.
- The matrix $ A$ has eigenvalue $ 0$.
- $ \det(A)\not = 0$.
- Spanning and Counting
- Spanning and Counting II**
- Spanning
- Spanning II
- Linear Transformation Dimensions
- $ \mathrm{dim}(\mathrm{im}(A)) = 5$
- $ \mathrm{dim}(\mathrm{im}(A)) < 5$
- The system $ AX = 0$ has infinitely many solutions.
- Null space
- Null Space II
- $ \mathrm{dim}(\mathrm{null}(A)) = 0$
- $ \mathrm{dim}(\mathrm{null}(B)) = 0$
- $ \mathrm{dim}(\mathrm{im}(A)) = n$
- $ \mathrm{dim}(\mathrm{im}(B)) = n$.
- Image
- $ \mathrm{dim}(\mathrm{im}(C)) = n$
- $ \mathrm{dim}(\mathrm{im}(B)) = n$
- $ \mathrm{dim}(\mathrm{im}(A)) = n$
- $ \mathrm{dim}(\mathrm{null}(C)) = 0$
- $ \mathrm{dim}(\mathrm{null}(B)) = 0$
- $ \mathrm{dim}(\mathrm{null}(A)) = 0$
- Linear Independence
- Linear Independence II
- For every linear transformation $ f:\mathbb{R}^n\to R^n$, the vectors $ f(v_1),f(v_2),f(v_3)$ are linearly independent.
- There exists a linear transformation $ f:\mathbb{R}^n\to\mathbb{R}^2$ such that the vectors $ f(v_1),f(v_2),f(v_3)$ are linearly independent.
- There exists infinitely many linear transformations $ f:\mathbb{R}^n\to\mathbb{R}^3$ such that $ f(v_1),f(v_2),f(v_3)$ are linearly independent.
- There is a unique linear transformation $ f:\mathbb{R}^n\to\mathbb{R}^3$ such that the vectors $ f(v_1),f(v_2),f(v_3)$ are linearly independent.
- Subspaces
- Let $ c\in \mathbb{R}$ be a constant. $ \{ [m~~~n~~~0]^t : m – n = c^2\} $ .
- $ \{ [m~~~n~~~2n]^t : mn = 0\}$.
- $ \{ [m~~~0~~~m^2]^t : m^4 = 0\}$.
- $ \{ [m~~~0~~~0]^t : \sin(m) = 1\}$.
- $ \{ [m~~~n~~~p]^t : m^3 = n^3\}$.
- $ \{ [m~~~0~~~0]^t : \mathrm{det}(\begin{pmatrix}m & 2\\ 0 & 11\end{pmatrix}) = 0 \}$
- $ \{ [m~~~n~~~p]^t : 4m + 2n – 10p = 0\}$
- $ \{ v\in \mathbb{R}^3 : v\cdot [2~~~4~~~-9]^t = 0\}$
- $ \{ v\in \mathbb{R}^3 : v\cdot [1~~~2~~~50]^t \not=0\}$
- $ \{ v\in \mathbb{R}^3 : v\times [-9~~~-1~~~5]^t = 0\}$
- $ \{ v\in \mathbb{R}^3 : v\times [1~~~10~~~1]^t = 1\}$
- $ \{ v\in \mathbb{R}^3 : \{ v, [1~~~1~~~1]^t\}\text{~~~are linearly independent~~~}\}$
- $ \{ [m~~~n~~~p]^t : m^2 + n^2 + p^2 \geq 0\}$
- Subspaces II
- Let $ f:\mathbb{R}^n\to\mathbb{R}^n, g:\mathbb{R}^n\to \mathbb{R}^n$ be two linear transformations and define $ S$ to be the set of vectors $ v\in \mathbb{R}^n$ such that $ f(v), g(v)$ are linearly independent.
- Let $ f:\mathbb{R}^n\to\mathbb{R}^n$ and $ g:\mathbb{R}^n\to\mathbb{R}$ be two linear transformations, and define $ S$ to be the set of vectors $ v\in \mathbb{R}^n$ such that $ f(v) = g(v)f(v)$.
- Let $ f:\mathbb{R}^n\to\mathbb{R}^n$ and $ g:\mathbb{R}^n\to\mathbb{R}$ be two linear transformations, and define the set $ S$ to be the set of vectors $ v\in \mathbb{R}^n$ such that $ g(f(v)) = g(v)$.
- Let $ f:\mathbb{R}^n\to \mathbb{R}$, $ g:\mathbb{R}^n\to\mathbb{R}$, and $ h:\mathbb{R}\to\mathbb{R}^n$ be three linear transformations and define $ S$ to be the set of $ v\in \mathbb{R}^n$ such that $ h(f(v) + g(v)) = v$.
- Domains and Codomains
- $ f$ is the zero transformation
- With respect to any basis of $ \mathbb{R}^3$, the matrix of $ f$ has rank $ 1$.
- With respect to any basis of $ \mathbb{R}^3$, the matrix of $ f$ has rank at most $ 1$.
- With respect to any basis of $ \mathbb{R}^3$, the matrix of $ f$ has zero determinant.
- If $ A$ is the matrix of $ \mathbb{R}^3$ with respect to the standard basis, then the system $ AX = 0$ has a unique solution.
Linear Systems
Suppose the homogenous system $ AX = 0$ has a unique solution and the rank of $ A$ is $ 5$. What are the dimensions of $ A$?
True or false: Suppose the system $ AX = B$ has infinitely many solutions. Then $ AX = 0$ has infinitely many solutions.
True or false: if $ AX = 0$ has infinitely many solutions then $ A^tX = 0$ has infinitely many solutions.
Let $ A$ be a $ 3\times 3$ matrix and suppose it has two leading $ 1$’s (i.e. it has rank two). Which of the following must be true?
For which values of $ a$ and $ b$ does the system
have infinitely many solutions? Are there any values of $ a$ and $ b$ such that the system has \emph{no} solutions?
For which values of $ a$ and $ b$ does the system
have a solution for all possible values of $ m$?
Matrices
For which numbers $ n=1,2,3,4,\dots$ is the following statement true: if $ A$ and $ B$ are any $ n\times n$ matrices such that $ AB = 0$ then $ BA = 0$.
Suppose $ A$ is a $ 3\times 2$ matrix, $ B$ is a $ 2\times 3$ matrix and $ C$ is a $ 3\times 8$ matrix. Which of the following expressions are well-defined?
Suppose that $ A$ and $ J$ are $ 2\times 2$ matrices such that $ AJ = A$. Must $ J$ be the identity matrix?
Suppose that $ J$ is an $ n\times n$ matrix such that $ AJ = A$ for every $ n\times n$ matrix $ A$. Must $ J$ be the identity matrix?
A matrix $ A$ is called symmetric if $ A = A^t$. What can you say about the dimensions of $ A$?
If $ A$ is an $ n\times n$ matrix, show that $ AA^t$ is a symmetric matrix.
Suppose that $ B$ is an $ n\times n$ symmetric matrix. Can we conclude that there is some matrix $ A$ such that $ B = AA^t$?
Suppose that $ A$ is a $ 2\times 2$ matrix and $ AB = BA$ for all $ 2\times 2$ matrices $ B$. Show that $ A$ is a constant multiple of the identity matrix.
Determinants
If $ A$ and $ B$ are $ n\times n$ matrices, is is true that $ \det(AB) = \det(BA)$? Try some small matrices first!
Suppose $ A,B,C,$ and $ D$ are $ n\times n$ matrices and $ AB^2C^tABC^5D^9AD^t$ is invertible. Can we conclude that $ A$ is invertible?
Suppose that $ A$ is an invertible matrix. Is it true that $ (A^{-1})^{-1} = A$ (in words: the inverse of the inverse of $ A$ is $ A$)? Explain.
Suppose $ A$ is an $ n\times n$ matrix that is not invertible. Which of the following statements are true?
Find a matrix that cannot be written as the product of elementary matrices, and justify your answer.
Suppose $ A$ is a $ 3\times 3$ matrix such that $ \det(A) = 4$ and $ B$ is a $ 3\times 3$ matrix with $ \det(B) = -3$. Compute the following determinants:
Suppose $ A$ is an $ n\times n$ matrix. Find the determinant of
in terms of $ \det(A)$ and $ n$. Hint: your formula should have an $ n$ and $ \det(A)$ in it. First find a formula for $ \det(\mathrm{adj}(A))$ in terms of $ n$ and $ \det(A)$ and then apply this formula multiple times.
Diagonalization
Suppose $ A$ and $ B$ are square matrices with characteristic polynomials $ p(x)$ and $ q(x)$ respectively. True or false: $ AB$ has characteristic polynomial $ p(x)q(x)$.
Suppose a matrix $ A$ has characteristic polynomial
Is $ A$ necessarily diagonalizable?
Suppose a matrix $ A$ has characteristic polynomial
Is $ A$ necessarily diagonalizable?
A square matrix $ A$ has characteristic polynomial
Suppose $ A$ is diagonalizable. Determine the rank of the following matrices:
Suppose that a square matrix $ A$ has characteristic polynomial
Suppose further than $ A$ is not diagonalizable. Determine the rank of the following matrices:
Let $ A$ be an $ n\times n$ matrix with \emph{exactly one} eigenvalue $ \lambda$. Find the one true statement:
For which values of $ m$ is the matrix
diagonalizable?
Which of the following statements are true for any $ n\times n$ matrix $ A$? For any false statements you find, provide a counterexample.
Which of the following statements are true for any $ n\times n$ matrices $ A$ and $ B$? For any false statements you find, provide a counterexample.
Suppose $ A$ is diagonalizable and $ B = P^{-1}AP$ for some invertible matrix $ P$. Which of the following statements are necessarily true?
Consider the matrix
$
Find a formula for $ A^n$ and use it to compute $ A^4$.
Geometry
Let $ v\in \mathbb{R}^n$. True or false:
is a vector that has length one.
Find the distance in $ \mathbb{R}^2$ between the line $ y = 3x + 8$ and the point $ P(10,5)$.
Suppose that $ L$ is a line in $ \mathbb{R}^3$, $ Q$ a point on $ L$, and $ P$ a point not on $ L$. Suppose that $ \vec{n}$ is a (nonzero) vector perpendicular to the direction vector of $ L$.
Is $ \lVert\mathrm{proj}{\vec{n}\rVert{\vec{PQ}}}$ the distance between $ L$ and $ P$? Explain.
Suppose that $ P,Q,R,S$ are distinct points in $ \mathbb{R}^2$ such that $ \vec{PQ}$ and $ \vec{RS}$ are parallel. Is it necessarily true that $ \vec{PR}$ and $ \vec{QS}$ are parallel?
Suppose $ P_1,\dots,P_n$ are $ n$ distinct points in $ \mathbb{R}^2$ such that $ \vec{P_kP_{k+1}}$ is parallel to $ \vec{P_{k+1}P_{k+2}}$ for $ k = 1,\dots,n-2$. Is $ \vec{P_1P_n}$ parallel to $ \vec{P_1P_2}$?
Which of the following expressions are well-defined for all $ u,v,w\in\mathbb{R}^3$? Note that $ u\times v$ denotes the cross product and $ u\cdot v$ denotes the dot product.
Linear Transformations and The Vector Space $ \mathbb{R}^n$
Find the matrix of the linear transformation in $ \mathbb{R}^2$ with respect to the standard basis:
Is it possible to write the composition of a reflection and a rotation just as a rotation?
Which of the following are linear transformations?
Suppose $ T:\mathbb{R}^3\to\mathbb{R}^2$ is a linear transformation and we had the vectors
Could we then determine $ T[x~~~y~~~z]^t$ for any $ x,y,z\in \mathbb{R}$? In other words, could we compute what $ T$ does to any vector?
Suppose $ A$ and $ B$ are $ n\times n$ matrices, that the columns of $ A$ span $ \mathbb{R}^n$ and the rows of $ B$ span $ \mathbb{R}^n$. Is it necessarily true that the columns of $ AB$ span $ \mathbb{R}^n$?
Suppose $ A$ is a $ 2\times 2$ matrix:
whose columns span $ \mathbb{R}^2$. If we put $ A$ in a bigger matrix:
Do the rows of $ B$ span $ \mathbb{R}^3$? Do the columns of $ B$ span $ \mathbb{R}^3$?
Suppose $ A$ and $ B$ are $ n\times n$ matrices, and the columns of $ A$ span $ \mathbb{R}^3$, but that the columns of $ B$ do \emph{not} span $ \mathbb{R}^3$. Can the columns of $ AB$ span $ \mathbb{R}^3$?
Suppose that $ A$ is a $ 2\times 2$ matrix such that $ AX = B$ has a solution for every $ B\in\mathbb{R}^2$. Which of the following statements is true?
Suppose $ A$ is the matrix in the standard basis of the linear transformation $ T$ that does the following: $ T([1~~~0]^t)$ is the vector obtained by rotating the vector $ [1~~~0]^t$ by an angle of $ \tfrac{\pi}{6}$ and $ T([0~~~1]^t)$ is the vector obtained by reflecting the vector $ [0~~~1]^t$ in the line $ y = -x$. Which of the following statements are true? Explain.
Find the number of $ 3\times 3$ matrices that are invertible and whose only entries are $ 0$ and $ 1$.
Find the number of $ n\times n$ matrices that are invertible and whose only entries are $ 0$ and $ 1$? (Your formula should be a function of $ n$).
For what values of $ m$ are the following vectors linearly independent?
For what values of $ m$ are the following vectors linearly independent?
Suppose $ A$ is a $ 5\times 5$ matrix with $ \det(A) = 0$. Which of the following statements are true?
Suppose $ A$ is an $ n\times n$ matrix. Show that $ AX= 0$ has a unique solution if and only if $ \mathrm{dim}(\mathrm{im}(A)) = n$.
Suppose $ A$ and $ B$ are $ n\times n$ matrices and $ \mathrm{dim}(\mathrm{null}(AB)) = 0$. Which of the following statements are necessarily true? For the statements that are not necessarily true, provide a counterexample.
Suppose $ A,B$ and $ C$ are $ n\times n$ matrices and $ \mathrm{dim}(\mathrm{im}(ABC)) = n$. Which of the following statements are necessarily true? For the statements that are not necessarily true, provide a counterexample.
If $ v_1,v_2,v_3\in\mathbb{R}^n$ are linearly dependent and $ f:\mathbb{R}^n\to \mathbb{R}^n$ is a linear transformation, show that $ f(v_1),f(v_2),f(v_3)$ are linearly dependent.
If $ v_1,v_2,v_3\in\mathbb{R}^n$ are linearly independent, which of the following statements are true?
Which of the following subsets of $ \mathbb{R}^3$ are subspaces of $ \mathbb{R}^3$?
Which of the following subsets $ S$ of $ \mathbb{R}^n$ are necessarily subspaces of $ \mathbb{R}^n$?
Let $ v\in\mathbb{R}^3$ and suppose that $ f:\mathbb{R}^3\to \mathbb{R}^3$ is a linear transformation such that $ f(w) = 0$ for all $ w\in \mathbb{R}^3$ such that $ w\not= v$. Which of the following statements are true?