Linear Algebra II
An introduction to intermediate level topics in Linear Algebra by a series of probing questions and answers. Specifically the following topics are covered:
- Eigenvalues & Eigenvectors
- Norms, Inner Products and Projections
- Spectral & Singular Value Decomposition Theorems
- Applications of SVD and Best Fit Subspaces
All materials including videos, notes, assignments are provided here. The textbooks used are also openly available. The course was designed for computer science and electronics undergraduate engineering students at IIIT Hyderabad.
reviews! not so bad i guess.
— Girish Varma (@girishvarma) May 3, 2021
amazed by students who took time to solve hard problems even during these times. pic.twitter.com/5sgsiRY7oZ
Prerequisites
Assumes that Linear Algebra I is covered. Specifically assumes knowledge of the following topics:
- Solutions to Linear Equations, Echelon Forms, Gaussian Elimination
- Field, Vector Space Defintions, Real, Complex, Finite Field based Vector Spaces
- Linear Transformation, Matrix, Rank, Inverse, Transpose
- Change of Basis and effect on Matrix of the Linear Transformation
- Determinants
Schedule
Meetings
Live Lectures
From 30th March to 4th May, 2021, Tuesdays, Thursdays and Saturdays
- 10-11 AM for Batch 2
- 12-01 PM for Batch 1
Tutorials
Weekly tutorials conducted by TAs in smaller groups as per times fixed.
Office Hours
I will be available on Teams during the timings given bellow, for clearing any doubts. You can sent a direct message to me for joining.
- Monday, 530-730PM.
- Thursday, 330-530PM.
Expected Workload
Students are expected to spent atleast 12 hrs per week. Roughly
- 3+1 hrs attending lectures and tutorials
- 4 hrs reading textbooks, references etc
- 4 hrs solving assignments, quizes etc
Evaluations
- 4 Light Quizzes (Weekly)
- 3 Assignments
- 2 Deep Quizzes (17th April, 4th May)
Your intelligence cannot be measured by just a number. It is defined by your willingness to learn, solve problems and try new things.
— Prof. Feynman (@ProfFeynman) April 14, 2021
You are more than just a number. Develop your skills wherever they may lead. Share your ideas. Your skills are more valuable than your grades. ðŸ§
Textbook and References
Textbook
The resources provided are licenced under Creative Commons/Open Licences and hence downloadable.
[CDTW] Linear Algebra
David Cherney, Tom Denton, Rohit Thomas and Andrew Waldron[WKN] Linear Algebra with Applications
W. Keith Nicholson[MR] Interative Linear Algebra
Dan Margalit and Joseph Rabinoff[DA] Understanding Linear Algebra
David Austin
Extra Reading
Down with Determinants
Sheldon AxlerLinear Algebra Done Right Videos
Sheldon AxlerMarkov Chains and Google’s PageRank Algorithm
Understanding Linear Algebra, David Austin.The Fast Fourier Transform and Polynomial Multiplication
Mordecai GOLIN[GS] Linear Algebra MIT OCW Course Materials
Gilbert Strang
Interactive Videos
On Solving Problems & Understanding Proofs
Richard Feynman, His Life, and the Art of Solving Important Problems
Blog PostHow to Solve it
G. Polya
Shorter Version
Lecture Topics
1. Eigenvalues & Diagonalization
1.1 Linear Algebra & Random Walks
Recalling Basics | Function Spaces | Random Walk on Graphs
- Notes
- Reading
- Section 2.9 in [WKN]
1.2 Eigenvectors and Eigenvalues
Definitions | Characteristic Polynomial | Examples
- Notes
- Reading
- Section 3.3 for Eigenvalues, Section 2.9 for Random Walks on Graphs in [WKN].
- Chapter 12 in [CDTW].
1.3 Diagonalization
Eigenvector Basis and Powering | Multiplicities
- Notes
- Reading
- Section 3.3 in [WKN]
- Chapter 13 in [CDTW]
Assignment 1
Submit by 13th April
- Notes
- Practice Problems:new
- Show that for any matrix $M \in \mathbb R^{n \times n}$ with eigenvalue $\lambda \in \mathbb R$, $$ \text{geometric_multiplicity}(\lambda, M) = \text{geometric_multiplicity}(\lambda, M^t).$$
- Let $M$ be block diagonal with blocks $M_1,\ldots,M_k$ (all square matrices). Show that: $$ \text{geometric_multiplicity}(\lambda, M) = \sum_{i=1}^k \text{geometric_multiplicity}(\lambda, M_i).$$
2. Norms & Inner Products
2.1 Norms & Inner Products
Jordan Form | Norms | Distance | Inner Product | Complex Case
- Notes
- Reading
- Section 10.1 in [WKN]
- Explore
- For Jordan Form, Generalized Eigenvectors, see Down with Determinants.
- Geometric Multiplicity $\leq$ Algebraic Multiplicity.
- Rotation Matrics have complex eigenvalues.
2.2 Orthonormal Vectors
Orthogonal & Orthonormal Vectors | Gram-Schmidt Orthogonalization
- Notes
- Reading
- Section 10.2 in [WKN]
- Chapter 14 in [CDTW]
- Explore
- For Bilinear forms see Slides of Prof. P. Karageorgis.
2.3 Projection and Orthogonal Complement
Subspace Projections | Orthogonal Complements | Fitting with Errors
- Notes
- Reading
- Section 14.6 in [CDTW]
- Section 8.1 in [KTW]
2.4 Least Squares Fitting
Best fit vector on a subspace | Least Squares Fitting Equation
- Notes
- Reading
- [MR] Section 6.5
- Chapter 7 (upto page 307) in [CDTW]
Deep Quiz I and Discussion
3. Advanced Topics
Assignment 2
Submit by 26th April
3.1 Symmetric Matrices and Properties
Eigenvalues and eigenvectors of Symmetric Matices | Spectral Theorem
- Notes
- Reading
- Chapter 15 in [CDTW]
- Solve
- Question 3, 5 in Review Problems 15.1 in [CDTW]
- If $P$ is the change of basis matrix for changing coordinates from standard basis to another orthonormal basis, then columns of $P$ are orthonormal.
- If columns of $P$ are orthonormal then rows of $P$ are also orthonormal.
3.2 Spectral Decomposition Theorem
Spectral Theorem | Spectral Decomposition
- Notes
- Reading
- Section 10.3 in [WKN]
- Chapter 15 in [CDTW]
- Solve
- Let $v_1,\cdots, v_n$ be a basis for an $n$ dimensional vector space $V$ over some field $\mathbb F$ and let $M,M’ \in \mathbb F^{n\times n}$ be matrices. Show that if $$\forall i \in \{1,\ldots, n\},\qquad Mv_i = M’v_i$$ then $M=M’$.
- Consider the $n$ dimenstional complex vector space $\mathbb C^n$. Is $\mathbb R^n$ a subpace of $\mathbb C^n$?
- We know that $\mathbb R^n$ is a subset of $\mathbb C^n$. Suppose $v_1,\ldots, v_n \in \mathbb R^n$ be orthonormal vectors. Can they form a basis for $\mathbb C^n$?
3.3 Singular Value Decomposition
Spectral Theorem for Complex Spaces | Singular Value Decomposition
- Notes
- Reading
- Section 17.2 in [CDTW]
- Section 8.6 in [WKN]
4. Applications
4.1 SVD & Applications
Principal Component Analysis | Applications in Data Science
- Notes
- Read
- Section 8.11 in [WKN]
- Code
- Explore
Assignment 3
See Section 6 in Problems. Submit by 7th May.
4.2 SVD & Best Fit Subspaces
- Notes
- Read
- Problems new
- Suppose $W$ is a $k$ dimensional subspace of $\mathbb R^n$ and $v_1,\ldots, v_{k-1} \in \mathbb R^n$ be orthonormal vectors (may or may not be in $W$). Show that there exists a vector $w\in W$ that is nonzero, such that $w \in \text{span}(v_1,\ldots, v_{k-1})^\perp$. That is $$ \exists w \in W \cap \text{span}(v_1,\ldots, v_{k-1})^\perp \text{ such that } w \neq 0. $$
- Suppose $M \in \mathbb R^{n \times n}$ is invertible with singular value decomposition: $$ M = \sum_{i=1}^n s_i u_iv_i^T \qquad \text{ where } s_i \in \mathbb R^{+}, u_i,v_i \in \mathbb{R}^{n \times 1}.$$ Let $M’ = \sum_{i=1}^n s^{-1}_i v_iu_i^T$. Show that $M’ = M^{-1}$.
5. Course Summary, More Applications & Closing Notes.
- Notes
- Reading
- Applications
- Algorithms: Discrete Fourier Transform and Fast Multiplication.
- Spectral Graph Drawing: Spielman, Book Chapter
- Robotics, VR, AR uses 3D, 6D data.
- Pooled Testing for COVID: currently being used in IIIT to test 400 campus residents with only 40 tests.
- Compress Sensing.
- Machine Learning and Data Science. See Foundations of Data Science by Blum, Hopcroft, and Kannan
- Communications Systems: Error Correcting Codes (See Section 8.8 in [WKN]).
- Quantum Computing
- Videos by M. Nielsen.
- QED by Feynmann.
- Lecture Notes by Vazirani.
Deep Quiz II
On 4th May