This is an essential branch of mathematics for understanding how machinelearning algorithms work on a stream of data to create insight. Everything from friend suggestions on Facebook, to song recommendations on Spotify, to transferring your selfie to a Salvador Dali-style portrait using deep transfer learning involves matrices and matrix algebra. Here are the essential topics to learn:
- Basic properties of matrix and vectors: scalar multiplication, linear transformation, transpose, conjugate, rank, determinant
- Inner and outer products, matrix multiplication rule and various algorithms, matrix inverse
- Special matrices: square matrix, identity matrix, triangular matrix, idea about sparse and dense matrix, unit vectors, symmetric matrix, Hermitian, skew-Hermitian and unitary matrices
- Matrix factorization concept/LU decomposition, Gaussian/GaussJordan elimination, solving \(\mathrm{Ax}=\mathrm{b}\) linear system of equation
- Vector space, basis, span, orthogonality, orthonormality, linear least square
- Eigenvalues, eigenvectors, diagonalization, singular value decomposition
Where You Might Use It
If you have used the dimensionality reduction technique principal component analysis, then you have likely used the singular value decomposition to achieve a compact dimension representation of your data set with fewer parameters. All neural network algorithms use linear algebra techniques to represent and process network structures and learning operations.