Welcome survey - please fill this out if you plan on attending my discussions and/or if you'd like to receive occasional recap emails!
Discussion 12: Principal Components Analysis (PCA)
Rayleigh quotients and their connection to the spectral norm and related optimization problems. Derivations of PCA through various methods: Gaussian MLE, maximizing variance, and minimizing projection error. Relationship between the SVD and PCA.
Kernel methods and their motivation as both enabling efficient high-dimensional featurization, and allowing custom notions of similarity between data points. Conditions for the validity of a kernel function.
Least-squares linear regression and motivation for the min-norm solution in the case of infinitely many solutions. SVD, the Moore-Penrose Pseudoinverse, and its application to the min-norm least squares problem.
Discussion 5: Anisotropic Gaussians, Transformations, Quadratic Forms
Overview of anisotropic Gaussians, including properties of the covariance matrix and the elliptical isocontours of the quadratic form. Change of basis as a way to understand various data transformations (sphering, whitening, etc.).