Please use this identifier to cite or link to this item:
http://dspace.dtu.ac.in:8080/jspui/handle/repository/22646| Title: | RECENT APPLICATIONS OF PCA AND SVD |
| Authors: | AHUJA, ADITI |
| Keywords: | DIMENSIONALITY REDUCTION COVARIANCE MATRIX DIRECTIONAL VECTOR MATRIX FACTORIZATION DATA COMPRESSION |
| Issue Date: | Dec-2025 |
| Series/Report no.: | TD-8588; |
| Abstract: | In this thesis, Principal Component Analysis is a powerful dimensionality reduction technique that transforms high-dimensional data into a lower-dimensional space while preserving variance. By comput- ing the covariance matrix and its eigenvectors, PCA finds principal components that best represent the data. It is used on large scale in image compression, face recognition, and feature extraction, simplifying complex datasets without losing critical information. SVD is a matrix factorization method that decom- poses any matrix into three distinct matrices: A = U ΣV T . This decomposition reveals hidden patterns in data and has applications in data compression, noise reduction, and recommendation systems. Unlike PCA, which relies on eigenvectors of the covariance matrix, SVD works directly on the data matrix, making it more versatile. PCA and SVD are two fundamental techniques in linear algebra that have revolutionized data science, machine learning, and image processing. This presentation explores their mathematical foundations, geometric interpretations, and real-world applications. |
| URI: | http://dspace.dtu.ac.in:8080/jspui/handle/repository/22646 |
| Appears in Collections: | M Sc Applied Maths |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| Aditi Ahuja M.Sc..pdf | 2.1 MB | Adobe PDF | View/Open | |
| Aditi Ahuja plag.pdf | 2.08 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



