Please use this identifier to cite or link to this item: http://dspace.dtu.ac.in:8080/jspui/handle/repository/22646
Title: RECENT APPLICATIONS OF PCA AND SVD
Authors: AHUJA, ADITI
Keywords: DIMENSIONALITY REDUCTION
COVARIANCE MATRIX
DIRECTIONAL VECTOR
MATRIX FACTORIZATION
DATA COMPRESSION
Issue Date: Dec-2025
Series/Report no.: TD-8588;
Abstract: In this thesis, Principal Component Analysis is a powerful dimensionality reduction technique that transforms high-dimensional data into a lower-dimensional space while preserving variance. By comput- ing the covariance matrix and its eigenvectors, PCA finds principal components that best represent the data. It is used on large scale in image compression, face recognition, and feature extraction, simplifying complex datasets without losing critical information. SVD is a matrix factorization method that decom- poses any matrix into three distinct matrices: A = U ΣV T . This decomposition reveals hidden patterns in data and has applications in data compression, noise reduction, and recommendation systems. Unlike PCA, which relies on eigenvectors of the covariance matrix, SVD works directly on the data matrix, making it more versatile. PCA and SVD are two fundamental techniques in linear algebra that have revolutionized data science, machine learning, and image processing. This presentation explores their mathematical foundations, geometric interpretations, and real-world applications.
URI: http://dspace.dtu.ac.in:8080/jspui/handle/repository/22646
Appears in Collections:M Sc Applied Maths

Files in This Item:
File Description SizeFormat 
Aditi Ahuja M.Sc..pdf2.1 MBAdobe PDFView/Open
Aditi Ahuja plag.pdf2.08 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.