Please use this identifier to cite or link to this item:
http://dspace.dtu.ac.in:8080/jspui/handle/repository/13997
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | CHITTORA, ASHISH | - |
dc.date.accessioned | 2012-06-28T09:54:33Z | - |
dc.date.available | 2012-06-28T09:54:33Z | - |
dc.date.issued | 2011 | - |
dc.identifier.uri | http://dspace.dtu.ac.in:8080/jspui/handle/repository/13997 | - |
dc.description.abstract | The focus of the project entitled “UNCONSTRAINED FACE RECOGNITION” is to ultimately develop a face recognition system. The novelty of the proposed method is twofold. First we have developed a fuzzy based Gaussian-like edge detector to extract features. Secondly support vector machine is used for multi-tier classification. The main idea behind the face recognition approach is to extract the important features (which are edges) using a fuzzy edge detection method. Now there is a need to reduce the dimensions of this extracted feature space and to identify most significant features. For this Principle Component Analysis method is used. After this dimension reduction step classification is performed by using Support Vector Machine based classifier, which is a binary classifier. An algorithm “one v/s all” is applied to make the SVM work as multiclass classifier. This method is very fast and can be used in real-time. It can be applied to robotic system, access control system, security system, the human-machine interaction system and so on. | en_US |
dc.language.iso | en | en_US |
dc.relation.ispartofseries | TD 850;62 | - |
dc.subject | UNCONSTRAINED FACE RECOGNITION” | en_US |
dc.subject | FUZZY EDGE DETECTION | en_US |
dc.title | UNCONSTRAINED FACE RECOGNITION SYSTEM | en_US |
dc.type | Thesis | en_US |
Appears in Collections: | M.E./M.Tech. Electronics & Communication Engineering |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
thesis3 final thesis report1.doc | 1.63 MB | Microsoft Word | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.