Please use this identifier to cite or link to this item: http://dspace.dtu.ac.in:8080/jspui/handle/repository/15328
Title: TRACKING THE OBJECTS IN MOTION
Authors: SANGWAN, VARUN
Keywords: VISUAL SURVEILLANCE
OBJECTS IN MOTION
TRACKING
DETER
FAST FEATURES
Issue Date: Nov-2016
Series/Report no.: TD NO.1765;
Abstract: Visual surveillance especially for humans, has recently been one of the most active research topics in machine vision because of its applications such as deter and response to crime, suspicious activities, terrorism or human behaviour recognition. In this dissertation, we have proposed a model which combines features and appearance so as to make the system robust for multiple object tracking. The system works on the principal of background subtraction for which the background is learned using Gaussian Mixture Model. The system uses Kalman filter for the prediction and tracking. So the system is able to predict the position of the object in case of full occlusion. To deal with partial occlusion, the system performs the matching of features. Color, edge, texture and FAST features are extracted and compared with previous frame during partial occlusion. Bhattacharyya distance is evaluated for the matching of color, edge and texture. Texture feature has been extracted using Gabor filter which uses multiple scales and orientations. The key points in the objects are extracted using FAST features. These key points are matched so as to make the system robust. Experimental results on several real videos sequences from different conditions have shown the effectiveness of our approach.
URI: http://dspace.dtu.ac.in:8080/jspui/handle/repository/15328
Appears in Collections:M.E./M.Tech. Electronics & Communication Engineering

Files in This Item:
File Description SizeFormat 
VARUN COVER PAGE.pdf24.18 kBAdobe PDFView/Open
DECLARATION BY THE CANDIDATE (1).pdf463.04 kBAdobe PDFView/Open
CHAPTER 1.pdf2.17 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.