Please use this identifier to cite or link to this item: http://dspace.dtu.ac.in:8080/jspui/handle/repository/19157
Title: OBJECT DETECTION VIA DEEP LEARNING AND NEURAL NETWORK
Authors: BAKAWLE, ARCHI
Keywords: OBJECT DETECTION
DEEP LEARNING
NEURAL NETWORK
Issue Date: May-2022
Series/Report no.: TD-5745;
Abstract: DL approaches to Object Detection (OD) have attracted a lot of attention from researchers because of their implied strength in overcoming the drawbacks of traditional approaches that rely on handcrafted characteristics. DL algorithms have made major advances in object recognition during the previous few years. This paper discusses the most recent and effective DL framework for object recognition. Visual recognition systems, which include picture categorization, localization, and detection, are at the heart of all of these applications and have gathered a lot of research attention. These visual identification algorithms have achieved extraordinary performance due to considerable advancements in neural networks, particularly deep learning. OD is one of these sectors where computer vision has had a lot of success. The role of DL methods based on YoloWingNet for OD is proposed in this research. In computer vision, classifying and detecting various items in an image is a crucial ability. Robust and efficient object identification is a critical method for engaging with one's surroundings. Humans utilize a technique known as a visual focus to swiftly determine which areas of an image require detailed processing and which can be avoided. However, identifying an object and its precise location in an image is a challenging problem for a machine. This paper studies features and methods of object detection and the algorithms related to object detection using deep learning.
URI: http://dspace.dtu.ac.in:8080/jspui/handle/repository/19157
Appears in Collections:M.E./M.Tech. Computer Engineering

Files in This Item:
File Description SizeFormat 
ARCHI BAKAWLE M.Tech.pdf1.6 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.