Please use this identifier to cite or link to this item:
http://dspace.dtu.ac.in:8080/jspui/handle/repository/20180
Title: | DETECTION OF ATTRIBUTES MANIPULATION IN DEEP FAKE IMAGES USING HYBRIDLEARNING TECHNIQUE TO CLASSIFY INDIGENOUS AND FORGED IMAGES |
Authors: | GAHLOT, KANWARDEEP SINGH |
Keywords: | DEEPFAKE ARTIFICIAL INTELLIGENCE HYBRID LEARNING GENERATIVE ADVERSARIAL NETWORKS |
Issue Date: | May-2023 |
Series/Report no.: | TD-6714; |
Abstract: | Deepfake technology employs artificial intelligence (AI) algorithms tocreatemanipulated photographs and videos that are indistinguishable fromauthenticones. Generative adversarial networks (GANs), a type of deep-learning algorithm, aredriving the development of deepfake, which have the potential to compromiseindividual privacy, as they can be used to create pornographic content bysuperimposing images.As a result, digital media, including news broadcasts, onlinevideo clips, and live streams, are experiencing trust issues. Recently, synthetic imagecreation and manipulation methods have improved, enabling the creation of realisticfake face images. Despite the emergence of certain deep learning-basedimageforensic techniques, it is still challenging to differentiate between manipulatedandgenuine photos generated by modern techniques such as face2face, deep fake, andface swap. Current methods require substantial data input, high computational power, and are time-consuming. To overcome these challenges, we propose a novel HybridLearning model that ranks features to detect fake and genuine images usinglesscomputational power and time, while improving accuracy to 99 %and MCCscoreto98.87 additionally reducing computing time by 40 seconds. |
URI: | http://dspace.dtu.ac.in:8080/jspui/handle/repository/20180 |
Appears in Collections: | M.E./M.Tech. Electronics & Communication Engineering |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Kanwardeep Singh Gahlot M.Tech..pdf | 2.02 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.