Please use this identifier to cite or link to this item: http://dspace.dtu.ac.in:8080/jspui/handle/repository/18822
Title: IMPROVING IMAGE RESOLUTION USING GENERATIVE ADVERSARIAL NETWORKS
Authors: DHAWAN, SUMIT
Keywords: IMAGE SUPER RESOLUTION
DEEP LEARNING
GENERATIVE ADVERSARIAL NETWORKS
SUPER RESOLUTION MODELS
Issue Date: Oct-2020
Publisher: DELHI TECHNOLOGICAL UNIVERSITY
Series/Report no.: TD - 5354;
Abstract: Even with all the achievements in precision and speed of various image super resolution models, such as better and more accurate Convolutional Neural Networks (CNN), the results have not been satisfactory. The high resolution images produced are generally missing the finer and frequent texture details. Majority of the models in this area focus on such objective functions which minimize the MSE (Mean Square Error). Although, this produces images with better PSNR (Peak Signal to Noise Ratio) but such images are perceptually unsatisfying and lack the fidelity and high frequency details when seen at a high resolution. Generative Adversarial Networks (GAN), a deep learning model, can be used for such problems. In this work, we present and show how GAN can be used to produce perceptually satisfying images with decent PSNR score as well as good Perceptual Index (PI) when compared to other models. In contrast to existing Super Resolution GAN model, we have introduced various modifications to improve the quality of images, like replacing batch normalization layer with weight normalization layer, modified dense residual block, taking features for comparison before they are fed in activation layer, using the concept of a relativistic discriminator instead of a normal discriminator that is used in vanilla GAN and finally, using Mean Absolute Error in the model.
URI: http://dspace.dtu.ac.in:8080/jspui/handle/repository/18822
Appears in Collections:M.E./M.Tech. Computer Engineering

Files in This Item:
File Description SizeFormat 
Thesis_Sumit.pdf2.1 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.