Please use this identifier to cite or link to this item: http://dspace.dtu.ac.in:8080/jspui/handle/repository/20109
Title: CONVOLUTIONAL NETWORK FEATURE HIERARCHY FOR HYPER SPECTRAL IMAGE CLASSIFICATION
Authors: THAPLIYAL, ANKITA
Keywords: HYPER SPECTRAL IMAGE CLASSIFICATION
CONVOLUTIONAL NEURAL NETWORK
REMOTE SENSING
SUPERVISED TECHNOLOGY
Issue Date: Jul-2021
Series/Report no.: TD-6666;
Abstract: Hyper spectral image classification is the recent technology that is famous among the researchers nowadays. It is simply an application of remote sensing methodology. The results of remote sensing are basically needs to get studied by the scientists properly so that they can analyze the surface area and the target information accordingly. Then next to the study of target information further conclusions can be made about that area successfully. Remote sensing is the first step to analyze the whole process in which the satellite helps to provide several images of a particular land area or vegetation portion. These images can be obtained by using active or passive remote sensing depending on the choice of user. As soon as the images are received by the sensors we not only analyse them in visible spectrum, but we do recognize them in ultra violet and infrared region of the electromagnetic spectrum. This type of technique is known as hyper spectral imaging. We use hyper spectral sensors to perform this type of imaging. This method has so many advantages over multispectral imaging in which number of spectral band information is comparatively less. Since the number of bands in hyper spectral imaging is greater than the band information in multi spectral imaging, the recognition of images and target is more specified and accurate for hyper spectral data. More significant information is obtained through hyper spectral imaging. Since we receive the data through hyper spectral imaging we need to apply the upcoming tasks to know the target area in deeper way. As soon as the input is received in the form of images that are now three dimensional due to the hyper spectral view, we need to classify these images into the categories they are having. For instance, we get the information of a vegetation area we need to classify this three dimensional image data into the different categories of vegetation in that particular portion of land. This whole process is known as image classification which is the latest topic for machine learning methods. The use of deep neural networks at present helps in doing the classification of large number of images at a time with much more accuracy and reduced complexity. In past few decades many researchers have provided their own supervised models to implement the image classification over a huge dataset to classify the images successfully. But due to the drawbacks like less accuracy and higher complexity, these models have been over take by convolutional neural networks. Supervised technology is a type of machine learning task where the model learn itself on the basis of input and the outputs provided at the time of training. The methods like SVM and CNN are supervised methods that we us for the classification purpose. Hence in this project instead of using multi spectral data, we have discussed the use of hyper spectral data. This chapter consist of six chapters. In chapter 1 we are discussing the basic of remote sensing and its types. Chapter 2 will tell us about the type of imaging method and their advantages and disadvantages so that we can prefer the suitable one to perform the objective. In chapter 3 we are looking over the multiple supervised methods like SVM, CNN and ANN that helps in the classification of hyper spectral image data. Chapter 4 and 5 are the discussion of latest model with increased accuracy and reduced complexity.
URI: http://dspace.dtu.ac.in:8080/jspui/handle/repository/20109
Appears in Collections:M.E./M.Tech. Electronics & Communication Engineering

Files in This Item:
File Description SizeFormat 
Ankita Thapliyal M.Tech..pdf3.06 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.