Please use this identifier to cite or link to this item: http://dspace.dtu.ac.in:8080/jspui/handle/repository/15617
Title: DYNAMIC GROWTH OF HIDDEN-LAYER NEURONS USING THE NON-EXTENSIVE ENTROPY
Authors: DWIVEDI, MAYANK
Keywords: NON-EXTENSIVE ENTROPY
NEURONS NEURAL NETWORK
EXTREME LEARNING MACHINE
DATASETS
COMPLEXITY
HIDDEN LAYER
Issue Date: Jul-2014
Series/Report no.: TD NO.1447;
Abstract: The dynamic growth of hidden units and the pruning strategy has been sufficiently investigated in case of Radial Basis Function (RBF) neural network but relatively less in the case of feedforward multilayer perceptron (MLP) due to similarity in the hidden units in the MLP. So in this study I present a dynamic neural network that dynamically grows the number of hidden layer neurons based on an increase in the entropy of the weights during training. Before computing the entropy value weights are normalized to probability values. The entropy which is used being referred is the non-extensive entropy proposed recently by Susan and Hanmandlu for the representation of structured data. Along with the description of dynamic growth of hidden layer neurons using the Susan and Hanmandlu non-extensive entropy, the results are also compared with the Shannon, Pal and Pal, the Tsallis entropies and various static neural network configurations, in terms of execution time of the set of training samples, growth of hidden layer neurons and the testing accuracy. The experiment is performed on basically three standard machine learning datasets and on synthetic dataset. Incrementally growing the hidden layer as per requirement leads to better tuning of network weights and high classification performance as proved by the empirical results.
URI: http://dspace.dtu.ac.in:8080/jspui/handle/repository/15617
Appears in Collections:M.E./M.Tech. Information Technology

Files in This Item:
File Description SizeFormat 
mayank thesis.pdf1.34 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.