Please use this identifier to cite or link to this item:
http://dspace.dtu.ac.in:8080/jspui/handle/repository/15617
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | DWIVEDI, MAYANK | - |
dc.date.accessioned | 2017-02-17T06:30:04Z | - |
dc.date.available | 2017-02-17T06:30:04Z | - |
dc.date.issued | 2014-07 | - |
dc.identifier.uri | http://dspace.dtu.ac.in:8080/jspui/handle/repository/15617 | - |
dc.description.abstract | The dynamic growth of hidden units and the pruning strategy has been sufficiently investigated in case of Radial Basis Function (RBF) neural network but relatively less in the case of feedforward multilayer perceptron (MLP) due to similarity in the hidden units in the MLP. So in this study I present a dynamic neural network that dynamically grows the number of hidden layer neurons based on an increase in the entropy of the weights during training. Before computing the entropy value weights are normalized to probability values. The entropy which is used being referred is the non-extensive entropy proposed recently by Susan and Hanmandlu for the representation of structured data. Along with the description of dynamic growth of hidden layer neurons using the Susan and Hanmandlu non-extensive entropy, the results are also compared with the Shannon, Pal and Pal, the Tsallis entropies and various static neural network configurations, in terms of execution time of the set of training samples, growth of hidden layer neurons and the testing accuracy. The experiment is performed on basically three standard machine learning datasets and on synthetic dataset. Incrementally growing the hidden layer as per requirement leads to better tuning of network weights and high classification performance as proved by the empirical results. | en_US |
dc.language.iso | en | en_US |
dc.relation.ispartofseries | TD NO.1447; | - |
dc.subject | NON-EXTENSIVE ENTROPY | en_US |
dc.subject | NEURONS NEURAL NETWORK | en_US |
dc.subject | EXTREME LEARNING MACHINE | en_US |
dc.subject | DATASETS | en_US |
dc.subject | COMPLEXITY | en_US |
dc.subject | HIDDEN LAYER | en_US |
dc.title | DYNAMIC GROWTH OF HIDDEN-LAYER NEURONS USING THE NON-EXTENSIVE ENTROPY | en_US |
dc.type | Thesis | en_US |
Appears in Collections: | M.E./M.Tech. Information Technology |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
mayank thesis.pdf | 1.34 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.