Please use this identifier to cite or link to this item: http://dspace.dtu.ac.in:8080/jspui/handle/repository/15617
Full metadata record
DC FieldValueLanguage
dc.contributor.authorDWIVEDI, MAYANK-
dc.date.accessioned2017-02-17T06:30:04Z-
dc.date.available2017-02-17T06:30:04Z-
dc.date.issued2014-07-
dc.identifier.urihttp://dspace.dtu.ac.in:8080/jspui/handle/repository/15617-
dc.description.abstractThe dynamic growth of hidden units and the pruning strategy has been sufficiently investigated in case of Radial Basis Function (RBF) neural network but relatively less in the case of feedforward multilayer perceptron (MLP) due to similarity in the hidden units in the MLP. So in this study I present a dynamic neural network that dynamically grows the number of hidden layer neurons based on an increase in the entropy of the weights during training. Before computing the entropy value weights are normalized to probability values. The entropy which is used being referred is the non-extensive entropy proposed recently by Susan and Hanmandlu for the representation of structured data. Along with the description of dynamic growth of hidden layer neurons using the Susan and Hanmandlu non-extensive entropy, the results are also compared with the Shannon, Pal and Pal, the Tsallis entropies and various static neural network configurations, in terms of execution time of the set of training samples, growth of hidden layer neurons and the testing accuracy. The experiment is performed on basically three standard machine learning datasets and on synthetic dataset. Incrementally growing the hidden layer as per requirement leads to better tuning of network weights and high classification performance as proved by the empirical results.en_US
dc.language.isoenen_US
dc.relation.ispartofseriesTD NO.1447;-
dc.subjectNON-EXTENSIVE ENTROPYen_US
dc.subjectNEURONS NEURAL NETWORKen_US
dc.subjectEXTREME LEARNING MACHINEen_US
dc.subjectDATASETSen_US
dc.subjectCOMPLEXITYen_US
dc.subjectHIDDEN LAYERen_US
dc.titleDYNAMIC GROWTH OF HIDDEN-LAYER NEURONS USING THE NON-EXTENSIVE ENTROPYen_US
dc.typeThesisen_US
Appears in Collections:M.E./M.Tech. Information Technology

Files in This Item:
File Description SizeFormat 
mayank thesis.pdf1.34 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.