Please use this identifier to cite or link to this item:
http://dspace.dtu.ac.in:8080/jspui/handle/repository/19882Full metadata record
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | PASBOLA, HRITICK | - |
| dc.date.accessioned | 2023-06-14T05:40:41Z | - |
| dc.date.available | 2023-06-14T05:40:41Z | - |
| dc.date.issued | 2023-06 | - |
| dc.identifier.uri | http://dspace.dtu.ac.in:8080/jspui/handle/repository/19882 | - |
| dc.description.abstract | Text classification is the process in which documents are classified into categories that are already defined. These days the amount of documents is growing at an exponential rate, thus there is a need for classifying these documents as it will be extremely helpful in text retrieving, news classification, spam detection and many more. We combine BERT with different deep learning techniques(BiGRU, 1-D CNN,GRU-CNN and TCN-CNN) and compare its performance with some of the popular methods used in text classification. We first convert the document into BERT embedding using a pre-trained BERT model and then feed it into each different model. If we are using an ensemble method then we use stacking to merge the outputs of the models to obtain the final result. We compare the performance on the basis of accuracy and observe that the models BERT+GRU-CNN and BERT+TCN-CNN perform the best among the models used in this thesis and as good as some of the popular methods. | en_US |
| dc.language.iso | en | en_US |
| dc.relation.ispartofseries | TD-6443; | - |
| dc.subject | TEXT CLASSIFICATION | en_US |
| dc.subject | DEEP LEARNING METHODS | en_US |
| dc.subject | TCN-CNN | en_US |
| dc.subject | GRU-CNN | en_US |
| dc.title | TEXT CLASSIFICATION USING DEEP LEARNING METHODS | en_US |
| dc.type | Thesis | en_US |
| Appears in Collections: | M.E./M.Tech. Computer Engineering | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| HritickPasbola MTech.pdf | 1.16 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



