Please use this identifier to cite or link to this item:
http://dspace.dtu.ac.in:8080/jspui/handle/repository/20038
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | AGARWAL, SHUBHAM | - |
dc.date.accessioned | 2023-07-11T06:07:11Z | - |
dc.date.available | 2023-07-11T06:07:11Z | - |
dc.date.issued | 2023-06 | - |
dc.identifier.uri | http://dspace.dtu.ac.in:8080/jspui/handle/repository/20038 | - |
dc.description.abstract | An innovative design called The Transformer in NLP tries to tackle sequence-to sequence problems while skillfully managing long-range relationships. It doesn’t use convolution or sequence-aligned RNNs; it just uses self-attention to compute repre sentations of its input and output. The encoder-decoder design is the foundation of the transformer concept. After conducting in-depth study, researchers put forward the BERT and GPT transformer-based models, which significantly improved the bulk of NLP tasks including text creation, text summarization, and question an swering, among others. But as time went on, a number of these models’ drawbacks became apparent. PERT was recommended as a way to get around one of these drawbacks. In this project work, we fine-tune the pre-trained model on the similarity and paraphrasing task and analyze how the model performs in comparison to the other previously introduced methods. | en_US |
dc.language.iso | en | en_US |
dc.relation.ispartofseries | TD-6578; | - |
dc.subject | PROGRAM EVALUATION AND REVIEW TECHNIQUE | en_US |
dc.subject | QUORA | en_US |
dc.subject | NATURAL LANGUAGE PROCESSING | en_US |
dc.title | QUORA QUESTION PAIRS ANALYSIS USING PERT | en_US |
dc.type | Thesis | en_US |
Appears in Collections: | M.E./M.Tech. Computer Engineering |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Shubham Agarwal M.Tech.pdf | 2.85 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.