Please use this identifier to cite or link to this item:
http://dspace.dtu.ac.in:8080/jspui/handle/repository/20038| Title: | QUORA QUESTION PAIRS ANALYSIS USING PERT |
| Authors: | AGARWAL, SHUBHAM |
| Keywords: | PROGRAM EVALUATION AND REVIEW TECHNIQUE QUORA NATURAL LANGUAGE PROCESSING |
| Issue Date: | Jun-2023 |
| Series/Report no.: | TD-6578; |
| Abstract: | An innovative design called The Transformer in NLP tries to tackle sequence-to sequence problems while skillfully managing long-range relationships. It doesn’t use convolution or sequence-aligned RNNs; it just uses self-attention to compute repre sentations of its input and output. The encoder-decoder design is the foundation of the transformer concept. After conducting in-depth study, researchers put forward the BERT and GPT transformer-based models, which significantly improved the bulk of NLP tasks including text creation, text summarization, and question an swering, among others. But as time went on, a number of these models’ drawbacks became apparent. PERT was recommended as a way to get around one of these drawbacks. In this project work, we fine-tune the pre-trained model on the similarity and paraphrasing task and analyze how the model performs in comparison to the other previously introduced methods. |
| URI: | http://dspace.dtu.ac.in:8080/jspui/handle/repository/20038 |
| Appears in Collections: | M.E./M.Tech. Computer Engineering |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| Shubham Agarwal M.Tech.pdf | 2.85 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



