Please use this identifier to cite or link to this item: http://dspace.dtu.ac.in:8080/jspui/handle/repository/20038
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAGARWAL, SHUBHAM-
dc.date.accessioned2023-07-11T06:07:11Z-
dc.date.available2023-07-11T06:07:11Z-
dc.date.issued2023-06-
dc.identifier.urihttp://dspace.dtu.ac.in:8080/jspui/handle/repository/20038-
dc.description.abstractAn innovative design called The Transformer in NLP tries to tackle sequence-to sequence problems while skillfully managing long-range relationships. It doesn’t use convolution or sequence-aligned RNNs; it just uses self-attention to compute repre sentations of its input and output. The encoder-decoder design is the foundation of the transformer concept. After conducting in-depth study, researchers put forward the BERT and GPT transformer-based models, which significantly improved the bulk of NLP tasks including text creation, text summarization, and question an swering, among others. But as time went on, a number of these models’ drawbacks became apparent. PERT was recommended as a way to get around one of these drawbacks. In this project work, we fine-tune the pre-trained model on the similarity and paraphrasing task and analyze how the model performs in comparison to the other previously introduced methods.en_US
dc.language.isoenen_US
dc.relation.ispartofseriesTD-6578;-
dc.subjectPROGRAM EVALUATION AND REVIEW TECHNIQUEen_US
dc.subjectQUORAen_US
dc.subjectNATURAL LANGUAGE PROCESSINGen_US
dc.titleQUORA QUESTION PAIRS ANALYSIS USING PERTen_US
dc.typeThesisen_US
Appears in Collections:M.E./M.Tech. Computer Engineering

Files in This Item:
File Description SizeFormat 
Shubham Agarwal M.Tech.pdf2.85 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.