Please use this identifier to cite or link to this item:
http://dspace.dtu.ac.in:8080/jspui/handle/repository/21778
Title: | MEASURING AND MITIGATING GENDER BIAS IN LEGAL CONTEXTUALIZED LANGUAGE MODELS |
Authors: | NAYAK, ANANYA |
Keywords: | MITIGATING GENDER NLP APPLICATION AI MODELS LEGAL CONTEXTUALIZED LANGUAGE MODELS |
Issue Date: | May-2025 |
Series/Report no.: | TD-7988; |
Abstract: | LLMs in NLP are now able to summarize, translate and produce text with remarkable success. Nonetheless, the way these models can be biased, including towards gender, has attracted some criticism. The issue appears because the training data is often influenced by cultural preconceptions that are reinforced during the model’s improvement. Because of these biases, people’s opinions can become set, society’s values might shift and applications like automatic hiring, educational resources and customer help might not treat everyone the same. This research examines why gender bias exists in LLMs, the impact it has on NLP applications and how we can reduce it to make AI more equal. Long-lasting inconsistency in society has resulted in gender bias being found in legal texts, court cases and age-old practices. This inequality can be found in the way some crimes or positions are more often linked to one gender and in how legal papers word description of roles and rules. As another example, if AI uses gender-based biases in the court, it could affect both sentencing and the way the AI models outcomes. It is very hard to solve this problem in legal language processing, mainly because legal texts are usually complex and depend on unclear hints. Legal information should be made fair while at the same time maintaining its accuracy through well-structured mitigation efforts. |
URI: | http://dspace.dtu.ac.in:8080/jspui/handle/repository/21778 |
Appears in Collections: | M.E./M.Tech. Computer Engineering |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Ananya Nayak M.Tech..pdf | 2.11 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.