Ana içeriğe atla
TR EN

CS SEMINAR:NLP with Transformer Architectures

Time:July 18 2024 3:30 FENS L027(Hybrid)


Please find the abstract of the talk and the short bio of the speaker below.


Abstract: In recent years, transformer architectures have revolutionized the field of Natural Language Processing (NLP), enabling significant advancements in various applications. This seminar will delve into the use of transformers, with a particular focus on BERT (Bidirectional Encoder Representations from Transformers), for detecting hate speech in Turkish tweets. Our research shows how a fine-tuned BERT model can effectively identify hate speech. The seminar also covers the definition of hate speech, datasets collected for the study, annotation process, model architecture, training process, and evaluation metrics used in our study. Finally, future directions for enhancing hate speech detection and the broader implications of transformer architectures in NLP will be discussed.  riented studies into the domain knowledge driven ML approaches. 

Bio: He received his bachelor's degree (2010), master's degree (2012), and Ph.D. (2017) from the Computer Science and Engineering department of Sabancı University. Since September 2017, he has been serving as an instructor at Sabancı University. In addition to undergraduate courses, he teaches Introduction to Data Analytics and Machine Learning courses in the Professional Master's Program in Data Analytics at Sabancı University, while also providing consultancy and training under the umbrella of the Center of Excellence in Data Analytics (VERİM). His research areas are data mining, machine learning, and specifically natural language processing.