Dates
Wednesday, May 11, 2022 - 12:00pm to Wednesday, May 11, 2022 - 01:30pm
Location
NCS 115
Event Description

Abstract: Language is one of the main methods for humans to interact with each other and express themselves, hence containing in itself an encoding of rich psychological traits of the speaker. With the help of natural language processing (NLP), human-generated text data such as social media posts are a useful source for researchers to gain a better understanding of human psychology and behaviors. Recently, with the advent of transformers architecture which helps push state-of-the-art performances across many NLP tasks, improvements have cascaded to other interdisciplinary research involving language. In this thesis, we explore how we can take advantage of this strong deep learning framework to tackle many psychological problems, by modifying and adapting transformers to fit the characteristics of psychology studies. We look at and build methods on two of the main strengths of transformers: the capability to capture rich contextualized embeddings and the capability to learn and generate human-like texts. Then, we propose how they can be applied to gain more insights into the relationship between language and underlying psychological constructs.

Event Title
Ph.D. Proposal Defense: Huy Vu, 'Adapting Transformers for Psychology Research'