Fine-tuning bidirectional encoder representations from transformers for the X social media personality detection

International Journal of Artificial Intelligence

Fine-tuning bidirectional encoder representations from transformers for the X social media personality detection

Abstract

Understanding personality traits can help individuals reach their full potential and has applications in various fields such as recruitment, advertising, and marketing. A widely used tool for assessing personality is Myers-Briggs type indicator (MBTI). Recent advancements in technology have allowed for research on how personalities can change based on social media use. Previous research used machine learning methods, deep learning methods, until transformers-based method. However, these previous approaches must be revised to require extensive data and a high computational load. Although transformer-based methods like bidirectional encoder representations from transformers (BERT) excel at understanding context, it still has limitations in capturing word order and stylistic variations. Therefore, this study proposed integrating fine-tuning BERT with recurrent neural networks (RNNs) consisting of vanilla RNN, long short-term memory (LSTM), and gated recurrent unit (GRU). This study also uses a BERT base fully connected layer as a comparison. The results show that the BERT base fully connected layer approach strategy has the best evaluation results in class extraversion/introversion (E/I) of 0.562 and class feeling/thinking (F/T) of 0.538. then, the BERT+LSTM approach strategy has the highest accuracy for the intuition/sensing (N/S) class of 0.543 and judging/perceiving (J/P) of 0.532. 

Discover Our Library

Embark on a journey through our expansive collection of articles and let curiosity lead your path to innovation.

Explore Now
Library 3D Ilustration