Evaluating Transformer Models for Social MediaText-Based Personality Profiling

Dublin Core

Title

Evaluating Transformer Models for Social MediaText-Based Personality Profiling

Subject

Profiling analysis;Transformer, BERT Variants

Description

This research aims to evaluate the performance of various Transformer models in social media-based classification tasks, specifically focusingon applications in personality profiling. With the growing interest in leveraging social media as a data source for understanding individual personality traits, selecting an appropriate model becomes crucial for enhancing accuracyand efficiency in large-scale data processing. Accurate personality profiling can provide valuable insights for applications in psychology, marketing, and personalized recommendations. In this context, models such as BERT, RoBERTa, DistilBERT, TinyBERT, MobileBERT, and ALBERT are utilized in this study to understand their performance differences under varying configurations and dataset conditions, assessing their suitability for nuanced personality profiling tasks.The research methodology involves four experimental scenarios with a structured process that includes data acquisition, preprocessing, tokenization, model fine-tuning, and evaluation. In Scenarios 1 and 2, a full dataset of 9,920 data points was used with standard fine-tuning parameters for all models. In contrast,ALBERT in Scenario 2 was optimized using customized batch size, learning rate, and weight decay. Scenarios 3 and 4 used 30% of the total dataset, with additional adjustments for ALBERT to examine its performance under specific conditions. Each scenario is designed to test model robustness against variations in parameters and dataset size.The experimental results underscore the importance of tailoring fine-tuning parameters to optimize model performance, particularly for parameter-efficient models like ALBERT. ALBERT and MobileBERT demonstrated strong performance across conditions, excelling in scenarios requiringaccuracy and efficiency. BERT proved to be a robust and reliable choice, maintaining high performance even with reduced data, while RoBERTa and DistilBERT may require further adjustments to adapt to data-limited conditions. Although efficient, TinyBERT may fall short on tasks demanding high accuracy due to its limited representational capacity. Selecting the right model requires balancing computational efficiency,task-specific requirements,and data complexity

Creator

nggit Dwi Hartanto1*, Ema Utami2, Arief Setyanto3, Kusrini

Source

https://jurnal.iaii.or.id/index.php/RESTI/article/view/6157/1005

Publisher

Faculty of Computer Science, Universitas Amikom Yogyakarta, Yogyakarta, Indonesia

Date

21-01-2025

Contributor

FAJAR BAGUS W

Format

PDF

Language

ENGLISH

Type

TEXT

Files

Collection

Citation

nggit Dwi Hartanto1*, Ema Utami2, Arief Setyanto3, Kusrini, “Evaluating Transformer Models for Social MediaText-Based Personality Profiling,” Repository Horizon University Indonesia, accessed January 26, 2026, https://repository.horizon.ac.id/items/show/10474.