Bayes at FigLang 2022 Euphemism detection shared task: Cost-sensitive Bayesian fine-tuning and Venn-Abers predictors for robust training under class skewed distributions

Loading...
Thumbnail Image
Files
2022.flp-1.13.pdf(178.17 KB)
Published Version
Date
2022-12
Authors
Trust, Paul
Provia, Kadusabe
Omala, Kizito
Journal Title
Journal ISSN
Volume Title
Publisher
Association for Computational Linguistics
Published Version
Research Projects
Organizational Units
Journal Issue
Abstract
Transformers have achieved a state of the art performance across most natural language processing tasks. However the performance of these models degrade when being trained on skewed class distributions (class imbalance) because training tends to be biased towards head classes with most of the data points . Classical methods that have been proposed to handle this problem (re-sampling and re-weighting) often suffer from unstable performance, poor applicability and poor calibration. In this paper, we propose to use Bayesian methods and Venn-Abers predictors for well calibrated and robust training against class imbalance. Our proposed approach improves f1-score of the baseline RoBERTa (A Robustly Optimized Bidirectional Embedding from Transformers Pretraining Approach) model by about 6 points (79.0% against 72.6%) when training with class imbalanced data.
Description
Keywords
Natural language processing , Transformers , Bayesian methods , Venn-Abers predictors
Citation
Trust, P., Provia, K. and Omala, K. (2022) 'Bayes at FigLang 2022 Euphemism detection shared task: Cost-sensitive Bayesian fine-tuning and Venn-Abers predictors for robust training under class skewed distributions', Proceedings of the 3rd Workshop on Figurative Language Processing (FLP), pp. 94-99. Available at: https://aclanthology.org/2022.flp-1.13/ (Accessed: 22 February 2023)
Link to publisher’s version