grafzahl
fine-tuning Transformers for text data from within R
Abstract
This paper introduces `grafzahl`, an R package for fine-tuning Transformers for text data from within R. The package combines the ease of use of the `quanteda` R ecosystem and the state-of-the-art `Transformers` Python library. The package is used in this paper to reproduce the analyses in communication papers or, of non-Germanic benchmark datasets. Very significant improvement in model accuacy over traditional machine learning approach such as Convoluted Neural Network is observed. `grafzahl` might have a role in the mainstreamization of Transformer-based machine learning methods for communication research and beyond.
Copyright (c) 2023 Chung-hong Chan

This work is licensed under a Creative Commons Attribution 4.0 International License.