Users are encouraged to not only cite pangoling, but also the python package `transformers` (and the specific LLM they are using): Nicenboim B (2025). _pangoling: Access to large language model predictions in R_. doi:10.5281/zenodo.7637526 <https://doi.org/10.5281/zenodo.7637526>, R package version 1.0.1, <https://github.com/ropensci/pangoling>. A BibTeX entry for LaTeX users is @Manual{, title = {{pangoling}: {Access} to large language model predictions in {R}}, author = {Bruno Nicenboim}, year = {2025}, note = {R package version 1.0.1}, doi = {10.5281/zenodo.7637526}, url = {https://github.com/ropensci/pangoling}, } Wolf T, Debut L, Sanh V, Chaumond J, Delangue C, Moi A, Cistac P, Rault T, Louf R, Funtowicz M, Davison J, Shleifer S, von Platen P, Ma C, Jernite Y, Plu J, Xu C, Le Scao T, Gugger S, Drame M, Lhoest Q, Rush AM (2020). “HuggingFace's Transformers: State-of-the-art Natural Language Processing.” 1910.03771, <https://arxiv.org/abs/1910.03771>. A BibTeX entry for LaTeX users is @Misc{, title = {{HuggingFace's Transformers}: State-of-the-art Natural Language Processing}, author = {Thomas Wolf and Lysandre Debut and Victor Sanh and Julien Chaumond and Clement Delangue and Anthony Moi and Pierric Cistac and Tim Rault and Rémi Louf and Morgan Funtowicz and Joe Davison and Sam Shleifer and Patrick {von Platen} and Clara Ma and Yacine Jernite and Julien Plu and Canwen Xu and Teven {Le Scao} and Sylvain Gugger and Mariama Drame and Quentin Lhoest and Alexander M. Rush}, year = {2020}, eprint = {1910.03771}, archiveprefix = {arXiv}, primaryclass = {cs.CL}, url = {https://arxiv.org/abs/1910.03771}, }