Publications

Teaching Language Models to Faithfully Express their Uncertainty
Bryan Eikema, Evgenia Ilia, José G. C. de Souza, Chrysoula Zerva, Wilker Aziz in arXiv preprint, 2026

Structure-Conditional Minimum Bayes Risk Decoding
Bryan Eikema, Anna Rutkiewicz, Mario Giulianelli in Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing (EMNLP 2025), 2025

Proceedings of the 2nd Workshop on Uncertainty-Aware NLP (UncertaiNLP 2025)
Bryan Eikema, Raúl Vázquez, Jonathan Berant, Marie-Catherine de Marneffe, Barbara Plank, Artem Shelmanov, Swabha Swayamdipta, Jörg Tiedemann, Chrysoula Zerva, Wilker Aziz, 2025

Structure-Conditional Minimum Bayes Risk Decoding
Bryan Eikema, Anna Rutkiewicz, Mario Giulianelli in Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing (EMNLP 2025), 2025

Findings of the WMT 2024 Shared Task on Chat Translation
Wafaa Mohammed, Sweta Agrawal, Amin Farajian, Vera Cabarrão, Bryan Eikema, Ana C Farinha, José G. C. De Souza in Proceedings of the Ninth Conference on Machine Translation (WMT 2024), 2024

The Effect of Generalisation on the Inadequacy of the Mode
Bryan Eikema in Proceedings of the 1st Workshop on Uncertainty-Aware NLP (UncertaiNLP 2024), 2024

An Approximate Sampler for Energy-based Models with Divergence Diagnostics
Bryan Eikema, Germán Kruszewski, Cristopher R Dance, Hady Elsahar, Marc Dymetman in Transactions on Machine Learning Research (TMLR), 2022

Sampling-Based Approximations to Minimum Bayes Risk Decoding for Neural Machine Translation
Bryan Eikema and Wilker Aziz in Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing (EMNLP 2022), 2022

Is MAP Decoding All You Need? The Inadequacy of the Mode in Neural Machine Translation
Bryan Eikema and Wilker Aziz in Proceedings of the 28th International Conference on Computational Linguistics (COLING 2020), 2020 Best Paper Award

Auto-Encoding Variational Neural Machine Translation
Bryan Eikema and Wilker Aziz in Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP 2019), 2019