Title
deepQuest-py: Large and Distilled Models for Quality Estimation
Date Issued
01 January 2021
Access level
metadata only access
Resource Type
conference paper
Author(s)
University of Sheffield
Publisher(s)
Association for Computational Linguistics (ACL)
Abstract
We introduce deepQuest-py, a framework for training and evaluation of large and lightweight models for Quality Estimation (QE). deepQuest-py provides access to (1) state-ofthe-art models based on pre-trained Transformers for sentence-level and word-level QE; (2) light-weight and efficient sentence-level models implemented via knowledge distillation; and (3) a web interface for testing models and visualising their predictions. deepQuest-py is available at https://github.com/sheffieldnlp/deepQuest-py under a CC BY-NC-SA licence.
Start page
382
End page
389
Language
English
OCDE Knowledge area
Ciencias de la Información
Ingeniería de sistemas y comunicaciones
Medios de comunicación, Comunicación socio-cultural
Scopus EID
2-s2.0-85127267431
Resource of which it is part
EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
ISBN of the container
978-195591711-7
Conference
2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021
Sponsor(s)
This work was supported by funding from the Bergamot project (EU H2020 Grant No. 825303).
Sources of information:
Directorio de Producción Científica
Scopus