Title
An open source tool for crowd-sourcing the manual annotation of texts
Date Issued
01 January 2014
Access level
open access
Resource Type
conference paper
Author(s)
Drury B.
Cardoso P.C.F.
Valejo A.
Pereira F.
Lopes A.d.A.
University of São Paulo
Publisher(s)
Springer Verlag
Abstract
Manually annotated data is the basis for a large number of tasks in natural language processing as either: evaluation or training data. The annotation of large amounts of data by dedicated full-time annotators can be an expensive task, which may be beyond the budgets of many research projects. An alternative is crowd-sourcing where annotations are split among many part time annotators. This paper presents a freely available open-source platform for crowd-sourcing manual annotation tasks, and describes its application to annotating causative relations.
Start page
268
End page
273
Volume
8775
Language
English
OCDE Knowledge area
Bioinformática
Scopus EID
2-s2.0-84908583889
Source
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Resource of which it is part
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
ISSN of the container
03029743
ISBN of the container
9783319097602
Conference
11th International Conference on Computational Processing of Portuguese, PROPOR 2014Sao Carlos/SP6 October 2014through 8 October 2014
Sources of information: Directorio de Producción Científica Scopus