RUS  ENG
Full version
JOURNALS // Proceedings of the Institute for System Programming of the RAS // Archive

Proceedings of ISP RAS, 2015 Volume 27, Issue 3, Pages 351–364 (Mi tisp157)

This article is cited in 3 papers

A crowdsourcing engine for mechanized labor

D. A. Ustalov

Institute of Mathematics and Mechanics of Ural Branch of Russian Academy of Sciences

Abstract: Microtask crowdsourcing implies decomposing a difficult problem into smaller pieces. For that a special human-computer platform like CrowdFlower or Amazon Mechanical Turk is used to submit tasks for human workers motivated by either micropayments or altruism to solve. Examples of successful crowdsourcing applications are food nutrition estimation, natural language processing, criminal invasion detection, and other problems so-called “AI-hard”. However, these platforms are proprietary and requiring additional software for maintaining the output quality. This paper presents the design, architecture and implementation details of an open source engine for executing microtask-based crowdsourcing annotation stages. The engine controls the entire crowdsourcing process including such elements as task allocation, worker ranking, answer aggregation, agreement assessment, and other means for quality control. The present version of the software is implemented as a three-tier system, which is composed of the application level for the end-user worker interface, the engine level for the Web service controlling the annotation process, and the database level for the data persistence. The RESTful API is used for interacting with the engine. The methods for controlling the annotation are implemented as processors that are initialized using the dependency injection mechanism for achieving the loose coupling principle. The functionality of the engine has been evaluated by both using unit tests and replication of a semantic similarity assessment experiment.

Keywords: crowdsourcing engine, mechanized labor, human-assisted computation, task allocation, worker ranking, answer aggregation.

Language: English

DOI: 10.15514/ISPRAS-2015-27(3)-25



Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024