Matching crowd workers to suitable tasks is highly desirable as it can enhance task performance, reduce the cost for requesters, and increase worker satisfaction. In this paper, we propose a method that considers workers’ cognitive ability to predict their suitability for a wide range of crowdsourcing tasks. We measure cognitive ability via fast-paced online cognitive tests with a combined average duration of 6.2 min. We then demonstrate that our proposed method can effectively assign or recommend workers to five different popular crowd tasks: Classification, Counting, Proofreading, Sentiment Analysis, and Transcription. Using our approach we demonstrate a significant improvement in the expected overall task accuracy. While previous methods require access to worker history or demographics, our work offers a quick and accurate way to determine which workers are more suitable for which tasks.
Hettiachchi Danula, van Berkel Niels, Hosio Simo, Kostakos Vassilis, Goncalves Jorge
A4 Article in conference proceedings
Place of publication:
Human-Computer Interaction – INTERACT 2019 17th IFIP TC 13 International Conference, Paphos, Cyprus, September 2–6, 2019, Proceedings, Part I
Hettiachchi D., van Berkel N., Hosio S., Kostakos V., Goncalves J. (2019) Effect of Cognitive Abilities on Crowdsourcing Task Performance. In: Lamas D., Loizides F., Nacke L., Petrie H., Winckler M., Zaphiris P. (eds) Human-Computer Interaction – INTERACT 2019. INTERACT 2019. Lecture Notes in Computer Science, vol 11746. Springer, Cham. https://doi.org/10.1007/978-3-030-29381-9_28
Read the publication here: