Publication detail

Listen to Your Crowd - Analysis of User Comments of a Crowdsourced QoE Study

ZACH, O.

Original Title

Listen to Your Crowd - Analysis of User Comments of a Crowdsourced QoE Study

English Title

Listen to Your Crowd - Analysis of User Comments of a Crowdsourced QoE Study

Type

conference paper

Language

en

Original Abstract

Crowdsourcing-based studies in the field of Quality of Experience usually focus only on one result which is the impact of the conditions tested on the final recipient of the multimedia content (the viewer). However, much more information is often gathered, not just the quality scores. In this paper, we analyze the crowdworkers’ feedback based on comments they left after finishing the given task. Based on the analyses, recommendations to improve the tasks are made.

English abstract

Crowdsourcing-based studies in the field of Quality of Experience usually focus only on one result which is the impact of the conditions tested on the final recipient of the multimedia content (the viewer). However, much more information is often gathered, not just the quality scores. In this paper, we analyze the crowdworkers’ feedback based on comments they left after finishing the given task. Based on the analyses, recommendations to improve the tasks are made.

Keywords

crowdsourcing, comments, quality of experience

Released

28.08.2017

Publisher

VUT v Brně

Location

Brno

ISBN

9788021455269

Book

Proceedings of IEEE Student Branch Conference Mikulov 2017

Pages from

71

Pages to

74

Pages count

4

URL

Documents

BibTex


@inproceedings{BUT138801,
  author="Ondřej {Zach}",
  title="Listen to Your Crowd - Analysis of User Comments of a Crowdsourced QoE Study",
  annote="Crowdsourcing-based studies in the field of Quality
of Experience usually focus only on one result which is the impact
of the conditions tested on the final recipient of the multimedia
content (the viewer). However, much more information is often
gathered, not just the quality scores. In this paper, we analyze
the crowdworkers’ feedback based on comments they left after
finishing the given task. Based on the analyses, recommendations
to improve the tasks are made.",
  address="VUT v Brně",
  booktitle="Proceedings of IEEE Student Branch Conference Mikulov 2017",
  chapter="138801",
  howpublished="online",
  institution="VUT v Brně",
  year="2017",
  month="august",
  pages="71--74",
  publisher="VUT v Brně",
  type="conference paper"
}