Detail publikace

Listen to Your Crowd - Analysis of User Comments of a Crowdsourced QoE Study

Originální název

Listen to Your Crowd - Analysis of User Comments of a Crowdsourced QoE Study

Anglický název

Listen to Your Crowd - Analysis of User Comments of a Crowdsourced QoE Study

Jazyk

en

Originální abstrakt

Crowdsourcing-based studies in the field of Quality of Experience usually focus only on one result which is the impact of the conditions tested on the final recipient of the multimedia content (the viewer). However, much more information is often gathered, not just the quality scores. In this paper, we analyze the crowdworkers’ feedback based on comments they left after finishing the given task. Based on the analyses, recommendations to improve the tasks are made.

Anglický abstrakt

Crowdsourcing-based studies in the field of Quality of Experience usually focus only on one result which is the impact of the conditions tested on the final recipient of the multimedia content (the viewer). However, much more information is often gathered, not just the quality scores. In this paper, we analyze the crowdworkers’ feedback based on comments they left after finishing the given task. Based on the analyses, recommendations to improve the tasks are made.

BibTex


@inproceedings{BUT138801,
  author="Ondřej {Zach}",
  title="Listen to Your Crowd - Analysis of User Comments of a Crowdsourced QoE Study",
  annote="Crowdsourcing-based studies in the field of Quality
of Experience usually focus only on one result which is the impact
of the conditions tested on the final recipient of the multimedia
content (the viewer). However, much more information is often
gathered, not just the quality scores. In this paper, we analyze
the crowdworkers’ feedback based on comments they left after
finishing the given task. Based on the analyses, recommendations
to improve the tasks are made.",
  address="VUT v Brně",
  booktitle="Proceedings of IEEE Student Branch Conference Mikulov 2017",
  chapter="138801",
  howpublished="online",
  institution="VUT v Brně",
  year="2017",
  month="august",
  pages="71--74",
  publisher="VUT v Brně",
  type="conference paper"
}