From Crowdsourcing to Crowdcoding: An Alternative Approach to Annotate Big Data in Communication Research


This paper introduces and evaluates an emerging content analysis approach for big social data communication research—crowdcoding, which outsources the “coding” tasks to numerous people on the Internet and makes decisions by majority voting. Specifically, the study ran the crowdcoding algorithm on the Amazon Mechanical Turk. 239 crowdworkers were recruited to code a sample of tweets collected during the 2016 U.S. presidential primaries. Two communication researchers systematically evaluated the crowdcoding results. Overall we conclude that the crowdcoding approach has great potential to be applied in communication research for assisting with the analysis of large social data.

Cite the paper:

Guo, L., Mays, K., Sameki, M., & Betke, M. (2017). From crowdsourcing to crowdcoding: An alternative approach to annotate big data in communication research. Paper presented at the ICA annual conference, San Diego, May 2017.