Jacob Rohde and Denis Wu’s receive a top paper award in the Communication Theory and Methodology Division at the 2016 AEJMC Convention. Their paper was titled, “Agreement between Humans and Machines? — A Reliability Check among Computational Content Analysis Programs”.
As data generated from social networking sites become larger, so does the need for computer aids in content analysis research. This paper outlined the growing methodology of supervised machine learning in respect to document topics classification and sentiment analysis. A series of tweets were collected, coded by humans, and subsequently fed into a selection of six different popular computer applications: Aylien, DiscoverText, MeaningCloud, Semantria, Sentiment 140, and SentiStrength. Reliability results between the human and machine coders were presented in a matrix in terms of Krippendorff’s Alpha and percentage agreement. Ultimately, this paper illuminated that, while computer-aided coding may lessen the burden and accelerate for researchers in coding social media content, the results of utilizing these programs indicate low reliability for analyzing political content.