Project Abstract: Huynh / Moore

Final Project Slides:

Final Project Code:

Given the contemporary ubiquity for charged terms such as ‘fake news’ and ‘post-truth politics,’ this project seeks to work toward a more usefully quantitative metric for measuring the veracity of allegedly factual news stories spread through social media platforms. We acknowledge the inherent politicization of inaccuracies posing as legitimate reporting in a landscape wherein value judgments too often inform factual exposure rather than vice versa, but instead of finding an excuse to dismiss ideological components from our metric here we instead see an opportunity to fold them into our process of identification and observation. Further, we speculate that in making ‘fake news’ more tangible there must exist additional commonalities, both across those social media users that proliferate these stories and within the content of the stories themselves, that could reinforce the accuracy of an identification metric.

Therefore, this project seeks to question which factors of purportedly factual news stories and the users that share them might in fact serve as reliably high indicators of ‘fake news.’ Through our consideration of available data from social media sites like Twitter and consideration of previous research and scholarship of online credibility detection we hope to propose both the most and least effective indicators of inaccurate news stories. Additionally, observations regarding the spread of the stories through user activity will inform a metric for measuring the likelihood of any given user’s propensity for fake news proliferation. From this research we hope to produce an algorithm that might begin to evaluate news stories and social media users for our stated concerns in real time.


Week 2: Laying the Groundwork

In the foundational readings for the first week of our course proper, I found both a solid overview and a wonderful progression of ideas that highlight the key problems we seek to address as scholars blending technology and the humanities. Having taken a course somewhat overlapping with INT200 with Professor Sakr last year, I recall how the apparent simplicity of the topic at hand might at first blind deeper consideration of the politics and pitfalls of ‘algorithmic culture.’ Therefore, I appreciate we begin on a trajectory that highlights not just current definitions and historiographical approaches to the topic, but also many of the more topical crises that underscore the importance of this academic intersection.

Professor El Abbadi’s “What are Algorithms?” provides a useful overview, especially for a humanities-based student such as myself. The variety of algorithms he lists in an evolving order grants great insight into how, far from abstract theoretical ideas, these programs serve first and foremost as tools to solve real problems in our digital era. Lest we selfishly consider algorithms and data as contemporary issues only, Robertson and Travaglia helpfully remind us that the field stretches back at least two hundred years to demonstrate that issues of data management predate the modern computer by a significant margin. This article, I believe, crucially detaches the larger importance of data management from twenty-first century concerns to emphasize the universality of these concepts.

Manovich and Striphas, however, each look forward by digging into contemporary challenges and politics facing the algorithm; Striphas explicitly refers to Amazon’s LGBT book gaffe as an example of real consequences facing the human manipulation of the algorithms now so entrenched in the contemporary world. Both also serve to remind their readers that algorithms and data are inextricable from modern culture, that these tools are no longer contained as just tools but that they permeate the most mundane and the most crucial aspects of our everyday digital lives. Taken in conjunction, these readings provide an excellent starting point for the ultimate objectives of our own research and interventions into the nature of the algorithm and culture.