Late one Saturday night / Sunday morning, I’m scrolling through my Facebook feed when I come across instructions for how to find out what your political views are based off Facebook’s algorithms. Intrigued, I track it down. It’s buried 6-7 links down in your Account Settings: Menu to Account Settings to Ads to Your Information to Your Categories to Review and Manage Your Categories to US Politics.
As I’m scrolling through, I begin to worry. As a Computer Science major, I’m more aware than most of just what kind of data can be collected, and despite that knowledge, I have maintained a pretty steady online presence where Facebook is concerned. There are many reasons for this, not least of which is I like to read, and Facebook does a good job of giving me delicious content to consume.
But in the past few years, the content I consume has turned more political than when I was in my 20’s (I’m 30 now). I have over 2,400 different groups liked on my profile. Thousands upon thousands of viewed articles, comments, and replies. Many would view Facebook’s interpretation of its users’ political stances as sort of a “What kind of toaster are you? Take our 5 question quiz to find out!” gimmicky tagline, and take it with a grain of salt. I, on the other hand, know the sheer amount of information I have interacted with online tells me that it is a statistically significant result. What Facebook’s algorithms think is more than likely a true representation of my political inclination.
What if I’m one of those right-wingers that everybody on-campus hates? What if people summarize my political views as “God, guns, and gin” or “King James, jacked up trucks, and Uncle Sam”? What if I get pegged at the opposite end and live out the rest of my days trying to figure out how to weave hemp undergarments with locally sourced, gender neutral, cruelty-free knitting needles? I have read news from all sources, everything from Huffington Post to Fox News. Doubts assailed me from all sides.
Imagine my relief when I see, in parentheses, the word “Moderate.” That’s like getting a ‘C’ on a final when you’re expecting an ‘F.’ But further down, there’s a little box. It says “Sorry, we don’t have an example of this kind of ad to show you right now.” This entire thing, Facebook and its affiliates, are nothing more than ad recommendation engines. 2 billion people all “turned on, tuned in, and dropping out.” And not a single recommendation for “Moderate” US politics? There are 170 million people who didn’t vote in the last Presidential election. Approximately 340 million total living in the US. And there is no sizable subset who might be interested in Moderate political ads? That’s a scary thought.