There are thousands of news stories every day. More than any news publication or news hour could possibly cover.
Car crashes, robberies, heroic feats, international skirmishes, sporting events, political battles, traffic jams, fires, weather reports, personal interest stories. The list is infinite.
Every day, editors curate which stories are worthy of front-page coverage, which should be included in the news hour, or even which are worth investigating in the first place.
We like to believe that the news is curated based on what we need to know. But we know that’s not quite true.
Do we really need to know about the fatal car accident that happened six hours ago or what Kim Kardashian did earlier today?
Instead, the news is often curated based on what draws interest, what brings in viewers and readers, what builds an audience for advertisers.
Most of us understand that the news is designed to grab our attention.
What grabs our attention?
In Trust Me, I’m Lying, Ryan Holiday summarized the findings of a research study that looked at the attributes of the most shared articles from The New York Times. The stronger the positive or negative emotions in the content, the more times each article was shared.
We respond strongly to emotions.
News organizations know this. It’s why the news hour is filled with stories that evoke emotion. Car crashes, robberies, and fires evoke fear. Heroic stories evoke hope. Both move us to keep watching, to keep reading.
Emotions arouse us. The stronger the arousal the more we watch, the more we are likely to share the experience with someone else, the larger the news audience grows.
Headlines Are Designed to Get Us to Read, to Click
Upworthy is one of the fastest growing internet sites of all time. They grew to 8.7 million monthly unique visitors in its first six months.
They did it through what’s now commonly referred to as link-bait. Otherwise known as compelling headlines.
Headlines designed to evoke emotion.
See: How Upworthy Used Emotional Data to Become The Fastest Growing Media Site Of All Time
In the copywriting world, there are “tricks of the trade” when it comes to headline writing, including:
- Evoke emotion.
- Evoke curiosity.
- Surprise people.
News organizations don’t just select stories that evoke emotion, they write headlines meant to do the same thing.
They know that if they get us to react emotionally, we are more likely to read.
This happens every day. With every news publication.
Are They Manipulating Us?
The most relevant definition for “manipulate” from the Oxford English Dictionary is:
To manage, control, or influence in a subtle, devious, or underhand manner.
I think you could argue that, yes, news organizations are manipulating us to click on their headlines, to read their articles, and to remain glued to our televisions.
They do this when they tease us right before a commercial break. They do this when they over promise in a headline. They do it when they over report uncommon events.
But is this wrong?
That’s a tough question to answer. Most of us don’t get angry when we hear about this. We might get frustrated. We might choose to ignore the news. But we don’t vilify the news organizations.
Instead, we expect this behavior. The more conscientious among us choose the news sources that best reflect our interests and our values. And we ignore the rest.
We understand the game that is being played around and we choose how we want to participate.
So What’s Different About the Facebook Social Contagion Study?
I’ve been asking myself this question all week. A lot of people are upset about this study. I’m not questioning or judging those reactions. Emotions are a good indicator that there’s something important to investigate. And this study definitely struck an emotional chord.
But I can’t manage to figure out why. What makes it different from what these news organizations do every day?
Is it that people don’t know that Facebook has always filtered the stories you see in your newsfeed?
Just as news organizations have too many news stories to publish and editors choose what to print, your friends create too many posts and Facebook uses an algorithm to show you the “most relevant” ones.
Newspapers want you to click, read, and share. Facebook wants you to like, comment, and share. Doing so not only encourages you to use Facebook more often, it encourages others to visit Facebook more often.
In this context, it seems only natural that Facebook would run experiments to optimize this algorithm. Including ones that evoke emotion. Especially ones that evoke emotion.
Or is it that people don’t know that internet companies are constantly running experiments to better understand user behavior?
Most internet companies are running countless experiments at any given time. Google tests search algorithms to get you to click on more search results. Amazon is known for its extensive testing designed to get you to buy more. Odds are, if you use the internet, you are participating in dozens of experiments every day.
Or maybe it’s the fault of bad journalism. There was a lot of misreporting on this study.
Many articles made it sound like Facebook is manipulating the content of your friends posts. They aren’t. They are merely selecting which posts to show you.
Many articles claimed that the study didn’t get approval by an Institutional Review Board. It did.
Then they focused on the fact that IRB approval was only for a data set and did not cover data collection.
But all of this confuses me.
I can understand why academics are worried about the IRB process. And PNAS, absolutely should be worried about the IRB process. And I even question whether this research should have been published in the first place. Is it really generalizable research?
But I don’t understand why people who work in tech are up in arms about this data collection. We all collect this type of data.
Should we be collecting this type of data? Are there ethical concerns with collecting it? I don’t know. These are good questions to ask. Should news organizations manipulate our emotions to get us to click on headlines and read their articles/ Let’s have this conversation. I’m all for it.
But this isn’t the conversation we are having. Instead, we are vilifying Facebook for doing what everyone in the industry does. I suggest we stop doing that and start talking about what we, as an industry, should be collectively doing and not doing.