Google, Facebook and Information Junk Food

Post from the 2011 Conference in Boston


by Lucas Bernays Held, The Wallace Foundation

For a long time, experts blamed the problem of obesity on personal factors – perhaps a poor self-image, or lack of control. But in recent years scholars studying the problem increasingly point to environmental factors that influence people’s choices.

A prime example are fast-food restaurants where a large Coke costs only 10 cents more than a medium, making it a financially sound decision (given the lower cost per fluid ounce) to order a large. In other words, it’s still possible to make a healthy choice (ordering the small Coke), but the external cues and incentives set up by the food industry are nudging us in the wrong direction.

(The power of external cues can work in a positive direction, too – witness the effect of excise taxes on cigarettes and its part in reducing smoking, or the rise in 401k enrollment when you have to check a box to opt out, rather than check a box to opt in. These issues are explored in Nudge by Richard Thaler and Cassie Sunstein, though their conclusions have not gone uncontested by critics including members of the House of Lords.)

We’ve all thought of the World Wide Web as being a blank slate – free of any of these cues.

But in his address at the Fall 2011 Communications Network Conference in Boston on Sept. 22, MoveOn.org founder Eli Pariser warned us that new steps taken by Google, Facebook and other technology firms may actually be nudging us in the direction of being consumers of “information junk food.”

Here’s what the author of the book The Filter Bubble: What the Internet is Hiding from You told a surprised audience:

-That if you take two people who use search engines regularly and have them do a Google search on a particular term, about 60 percent of the search results will be different. What’s happening, Pariser said, is that Google is shaping its results according to what you’ve clicked on in the past. He offered the example of what happened when two friends searched the term “Egypt”. One got news of protests; the other got information on travel. The algorithm had automatically aligned search results with the prior search behavior of the user.

-That on his facebook page, he was surprised one day to find that all of his conservative friends had disappeared. Why? Because he was clicking on news from liberal friends more frequently.

What’s behind this, Pariser explained, is the goal of both technology companies to offer content that is relevant to users. And relevance is defined by past behavior. He quoted former Google Executive Chairman Eric Schmidt as saying: “It will be very hard for people to watch or consume something that has not in some sense been tailored to them.”

In many ways, this is just an extension of what people have been doing for decades. Some folks who pick up a newspaper will go first to the sports section, others to the local news, and some to the obituaries.

The difference, though, he noted, is that this is done automatically for you. It’s “invisible and passive.” In effect, Pariser argued, you have been placed in a “filter bubble” – and you have no way of knowing “who Google or Yahoo! think you are.”

This kind of automated narrowing of perspectives could lead to big problems:

Personal biases – or distortions, as he put it – that are reinforced. People experience pleasure when their pre-existing opinions are confirmed, and irritation when those opinions are contradicted. So instead of being exposed to facts that might change your opinion, you’ll be exposed mainly to items that reinforce them.

-The “like button” problem. These buttons are important, as they trigger what items rise to the top of news lists. But “like” has specific connotations – like pleasure, or amusement. And that may make it less probable that people will hit the “like” button for stories that are important, but perhaps not amusing – like the revolution in Iraq, or bank defaults. Those stories could wind up in the background.

-The psychological equivalent of obesity. If you’re only fed a diet of what others find amusing, and that is likely to be aligned with your current interests, you won’t get the “nutrition” that new perspectives could provide. As he put it, “you could end up in a world surrounded by information junk food.”

Pariser didn’t mention that this shift toward “personalization” of the Internet is really part of the wider shift documented by Clay Shirky of power from the few to the many. Four decades ago, there were three nightly TV newscasts, that you could supplement with your local newspaper and several national newspapers. There were a relatively small number of editors who at least made you aware of what (and I can say this as a former newspaper editor) was, based on their usually thoughtful but hardly infallible judgment, relevant, important, and interesting. You may have skimmed the news from overseas, but at least you knew it existed.

To my mind, the strongest part of Pariser’s argument is that this “filter bubble” – a clever metaphor –is not visible. That argument should, in an age where the Pew Survey Research Center on the People and the Press charts worrisomely low levels of trust in institutions, resonate with both those on the right and the left. (If there was any doubt about the findings, the Center’s latest report was called Distrust, Discontent and Partisan Rancor.)

But it’s also hard not to feel that Pariser’s target is really human nature. The argument that news is like spinach – and that we need a kind of Web-based “fairness doctrine” – seems like a tough one to make.

A different argument was advanced by actress (“Ugly Betty”) and activist America Ferrera, who recounted how she has helped build a school in Mali, and rescue a young woman in Kolkata profiled by Nicholas Kristof who would have otherwise been sold to a brothel and denied the chance for schooling. In Ferrera’s view, it’s all about telling a compelling story.

“You can’t expect that a PSA is going to work in this day and age,” she said. “How you tell the story is as important as the story itself. Creative thinking in the storytelling is really essential to reaching an audience.”

In other words, all attention is ultimately voluntary – which is why Dickens did as much as anyone to shine a light on the appalling conditions of Victorian workhouses. And that’s especially true in a world that has moved from three nightly newscasts plus a local paper – to hundreds of channels, thousands of online newspapers, and thousands more bloggers.

But Ferrera also used posts on her Facebook page and Tweets to help tell that young woman’s story. And, in fact, 45 people hit the “like” button on the Kristof story, despite Pariser’s worry about the semantic implications of the word.

In a separate and fascinating Communications Network session, Nasser Weddady, outreach coordinator for Hands Across the Mideast Support Alliance, and Mona Eltahawy, a commentator, both emphasized the role of social media in the protests that led to the overthrow of the Mubarak regime in Egypt. As Eltawahy said, ”we found each other online. A revolution needs that kind of criss-crossing of people.”

So, where does this leave us?

Pariser has a point that “personalization” of online search could amplify our worst tendencies to roll around in the mud puddle of our own prejudices – in the way that the pricing of the large Coke just a hair above the medium seduces us into gluttony.

It’s also the case that in an age of institutional distrust, people put more stock in information from other people they know, usually those who are like themselves. That has a big downside. But it also opens up the potential that a woman of conscience like Ferrera, who has beauty, fame and fluency in digital media, is showing us how to tap.

“We have met the enemy and he is us,” said Pogo.

In the battle for improving the conditions of the thousands of oppressed and disadvantaged people, our own fragile humanity may be our worst feature – but also our best ally.


 

1 Comment

  1. Deena LeventerDeena Leventer09-29-2011

    Ah, but there is something even more insidious hiding in Eli Pariser’s revelations about personalization of the Internet. In the past, the reality created for me by the biases of the National Review or The New Republic or Pravda were visible to and could be discussed with others. What I think I heard him say is that no one else sees the personalized reality that I am presented with by Google, meaning that we have no common ground for accepting, rejecting or critiquing. Now that’s scary.

Leave a Reply