Facebook's News Feed uses algorithms to choose which stories we see and in what order, based on who posted them, who among our "friends" reacted to them, and how much they mesh with the "preferences" we signal by our own clicks. Is this what we want?
Technology columnist
Farhad Manjoo's "
Social Insecurity: Can Facebook Fix Its Own Worst Bug?" (
New York Times Magazine, 4/30/17) begins with this kicker:
Mark Zuckerberg now acknowledges the dangerous side of the social
revolution he helped start. But is the most powerful tool for connection
in human history capable of adapting to the world it created?
Manjoo interviewed Mark Zuckerberg at Facebook's headquarters in early January 2017 and again a month later. The article's primary concern is the effect of Facebook on national and global politics—especially its disruptive "echo chamber" distortion of public discourse on all sides during and since the November 2016 elections.
[Facebook has] become the largest
and most influential entity in the news business.... It is
also the most powerful mobilizing force in politics, and it is fast replacing
television as the most consequential entertainment medium....
But over the course of 2016, Facebook’s
gargantuan influence became its biggest liability. During the U.S. election,
propagandists...used the service to turn fake stories into viral sensation....
And fake news was only part of a larger conundrum. With its huge reach,
Facebook has begun to act as the great disseminator of the larger cloud of
misinformation and half-truths swirling about the rest of media. It sucks up
lies from cable news and Twitter, then precisely targets each lie to the
partisan bubble most receptive to it. (40)
The News Feed
What most catches my attention as a retired reference librarian is Manjoo's discussion of Facebook's
News Feed, the key algorithmic engine defining each user's unique experience of information access.
It's an easy guess that most FB users don't even know about this "default ON" feature—any more than they know that Google's default ON is its
personalized search.
Personalization...may influence your search results...based on sites you
have showed a past interest in through your browser or search history,
[and] whether you are signed in or out of a Google account that houses more
extensive information about yourself (including on your Google+
profile).
It's deeper than simple date, time, location. It tries to get
at the heart of who you are, the kinds of sources you gravitate to, and
the content that will most satisfy you as a searcher.
[Note: To turn off personalized search in Google, log into your Google account, go to Search Settings at https://www.google.com/preferences, scroll down to Private Results, and select "Do not use private results."]
Here's how Manjoo describes the analogous personalization mechanics of News Feed:
Every time you open Facebook, [News Feed] hunts through
the network, collecting every post from every connection—information that,
for most Facebook users, would be too overwhelming to process themselves. Then
it weighs the merits of each post before presenting you with a feed sorted in order
of importance: a hyperpersonalized front page designed just for you....
For the typical user...News Feed is computing the relative merits of about 2,000 potential posts in
your network every time you open the app. In sorting these posts, Facebook does
not optimize for any single metric: not clicks or reading time or likes. (43)
Zuckerberg's aim for Facebook is to do global news distribution run by machines, ruled by engineering rather than editing, user preference rather than public good.
The people who work on News Feed aren’t making decisions
that turn on fuzzy human ideas like ethics, judgment, intuition or seniority.
They are concerned only with quantifiable outcomes about people’s actions on
the site. That data, at Facebook, is the only real truth…....
But it is precisely this ideal that conflicts with attempts
to wrangle the feed in the way press critics have called for. The whole purpose
of editorial guidelines and ethics is often to suppress individual instincts in
favor of some larger social goal. Facebook finds it very hard to suppress
anything that its users’ actions say they want. (43)
[Note: You cannot turn New Feed's algorithms off, but you can—within some annoying limits—narrow their operations. See below for more details.]
But is News Feed "free access to information"?
Technically it is...
sort of. Facebook users make the choices that feed the algorithms that drive what information the users see. Unfortunately, few of us realize that clicking on something—anything—triggers a cascading sequence of other machine-based choices. Choices defined by Facebook's aims, not our own.
In a sense Facebook's aims
do match those of most contemporary users. We all want to have "news" and "information" and "opinion" and "entertainment" from people we "agree with." If I go to a library or bookstore,
I decide what I want to read, right? If I don't want to see opposing views, I don't read them. Just like I pick PBS or FOX News or whatever, based on which slant on reality I want to have reinforced by broadcasters.
So Facebook's News Feed just automates the process of making sure I see mostly the web-based stuff I
want to see. That's its whole point, right?
As a professional librarian I have to agree...unhappily. When I was still a public librarian, I always wanted to take those three-plus shelves of
Ann Coulter books into the courtyard and burn them. But I didn't. It wasn't my choice what my customers read.
Given that ethic, I also have to accept Zuckerberg's business model: "
Facebook finds it very hard to suppress
anything that its users’ actions say they want."
Okay, but is News Feed an "information service"?
I don't think so. At least not in the library professional's sense of helping users find and evaluate
authentic information. As with most other popular "news media," Facebook is basically in the
entertainment business, not the business of keeping the public
well informed.
Facebook is primarily an
advertising agency. It gets its revenue by showing us what we want to watch, so that we keep watching...and see ads. And here's the key point: Facebook's vast market share (
1.94 billion for first quarter 2017) hinges on its mastery of user data mining—and especially its sheer genius in persuading us to give it the data that tells it what we want.
This isn't about what they
ask us to tell them. Every click, every search is a data point, allowing the machines to index, compile, and analyze our choices, and to redirect us to stuff we imagine we want to see...with ads attached that match closely the analysis of our unique, every-changing data sets.
By the way, all those intriguing "personality profile" quizzes on Facebook? Whenever we do any of these, we are giving Facebook and
their advertisers—and who knows who else—a vast store of deep, intimate psychological
profiling data about ourselves. Even better for precisely targeting machine choices of both stories and ads.
A change of heart at Facebook?
Manjoo shares a concern over the way algorithm-driven filtering can create blind spots in public discourse.
Scholars
and critics have been warning of the solipsistic impressibility of algorithmic
news at least since 2001, when the constitutional-law professor Cass R.
Sunstein warned, in his book Republic.com, of the urgent risks posed to
democracy “by any situation in which thousands or perhaps millions or even tens
of millions of people are mainly listening to louder echoes of their own
voices....
In 2011, the digital activist and entrepreneur Eli
Pariser, looking at similar issues, gave this phenomenon a memorable name in
the title of his own book: The Filter Bubble. (41)
Zuckerberg's own level of concern has shifted since the 2016 American elections. Manjoo tells of
the manifesto Zuckerberg wrote was in 2012, as part of Facebook’s application
to sell its stock to the public. It explained Facebook’s philosophy...and sketched an unorthodox path for the soon-to-be-public company.
“Facebook was not originally created to be a company.... It was built
to accomplish a social mission: to make the world more open and connected."
What’s
striking about that 2012 letter...is its
certainty that a more “open and connected” world is by definition a better one.
“When I started Facebook, the idea of connecting the world was not
controversial...” [Zuckerberg told Manjoo]. “The default assumption was that the world
was incrementally just moving in that direction. So I thought we can connect
some people and do our part in helping move in that direction.” But now, he
said, whether it was wise to connect the world was “actually a real question.” (42)
In February, Facebook staff gave Manjoo a draft of the 2017 manifesto,
Building Global Community. The new manifesto, Manjoo writes,
is remarkable for the way it concedes
that the company’s chief goal—wiring the globe—is controversial. “There are
questions about whether we can make a global community that works for
everyone..., and whether the path ahead is to connect more or
reverse course.”
[Zuckerberg] also confesses misgivings about Facebook’s role in the
news. “Giving everyone a voice has historically been a very positive force for
public discourse because it increases the diversity of ideas shared..... But the past year has also shown it may fragment our shared sense of
reality.” (42)
Meanwhile, can I control my own News Feed?
What Facebook can and will do about these concerns, especially given its "prime directive" of avoiding human editing of user choices, is yet to be seen.
Meanwhile, here are some steps you
can take to manage how your personal News Feed works, taken from
Control What You See in News Feed (as of 5/20/2017).
- On your Facebook home page, click the white triangle to open this menu
- Select New Feed Preferences
- Use the options provided to manage how your personal News Feed sorts and displays information.
Keep paying attention. Add your human brain to what machine algorithms do.
Blessings,
Mike Shell
Addendum
Of course, one way to counterbalance the "filter bubble" effect in your own New Feed is to "like" and "follow" commentators and news sources with whom you usually disagree.
Image & Author Notes
Image: "
Mark Zuckerberg." Credit: Spencer Lowell for The New York Times. Illustration by Mike McQuade.
Image: "
Is This Story Relevant to You? How We Evaluate," from Facebook's introduction to its News Feed.
Image: "Which FORMER PRESIDENT Would You Have Been?" screenshot of Facebook "click bait" quiz from
Quizony.com.
Image: "News Feed Preferences" made up of screenshots compiled on 5/20/17. See
Control What You See in News Feed for more options.
Technology columnist
Farhad Manjoo is working on a book about what he calls the
Frightful
Five: Apple, Amazon, Google, Facebook and Microsoft (see "
Tech’s Frightful Five: They’ve Got Us,"
NYT, 5/11/2017).