Quantcast
Channel: Uncategorized Archives - PressThink
Viewing all articles
Browse latest Browse all 171

“It’s not that we control NewsFeed, you control NewsFeed…” Facebook: please stop with this.

$
0
0
Of course Facebook doesn’t “edit” NewsFeed in the same way that a newspaper editor once edited the front page. It’s a very different way. That’s why we’re asking about it!

I’ve met some of the people at Facebook whose job it is to work with journalists and media companies. They’re good people, smart people. They seem to care about the future of news. Some of my students, now graduated, work with them. I like that.

What I have to say in this post isn’t personal. It’s professional. Please stop doing this. Here’s what I mean:

Last week, at the International Journalism Festival in Perugia, Italy, Facebook’s Andy Mitchell​, director of news and media partnerships, was asked how the company sees its role as a new kind of editorial filter or influence on the news— an important question, now that Facebook has become such an important part of the news ecosystem. He was also asked what kind of accountability Facebook felt it had as a player in that system. Mitchell had three answers to these questions.

1. “It’s not that we control NewsFeed, you control NewsFeed by what you tell us that you’re interested in.” You send us signals. We respond.

2. Facebook should not be anyone’s primary news source or experience. It should be a supplement to seeking out news yourself with direct suppliers. “Complementary” was the word he used several times. Meaning: complement to, not substitute for.

3. Facebook is accountable to its users for creating a great experience. That describes the kind of accountability it has. End of story.

To find these answers go to 45:50 in the video clip and watch to the end.

George Brock, journalism professor in the UK, was the one who asked about accountability. He comments:

Facebook is not, and knows quite well it is not, a neutral machine passing on news. Its algorithm chooses what people see, it has ‘community standards’ that material must meet and it has to operate within the laws of many countries.

The claim that Facebook doesn’t think about journalism has to be false. And, at least in the long run, it won’t work; in the end these issues have to faced. Facebook is a private company which has grown and made billions by very successfully keeping more people on its site for longer and longer. I can imagine that any suggestion that there are responsibilities which distract from that mission must seem like a nuisance.

Google once claimed something similar. Its executives would sit in newspaper offices and claim, with perfectly straight faces, that Google was not a media company. As this stance gradually looked more and more absurd, Google grew up and began to discuss its own power in the media.

I would put it differently: Facebook has to start recognizing that our questions are real— not error messages. We are not suggesting that it “edits” NewsFeed in the same way that a newspaper editor once edited the front page. It’s a very different way. That’s why we’re asking about it! We are not suggesting that algorithms work in the same way that elites deciding what’s news once operated. It’s a different way. That’s why we’re asking about it!

No one is being simple-minded here and demanding that Facebook describe editorial criteria it clearly does not have— like reaching for a nice mix of foreign and domestic news. We get it. You want not to be making those decisions. You want user interest to drive those decisions. We’re capable of understanding the basics of machine learning, collaborative filtering and algorithmic authority. We know that to reveal all would encourage gaming of the system. We’re capable of accepting: this is what the users are choosing to use now. We’re not platform idiots. Stop treating us like children at a Passover seder who don’t know enough to ask a good question.

But precisely because we do “get it” — at least at a basic level — we want to know: what are you optimizing for, along with user interest? How do you see your role within a news ecosystem where you are more and more the dominant player? In news, you have power now. It is growing. Help us understand how you intend to use it. What kind of filter will you be? What kind of player… playing for what?

These are not outrageous or ignorant questions. They do not misstate how Facebook works. They are not attempts to turn the clock back to a time when editors chose and readers read. We don’t need your answers to babysit us. We’re awake and alive in the algorithmic age and exercising our critical faculties just fine. If you can’t answer, then say that: We are not here to answer your questions because we can’t.

Andy Mitchell’s three replies are not adequate— for us or for Facebook.

Q. What are you optimizing for, along with user interest? A. “It’s not that we control NewsFeed, you control NewsFeed.” No, sorry. As I wrote before: It simply isn’t true that an algorithmic filter can be designed to remove the designers from the equation. The assertion melts on contact.

Q. How do you see your role in the news ecosystem where you are more and more the dominant player? A. Facebook should not be anyone’s primary news source or news experience. No, sorry. On mobile, especially, “primary” is exactly what’s happening. And everyone who pays attention knows how strenuously Facebook tries to keep users engaged with Facebook. So “we don’t want to be primary” is… I’m trying to be nice here… a little insulting.

Q. In news you have a lot of power now. How do you intend to use that power? A. We just want to create a great experience for users. No, sorry, that’s not an answer because you just said the users have the power, not Facebook, so what you’re really saying is: power? us? whatever do you mean?

Facebook’s smart, capable and caring-about-news people should be disappointed that this is as far as the company has gotten in being real with itself and with us.

(This started as a Facebook post. If you want to see it spread on that platform, I’m confident you know what to do.)

After Matter: Notes, Reactions & Links

Now here’s a good example of what I mean. In an update at a company blog, Facebook tells us:

Facebook is constantly evaluating what’s the right mix of content in News Feed and we want to let you know about a change that may affect referral traffic for publishers…

Stop the tape! Notice how Facebook is the one evaluating. Facebook is the one changing things up. This is not a scandal or a surprise. But it’s also not: “you control NewsFeed, we don’t control NewsFeed.” They control NewsFeed too. User choice is real. But code is destiny.

Mathew Ingram of Fortune magazine writes about the same announcement. Facebook, he says, “wants to have its cake and eat it too: it wants to tweak the news-feed in order to promote content that serves its purposes—whether that’s news content or baby pictures—but it also wants to pretend that it isn’t a gatekeeper, because then media companies might not play ball. So it tries to portray the algorithm as just a harmless extension of its users’ interests, when in fact it is anything but.”

David Holmes at Pando.com comments, as well:

I don’t blame Facebook for wanting to squeeze ever-increasing amounts of money from publishers and the content they produce. Facebook is a for-profit corporation and that’s what corporations do: make money. And it certainly doesn’t owe journalists or their organizations anything.

But it’s phenomenally disingenuous of the company to insist that its every strategic decision is part of some “user-first” mentality. Users don’t even pay to use Facebook — so how could they be its core constituency?

Good question.

A distinction I have tried to import into this debate is between “thick” and “thin” legitimacy. From my piece in the Washington Post about Facebook’s mood manipulation study.

Thin legitimacy is when the experiments conducted on human beings are: fully legal and completely normal, as in common practice across the industry, but there is no way to know if they are minimally ethical, because companies have no duty to think such matters through or share with us their methods.

Thick legitimacy: when experiments conducted on human beings are not only legal under U.S. law and common in practice but also attuned to the dark history of abuse in experimental situations and thus able to meet certain standards for transparency and ethical conduct— like, say, the American Psychological Association’s “informed consent” provision.

For purposes of establishing at least some legitimacy Facebook relies on its “terms of service,” which is 9,000 words of legalese that users have no choice but to accept. That’s thin.

Facebook thinks “thin” legitimacy will work just fine. That is why it can give journalists and academics the royal run around at conferences. But what if that assessment is wrong, not from some moral perspective but as a business case? The question turns on this: To what degree does Facebook’s success depend on trust — user trust, social trust, partner trust — vs. power: market power, monopoly power, the power of an overwhelming mind share. I don’t know the answer, but I don’t trust anyone who says the answer is obvious. It’s not obvious. The more the company’s fortunes turn on trust, the greater the business case for “thick” legitimacy.

I wrote about the same issue last year. This is the description I recommended if Facebook ever decided to (I know it sounds crazy) optimize for truth.

The algorithm isn’t picking stories the way a home page or front page editor would. It’s not mimicking the trained judgment of experienced journalists. Instead, it’s processing a great variety of signals from users and recommending stories based on Facebook’s overrrding decision rule for the design of an editorial filter: maximizing time on site, minimizing the effort required to “get” a constant flow of personal and public news. The end-in-view isn’t an informed public or an entertained audience but a user base in constant contact with Facebook. As programmers we have to use our judgment — and a rigorous testing regime —to make that happen. We think it results in a satisfying experience.

Ben Thompson at his invaluable site, Stratechery. “It is increasingly clear that it is Facebook — not iOS or Android — that is the most important mobile platform.”

Andy Mitchell’s answers at Perugia insulted a lot of people. Here’s an account in Italian by a student, Enrico Bergamini, who asked Mitchell about the NewsFeed alogorithm. It includes an interview with George Brock. On my Facebook page he writes: “I was at the conference, I’m the student asking the question at 45:42, and I was obviously disappointed with the empty answer Mr Mitchell gave me.” Other comments at my Facebook page from people who were there:

Mindy McAdams: “The answers Andy Mitchell gave to questions asked after his talk in Perugia were pure spin and obfuscation… The mood was sullen as he continued answering questions with non-answers.”

Eric Sherer: “I attended this conference, Jay. It was a shame. And yes, he treated [us] like children!”


Viewing all articles
Browse latest Browse all 171

Trending Articles