Quantcast
Channel: Uncategorized Archives - PressThink
Viewing all articles
Browse latest Browse all 171

Facebook’s phony claim that “you’re in charge.”

$
0
0
It simply isn’t true that an algorithmic filter can be designed to remove the designers from the equation. That assertion melts on contact, and a New York Times reporter who receives such a claim from a Facebook engineer should somehow signal to us that he knows how bogus it is.

In today’s New York Times, media reporter Ravi Somaiya visits with Facebook to talk about the company’s growing influence over the news industry, especially with News Feed’s dominance on mobile devices. Greg Marra, a 26 year-old Facebook engineer who heads the team that does the code for News Feed, was interviewed. Marra is “fast becoming one of the most influential people in the news business,” Somaiya writes.

Mr. Marra said he did not think too much about his impact on journalism.

“We try to explicitly view ourselves as not editors,” he said. “We don’t want to have editorial judgment over the content that’s in your feed. You’ve made your friends, you’ve connected to the pages that you want to connect to and you’re the best decider for the things that you care about.”

In Facebook’s work on its users’ news feeds, Mr. Marra said, “we’re saying, ‘We think that of all the stuff you’ve connected yourself to, this is the stuff you’d be most interested in reading.’ ”

It’s not us exercising judgment, it’s you. We’re not the editors, you are. If this is what Facebook is saying — and I think it’s a fair summary of Marra’s comments to the New York Times — the statement is a lie.

I say a lie, not just an untruth, because anyone who works day-to-day on the code for News Feed knows how much judgment goes into it. It simply isn’t true that an algorithmic filter can be designed to remove the designers from the equation. It’s an assertion that melts on contact. No one smart enough to work at Facebook could believe it. And I’m not sure why it’s sitting there unchallenged in a New York Times story. For that doesn’t even rise to the level of “he said, she said.” It’s just: he said, poof!

Now, if Greg Marra and his team want to make the point that in perfecting their algorithm they’re not trying to pick the day’s most important stories and feature them in the News Feed, the way an old fashioned front page or home page editor would, and so in that sense they are not really “editors” and don’t think in journalistic terms, fine, okay, that’s a defensible point. But don’t try to suggest that the power has thereby shifted to the users, and the designers are just channeling your choices. (If I’m the editor of my News Feed, where are my controls?)

A more plausible description would go something like this:

The algorithm isn’t picking stories the way a home page or front page editor would. It’s not mimicking the trained judgment of experienced journalists. Instead, it’s processing a great variety of signals from users and recommending stories based on Facebook’s overrrding decision rule for the design of an editorial filter: maximizing time on site, minimizing the effort required to “get” a constant flow of personal and public news. The end-in-view isn’t an informed public or an entertained audience but a user base in constant contact with Facebook. As programmers we have to use our judgment — and a rigorous testing regime —to make that happen. We think it results in a satisfying experience.

That would be a more truthful way of putting it. But it doesn’t sound as good as “you’re in charge, treasured user.” And here is where journalists have to do their job better. It’s not just calling out BS statements like “you’re the best decider.” It’s recognizing that Facebook has chosen to go with “thin” legitimacy as its operating style, in contrast with “thicker” forms. (For more on this distinction go here.)

By “thin” I mean Facebook is operating within the law. The users are not completely powerless or kept wholly in the dark. They have to check the box on Facebook’s terms of service and that provides some cover. The company has pages like this one on data use that at least gesture toward some transparency. But as this summer’s controversy over the “mood manipulation” study showed, Facebook experiments on people without them knowing about it. That’s thin.

Jeff Hancock, the Cornell researcher who worked on the mood manipulation study, said this last week: One of his big discoveries was that most users don’t grasp the basic fact that the Facebook algorithm is a filter. They think it’s just an “objective window into their social world.” That’s thin too. (See my post about Hancock and his choices, Why do they give us tenure?) The company doesn’t level with users about the intensity of its drive to maximize time on site. Thin.

Thick legitimacy is where informed consent, active choice and clear communication prevail between a platform and its public, the coders and the users. Facebook simply does not operate that way. Many would argue that it can’t operate with thick legitimacy and run a successful business at scale. Exactly! As I said, the business model incorporates “thin” legitimacy as the normal operating style. For better or for worse, that’s how Facebook works. Reporters should know that, and learn how to handle attempts by Facebook speakers to evade this basic fact— especially from “one of the most influential people in the news business.”


Viewing all articles
Browse latest Browse all 171

Trending Articles