Despite censorship, Facebook denies it’s a media company
Facebook makes value judgements about what can appear in News Feed and what’s censored, but insists its a tech company not a media editor.
At TechCrunch Disrupt SF, writer Josh Constine sat down with Adam Mosseri, a VP at Facebook and head of News Feed, to hear more about how policies control what you see.
The talk started with Constine asking Mosseri much content people consume daily on their News Feed. Mosseri shared a new statistic, which is that the average Facebook users reads a little over 200 stories a day on their feed, which is about 10 percent of the 2,000 possible stories Facebook has to show them each day.
The average user consumes this News Feed content over 45 minutes a day. This number is still growing, which Mosseri says the company interprets as a signal that they are making News Feed better every day.
The conversation then moved to what kind of content is shared on Facebook most, original sharing (like photos of your friends) or a publisher sharing a story (like CNN). Mosseri noted that both types of sharing were still growing, but publisher sharing is growing at a much faster rate – which could explain why the average users may feel like their News Feed is dominated by content from big new publications.
But Mosseri also said that the company definitely understands that friends and family come first and seeing content from loved ones is why many people come to Facebook in the first place. So they are going to ensure there is a good mix, and content from your friends remains on your feed.
Constine then asked about internet addiction amongst users and if it is something Facebook is concerned about. Mosseri replied that while they don’t track addiction they track a user’s sentiment, and try to understand if people think their News Feed experience is time well spent. Essentially they aren’t worried about someone using Facebook too much (and getting addicted) as long as the person is having a meaningful experience.
When asked about Facebook firing its team of description-writing curators, and the impact on its Trending Topics product, Mosseri said “I think it’s better”. But he conceded that the product needs to improve its ability to block fake news, and says tech that Facebook built to squash hoaxes in the News Feed is being rolled out to Trending Topics now.
The talk ended by the two discussing the Philando Castile shooting video which Facebook had at first temporarily removed from the site, then replaced it saying a “technical glitch” was to blame for its short removal. Mosseri clarified that this glitch was Facebook’s algorithms miscategorizing the content and accidentally flagging it as something else, not a technical glitch like a server going down. ”
“We have a lot of systems in place that try and automatically detect content that violates our standards. And we actually had a, sort of a miscategorization, essentially, which is really a bug. And it really, really unfortunate…about such an important story at such an important moment.
I’m sure whatever the system was in place didn’t perform as was intended, and we shouldn’t have taken it down, we didn’t mean to.”
This brings up the question what place does Facebook have to censor content? While the company insists it’s not a media company, it effectively fills the shoes of an editor saying “yes or no” to each specific piece of content. If Facebook thinks something is important enough to see, even if it violates standard News Feed guidelines, they will still allow it – placing them in a position that is pretty damn close to being a media company.
Source: TechCrunch