Facebook can't hide behind algorithms

If Facebook’s algorithms were executives, the public would be demanding their heads on a stick, such was the ugly incompetence on display this week.

First, the company admitted a “fail” when its advertising algorithm allowed for the targeting of anti-Semitic users.

Then on Thursday, Mark Zuckerberg said he was handing over details of more than 3,000 advertisements bought by groups with links to the Kremlin, a move made possible by the advertising algorithms that have made Mr Zuckerberg a multi-billionaire.

Gross misconduct, you might say - but of course you can’t sack the algorithm. And besides, it was only doing what it was told.

“The algorithms are working exactly as they were designed to work,” says Siva Vaidhyanathan, professor of media studies at the University of Virginia.

Which is what makes this controversy so extremely difficult to solve - a crisis that is a direct hit to the core business of the world’s biggest social network.

Fundamentally flawed

Facebook didn’t create a huge advertising service by getting contracts with big corporations.

No, its success lies in the little people. The florist who wants to spend a few pounds targeting local teens when the school prom is coming up, or a plumber who has just moved to a new area and needs to drum up work.

Facebook’s wild profits - $3.9bn (£2.9bn) between April and June this year - are due to that automated process. It finds out what users like, it finds advertisers that want to hit those interests, and it marries the two and takes the money. No humans necessary.

But unfortunately, that lack of oversight has left the company open to the kinds of abuse laid bare in ProPublica’s investigation into anti-Semitic targeting.

 “Facebook’s algorithms created these categories of anti-Semitic terms,” says Prof Vaidhyanathan, author of Anti-Social Network, a book about Facebook due out later this year.

“It’s a sign of how absurd a human-free system can be, and how dangerous a human-free system can be.”

That system will be slightly less human-free in future. In his nine-minute address, a visibly uncomfortable Mark Zuckerberg said his company would be bringing on human beings to help prevent political abuses. The day before, its chief operating officer said more humans would help solve the anti-Semitism issue as well.

“But Facebook can’t hire enough people to sell ads to other people at that scale,” Prof Vaidhyanathan argues.

“It’s the very idea of Facebook that is the problem."

'Crazy idea'

Mark Zuckerberg is in choppy, uncharted waters. And as the “leader” (as he likes to sometimes say) of the largest community ever created, he has nowhere to turn for advice or precedent.

This was most evident on 10 November, the day after Donald Trump was elected president of the United States.

When asked if fake news had affected voting, Mr Zuckerberg, quick as a snap, dismissed the suggestion as a “crazy idea”.

That turn of phrase has proven to be Mr Zuckerberg’s biggest blunder to date as chief executive.

 

Photo copyright: FACEBOOK / BBC