Instead, it discovered that connecting people led to its systems being exploited to spread dissent, outrage, and -- of course -- fake news for profit.
Unless you're in the armaments business, discovering that your products are being horrendously misused will trigger a response to try to remove the problem. With high-profile companies, there's often a PR effort to insist that nothing really happened, that there's nothing to see here.
In Facebook's case, as a New York Times investigative piece revealed, part of the reaction to discovering Russian attempts to interfere in the 2016 elections involved hiring a right-wing opposition research company, Definers.
Definers went on to discredit activists in part by alleging links to the billionaire financier George Soros, while suggesting that some criticism of the company was anti-Semitic.
Quite a coup to use an argument of anti-Semitism (which any accusation involving Soros always turns into) both for offense and defense. In responding to the NYT piece, Facebook didn't deny either accusation. It only said that Definers was not asked to "pay for or write articles on Facebook's behalf -- or to spread misinformation."
Absolutely. After all, spreading misinformation is Facebook's job. Well, not job, but certainly something it is accidentally very effective at doing. The Facebook pages and feeds you see are decided by an algorithm which watches what you pay attention to and tries to feed you more of it so you'll keep paying attention, meaning it can show you more ads.
It turns out that outrage captures attention far more effectively than pretty much anything. In general, a headline along the lines of "Discover the Appalling Things These Disgusting People Are Doing" will really pull in the clicks far better than "There's A Group of People Doing Justifiable Things Lawfully."
By this stage, it's becoming clear that Facebook and social networks including YouTube -- which tries to rope together ad-hoc networks of subscribers -- aren't altogether good for our well-being.
A Pew Research study in March found that around 70% of US adults use Facebook and YouTube, of whom three-quarters accessed it every day. But by September, the Pew Center found 42% saying they'd taken multi-week breaks from Facebook, and an amazing 26% who had deleted the app from their phone -- though most of those are 18 to 29.
Why? People are starting to realize that being exposed to the sometimes-noxious outputs of fellow citizens isn't that beneficial. What if it's time to treat Facebook and Twitter and Instagram and Snapchat and YouTube as akin to cigarettes -- addictive, pervasive, noxious and harmful?
After all, cigarette smoking used to be everywhere. In the UK, a key rollback came after 31 people died in the Kings Cross Underground fire in November 1987, caused by a smoker's discarded match. Smoking, which had been allowed on the Underground, was banned five days later. If you want a comparator, look at the United Nations report into the killing of Rohingya Muslims in Myanmar, published last August, where Facebook was cited as a key conduit for hate speech and misinformation.
We banned cigarettes from more and more public places. We acknowledged they have harmful effects. Maybe we're approaching a similar tipping point with social networks. Already Apple and Google offer ways to monitor your screen time built into new phone software.
The habit is hard to break. I'll acknowledge a Twitter addiction (but also a Facebook, Instagram and YouTube indifference). They say the hard part about giving up smoking isn't stopping, it's not starting again. To some extent, that power is in Facebook's hands. It can tune the algorithms to tone down the outrage and the addiction. But is it likely to? Only Mark Zuckerberg, in his smoke-free office in California, knows.
Bagikan Berita Ini
0 Response to "It's time to admit we are addicted to Facebook -- and that it might be toxic"
Post a Comment