Facebook groups spread conspiracy theories and shield abuse. Here are ways to make them less toxic.
This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.
The QAnon conspiracy theory, promotions of bogus health treatments and calls for violence based on false claims of election fraud have a common thread: Facebook groups.
These forums for people with a shared interest can be wonderful communities for avid gardeners in the same neighborhood or parents whose children have a rare disease. But for years, it’s also been clear that the groups turbocharge some people’s inclinations to get into heated online fights, spread engrossing information whether it’s true or not and scapegoat others.
I don’t want to oversimplify and blame Facebook groups for every bad thing in the world. (Read my colleague Kevin Roose’s latest column on suggestions for how to target polarization and engage people in purposeful activities.) And mitigating the harms of Facebook is not as simple as the company’s critics believe.
But many of the toxic side effects of Facebook groups are a result of the company’s choices. I asked several experts in online communications what they would do to reduce the downsides of the groups. Here are some of their suggestions.
Stop automated recommendations. Facebook has said it would extend a temporary pause on computerized recommendations for people to join groups related to politics. Some experts said that Facebook should go further and stop computer-aided group suggestions entirely.
It’s nice that Facebook suggests a forum about growing roses to someone who posts about gardening. But for years, Facebook’s group recommendations have proved to be easily manipulated and to have pushed people toward increasingly fringe ideas.
In 2016, according to a Wall Street Journal report, Facebook’s research found that two-thirds of people who joined extremist groups did so at Facebook’s recommendation. Automated group recommendations was one of the ways that the QAnon conspiracy theory spread, my colleague Sheera Frenkel has said.
Ending these computerized suggestions isn’t a silver bullet. But it’s nuts how often activists and academics have screamed about how harmful recommendations are, and Facebook has only tinkered at the margins.
Provide more oversight of private groups. The social media researchers Nina Jankowicz and Cindy Otis have proposed not allowing groups above a certain number of members to be private — meaning newcomers must be invited and outsiders can’t see what’s being discussed — without regular human review of their content.
“A lot of truly toxic groups are unsearchable and invite-only, and that’s hugely problematic,” Jankowicz told me.
Jankowicz and Otis have also pushed for more-consistent descriptions of groups and more transparency into who manages them. Political discussion groups are sometimes intentionally mislabeled by their hosts as “personal blogs” to avoid the extra attention that Facebook applies to political forums.
Target the habitual group offenders. Renée DiResta, a disinformation researcher at the Internet Observatory at Stanford, said that Facebook needs to “take more decisive action” against the groups that repeatedly engage in harassment or otherwise break Facebook’s rules again and again. Facebook did take some steps in this direction last year.
Jade Magnus Ogunnaike, a senior director at the racial justice organization Color of Change, also said that Facebook should stop using contractors to review material on the site. It’s more fair to convert those workers to employees, she said, and it could help improve the quality of oversight of what’s happening in groups.
Add some … librarians? Joan Donovan, the research director of Harvard University’s Shorenstein Center on Media, Politics and Public Policy, has suggested that big internet companies should hire thousands of librarians to provide people with vetted information to counter groups wallowing in false information.
Superstars are not brilliant at everything
Jeff Bezos is fond of saying that failure is healthy because people and companies learn from it. But sometimes failure is a result of a company’s weaknesses, and it’s not a good thing.
There have been news articles in the last few days about both Amazon’s and Google’s utter inability to create their own successful video games despite having infinite cash and smart people at their disposal.
The roots of their failures are complex, but two problems came to my mind about what happened: cultural soft spots and hubris. (And in Amazon’s case, an overreliance on Bezos’s distilled wisdom in “Jeff-isms,” like the one above.)
Here’s what happened: Google this week said it was closing down its group devoted to making video games. And Bloomberg News detailed the reasons behind Amazon’s repeated flops in making its own high-powered video games.
Describing Amazon’s struggles as a reflection of its Amazon-ness, it reported that an obsession with data made people lose focus on making games fun. Executives confident in their company’s expertise forced staff to use game development technologies of Amazon’s making rather than industry-standard ones.
Google, too, for all of its successes has ingrained habits that sometimes make it hard to break into unfamiliar areas. The technology news publication The Information reported this week on the struggles of Google’s business that sells cloud computing technology to companies.
Google engineers are treated like kings, and it has been hard to convince them to come up with rigid three-year product road maps that corporations tend to like. The Google Cloud business has struggled for years with the same basic problem — shoehorning Google’s ways into the prosaic habits of its business clients.
The magic (or annoying) thing about cash-rich superstar companies is that they can often turn failures into success. But Amazon’s and Google’s difficulties in businesses outside their core expertise are a reminder that being rich and smart sometimes blinds companies to their weaknesses.
Before we go …
-
Robinhood needed money fast: In the eye of frenzied stock trading, the stockbroker app Robinhood has been forced to take in billions of dollars to steady itself and meet legal requirements of maintaining a cushion of cash, my colleagues reported. (Related: My colleague Andrew Ross Sorkin has six ideas to make the stock market more trustworthy and fair.)
-
India versus Twitter: Twitter temporarily blocked people in India from viewing numerous accounts critical of the prime minister after receiving a government order, BuzzFeed News reported. These tugs of war between local laws and social media companies’ standards of free expression typically receive far less notice when they happen outside the United States.
-
“I give it five sinks.” Sink Reviews is a TikTok account that — you guessed it — opines about sinks in public places in New York City, like museums, restaurants and stores, Time Out New York reported.
Hugs to this
Apparently baby octopuses sometimes ride on top of jellyfish, and they look marvelous.
We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at ontech@nytimes.com.
If you don’t already get this newsletter in your inbox, please sign up here.
"how" - Google News
February 03, 2021 at 01:11AM
https://ift.tt/3aw3hS9
How to Fix Facebook Groups - The New York Times
"how" - Google News
https://ift.tt/2MfXd3I
Bagikan Berita Ini
0 Response to "How to Fix Facebook Groups - The New York Times"
Post a Comment