Thursday, September 1, 2022

№ 647. Waking Up to Binary Dreams 4

reddit
 

Digital tech doesn’t only erode our attention. It also divides and redirects our attention into separate information ecosystems, so that the news you see is different from, say, the news your grandmother sees. And that has profound effects on what each of us ends up viewing as morally salient.

 

To make this concrete, think about the recent US election. As former President Donald Trump racked up millions of votes, many liberals wondered incredulously how nearly half of the electorate could possibly vote for a man who had put kids in cages, enabled a pandemic that had killed many thousands of Americans, and so much more. How was all this not a dealbreaker?

“You look over at the other side and you say, ‘Oh, my god, how can they be so stupid? Aren’t they seeing the same information I’m seeing?’” Harris said. “And the answer is, they’re not.”

Trump voters saw a very different version of reality than others over the past four years. Their Facebook, Twitter, YouTube, and other accounts fed them countless stories about how the Democrats are “crooked,” “crazy,” or straight-up “Satanic” (see under: QAnon). These platforms helped ensure that a user who clicked on one such story would be led down a rabbit hole where they’d be met by more and more similar stories.

Say you could choose between two types of Facebook feeds: one that constantly gives you a more complex and more challenging view of reality, and one that constantly gives you more reasons why you’re right and the other side is wrong. Which would you prefer?

Most people would prefer the second feed (which technologists call an “affirmation feed”), making that option more successful for the company’s business model than the first (the “confronting feed”), Harris explained. Social media companies give users more of what they’ve already indicated they like, so as to keep their attention for longer. The longer they can keep users’ eyes glued to the platform, the more they get paid by their advertisers. That means the companies profit by putting each of us into our own ideological bubble.

Think about how this plays out when a platform has 2.7 billion users, as Facebook does. The business model shifts our collective attention onto certain stories to the exclusion of others. As a result, we become increasingly convinced that we’re good and the other side is evil. We become less empathetic for what the other side might have experienced.

In other words, by narrowing our attention, the business model also ends up narrowing our moral attention — our ability to see that there may be other perspectives that matter morally.

The consequences can be catastrophic.

Myanmar offers a tragic example. A few years ago, Facebook users there used the platform to incite violence against the Rohingya, a mostly Muslim minority group in the Buddhist-majority country. The memes, messages, and “news” that Facebook allowed to be posted and shared on its platform vilified the Rohingya, casting them as illegal immigrants who harmed local Buddhists. Thanks to the Facebook algorithm, these emotion-arousing posts were shared countless times, directing users’ attention to an ever narrower and darker view of the Rohingya. The platform, by its own admission, did not do enough to redirect users’ attention to sources that would call this view into question. Empathy dwindled; hate grew.

In 2017, thousands of Rohingya were killed, hundreds of villages were burned to the ground, and hundreds of thousands were forced to flee. It was, the United Nations said, “a textbook example of ethnic cleansing.” 

 

Slate

 

No comments:

Post a Comment