AToxicPlatformToReasonedDiscourse

ThoughtStorms Wiki

Context : TheProblemWithFacebook

FaceBook is a toxic platform to reasoned discourse :

Quora Answer : Do you think Facebook is a toxic platform to reasoned discourse?

Jun 21, 2019

Yes.

I left FB back in 2013 so I don't know it today, but back then I thought it was deliberately optimising to increase flow and addictiveness at the cost of careful reading and reasoned discourse.

The main issue was collapsing any post longer than about two lines of text so that it needed to be clicked on and opened up to be read.

That meant FB could fit more posts on the screen, and amplify the impression of a huge amount of content that needed to be dealt with. Ie. ramped up the sense of urgency and information overload for the user. But made it harder for users stay concentrated on the content and argument of any particular post.

Even in discussion groups this was going on. The result was that the users were always distracted by "the next thing" to deal with, rather than given space to think about the current thing.

Even then I knew that this prejudice against text was making Facebook a dud medium for any serious discourse. I had been reading Marshall McLuhan and realized "Facebook is reinventing TV". The overwhelming flow that reduces you to being pulled on to the next item with no more response than repeatedly stabbing a like or forward button.

Furthermore, this collapse of text to a trivialized placeholder made for the rise of visual memes. Because structured text is so devalued on Facebook, anyone who wanted to say anything moved to pasting the same text on a picture. The emotional charge of the underlying picture, the shock value or the humour or the cuteness, swamped everything else.

Again Marshall McLuhan is relevant here. The medium is the message. When you move from communicating in words to communicating in pictures you lose an entire framework for thinking.

Finally, Facebook has turned our informal and vaguely defined friendships and acquaintanceships into hard machine-managed links. It has "instrumentalized" informal human social connections. Which was great for Facebook. It got loads of data it could mine for information about you, and sell on to advertisers. But actually ruinous to those relationships themselves.

The guy you would chat with in the lift on the way to work? The old school friend you would catch up with once every five years? Your aunt you only see at Christmas?

At that level of conviviality and intimacy, these "weak ties" enriched your life and made the world a friendly place. Full of familiar faces.

NOW ... those fuzzy informal connections have become stark. A harsh light illuminates every one. Each is now a media channel blasting you with more information. Perhaps only semi-interesting. Perhaps depressing. (Why can't my life be more like hers?) Perhaps infuriating : a flood of fake news denigrating the politicians you support and the values you hold dear.

Instead of these weak ties enriching your world and making it feel friendlier. Now you feel surrounded by enemies. You're in constant "fight or flight" mode. Your enemies are even in your pocket! Ready to blast you with yet another outrageous lie or example of their grotesque prejudice.

You didn't need to know the voting intentions of your aunt. Or your old school friend's views on transsexuals. The topics never came up in real life. Now the horrible truth is out there for you all to see, and you no longer like your aunt or old school friend as much.

Vague liking has been replaced by vague dislike and sullen annoyance. Because you know now too much about people you shouldn't have known so much about.

The world feels less full of friends than of enemies. And, of course, everyone is on edge and every disagreement spirals into an argument into a shouting match and further enmity.

No one is surprised Facebook knew it was amplifying disinformation and incitement. But I am very excited to have this much internal evidence upon which we may consider proper regulatory steps to correct course. I’ll be live tweeting my thoughts here about @60Minutes 1/

The Facebook whistleblower says the platform constantly chooses growth over safety. No one is malevolent but the incentives are misaligned for profit. 2/

Employees have had to choose growth so they win in the marketplace of social media companies. This means the platform continues amplifying content that incites violence and hate and this affects our trust and connectivity with each other. 3/

Now we must dig into the details of these 8 complaints with the Securities and Exchange Commission, an agency charged with investigating and enforcing the law to fight financial market manipulation. This will be the story to watch for and I hope the media covers it properly. 4/

The whistleblower estimates 60,000 Facebook employees could have accessed the docs she came forward with. But they didn’t. This is why we need strong tech worker & whistleblower legal protections. The stakes are too high to wait for another Frances Haugen. 5/