Disinformation

ThoughtStorms Wiki

Read with FakeNews and TheEndOfConsensus and DisinformationVsScience

How big a threat is it to democracy? https://www.niemanlab.org/2023/11/how-big-a-threat-does-misinformation-pose-to-democracy/

How they made us doubt everything : https://www.bbc.co.uk/programmes/m000l7q0

This is good. MikeCaulfield on recalibrating our approach to it :

https://www.edsurge.com/amp/news/2018-12-19-recalibrating-our-approach-to-misinformation

https://www.nytimes.com/2021/02/18/opinion/fake-news-media-attention.html

Transcluded from ThirteenWaysOfLookingAtDisinformation

A very good critique, by JacobSiegell of the concept of Disinformation / FakeNews / FactChecking etc. and how it's playing into the hands of a TechnoCrat elite who are trying to censor and delegitimize any challenges to their power.

https://www.tabletmag.com/sections/news/articles/guide-understanding-hoax-century-thirteen-ways-looking-disinformation

Worth reading.

A somewhat sceptical article in nature : https://www.nature.com/articles/s41586-024-07417-w.epdf?sharing_token=-H6ynotZUtnX9hqj5LZOfdRgN0jAjWel9jnR3ZoTv0PHjAsYY0GLhz9RBvciXVgSoZhMWcZmjAriBVc5wcGgxyPRW3cif8KFcfw5gbjfv51-ZGi0HTE487XUK8ujAV6e3ruze_axOif8S74p7QuvML01X1IhQ0I7V7D8M5cppWQ=

The issue is whether it's as widespread as the scare stories assume or is it still fringe?

AzeemAzhar says :

Key findings:

  • Exposure to false and inflammatory content is remarkably low, with just 1% of Twitter users accounting for 80% of exposure to dubious websites during the 2016 U.S. election. This is heavily concentrated among a small fringe of users actively seeking it out. Examples: 6.3% of YouTube users were responsible for 79.8% of exposure to extremist channels from July to December 2020, 85% of vaccine-sceptical content was consumed by less than 1% of US citizens in the 2016–2019 period.
  • Conventional wisdom blames platform algorithms for spreading misinformation. However, evidence suggests user preferences play an outsized role. For instance, a mere 0.04% of YouTube's algorithmic recommendations directed users to extremist content.
  • It's tempting to draw a straight line between social media usage and societal ills. But studies rigorously designed to untangle cause and effect often come up short.

The paper argues these misunderstandings arise from a tendency to cite impressive-sounding statistics without appropriate context, to focus on engaging but unrepresentative content, and to confuse correlation and causation.

As a result, the scale and impact of misinformation on social media tends to be overstated in public discourse relative to the empirical evidence. Facebook’s research about Russian troll farms suggested 126 million Americans were exposed to the material, even though that represented 0.004% of the material Americans saw in the Facebook newsfeed. How do we move forward?

Shine a light on the fringe: Zoom in on the small minority driving the bulk of misinformation exposure. Map their networks, decode their motivations, and use precision-guided strategies to stem the tide of false content.

Pursue causality with rigour: Ditch the correlations and dig deeper for causal gold. Randomized trials, natural experiments, and longitudinal studies can untangle the web of variables. Partnering with academic matters.

Think globally: Date coverage and research is deeper in the US and Europe. More investment is needed globally to understand the country to country differences.

Iterate, test, adapt: With data about causality, we should turn to evidence-based interventions. These could be active, like counter-messaging; regulatory, like mandating algorithmic adjustments or investments in safety teams; or policies, like media literacy. Because the landscape is shifting, like other institutional adaptations of the exponential age, these need to be dynamic and iterative.

We must adopt a more nuanced, systems-level perspective rather than getting caught up in moral panics or one-size-fits-all solutions. This is how we build our epistemic integrity.

Just as the exponential technologies that underpin our world continue to evolve at breakneck speed, so too must our responses to the challenges they present. A culture of experimentation, learning, and collaboration delivers this agility.

The prize is a healthier, more resilient information ecosystem supporting informed citizenship, democratic deliberation, and social cohesion in the face of that accelerating change.

I'm not entirely sure what is proved in the findings.

Maybe small number are exposed to go disinfo websites directly. But what about exposed to that disinfo circulating on say Facebook groups etc? Is that only 1%?

vaccine-sceptical content was consumed by less than 1% of US citizens in the 2016–2019 seems to me to be looking at the wrong time frame. What about after COVID?

Users actively seek out disinformation. Maybe but they had to be turned on to it in the first place before they started looking for it. How did that happen? Could it be that the exposure that started them down that path hasn't been measured?

ThreadView

ThePowerOfLies is a documentary about lying in politics.

BranderOnLLMsAndSigningEverything