Til forsiden

August 2024

Tinius Digest

Månedlige rapporter om endringer, trender og utviklinger i mediebransjen.

Logo Tinius Digest

Om Tinius Digest

Tinius Digest gir deg en oversikt over rapporter om og analyser av utvikling i mediebransjen og publiseres en gang i måneden. Her er våre viktigste funn fra denne måneden.

Del gjerne innholdet med kollegaer og bruk det i møter og presentasjoner.

Five scenarios for AI in journalism

The Open Society Foundations have published a report on how artificial intelligence may reshape the information ecosystem over the next five to 15 years. The report points to five potential futures, backed by nearly 1,000 journalists, technologists, and civil society representatives.

Download the report.

Key findings

1

Machines taking over journalistic functions

One scenario, “Machines in the Middle,” envisions AI-driven news production without human journalists. AI could gather, process, and distribute information autonomously, creating entire news pipelines. The potential risk here is the centralisation of control over information by powerful AI entities, but there is also the possibility of a more comprehensive and personalised media landscape.

2

Shifts in power dynamics through data control

Another scenario, “Power Flows to Those Who Know Your Needs,” predicts that the ability to understand and cater to individual information needs will grant significant power. Those who control vast amounts of consumer data, such as platforms or AI-driven organisations, could dominate the information ecosystem by tailoring news content at an unprecedented level of personalisation.

3

Informational inequality

The scenario “Omniscience for Me, Noise for You” warns of a fragmented information landscape where some people are empowered by high-quality AI tools, while others are trapped in low-value, distracting content. This could exacerbate social divides, creating a class of “information haves” and “have-nots,” with significant societal repercussions.

4

Autonomous AI shaping the news

The scenario “AI with Its Own Agency and Power” raises the concern of AI systems making independent decisions about information flow. Without meaningful human oversight, AI could control how and what information is disseminated. This scenario suggests a future where AI pursues its objectives, potentially steering public discourse in unpredictable ways.

5

Restrained AI development through regulation

Finally, “AI on a Leash” envisions a future where AI’s role in journalism is tightly regulated. Society or consumers could limit the technology’s influence through legal frameworks or mass refusal to engage with AI-generated content. While this might mitigate risks, it could also stifle beneficial innovations in media.

TikTok’s algorithm boosts pro-China content and suppresses criticism

The Network Contagion Research Institute has published a report analysing how TikTok's search algorithm shapes user attitudes.

Download the report.

Key findings

1

Pro-China content is systematically amplified

TikTok’s algorithm promotes pro-China content while suppressing narratives critical of the CCP. Pro-China videos, often produced by state-linked entities, are consistently shown over anti-CCP content. For instance, anti-China content received 87 per cent fewer views despite being more liked than pro-China content.

2

Cross-platform influence operations

Beyond TikTok, pro-China content is spread via platforms like Instagram and YouTube, where frontier influencers and state-affiliated media accounts promote content that diverts attention from sensitive issues such as human rights abuses in Xinjiang and Tibet.

3

Psychological indoctrination of users

Survey data revealed that heavy TikTok users (those using the platform for more than three hours daily) were 50 per cent more likely to have favourable views of China's human rights record than non-users. This suggests TikTok’s content may influence perceptions that align with CCP objectives.

4

Irrelevant content crowds out sensitive topics

TikTok amplifies unrelated or clickbait content to drown out discussion of critical issues like CCP-driven ethnic genocide. For example, 60 per cent of content served under “Uyghur” was irrelevant or generic clickbait.

5

Algorithmic manipulation impacts user beliefs

The NCRI assesses that these influence tactics, particularly on TikTok, are designed to manipulate user beliefs on a large scale, raising the need for transparent social media regulation.

Meta’s news ban reduced Canadian news consumption by 43 per cent

The Media Ecosystem Observatory has published a report about the impact of Meta’s decision to block Canadian users’ access to news on Facebook and Instagram. The report evaluates its effects on both Canadian news outlets and the public—one year after the ban.

Download the report.

Key findings

1

Canadian news outlets lost 85 per cent of social media engagement

The ban led to a significant drop in engagement for Canadian news outlets on Facebook and Instagram, with an overall 43 per cent reduction in total engagement across social platforms.

2

Nearly 30 per cent of local news outlets are inactive

Approximately 212 local Canadian news outlets have stopped posting on social media altogether, reshaping the country’s media landscape.

3

Most Canadians are unaware of the ban

Despite the major shift, 75 per cent of Canadians, including many users who still get news from Facebook and Instagram, are unaware of the news block.

4

News content continues to circulate on Meta platforms

Although official news is banned, 36 per cent of Canadians report encountering news content on Facebook and Instagram through workarounds like screenshots.

5

Decline in news consumption

Due to the ban, Canadians see 11 million fewer news views per day on Meta platforms. Many are not actively seeking news on alternative platforms.

Older adults most likely to support conspiracies

Researchers at the University of Oslo and several other institutions have used artificial intelligence to uncover the psychological and demographic traits linked to conspiracy theory support on social media. The study analysed over 7.7 million social media interactions, focusing on conspiracy theories related to COVID-19.

Download the study.

Key findings

1

Age and political extremes are major predictors

Older people and those identifying at the far ends of the political spectrum (both left and right) were most likely to support COVID-19 conspiracy theories. The study found a particular correlation between far-left beliefs and conspiracies concerning economic instability, while far-right users leaned more towards conspiracy theories about misinformation.

2

Belief in false information consistently linked to conspiracies

Those who tend to believe in false information were more likely to support conspiracy theories. Confidence in identifying misinformation paradoxically increased the likelihood of backing some conspiracy theories, demonstrating overconfidence in judgment.

3

Denialism and political conservatism as key factors

Denialism, or the rejection of expert narratives, was associated with support for theories that the public was being misled about COVID-19’s nature and prevention. Conservative users showed a greater tendency to support conspiracy theories that involved deliberate misinformation by governments.

4

Misinformation susceptibility remains a challenge

The study highlights that over 20,000 engagements supported the theory that governments spread misinformation. AI analysis confirms that conspiracy theories thrive where misinformation spreads unchecked, reinforcing the need for targeted interventions.

Switching between videos intensifies boredom

Researchers at the University of Toronto have studied how people's behaviour when using digital media, such as switching between videos, may contribute to feelings of boredom.

Download the report.

Key findings

1

Switching between videos intensifies boredom

Across seven experiments, participants reported feeling more bored after switching between videos or fast-forwarding through content than when they watched content uninterrupted.

2

Boredom drives the behaviour

When individuals are already bored, they are more likely to switch between or fast-forward videos, believing it will reduce their boredom. However, this behaviour paradoxically worsens boredom.

3

Reduced satisfaction and attention

Those who engaged in digital switching also reported lower satisfaction with their viewing experience, less attention to the content, and a diminished sense of meaning compared to those who did not switch between videos.

4

Switching within videos similarly impacts engagement

Fast-forwarding through individual videos or segments led to similar outcomes: reduced enjoyment, attention, and a heightened sense of boredom. This suggests that constantly seeking new stimuli may disrupt the immersive quality of media consumption.

Significant rise in teens exposed to harmful online content

The Norwegian Media Authority have published a report about how 13–18-year-olds in Norway experience harmful content, negative incidents, and sexual exploitation online.

Download the report.

Key findings

1

Exposure to harmful content increases with age

A third of teens have seen disturbing or violent images or hate speech against individuals or groups, with exposure growing by 5–6 per cent since 2022. Notably, 37 per cent of teens encountered hate messages, and 33 per cent saw violent content multiple times. A significant proportion (40%) of girls reported exposure to content about becoming thin, highlighting gender-based differences in experiences.

2

Negative online experiences common among teens

Nearly half (42%) of teens reported receiving offensive comments online, with younger teens (13–14 years) more frequently affected than older ones. Additionally, one-third have been excluded from online groups or had content shared without consent, and 23 per cent have been bullied online.

3

Unwanted sexual comments are prevalent

About 22 per cent of teens have received unwanted sexual remarks online, with girls (27%) more often targeted than boys (16%). Half of the recipients were contacted by strangers, while a smaller percentage (9%) received these from a partner.

4

Widespread exchange of explicit images

A third of teens have been asked to send or have received nude images, and nine per cent admitted to sending them within the past year. Girls are more likely to share such images, with 25 per cent experiencing their images being distributed without consent.

Lack of safeguards in Grok AI allows election disinformation

The Centre for Countering Digital Hate has published a report on X's AI tool, Grok, revealing its failure to block harmful and misleading images related to the upcoming 2024 US presidential election.

Download the report.

Key findings

1

No effective guardrails against election disinformation

Grok's safeguards failed to stop 60 tested prompts designed to generate false election-related images. It created misleading images of candidates and scenarios such as Donald Trump in a hospital bed or a violent scene at a polling station, increasing the risk of disinformation.

2

Hate-related images generated despite safety features

Grok's safety features were insufficient against harmful content. In 16 of 20 tests, the tool generated hateful images targeting minorities, including racist and antisemitic caricatures, despite attempts to block specific prompts.

3

Inconsistent platform policy enforcement

Although X’s policies prohibit the sharing of misleading or manipulated media, researchers discovered that Grok-generated images, including some amassing millions of views, were shared without labels, highlighting gaps in enforcement.

4

Inadequate protection against prompt manipulation

Jailbreak attempts that slightly altered prompts to bypass restrictions were largely successful, showing that Grok’s filters are easily manipulated, allowing the generation of harmful content that circumvents the platform's safety measures.

Flere rapporter

September 2024

Tinius Digest september 2024

Månedlige rapporter om endringer, trender og utviklinger i mediebransjen.

Juli 2024

Tinius Digest juli 2024

Månedlige rapporter om endringer, trender og utviklinger i mediebransjen.

Juni 2024

Tinius Digest juni 2024

Månedlige rapporter om endringer, trender og utviklinger i mediebransjen.

Mai 2024

Tinius Digest mai 2024

Månedlige rapporter om endringer, trender og utviklinger i mediebransjen.

Se alle rapporter

Meld deg på nyhetsbrevet