Tinius Digest October

Tinius Digest

Tinius Digest report on changes, trends and developments within the media business at large. These are our key findings from last month.

Share the report with colleagues and friends, and use the content in presentations or meetings.

Download Tinius Digest (PDF).

Insight October 2022

Norwegian commission open to banning targeted advertising

The Norwegian Privacy Commission has handed over its final report to the Norwegian government. The 245 pages report spans a wide range of topics. 

Download the report.

Five interesting proposals:


Ban on targeted advertising

The commission proposes the Norwegian government consider a ban on targeted advertising. The proposition is controversial, with strong support from the Norwegian Data Protection Authority and Norwegian Consumer Council—while the Norwegian Media Businesses’ Association finds the proposal a threat against ads revenue financing journalism.


New advisory body

To secure a conform regulation and practice for the use of personal data, the commission proposes establishing a new advisory body that will have a comprehensive overview of personal data usage in Norway.


Stricter penalties and bans

The commission proposes stricter penalties for breaches of the Personal Protection Act. Furthermore, the commission also proposes a ban on the usage of remote biometric identification in public spaces, including the use of facial recognition technology.


Money talks

The commission proposes instructing The Government Pension Fund Global to impose privacy requirements in all companies they invest in. The fund has investments in over 9,000 companies in 70 countries worth more than 12.4 trillion NOK (1.2 trillion euros).


Greater transparency

To strengthen the trust in the government, the commission proposes greater transparency about the authorities’ use of personal data and technology to fight crime. They also suggest creating a committee that can assess the privacy consequences of the police’s methods.

Social media and tv dominate youths news consumption

The Norwegian Media Authority has surveyed Norwegian children and youths aged 9 to 18 about their news consumption.

Download the report.

Four main findings:


New consumption widespread

97 percent consume news often or occasionally. The younger the children, the lower the news consumption. Among 17-18-year-olds, pretty much everyone consumes news (99 percent).


Social media and tv

Social media and television are the most common channels for news consumption for 9-18-year-olds: 89 percent get news often or occasionally via television, while 88 percent get news from social media.


Snapchat, TikTok and YouTube

The social media platforms with the highest proportion of news consumption among 9-18-year-olds are Snapchat (66%), TikTok (65%) and YouTube (62%). For primary school children, YouTube and TikTok are the most used for news consumption. 


Family important

Family plays an essential role in children’s and youths’ news consumption. 93 percent of 9-18-year-olds get news from their parents, 91 percent from friends and 87 percent from teachers/school.

Seven barriers for youths' digital participation

Save the Children Norway (Redd Barna) has looked into the minority of youths that do not have access to and the opportunity to participate digitally. 

The report identifies seven barriers that can stand in the way of young people participating on an equal footing with their peers. 

Download the report.

Seven barriers:


Poor access to the internet

Other challenges with accessibility are that some digital services do not meet the requirements for universal design, which means that young people with disabilities cannot participate in school and/or use public services.


Poor ICT equipment

Four percent of secondary and upper secondary school youths do not have access to a school PC or tablet. An additional four percent do not have access to a PC at home.


Digital services are not available

Lack of availability is also a barrier to participation, whether the services are not user-friendly or some young people are not allowed to use public services because they do not have access to BankID. 


Bullying and hate speech

Bullying and hate speech, negative social control and/or parents’ lack of prerequisites to support and guide young people digitally are also factors that can prevent young people from participating.


Lack of parental support

A lack of digital competence may hinder youth participation. Some parents are hostile to all things digital or lack basic knowledge of digital functions and tools.


Negative social control

Parents, siblings, boyfriends, extended family and other carers can exercise negative social control, which can prevent young people from being able to be social online and use digital services on an equal basis with their peers. This includes overprotective parents that monitor their children online, checking logs and location—and in some instances to provide passwords and access to their accounts online.


Living conditions

Living conditions and living conditions also determine whether young people can use public services and live an everyday youth life digitally.

Facebook and Norwegian voluntary organizations

The Norwegian Institute for Social Research has studied the consequences of Facebook presence for voluntary organizations. 

Download the study.

Four main findings:


Changing communications

Social media, in general, and Facebook, have changed how voluntary organizations communicate with their stakeholders. By offering low-cost tools for communication and coordination, social media platforms such as Facebook may constitute a substitute for coordination utilizing hierarchical organization.


Size may discriminate

Suppose one looks at the factors influencing the adoption of Facebook. In that case, the only discriminant factor explaining its non-adoption is the organization’s size, measured in terms of operating costs.


Comprehensive implementation

Usually, the organizational barriers to adopting social media are lack of expertise and resources, inappropriateness for the target population, and internal institutional policies. In Norway, most of these barriers, especially those related to resources and capacity, are no longer at work, given the pervasiveness of Facebook.


The battle for attention

The main challenge for voluntary organizations is no longer the adoption of these means of communication but the battle for attention and reach.

No such thing as an unbiased AI-system

Researchers at the University of Cambridge have published a critical study of the impact of artificial intelligence on objective recruitment.

Download the study.

Four main findings:


Race and gender

Attempts to remove information regarding gender and race from AI systems often misunderstand what gender and race are, casting them as isolatable features rather than a broader aspect. 


Diversity and discrimination

Outsourcing diversity work to AI-powered hiring tools may unintentionally implant cultures of inequality and discrimination by failing to address the systemic problems within organizations. 


Ideal candidates

Recruitment AI tools are trying to produce the ‘ideal candidate’, identified through constructing associations between words and people’s bodies. This may discriminate potentially more valuable and useful candidates for an organization.


Historical inequality

AI hiring tools’ supposedly neutral assessment of candidates’ traits misrepresent the need for change. Specifically, the historical demands in the job market are hardwired in AI-powered hiring tools, thus maintaining the historical legacy instead of a neutral assessment of the candidates.

Recommender systems may manipulate users

Researchers from Berkeley and MIT show how recommender systems influence users.

Download the report.

Three main findings:


Represents values

By deploying a specific recommender system, the companies also choose to induce specific internal states and attitudes regarding the available content.


Incentive for manipulation

Systems trained via long-horizon optimization will have direct incentives to manipulate users—e.g., shift their preferences so they are easier to satisfy.


Human interference needed

The researchers argue that recommender systems need system designers that consecutively evaluate the system preferences and estimate the shifts the system may induce. This will make them capable of implementing necessary changes to the systems.

Join our newsletter

Get our monthly Tinius Talks and Tinius Digest

Olav Vs gt 5

NO – 0161 Oslo, Norway

(0047) 98 20 30 70


© 2021 Tinius Trust All rights reserved – Privacy Policy – Created by Kult Byrå