Question écrite de
M. Emmanouil FRAGKOS
-
Commission européenne
Subject: Tackling internet addiction
Around 52 % of Greece’s population is digitally literate (the EU average is 56 %) and 68 % of Greeks use social media (the EU average is 59 %).1 A significant proportion of internet and social media users are susceptible to addition, experiencing physical and mental symptoms such as sadness, fatigue and even cyberbullying and depression.
The negative aspects of social media include hostility from other users, comparison with unrealistic ideals, strong expectations regarding friendships on different online platforms, difficulties integrating into the online environment, and sleep and work problems. Furthermore, due to AI there seems to be an exponential rise in internet scams.
According to the Greek Safer Internet Centre (2019-2020), 83 % of children and young people aged 10-17 have a social media profile. As many as 70 % of children start using social media under the age of 13 (illegally). As many as 43 % of child social media users say they neglect their activities. This percentage increases to 58 % among young people in upper secondary school. Research has found that internet addiction has an impact on children’s cognitive function and slows their linguistic and emotional development, it causes problems with socialisation and emotional intelligence, high stress levels, sleep disturbances and isolation and hinders the development of social skills.
In view of this:
1. How does the Commission assess the initiatives it has taken so far to reduce internet addiction?
2. What initiatives does it plan to take to reduce digital addiction?
Submitted:2.4.2024
1 56% of EU people have basic digital skills, https://t.ly/qO-c_.
Answer given by Mr Breton on behalf of the European Commission
(28 May 2024)
With the Digital Services Act (DSA) (2) the EU has already in place rules that oblige all online platforms to take appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, including restrictions to screen time.
For platforms with more than 45 million users in the EU (3), the DSA recognises the systemic risks related to negative effects on physical and mental health and children’s rights. These risks may arise from the design of interfaces which exploit vulnerable users, including by causing addictive behaviour.
In November 2023 the Commission launched an inquiry into minors’ protection under the DSA by sending information requests to all very large online platforms with a significant underage user base (TikTok, YouTube, Instagram, Facebook, and Snapchat), with a particular focus on the effects on mental and physical health, such as addictio n (4). On this basis, due to suspected infringements of the DSA, the Commission has opened two non-compliance cases against TikTok.
The first, launched in February 2024, concerns TikTok’s algorithmic systems, that may stimulate behavioural addictions (5). The investigation is ongoing.
The second, launched in April 2024, concerns TikTok Lite, based on the suspicion that some features of this new application may cause negative effects on mental health and addiction. The Commission communicated to TikTok its intention to suspend the relevant features in the EU pending the assessment of their safety.
As a result, TikTok announced unilaterally to withdraw the relevant features (6). The non-compliance case remains nonetheless open, and the investigation is ongoing.
1 ∙ ⸱ 56% of EU people have basic digital skills, https://t.ly/qO-c_.
2 ∙ ⸱ Regulation (EU) 2022/2065, https://eur-lex.europa.eu/legal-content/EN/TXT/?toc=OJ%3AL%3A2022%3A277%3ATOC&uri=uriserv
%3AOJ.L_.2022.277.01.0001.01.ENG 3 ∙ ⸱
So called very large online platforms and search engines or VLOPSEs. 4 ∙ ⸱ https://digital-strategy.ec.europa.eu/en/policies/list-designated-vlops-and-vloses 5 ∙ ⸱ https://digital-strategy.ec.europa.eu/en/news/commission-opens-formal-proceedings-against-tiktok-under-digital-services-act 6 ∙ ⸱ https://ec.europa.eu/commission/presscorner/detail/en/STATEMENT_24_2290
| | )The Commission is also working on guidelines on child protection under the DSA to make it clearer for platforms of all sizes what they are required to do to comply with the DSA.