
Brussels, Belgium – The European Union has escalated its scrutiny of social media platforms, formally accusing TikTok of violating the bloc's Digital Services Act (DSA) through "addictive design" features that reportedly endanger the mental and physical well-being of children and vulnerable adults. This significant preliminary finding, announced on Friday, February 6, 2026, signals a potentially transformative moment for how major tech companies operate within the EU, with demands for fundamental redesigns of the popular video-sharing application or the threat of substantial financial penalties.
The European Commission, the EU's executive arm, concluded a two-year investigation by stating that TikTok has not adequately assessed the potential harm caused by core features such as infinite scroll, autoplay, push notifications, and its highly personalized recommender system. According to the Commission, these elements encourage compulsive use, particularly among young users, by constantly "rewarding" them with new content, effectively shifting their brains into an "autopilot mode" that diminishes self-control.
At the heart of the EU's concerns are the specific design choices TikTok has implemented, which regulators argue are intentionally crafted to maximize engagement, often at the expense of user welfare. Commission spokesperson Thomas Regnier highlighted these features, stating they "lead to the compulsive use of the app, especially for our kids, and this poses major risks to their mental health and wellbeing." The "infinite scroll" mechanism, for instance, continuously delivers new videos without requiring a conscious decision from the user to seek more content, making disengagement difficult and promoting extended viewing sessions. Similarly, autoplay features ensure a seamless, uninterrupted flow of content, while push notifications act as constant prompts, drawing users back into the application. The highly personalized recommender system, often praised for its ability to curate content to individual tastes, is simultaneously identified as a key driver of this compulsive behavior, creating a tailored stream that can be difficult to resist.
The EU's investigation found that TikTok's existing safeguards, such as daily screen time limits and parental control tools, are largely insufficient. Regulators noted that screen time alerts are often too easy to dismiss, even for younger users, and parental controls require significant effort, technical knowledge, and ongoing involvement to be effective. This inadequacy suggests that the measures currently in place do not sufficiently mitigate the risks associated with the platform's engaging, yet potentially harmful, design.
This action against TikTok is a direct application of the EU's landmark Digital Services Act (DSA), a comprehensive regulatory framework designed to mandate accountability for large online platforms and protect users across the 27-nation bloc. Enacted to ensure a safer and more transparent digital space, the DSA places significant responsibility on social media companies to identify, assess, and mitigate systemic risks posed by their services.
Henna Virkkunen, the Commission's Executive Vice-President for Tech Sovereignty, Security, and Democracy, emphasized the core principle behind the legislation: "The Digital Services Act makes platforms responsible for the effects they can have on their users." This preliminary finding is the culmination of a thorough inquiry and sends a clear message that the EU is prepared to enforce its digital rules rigorously. Should TikTok fail to address the Commission's concerns satisfactorily, it could face substantial fines amounting to as much as 6% of its total worldwide annual turnover. This financial penalty represents a significant deterrent, underscoring the seriousness with which the EU approaches compliance with the DSA. TikTok now has the opportunity to formally defend itself against these accusations and respond to the Commission's findings.
The EU's move against TikTok is not an isolated incident but rather a prominent example of a rapidly intensifying global pushback against the unchecked influence of social media on younger generations. Governments worldwide are increasingly recognizing and responding to mounting evidence that digital platforms, with their engagement-driven algorithms, are contributing to a crisis in youth mental health.
Research consistently links excessive social media use to negative mental health outcomes in children and adolescents, including heightened levels of anxiety, depression, and poor sleep quality. A meta-analysis indicated that children spending over two hours daily on social media face a significantly higher risk of experiencing symptoms of anxiety and depression. This vulnerability is exacerbated by the critical phase of brain development occurring in children and adolescents, making them more susceptible to the neurological impacts of constant digital stimulation. Screen time has been shown to affect the pre-frontal cortex, vital for self-control and decision-making, and the amygdala, which processes emotions and anxiety. Long-term exposure and incessant notifications may even lead to structural changes in the amygdala over time.
Statistics reveal the scale of the challenge: over 80% of young people in Europe use social media daily, with those aged 14 to 16 spending nearly twice as much time online as 9 to 10-year-olds. Data from the World Health Organization's Regional Office for Europe further underscores this trend, reporting a rise in problematic social media use among adolescents from 7% in 2018 to 11% in 2022. Beyond the EU, countries like Australia have already implemented social media bans for under-16s, and similar initiatives are being explored or enacted in Spain, France, and Denmark. Even the EU Parliament passed a non-legislative report outlining a potential minimum age of 16 for social media access (with parental consent from 13-15) and proposing bans on features like infinite scrolling and autoplay. Reports also suggest that internal TikTok documents, uncovered in separate legal proceedings, indicate the company's awareness of its platform's addictive nature and its targeting of children.
In response to the EU's preliminary findings, TikTok has vehemently denied the accusations. The company issued a statement calling the Commission's findings "categorically false and entirely meritless," and pledged to challenge the assessment through every available means. TikTok maintains that it already provides a suite of tools designed to help users manage their time on the app, including features like sleep reminders and various well-being functionalities.
However, the European Commission is pressing for more fundamental changes. Potential modifications could involve disabling key addictive features like infinite scroll, implementing more effective "screen time breaks" (including mandatory pauses during nighttime hours), and adapting its powerful recommender systems to prioritize user well-being over continuous engagement.
While no specific deadline has been set for a final decision and no penalties have yet been imposed, the ongoing investigation underscores a pivotal moment for digital regulation. This unprecedented legal challenge to a platform's core design elements signifies a robust effort by the EU to assert its authority under the DSA, potentially setting a global precedent for how technology companies are held accountable for the impact of their products, particularly on their most vulnerable users.
The coming months will reveal whether TikTok can successfully defend its current operating model or if it will be compelled to undertake significant redesigns in response to the EU's demands. The outcome of this high-stakes confrontation could redefine the landscape of social media design and user protection for children worldwide.

New York – The United Nations has issued a stark warning, revealing that an estimated 4.5 million girls are at risk of undergoing female genital mutilation (FGM) in 2026 alone. This alarming projection underscores a persistent global crisis, with millions more already living with the profound and often lifelong physical and psychological scars of the practice

Almaty, Kazakhstan, has been officially designated as the new host city for the 10th Asian Winter Games in 2029, a pivotal announcement that shifts the highly anticipated event from Saudi Arabia's ambitious Neom project. The decision, formalized by the Olympic Council of Asia (OCA) in Milan, comes after reports of significant construction delays at Neom's planned Trojena ski resort, rendering it unprepared to host the quadrennial multi-sport event as initially scheduled

LAHORE, Pakistan – After nearly two decades of silence, the vibrant skies of Lahore are once again set to burst into a kaleidoscope of colors as the Basant kite-flying festival makes its highly anticipated return. The Punjab government's decision to revive the cherished cultural event, scheduled for February 6-8, 2026, has ignited a wave of excitement and nostalgia across the city, though it comes with stringent safety measures designed to prevent a recurrence of past tragedies