Digital Rights & Technology 16 April 2026 10 min read

Government Calls Social Media Companies to Downing Street Over Child Safety Concerns

The Prime Minister and Technology Secretary summoned executives from Meta, Snap, Google, TikTok and X as part of a concerted push to tighten online protections for children with new legislative powers already in place to act fast.

✍️ By UKPoliticsDecoded Editorial Team
Social media platforms and Downing Street, children's online safety meeting April 2026

Senior executives from Meta, Snap, Google (YouTube), TikTok and X were brought to Downing Street on 15 April 2026 for a meeting focused on children's online safety and platform responsibility. The government framed the meeting as a direct intervention at a time when concerns about harmful content, addictive design features and unregulated AI tools are growing.

The meeting coincides with the midpoint of the government's consultation Growing Up in the Online World, which has already received over 45,000 responses from parents, young people, schools and civil society groups. Ministers are using this moment to increase pressure on platforms ahead of potential regulatory changes later in the year.

Key Points at a Glance

  • Executives from Meta, Snap, Google, TikTok and X were summoned to Downing Street on 15 April 2026
  • The government's consultation Growing Up in the Online World has received over 45,000 responses and remains open until 26 May 2026
  • Almost 6,000 young people have participated directly in the consultation
  • The government has already legislated for rapid response powers to introduce new rules within months
  • Potential measures include a minimum age for social media, restrictions on addictive design features, and stronger AI safeguards for children
  • The Prime Minister has stated action is no longer a question of if, but how and how quickly

Why the Government Intervened

Ministers say they have heard consistent concerns from parents about the impact of social media on children's wellbeing. These include exposure to harmful or age inappropriate content, the influence of algorithms on behaviour and self image, challenges in managing screen time, and unclear accountability when harm occurs.

The government argues that rapid technological change including the rise of AI driven features means existing protections are no longer sufficient. The Prime Minister has stated that the question is no longer whether the government will act, but how and how quickly.

What Powers the Government Has Already Taken

Ahead of the consultation's conclusion, the government has legislated to give itself rapid response powers. These are intended to allow ministers to introduce new rules without long delays, respond to emerging risks as technology evolves, and implement protections "within months, not years".

This is designed to avoid the long implementation gaps that have historically followed major digital regulation efforts. The government says these powers will allow it to move quickly once the consultation's findings are finalised.

What Social Media Companies Have Done So Far

Some platforms have introduced new safety features aimed at younger users, including autoplay disabled for children by default, parent controlled screen time limits, curfews or time restricted access, and expanded parental dashboards. These measures vary by platform and are at different stages of rollout.

The government acknowledges these steps but argues that they do not yet address the full range of risks identified by parents, researchers and child safety organisations.

What the Government Asked Companies at the Meeting

During the meeting, the Prime Minister and Technology Secretary set out the government's expectations for the next phase of online safety policy. They asked companies to provide clear answers on:

  • What concrete steps they are taking to reduce harm
  • How they are responding to concerns raised by parents and young people
  • How they plan to adapt to potential new rules
  • What safeguards they have in place for AI driven features used by children

The government emphasised that platforms should prepare for regulatory change and demonstrate that they are taking responsibility for the design choices that affect young users.

The Consultation: What Is Being Considered

The consultation Growing Up in the Online World is described by the government as the most ambitious of its kind globally. It is examining a wide range of potential measures, including:

  • A minimum age for social media use
  • Restrictions on addictive design features such as infinite scroll or algorithmic nudges
  • Stronger safeguards for AI chatbots used by children
  • New duties on platforms to prevent foreseeable harm

Engagement so far includes over 45,000 responses from the public, almost 6,000 young people participating directly, and more than 80 organisations including schools, charities and community groups taking part in workshops and roundtables.

The consultation remains open until 26 May 2026, and the government is encouraging further submissions.

Why This Matters

The government frames this as a long term societal issue. Social media plays a significant role in shaping children's identity and self esteem, their social relationships, and their exposure to information and online communities. The government argues that without stronger protections, children may face risks that neither parents nor regulators are currently equipped to manage.

The meeting is therefore positioned as part of a broader shift toward more proactive oversight of digital platforms.

What Happens Next

Once the consultation closes, officials will analyse responses and develop policy proposals. Ministers will then use the new legislative powers to introduce measures more quickly than previous regulatory cycles. Further engagement with platforms is expected as proposals are refined. The government has signalled that it intends to move at pace, with the aim of delivering changes "within months".

A Note on the Wider Picture

While the government's current focus is on platform responsibility and regulatory powers, many experts, child safety researchers, digital literacy specialists and education bodies continue to stress that education, parental support and digital skills training remain core components of effective online safety strategies. These perspectives form part of the wider policy debate surrounding the consultation and are worth keeping in mind as proposals develop.

What Experts Say About Effective Online Safety

A wide range of child safety researchers, digital literacy specialists and education bodies have argued that supporting parents and improving digital education are among the most effective long term ways to protect children online. They often highlight that technical restrictions alone cannot address the full spectrum of risks children face in digital environments.

Supporting Parents With Practical Skills

Experts frequently emphasise the importance of equipping parents with clear, accessible guidance on how to talk to children about online harms, setting boundaries around screen time, recognising early signs of harmful online behaviour, and using age appropriate supervision tools. Many research groups describe confident, well informed parents as one of the strongest protective factors in a child's online life.

Education on Technical Protections at Home

Digital literacy specialists often highlight the value of teaching families how to use router level filters to block adult content, safe search and device level restrictions, and privacy and security settings on apps and platforms. These measures apply across devices and can be tailored to a child's age and maturity.

Digital Literacy and Online Safety in Schools

Education bodies and child protection organisations frequently argue that schools play a central role in preparing children for digital life. This includes teaching children to recognise harmful content, understand how algorithms and recommendations work, identify manipulation, scams and impersonation, develop critical thinking and resilience skills, and learn secure behaviour such as strong passwords and privacy awareness. Experts often describe this as essential for helping children navigate online spaces independently as they grow older.

Summary

This intervention signals a move toward faster, more assertive digital regulation, with a focus on design level risks rather than just content moderation. It reflects increasing expectations on platforms to demonstrate responsibility and transparency, and a policy agenda centred on children's wellbeing and parental confidence. The Downing Street meeting is part of a coordinated effort to shape the regulatory environment ahead of the consultation's final recommendations.

AI Use: AI tools were used to support source discovery and to structure the article for clarity. All research, verification, drafting, and final editorial decisions are fully human led.