Are you a parent in the UAE? You now have a legal duty to supervise your children’s digital lives
[Editor’s Note: This article is part of Khaleej Times’ Schools and Parents, a dedicated section designed to support families in the UAE as they explore educational choices. The section offers explainers, guidance from education leaders, expert advice and insights from parents to help readers make informed decisions about schools, curricula and communities.]When children in the UAE open a game, scroll through videos, or join a social media app, the law now expects the digital world around them to behave differently.A new Child Digital Safety Law shifts the focus from reacting after harm occurs to preventing it before it happens — placing responsibility on platforms while clearly outlining what parents are expected to do.For families, this raises two big questions: What exactly is my legal obligation? And how can I realistically meet it in daily life?1. Does this law mean parents can be fined or punished for their child’s online behaviour?No. Parents are not fined or penalised under this law for everyday parenting decisions. The legal consequences apply to platforms and service providers, not families. However, the law does create a clear expectation: parents must exercise reasonable supervision over their child’s digital use. This means staying informed, using available safety tools, and guiding children toward safer online behaviour — not monitoring every click or message.Stay up to date with the latest news. Follow KT on WhatsApp Channels.Sarah Greenstreet, a technology and data protection lawyer at Addleshaw Goddard, explains that the law reflects a wider shift in how children are protected online.She says, “This new law in the UAE signals a clear commitment by the government to taking children’s online safety seriously. The Child Digital Safety Law reflects a growing recognition that children need the same level of protection online as they do offline.”2. What practical steps should parents take to meet the law’s supervision requirement?In practical terms, reasonable supervision means knowing which apps, games and platforms your child uses, ensuring those apps are age-appropriate, using basic parental controls and privacy settings, talking to your child about online safety and behaviour, and reporting serious harmful content if you become aware of it. It does not mean constant surveillance, reading every message, or banning the internet entirely.Security experts emphasise that supervision does not require invading a child’s privacy to be effective. Syed Aizad, a lead security researcher at Acronis TRU, highlights that systems should protect children while limiting unnecessary data exposure:“Only the data required to deliver safety controls should be processed, and, wherever possible, it should be anonymised, encrypted, and retained for limited periods. Transparency toward parents and guardians is critical, as is giving them clear control over settings.”3. Is giving my child access to my own account still okay?This is one of the biggest risks under the new law. Sharing adult accounts with children can bypass age protections, content filters and safety settings, and may be seen as failing to exercise reasonable supervision. Children should always have their own age-appropriate accounts, even if that means setting them up together.For younger children, some platforms should remain under direct parental control. Pramod Kadavil Pushkaran, CEO of Techbee IT & Designs, explains why.He says, “Messaging platforms like WhatsApp should be accessed only through a parent’s device, especially for younger children. This allows parents to control contacts, prevent communication with unknown individuals, and guide children on safe online interactions.”4. What kind of online content is now considered “harmful”?Harm is no longer limited to explicit or illegal material. The law defines harmful content broadly — anything that negatively affects a child’s moral, psychological or social wellbeing. This can include content that promotes unhealthy body standards, encourages excessive screen use, subtly pressures children to share personal information, or normalises risky behaviour. Importantly, harm depends on age. What may be acceptable for a 17-year-old may be inappropriate for a 7-year-old.As Greenstreet explains, the law also targets harmful behaviours, not just content.“One of the law’s key aims is to reduce children’s exposure to harmful digital content, as well as harmful online behaviours such as harassment, grooming, or exploitation. Platforms are expected to take proactive steps to identify and filter inappropriate content and to provide clear, accessible reporting tools.”5. What changes will my child actually notice online?Children may not notice one dramatic shift, but their experience should feel consistently safer. Explicit or disturbing videos should appear less often, filters on violent or adult content should be stronger, contact with unknown adults may be limited, and privacy settings should default to safer op
[Editor’s Note: This article is part of Khaleej Times’ Schools and Parents, a dedicated section designed to support families in the UAE as they explore educational choices. The section offers explainers, guidance from education leaders, expert advice and insights from parents to help readers make informed decisions about schools, curricula and communities.]
When children in the UAE open a game, scroll through videos, or join a social media app, the law now expects the digital world around them to behave differently.
A new Child Digital Safety Law shifts the focus from reacting after harm occurs to preventing it before it happens — placing responsibility on platforms while clearly outlining what parents are expected to do.
For families, this raises two big questions: What exactly is my legal obligation? And how can I realistically meet it in daily life?
1. Does this law mean parents can be fined or punished for their child’s online behaviour?
No. Parents are not fined or penalised under this law for everyday parenting decisions. The legal consequences apply to platforms and service providers, not families. However, the law does create a clear expectation: parents must exercise reasonable supervision over their child’s digital use. This means staying informed, using available safety tools, and guiding children toward safer online behaviour — not monitoring every click or message.
Stay up to date with the latest news. Follow KT on WhatsApp Channels.
Sarah Greenstreet, a technology and data protection lawyer at Addleshaw Goddard, explains that the law reflects a wider shift in how children are protected online.
She says, “This new law in the UAE signals a clear commitment by the government to taking children’s online safety seriously. The Child Digital Safety Law reflects a growing recognition that children need the same level of protection online as they do offline.”
2. What practical steps should parents take to meet the law’s supervision requirement?
In practical terms, reasonable supervision means knowing which apps, games and platforms your child uses, ensuring those apps are age-appropriate, using basic parental controls and privacy settings, talking to your child about online safety and behaviour, and reporting serious harmful content if you become aware of it. It does not mean constant surveillance, reading every message, or banning the internet entirely.
Security experts emphasise that supervision does not require invading a child’s privacy to be effective. Syed Aizad, a lead security researcher at Acronis TRU, highlights that systems should protect children while limiting unnecessary data exposure:
“Only the data required to deliver safety controls should be processed, and, wherever possible, it should be anonymised, encrypted, and retained for limited periods. Transparency toward parents and guardians is critical, as is giving them clear control over settings.”
3. Is giving my child access to my own account still okay?
This is one of the biggest risks under the new law. Sharing adult accounts with children can bypass age protections, content filters and safety settings, and may be seen as failing to exercise reasonable supervision. Children should always have their own age-appropriate accounts, even if that means setting them up together.
For younger children, some platforms should remain under direct parental control. Pramod Kadavil Pushkaran, CEO of Techbee IT & Designs, explains why.
He says, “Messaging platforms like WhatsApp should be accessed only through a parent’s device, especially for younger children. This allows parents to control contacts, prevent communication with unknown individuals, and guide children on safe online interactions.”
4. What kind of online content is now considered “harmful”?
Harm is no longer limited to explicit or illegal material. The law defines harmful content broadly — anything that negatively affects a child’s moral, psychological or social wellbeing. This can include content that promotes unhealthy body standards, encourages excessive screen use, subtly pressures children to share personal information, or normalises risky behaviour. Importantly, harm depends on age. What may be acceptable for a 17-year-old may be inappropriate for a 7-year-old.
As Greenstreet explains, the law also targets harmful behaviours, not just content.
“One of the law’s key aims is to reduce children’s exposure to harmful digital content, as well as harmful online behaviours such as harassment, grooming, or exploitation. Platforms are expected to take proactive steps to identify and filter inappropriate content and to provide clear, accessible reporting tools.”
5. What changes will my child actually notice online?
Children may not notice one dramatic shift, but their experience should feel consistently safer. Explicit or disturbing videos should appear less often, filters on violent or adult content should be stronger, contact with unknown adults may be limited, and privacy settings should default to safer options. Teenagers will still be able to use social media and games, but with built-in safety limits rather than unrestricted access.
Behind the scenes, platforms will also be handling children’s data differently. As Greenstreet adds,
“Protecting children’s personal data is another central focus. The law places strict limits on the collection and use of personal data relating to children under the age of 13, including restrictions on targeted advertising.”
6. What is the law really asking parents to do differently?
The biggest shift is recognising that digital access is not a one-time decision — it is an ongoing responsibility. Rather than asking, “Is this app allowed?” once, parents are encouraged to review settings regularly, revisit age suitability as children grow, adjust screen time limits, and stay engaged with what their children are seeing and doing online.
Many parents worry this sounds technically overwhelming, but experts stress that most tools already exist.
Morey Haber, Chief Security Advisor at BeyondTrust, says,
“Parents have a myriad of tools to help comply with the new laws — operating system parental controls, browser-based content filtering, antivirus solutions, and even router-level filters. Regularly reviewing installed apps, setting screen time limits, and engaging in open dialogue about online risks further enhances digital safety.”
7. How can parents supervise effectively without becoming digital police?
Supervision works best when it feels supportive, not controlling. It begins with conversation rather than control — asking children what they enjoy online, who they interact with, and what makes them uncomfortable.
Rema Menon Vellat, Director, Counselling Point Training and Development, says, “Some of the connections made on the internet are not always genuine, age-appropriate, or legitimate. Children and young adults can become victims of scams, forced to share information that is not age-appropriate or can be introduced and influenced by bad company.”
Experts reiterate that children are more likely to share concerns when they do not fear punishment or device confiscation. Effective supervision also means “teaching judgment”, not just rules.
8. What this law means for families
This law does not ban screens, criminalise parenting, or place impossible burdens on families. Instead, it reshapes the digital environment children grow up in and positions parents as guides, not enforcers. Safety is no longer just about reacting to harm. It is about prevention, age-appropriate design, ongoing supervision, and open communication.
What's Your Reaction?