Protecting kids online needs to be a non-negotiable for the industry

22 September 2023
James Rose.

James Rose, Managing Director, Australia at Channel Factory

Some of my closest friends started their new life as parents with lofty ambitions of keeping their kids off screens completely or for "educational content". I’m sure you wouldn’t be surprised to learn that most, if not all of them were unable to stick to that plan.

I’d also hazard a guess that most people reading this grew up watching TV, some of you even got a mobile phone in your early teens. And the younger of you were digital natives who have always been online, but you probably still remember the fact you had to use the PC in the loungeroom to do that.  

That paradigm has shifted dramatically. It’s now far more common for kids’ to have access to devices like smartphones and tablets.. In fact recent data from the Australian Bureau of Statistics (ABS) shows 90% of children participate in screen-based activities. Of those, 24% spent 20 hours or more per week undertaking screen based activities, compared to that of 16% in 2017-18.1

This isn’t a judgement piece about parenting; there is a lot of value kids can get from spending time with the right content online. It can be educational, informative, interactive and quite obviously digital proficiency is going to be a necessity for them in their lives. Rather, it’s a call to arms for us as an industry to ensure they’re not getting exposed to things that they aren’t meant to be.

This isn’t just the responsibility of the platforms or content makers, but also advertisers as well. Where we put our ad dollars and the tools we use to maintain child safety online are vital. Getting it right is non-negotiable, the problem is that many of the existing methods are also damaging for business in other ways beyond brand safety or suitability. 

Including children in the privacy conversation

Privacy is undoubtedly an issue that all advertisers are keeping at the forefront of their minds. But in so many of the conversations that have been dominating the industry, the interests of children have mostly been relegated to the sidelines. 

We know that with stringent data privacy laws, advertisers can no longer rely on cookies for targeting potential customers with ads; instead, they must genuinely understand their audience's needs and deliver tailored advertisements in suitable environments. Now we also need to understand what consumers are consenting to online, securing permission, and obtaining first party data.

But when it’s a child navigating the internet, this conversation changes. It’s hard enough for the average adult to fully understand what it means to click “accept cookies” on a webpage - how can we expect the same for a kid?

The truth is, most of us are looking at the privacy and targeting issue from an adult mindset, with all the media literacy that comes with that. But it is not just adults occupying the internet, and all advertisers need to be more cognizant of this secondary audience when developing privacy-compliant campaigns. 

Interestingly, Channel Factory’s own research showed that in a meta analysis of over 130 campaigns, 17% of impressions on YouTube were served on kids-skewed content. In contrast, campaigns using brand suitability and contextual targeting strategies via Channel Factory only ran on kids content 3% of the time.2

Recent laws such as Australia's Online Safety Act of 2021 have certainly ushered in a new era of accountability for online service providers, emphasising the importance of safeguarding users. Under this act, industries are compelled to establish codes to regulate illegal and restricted content unsuitable for children.3

But the online world changes so rapidly, that laws often can't keep up to date. Even since 2021, there have been many changes to the online world and how we live and work with it. So while these laws should be the foundation of any business' approach, companies should implement a more proactive strategy to ensure they’re not falling behind in terms of online safety.

For example, we should reconsider the effectiveness of solely relying on audience-based targeting in digital advertising. This has long been considered a cornerstone of modern marketing strategies to cater relevant and personalised content. But recent revelations from an Adalytics report suggest that audience targeting, while precise in its intentions, can inadvertently contribute to filter bubbles and echo chambers.4

This is where users are only exposed to content that aligns with their existing beliefs and interests, reinforcing existing perspectives. 

Whilst this may not sound too harmful, over time, this can lead to a lack of diversity in the information that users encounter - and for children, can be extremely detrimental. It reduces exposure to new ideas and perspectives, and can also create an over-reliance on audience targeting. 

Exclusions, block-lists and overreaching brand safety measures

The shift towards more brand-safe and transparent practices over recent years has led to the proliferation of exclusion or block-lists. These are designed to filter out words or content deemed unsafe or inappropriate. 

While that looks good at first glance, they’re the trawler net of the web. We certainly catch the places we don’t want to be, but we also end up snaring a lot of really good opportunities in the net. In short, it’s a blunt instrument because it doesn’t consider content nuance and context can inadvertently result in the exclusion of minority groups and communities. 

This poses a problem for all brands. Being brand safe is undoubtedly a key imperative for all, but the current methods businesses are using (i.e. block-lists) mean that they’re simultaneously restricting their ability to engage with a more diverse audience. It’s therefore more challenging than ever for brands to reach audiences in an inclusive and brand safe manner.

This is where the adoption of ‘Inclusion lists’ becomes crucial. An ‘Inclusion list’ is a list of sites, domains and bundle IDs that an advertiser considers to be safe, acceptable, and trustworthy environments to serve ads to. 

Rather than solely focusing on blocking specific content, advertisers can actively curate a list of safe, educational, and appropriate platforms and content that align with their brand values. This proactive approach not only safeguards brands but also contributes to a more positive and enriching online environment for children.

As an industry, we must support and educate advertisers on using inclusion lists effectively so we don’t just create another problem to solve in a few years time. It’s not just about fine-tuning algorithms, it’s about reevaluating the very foundations of digital advertising. 

Embracing a social conscience

Online safety should top the priority list for every platform and advertisers. Brands must identify their guiding principles – such as honesty, transparency, positivity, and societal welfare - and use these to generate more meaningful engagement with their customers.

We all need to trust in the system, to a certain extent. But this is a two way street, and businesses need to pick up the slack where the current laws and tools don’t cover. In the digital world, we can't stop kids from going online, but we can make sure they have a safe and enriching journey through the web. This is possible through responsible advertising and by creating a better online world for the next generation. 



1. Cultural and creative activities study by Australian Bureau of Statistics - April, 2023

2. Supercharge your YouTube buying with Channel Factory - March,2023

3. The Online Safety Act 2021 - 2021

4. Are YouTube ads complying with the Children by Adalytics August, 2023

comments powered by Disqus