top of page
Search

Let us protect our children not by coercion, but by education, ethics, and example.

Rock The Vote NZ Party Submission to the Inquiry into the harm young New Zealanders encounter online

Submitted on the 30th of July 2025

 

Summary

Rock The Vote NZ Party presents this submission in recognition of the significant and growing harms facing young New Zealanders in the digital sphere. The modern internet has transformed into an economy of attention, where content is engineered not for education or inspiration, but to capture and monetise the user’s gaze. This reality is particularly dangerous for children and adolescents, whose developmental vulnerabilities are directly exploited through addictive design, algorithmic targeting, and increasingly AI-generated content.


The submission begins by outlining the battle for attention that defines digital engagement in 2025. Platforms such as YouTube prioritise overstimulating and psychologically manipulative content, including shows like Cocomelon, which imitate the addictive mechanics of gambling interfaces. This saturation of engineered content reshapes how young people think, feel, and relate to the world, often impairing critical thinking and interpersonal development.


The second section highlights exploitative microtransactions and financial manipulation, where children are enticed into spending through hidden costs, in-game currencies, and “surprise mechanics” like loot boxes. These mimic gambling in structure and psychology while escaping its regulation. Academic studies show that such monetisation systems are correlated with increased psychological distress and create pathways into habitual spending behaviours from a young age.


The final section challenges the prevailing governmental impulse toward digital surveillance, centralised identification systems, and coercive control. RTVNZ contends that these systems are not only ineffective due to circumvention via VPNs and decentralised platforms but also dangerous in that they erode privacy, alienate users, and inadvertently funnel them into unregulated and potentially predatory digital environments.


Instead, we advocate for a principled, proportionate approach that respects both freedom and safety. The role of Government must be to empower families and inform citizens, not to impose technocratic control. This means equipping parents, teachers, and children with the tools to navigate digital life critically and safely.


In all cases, our framework remains simple: safeguard the minds of our young people, protect their autonomy, and build a digital society that uplifts rather than exploits.

ree

 

The Internet – A Battle for Attention

RTVNZ will cut to the heart of the matter and explain the core logic that corporations and commercial actors follow in the modern age.  The modern internet, and by modern we mean 2025, and since the beginning of generative AI, the internet has shifted to become more of a battle of gaining the attention of the viewer than it has ever before.  Every platform and every content producer compete to capture and hold the user’s gaze and keep them behind a screen.   Nowhere is this battle more intense than in the realm of online entertainment consumed by children and adolescents.  With the advent of AI, the very nature of online content has been transformed.  Recent studies have estimated that as much as 90% of online content will be AI-generated by 2026[1], a harrowing prospect for the average user who wants genuine content.  This saturation of content not only distorts the authenticity of information but also overwhelms users, both adults and young people alike, who have not had the experience or education to differentiate between real and artificial experiences.


Children’s online engagement is not passive.  This AI-generated content has but one objective. To capture your attention, so you can view it.  Platforms actively design addictive feedback loops that leverage psychological vulnerabilities. The American Psychological Association has warned that features like infinite scrolling and autoplay trigger "dopamine loops," making it difficult for young users to disengage. And of course, if you view it, then it looks good for the commercial metrics; they can use viewership to get advertisement and sponsorship deals, they can put whatever ads they like before and afterwards.  While adults are also a target for this, children are easier to manipulate; it’s all a game to the people, to funnel people to where they want them to go, using whatever legal means are at their disposal.


While AI-generated content has made this worse, it did not start with AI and many companies today have instead researched and determined ways to reach your child and get them hooked on them.  A clear example of this is the global popularity of shows like Cocomelon on YouTube. While marketed as educational, studies have shown that such programming employs overstimulating audio-visual patterns—rapid scene changes, high-pitched voices, bright colours—which can dysregulate children's attention spans and mirror the addictive design seen in gambling interfaces[2].  Reports have linked excessive exposure to Cocomelon and similar content to behavioural issues, reduced cognitive flexibility, and dependency-like symptoms in toddlers and pre-schoolers. The algorithms behind platforms like YouTube are finely tuned to promote such content repeatedly, ensuring that young viewers are kept in a continuous cycle of consumption.


This must be taken into account as a core harm to our young people, as it affects the development of our children.  Parents must not let the screen be the nanny.  The commodification of children’s attention will reshape their entire lives from their critical thinking development to their family dynamics.  Our children must yearn for the finer things in life, not for the screen.  Parents must be involved and educated to get them off the screen at a young age; they must be involved in their lives. Shows for children should be genuine and there to empower the child, to inspire and grow them.  It is a violation of trust that we have put in the free market, and ultimately, these companies must be taken to task and ethical guardrails put in place.

 

Exploitative Microtransactions and Financial Manipulation

One of the most underacknowledged yet deeply consequential harms to young people online is the rise of exploitative microtransactions within games and digital platforms.  Since the first microtransaction, a horse that can be downloaded in Oblivion, that triggered this off, it has gotten worse, and worse over the years since microtransactions started.  For this inquiry, we will focus on the most harmful of the lot, the ones that are engineered experiences designed to extract money from their users, in particular the younger generation by leveraging behavioural psychology, impulse systems, and in many cases, simulated gambling, or as called by the industry, “Surprise Mechanics”.


Loot boxes, gacha mechanics, and randomised digital rewards blur the line between gaming and gambling. Gacha games feature items that can only be obtained through these randomised systems. These features often replicate the key emotional dynamics of poker machines: variable reward schedules, dopamine-reinforcing anticipation, and false perceptions of near-misses. Unlike traditional gambling, however, they are embedded in games rated for children as young as 10 or 12.  There is a wealth of examples of loot box mechanics, Counter-Strike 2, Team Fortress 2, Overwatch, and League of Legends. We could spend an entire submission listing them all.  Many of these systems are paired with cosmetic incentives such as skins for different things, characters, and emotes that are highly valued in adolescent social environments.  While social pressure from the media that young people watch tends to push people away from microtransactions, the psychological parts can be very enticing.


The tactics used to achieve this fall under a broader category of “dark patterns”—user interface designs meant to trick or coerce users into taking actions against their best interests. In the gaming context, this includes:

  • Intentionally vague in-game currencies (e.g. “gems” or “credits”) that obscure real-money costs, usually unattainable through gameplay and are required to be bought.

  • Pay-to-progress models where grind levels are adjusted to incentivise purchases.

  • “Confirmation bias traps” that pressure users into recurring purchases after one initial spend.  The initial spend is usually a high-value purchase that starts from as low as $1 to get someone used to spending money and normalise it.

 

Academic research supports these concerns. A 2022 Scientific Reports study found purchasers of loot boxes had a 1.87× higher risk of severe psychological distress[3], independent of age, gender, or gambling symptoms. These same individuals did not show the same mental health risks from non-randomised game purchases.  This is evidence of harm to our children, and it must be taken into account by the committee.


It is not acceptable for games to be rated suitable for 12-year-olds while embedding mechanics that mirror those used in casinos. The industry has, for too long, relied on self-regulation and vague disclaimers. We believe New Zealand should adopt a principled stance: if it looks like gambling and feels like gambling, it must be rated as such.


Children deserve games that entertain, inspire, and challenge them, not software designed to harvest their attention and money. We must draw a clear ethical boundary around monetisation practices targeting our youth. This is not about banning games or limiting creativity; it is about ensuring that the commercial interests of developers never override the developmental needs of our young people.


The Role the Government Should Play

When faced with the complex reality of online harm to young people, the instinct of Governments worldwide has been to reach for control. History and technological evolution have consistently demonstrated that there are always alternatives. Attempts to regulate the internet through surveillance, digital identification systems, and platform gatekeeping, such as those currently pursued and in some cases already enacted in the United Kingdom, Australia, and the European Union, often fail to account for the adaptive, decentralised, and borderless nature of the internet itself.


RTVNZ holds that the role of Government in this space must not be to dominate, but to educate, empower, and inform. The idea that digital IDs and facial recognition systems will protect children from online harm is fundamentally flawed, not least because the very tools designed to shield them can be easily circumvented.  As shown by the explosion in VPN usage in response to the UK’s Online Safety Act, which in some cases went all the way up to 1,800% daily[4], adults and children alike will seek and find workarounds to get what they desire, so long as it's legal.  These tools are no longer confined to the “tech-savvy”; VPNs are now bundled with mainstream browsers like Brave and Opera, and decentralised platforms are becoming increasingly accessible.


The danger lies not merely in the futility of such control-based systems, but in their consequences. When young users are alienated by surveillance and constant ID checks, they do not disappear from the internet. Instead, they migrate to unregulated, often darker, corners of the digital world. These are not merely forums for offensive slurs or crude language, but spaces where predators, disinformation, and harmful content flourish beyond the reach of public accountability.  Attempting to “declare war” on the internet by further outlawing these tools is a losing battle that risks driving vulnerable users away from legitimate and lawful places and into these rabbit holes.  We urge the Government to avoid mirroring authoritarian models of digital surveillance and instead uphold democratic values of freedom and trust.


Instead of trying to become digital parents, Governments should not only let kids be kids but also let parents be parents.  Authorities should support actual parents and guardians in navigating this new terrain. The focus must be on fostering digital literacy, promoting mental health awareness, and equipping families with the knowledge and tools to critically assess online content. In parallel, investment should be made in school-based programmes that teach media discernment, privacy awareness, and resilience to online manipulation.


The role of the state must include honest communication and a foundational trust in its citizens, rather than treating the public as potential wrongdoers by default.  Scare tactics and state paternalism only distance young people further.  RTVNZ believes in empathy as one of its key principles and in determining what people want and need.  Governments must level with citizens: harmful content exists, but so too do the tools to counter it, and these tools begin with education, not control.  The misconduct of a minority should never justify imposing restrictions on the law-abiding majority. We do not believe in reactive policy, only correct policy.


The idea of protecting our children can seemingly compel people into giving up anything and everything and justify doing any sort of action.  Digital identity schemes may appear well-intentioned, but the technical, ethical and practical implications ultimately make it unviable to do.  The United Kingdom’s Online Safety Act has sparked significant backlash and circumvention efforts, while Australia’s push to require age verification even for search engines[5] risks driving users toward alternative, unregulated platforms.


Let us not follow the path of authoritarianism cloaked in child protection.  The notion that those who value privacy are indifferent to children’s safety must be firmly rejected.  We do care about the safety of our young people, but it must be guided with integrity, proportionality and ultimately trust.  Above all, a democratic Government must place trust in its citizens as the foundation of any effective and respectful online safety policy.  The Government’s role is not to control the internet, but to equip our people, especially our young, with the capacity to navigate it wisely.


Recommendations

Rock The Vote NZ calls on the Committee to adopt a rights-respecting, evidence-based, and proportionate framework for addressing online harm to young New Zealanders. The principle here is holding the companies to account, not restricting the freedoms of our citizens and having trust in them. Our key recommendations are:


  1. Mandatory Odds Disclosure for Digital Purchases: Any platform or game offering randomised rewards (e.g. loot boxes) must be required to disclose the odds and mechanics involved. If it resembles gambling, it must be classified and regulated as such.

  2. Ban Dark Patterns in Children's Apps and Games: Prohibit manipulative interface designs such as obfuscated pricing, forced continuity, or pay-to-win models targeted at minors.

  3. Reject Digital ID ideas and stand against what other countries are doing: Avoid implementing facial recognition, ID upload systems, or biometric age verification. These measures are easily circumvented, often disproportionately intrusive, and risk pushing young users into unsafe corners of the internet.

  4. Invest in Digital Literacy and Parental Support: Fund education programmes that teach children media discernment, digital ethics, and financial awareness. Keep up with the latest trends that young people follow. Provide parents with resources that allow them to engage meaningfully in their children’s digital lives.

  5. Support, Do Not Take Over: The Government’s role is to support families, not replace them. Rather than paternalistic control, empower a generation of informed, resilient, and critically thinking digital citizens.


Let us protect our children not by coercion, but by education, ethics, and example.


Rock The Vote NZ


[2] https://www.youtube.com/watch?v=YEFptHp0AmM - Cocomelon - The Most Evil Channel On YouTube

 
 
 

Comments


bottom of page