NZ’s social media crackdown
New Zealand is poised to follow France and Australia down a regulatory cul-de-sac – one paved with good intentions and riddled with structural failure. The proposed Social Media (Age-Restricted Users) Bill, which seeks to bar under-16s from accessing social media without verified age checks, is not just ill-considered, it is a textbook example of enforcement-first governance that will generate new harms, empower criminal actors and criminalise the very youth it claims to protect.
Of course, the problem is real: social media platforms are designed to be addictive, manipulative and often hostile to adolescent development. But the solution – mandatory age verification via government-certified tools – is a blunt instrument wielded without strategic foresight. It assumes that technical enforcement will suppress demand. It won’t: it will redirect it and there are many willing to take advantage of the opening – I’d say almost every gang in NZ.
France’s model, which New Zealand now seeks to emulate, mandates that platforms like TikTok and Instagram verify users’ ages using third-party tools. Parental consent is required for under-15s, and fines of up to one per cent of global revenue are threatened for non-compliance. New Zealand’s version is slightly softer in tone but no less misguided. Platforms must take “reasonable steps” to prevent under-16s from signing up, with penalties reaching NZ$2 million. The Office of the Privacy Commissioner has already flagged the obvious: this will require large-scale data collection and poses serious privacy risks. But the deeper problem is not just privacy – it’s the creation of a black market.
Once access is gated by ID, a bypass economy emerges. Adolescents – driven by social pressure, curiosity or desperation – will seek out fake credentials. Organised gangs will respond with scalable forgery operations, social engineering schemes and manipulative ‘verification services’ that harvest and store personal data. These minors, once onboarded via black-market IDs, become curated assets. Their fake identities are logged. Their behavioural data is tracked. Their real identities may be inferred or extracted over time. And then the long-tail exploitation begins.
This is not speculative. It is structurally inevitable – in fact it already happens through many ‘gaming web-sites’ – are you going to police those as well? The moment you criminalise access without addressing underlying demand, you create victims. Teens manipulated into buying fake IDs may be prosecuted for fraud or identity misuse. They may be saddled with police records for behaviour that is socially driven, not malicious. Worse, they may be extorted – ‘Pay or we expose your fake ID use’ – or stripped of assets because their credentials were harvested by the very actors who helped them bypass the system.
This is not protection. It is entrapment.
The cost burden compounds the failure. Even if platforms outsource verification to third-party tools, they still absorb licencing fees, backend integration costs and compliance audits. These costs will be passed on – either to users via monetisation shifts or to advertisers via intensified data extraction. Meanwhile, police involvement remains minimal unless criminal misuse triggers investigation. So, the state gets to claim protective intent without allocating enforcement resources, while platforms absorb the liability and youth absorb the risk.
* This is governance by optics. It creates the illusion of action while externalising harm – and, in fact, expanding the harm.
New Zealand’s digital infrastructure is not equipped to handle this. Nor is its civic culture prepared for the fallout. The bill mirrors Australia’s approach without sufficient adaptation to local context. It treats access as a binary – legal or illegal – ignoring the spectrum of social need, especially for youth navigating identity, belonging and peer pressure.
* It offers no restorative pathways, no educational alternatives and no federated trust systems. Just control.
And the control is brittle. It relies on the assumption that adolescents will comply, that platforms will self-police and that criminal actors will be deterred by regulation. None of these assumptions hold. In reality, enforcement will be patchy, circumvention will be rampant and the most vulnerable users – those already at risk of social isolation or exploitation – will be pushed further into informal, unsafe or alternative digital spaces (I have already mentioned gaming as an example but there are many, many others and some will be created in response to such legislation). The bill does not mitigate harm: it redistributes it.
The international context reinforces the critique. France’s model has already triggered concerns about surveillance creep and exclusion of undocumented youth. Australia’s approach has been criticised for its vagueness and lack of enforceability. New Zealand, rather than learning from these missteps, appears determined to replicate them. This is not leadership: it is mimicry.
This is growth-era logic masquerading as protection. It prioritises enforcement over civic capacity. It criminalises vulnerability. And it incentivises data hoarding by bad actors, while regulators remain blind to the informal ecosystems they’ve catalysed.
There is a better way. Platforms could adopt ambient automation – non-invasive, behaviour-based moderation that guides youth engagement without ID mandates. Civic-first digital ecosystems could be designed to support federated moderation, restorative justice and contextual nuance. But that requires imagination. And imagination is precisely what this bill lacks (sometimes I shake my head).
New Zealand has an opportunity to lead – not by copying France’s punitive model, but by rejecting it. By recognising that enforcement-first governance is not just ineffective – it is actively harmful. We should design infrastructure that protects without criminalising, moderates without surveilling and engages without coercing.
The current bill does none of that. It is an understanding (of the system) failure, a strategic failure and a failure of the imagination. Unless it is radically rethought, it will create more problems than it solves.
Dr Michael John Schmidt left NZ after completing postgraduate studies at Otago University (BSc, MSc) in molecular biology, virology, and immunology to work in research on human genetics in Australia. Returning to NZ has worked in business development for biotech and pharmacy retail companies and became a member of the NZ Institute of Directors. This article was first published HERE
France’s model, which New Zealand now seeks to emulate, mandates that platforms like TikTok and Instagram verify users’ ages using third-party tools. Parental consent is required for under-15s, and fines of up to one per cent of global revenue are threatened for non-compliance. New Zealand’s version is slightly softer in tone but no less misguided. Platforms must take “reasonable steps” to prevent under-16s from signing up, with penalties reaching NZ$2 million. The Office of the Privacy Commissioner has already flagged the obvious: this will require large-scale data collection and poses serious privacy risks. But the deeper problem is not just privacy – it’s the creation of a black market.
Once access is gated by ID, a bypass economy emerges. Adolescents – driven by social pressure, curiosity or desperation – will seek out fake credentials. Organised gangs will respond with scalable forgery operations, social engineering schemes and manipulative ‘verification services’ that harvest and store personal data. These minors, once onboarded via black-market IDs, become curated assets. Their fake identities are logged. Their behavioural data is tracked. Their real identities may be inferred or extracted over time. And then the long-tail exploitation begins.
This is not speculative. It is structurally inevitable – in fact it already happens through many ‘gaming web-sites’ – are you going to police those as well? The moment you criminalise access without addressing underlying demand, you create victims. Teens manipulated into buying fake IDs may be prosecuted for fraud or identity misuse. They may be saddled with police records for behaviour that is socially driven, not malicious. Worse, they may be extorted – ‘Pay or we expose your fake ID use’ – or stripped of assets because their credentials were harvested by the very actors who helped them bypass the system.
This is not protection. It is entrapment.
The cost burden compounds the failure. Even if platforms outsource verification to third-party tools, they still absorb licencing fees, backend integration costs and compliance audits. These costs will be passed on – either to users via monetisation shifts or to advertisers via intensified data extraction. Meanwhile, police involvement remains minimal unless criminal misuse triggers investigation. So, the state gets to claim protective intent without allocating enforcement resources, while platforms absorb the liability and youth absorb the risk.
* This is governance by optics. It creates the illusion of action while externalising harm – and, in fact, expanding the harm.
New Zealand’s digital infrastructure is not equipped to handle this. Nor is its civic culture prepared for the fallout. The bill mirrors Australia’s approach without sufficient adaptation to local context. It treats access as a binary – legal or illegal – ignoring the spectrum of social need, especially for youth navigating identity, belonging and peer pressure.
* It offers no restorative pathways, no educational alternatives and no federated trust systems. Just control.
And the control is brittle. It relies on the assumption that adolescents will comply, that platforms will self-police and that criminal actors will be deterred by regulation. None of these assumptions hold. In reality, enforcement will be patchy, circumvention will be rampant and the most vulnerable users – those already at risk of social isolation or exploitation – will be pushed further into informal, unsafe or alternative digital spaces (I have already mentioned gaming as an example but there are many, many others and some will be created in response to such legislation). The bill does not mitigate harm: it redistributes it.
The international context reinforces the critique. France’s model has already triggered concerns about surveillance creep and exclusion of undocumented youth. Australia’s approach has been criticised for its vagueness and lack of enforceability. New Zealand, rather than learning from these missteps, appears determined to replicate them. This is not leadership: it is mimicry.
This is growth-era logic masquerading as protection. It prioritises enforcement over civic capacity. It criminalises vulnerability. And it incentivises data hoarding by bad actors, while regulators remain blind to the informal ecosystems they’ve catalysed.
There is a better way. Platforms could adopt ambient automation – non-invasive, behaviour-based moderation that guides youth engagement without ID mandates. Civic-first digital ecosystems could be designed to support federated moderation, restorative justice and contextual nuance. But that requires imagination. And imagination is precisely what this bill lacks (sometimes I shake my head).
New Zealand has an opportunity to lead – not by copying France’s punitive model, but by rejecting it. By recognising that enforcement-first governance is not just ineffective – it is actively harmful. We should design infrastructure that protects without criminalising, moderates without surveilling and engages without coercing.
The current bill does none of that. It is an understanding (of the system) failure, a strategic failure and a failure of the imagination. Unless it is radically rethought, it will create more problems than it solves.
Dr Michael John Schmidt left NZ after completing postgraduate studies at Otago University (BSc, MSc) in molecular biology, virology, and immunology to work in research on human genetics in Australia. Returning to NZ has worked in business development for biotech and pharmacy retail companies and became a member of the NZ Institute of Directors. This article was first published HERE

No comments:
Post a Comment