default-output-block.skip-main
Politics

New plan for internet safety released

The Department of Internal Affairs has released a discussion document for consultation which aims to make it less common for people to see harmful and illegal content. Photo / Wikimedia Commons / Lobo Studio Hamburg

The Department of Internal Affairs (DIA) wants to make social media subject to regulation similar to that of traditional media platforms.

It has today released a discussion document for consultation which aims to make it less common for people to see harmful and illegal content.

The department said different standards and rules currently applied to different platforms and the regulations were decades old and predated social media.

The proposed reforms would bring them all into one framework, to be overseen by a new independent regulator. Its main focus would remain on the most risky material such as harm to children, and promotion of terrorism and violent extremism.

DIA said freedom of expression would be protected, and there would be no powers over editorial decision-making or individual users who shared legal content.

DIA said the existing child and consumer protection was not as strong as it should be, was difficult to navigate and had big gaps.

People had to figure out which of five industry complaint bodies to go to if they felt content was unsafe or breached conditions, and not all forms of content were covered by those bodies.

It said the status quo was slow and reactive. Authorities could only take action after people had already been harmed. Because of this, New Zealanders were being exposed to more harmful content than ever before, it said.

DIA said the proposed new system would apply some of the accountability mechanisms traditional media were subject to, to social media.

Platforms over a certain size (online or traditional media) would be required by law to comply with codes of practice to manage content and address complaints about specific harmful content.

"The system would keep powers of censorship for the most extreme types of harmful content," DIA said.

This included child sexual exploitation and promotion of terrorism material ('objectionable material'), and it was not proposing to change the definition of what was illegal.

"This material is already illegal, and it will remain illegal to produce, publish, possess, and share."

But other illegal content - such as harassment or threats to injure or kill - can be taken less seriously or even amplified online. The regulator would have powers to require illegal material to be removed quickly.

How it would work

Parliament would set expectations that platforms must achieve; there would be codes of practice, with more detailed minimum expectations. A new independent regulator would be responsible for approving the codes and making sure platforms complied with them.

"Government would only be involved with individual pieces of content if they are, or could be, illegal - this power already exists.

"We want to create safer platforms while preserving essential rights like freedom of expression, freedom of the press, and the benefits of media platforms.

"This is why public feedback on the review is essential."

Sector or industry organisations would help come up with enforceable codes of practice. DIA said the proposal was a deliberate shift away from the status quo of regulating content, towards regulating platforms.

Traditional media - social media

The existing system has processes in place to ensure that broadcasters like television, radio, and other traditional media comply with existing codes.

These are a mixture of government and industry-led regulations. Social media does not have similar compliance requirements in New Zealand.

The proposed new regulator would make sure social media platforms followed codes to keep people safe. Media services like TV and radio broadcasters would also need to follow new codes tailored to their industries.

Child protection and consumer safety not effective - DIA

The DIA report released today said it had heard widespread concerns about the harm some content was causing children and young people. Many of these concerns were about social media and other online platforms, but it had also heard concerns about other types of platforms, such as broadcasters.

Risky content included:

  • age-inappropriate material
  • Bullying and harassment
  • Promotion of self-harming behaviours
  • Instances of harmful content on mainstream social media sites, such as influencers promoting dangerous disordered eating to teenage girls

How would the codes of practice work?

Codes of practice would cover:

  • processes for platforms to remove content and reduce the distribution of unsafe content;
  • accessible processes for consumer complaints for particular content;
  • support for consumers to make informed choices about potentially harmful content;
  • how platforms would report on these measures, including on transparency; and
  • how they were reducing the impact of harm from content, and their performance against codes.

The codes would be developed by industry groups, with support from the regulator.

Who currently deals with harmful content?

Current legislation covering content is comprised of the Films, Videos and Publications (Classification Act) 1993 and the Broadcasting Act 1989. Other parts of the wider media industry rely on voluntary compliance, such as the New Zealand Media Council and Netsafe's Code of Practice for Online Safety and Harms.

There are also a number of regulations covering the online space, including the Harmful Digital Communications Act 2015 and the Unsolicited Electronic Messaging Act 2007 (spam).

Organisations like DIA, police, the Broadcasting Standards Authority, the media council, the Classification Officer, and the Advertising Standards Authority have responsibility for various parts of the media and digital environment.

Approach similar to overseas developments

Under the proposal, there would also be investment in education to build greater awareness and understanding of the risks associated with making, watching, and sharing harmful content.

DIA said targeted engagement with media groups, government agencies and specialist interest groups took place in 2021 and 2022. It said the approach aligned with what was being developed in other countries.

Consultation on the proposals runs until the end of July.