Assessing the privacy implications of new tech

Issue: Volume 103, Number 7

Posted: 6 June 2024
Reference #: 1HAgsq

Privacy Commissioner Michael Webster explores how kura and schools can protect student privacy when introducing a new system or technology.

.

Kura and schools should have existing systems in place to protect students’ personal information.

But what can be done to protect students’ privacy when a change is being suggested to a process or system? Education Gazette explores this question with the Office of the Privacy Commissioner (OPC).

Protecting privacy

Privacy Commissioner Michael Webster says when thinking about changing how to collect personal information, implementing new technology, or adding new software, school leaders should be aware of how to do this in a privacy protective way and meet their obligations under the Privacy Act.

The OPC recommends factoring a privacy impact assessment (PIA) into existing processes as one way to achieve this.

“You don’t need to be a tech genius or privacy expert to do this as we’ve developed some tools and documents to help you succeed,” says Michael.

“It’s quite natural to think about the possibilities a new tool or tech can offer, especially if you’re at the end of a sales call, but as a teacher, or school staff member, you also need to dig further and really understand what it might mean for the children you look after.”

Privacy assessments

Doing a PIA will help kura and schools assess any new systems and identify the privacy risks present and any mitigations to protect privacy.

“A PIA will also help you understand whether you are meeting your obligations under the Privacy Act. You just need to follow some basic steps to help you assess and address privacy risks,” explains Michael.

Schools and kura can also refer to Safer Technologies for Schools (ST4S), an initiative recommended by the Ministry of Education to help take some of the guesswork out of choosing technology for schools and kura.

ST4S provides a catalogue of reports on commonly used school software, which have been assessed against a comprehensive set of security and privacy standards. The reports identify any risks associated with requirements that are not fully met and advise as to how these might be mitigated.

A good example of new technology being increasingly used is artifical intelligence (AI). Michael says it can bring significant benefits, but it also introduces new risks that need to be managed to minimise privacy harm.

“It might be tempting to use AI for jobs like writing school report comments, and some schools are doing that. But let me caution that when you’re using AI (or AI systems like report writers) you still need to be complying with the Privacy Act.”

Policy and processes

There are several things kura and schools can do to make sure they are only collecting, using, and sharing personal information in a safe and transparent way. 

“They should make sure their basic privacy policy and processes are firmly in place. That will set them up to think about privacy issues with their school community,” says Michael.

In the case of AI, kura and schools should let parents know what they are planning to do, then give them an opportunity to talk through potential benefits and concerns. If the school decides to go ahead, they could let parents who remain uncomfortable opt out of using AI for their children’s reports.

“Thinking about privacy is vital for using any new tool or technology well, so before making a change, make sure you are going through the correct processes to limit any privacy risk to children and young people.”

Social media and privacy

Concern about social media use was one of the three themes raised by experts in an Office of the Privacy Commissioner (OPC) survey on children and young people’s privacy in New Zealand.

The three key themes that emerged from the survey were:

  • social media is a major concern, and a combination of guidance and regulatory changes are needed to manage this risk to children’s privacy
  • more guidance is needed to help professionals, parents and children better understand privacy risks
  • some regulatory changes could better protect children’s privacy, including changing the Privacy Act to include a right to be forgotten, introducing a requirement to consider the best interests of the child, or creating a code of practice.

The OPC is working to develop privacy guidance for people who work with children and young people, including professionals in the education sector. Further engagement and next steps will be announced in late 2024.

For teachers and school leaders interested in finding out more, see details of the project at privacy.org.nz(external link).

Privacy Week

The Office of the Privacy Commissioner marks Privacy Week each year to promote privacy awareness, inform people of their rights under the Privacy Act, and help educate agencies about their responsibilities. This year, it was marked on 13-17 May and included presentations on protecting children in the digital age, debunking the myth that young people don’t care about privacy, and safeguarding children and young people’s privacy.

Learn more at privacy.org.nz/privacy-week-2024(external link).

More information about the OPC’s privacy impact assessments can be found at privacy.org.nz(external link).

For more guidance on artificial intelligence and ST4S from the Ministry of Education, visit education.govt.nz/school/digital-technology(external link).

BY Education Gazette editors
Education Gazette | Tukutuku Kōrero, reporter@edgazette.govt.nz

Posted: 10:00 am, 6 June 2024

Get new listings like these in your email
Set up email alerts