Transcript
Have you wrapped your head around IPP3A yet?
Have you wrapped your head around IPP3A yet? Have you thought about how, when the new IPP comes into force, it might affect one’s use of generative AI tools, in circumstances where you’re asking an AI tool to supplement personal information you include in a prompt? For example, you might include a bunch of factual information in a prompt about a person’s behaviour, and then ask the AI tool to explain that behaviour, resulting in some useful content that you download from the AI tool for inclusion in a broader report you’re writing. That’s what I want to touch on briefly today – the role that IPP3A may play in this kind of situation. And I should perhaps add that, for the purposes of these reflections, I’m assuming that a privacy impact assessment has been done on use of the AI tool.
Privacy Amendment Bill
As you probably know, there is a Privacy Amendment Bill currently before Parliament which, if enacted, will insert a new IPP3A into the Privacy Act. The purpose of this new IPP is to fill a perceived gap in our privacy law as to the circumstances in which we need to disclose certain things to people when we collect their personal information. At the moment, the Privacy Act’s transparency requirements only apply when we collect personal information about individuals directly from those individuals. If we collect personal information from a third party, such as another agency or organisation, the Act does not require us to tell the individuals about that collection. On this point, our privacy law is out of step with privacy laws in some other jurisdictions, most notably the European Union and the United Kingdom, whose privacy laws do require disclosure of certain things to individuals when you collect information about them from another source.
As currently drafted, IPP3A will state – in essence – that if you collect personal information other than from the individual concerned, you’ll need to take any steps that are reasonable in the circumstances to inform the individual of the same kinds of things listed in the current IPP3. There are a couple of differences but for the most part the information is the same.
A key point I want to make is that IPP3A does not refer to collecting information “from another person, agency or organisation”. The precise wording (currently) is: “If an agency collects personal information about an individual other than from the individual concerned…”. To my mind, that means any other source.
You’ll need to comply with IPP3A unless an exception applies
And so, if you’re in a situation of collecting personal information about an individual from an AI tool, and regardless of whether that information is opinionative, derived, or inferred, you will need to comply with IPP3A. This means that, unless an exception applies, you’ll need to tell them about your collection of information from the AI tool.
An interesting consequence
Now, there’s an interesting consequence of this view in situations where section 11 of the Act applies in relation to your use of the AI tool. Before I get that, just to recap, section 11 applies where the AI tool provider is not using or disclosing personal information in your prompts for its own purposes. Where that’s the case, section 11 says there’s no disclosure of personal information from you to the AI tool provider, and no disclosure of personal information, including derived information, from the AI tool provider to you. You’re deemed to hold the information throughout.
Right, now we get to the interesting consequence I mentioned. When section 11 applies, there’s a good argument that, when you collect personal information directly from individuals, you don’t need to tell them that you may include their information in prompts to the AI tool. However, and this is the interesting point, if you are proposing to use their personal information in an AI tool in circumstances where you will be asking the AI tool to generate additional information about the person based on its pre-existing knowledge, IPP3A will require you to tell the individual about your obtaining additional information about them through the use of an AI tool. If you have to tell them that, then it will be implicit that you will be using their personal information in prompts to the AI tool, and so you might as well be completely transparent about that anyway from the outset. In other words, when section 11 applies, you might think that IPP3 does not require you to mention the AI tool provider as a recipient, but in this situation where you are seeking additional personal information from the tool, IPP3A will require you to be transparent about your use of the AI tool.
Practical implications
What are the practical implications? Well, if you know when collecting personal information from people that you will or may be seeking additional information about them through your use of a generative AI tool, then you might want to amend your privacy statement or, if you’re a health practitioner for example, your consent forms, so that you’re clear upfront about your use of the tool. If you didn’t know, when collecting personal information from people, about your subsequent use of an AI tool to seek additional information about them, you’ll need to take other steps to tell them (unless of course an exception applies).
Now I should add that there’s no immediate need to make changes to accommodate IPP3A, because the Bill is still before Parliament and, on its current wording, will not come into force until 1 June 2025. But if you’re an agency or organisation using or thinking of using an AI tool to obtain additional information about individuals, this is something – I think – you should be planning to address.
If you need any help with that, please feel free to get in touch. You can find my contact details on my website at richardbestlaw.com/contact.