Skip to main content

If you have been impacted by the Qantas cyber incident, please view our statement.

Carly Kind

Carly Kind
Privacy Commissioner

Published:  

Next week, we’ll see the release of the Productivity Commission’s interim report of its inquiry into Harnessing data and digital technology – the third of the 5 productivity inquiries commissioned by the Government. It’s an inquiry of close interest to the OAIC. Data and the digital world are intrinsic to our work across privacy, access to government information, and government information management. The inquiry is exploring 4 reform areas, including how to embrace reform to enable AI’s “productivity potential”.

Privacy is often pitted against productivity, and submissions to the enquiry by large tech players have asserted that the enforcement of robust privacy laws may imperil their ability to develop AI tools that enable productivity across the Australian economy.

At the OAIC, we recognise that AI has the potential to benefit the Australia people and the economy by improving efficiency and productivity across a wide range of sectors, including quality of government services for the Australian people.

However, the efficiency and productivity dividends of AI will not be realised if AI tools don’t enjoy the trust and confidence of the Australian public. Australians care a lot about their privacy, and want more choice in and control of how their personal information is handled. They are sceptical about the roll out of artificial intelligence without proper guardrails. Importantly, according to the ACCC’s digital platforms consumer survey, 83% of Australians agree that companies should seek user consent before using their data to train AI models.

This reinforces what we already know. Our Australian Community Attitudes to Privacy Survey found that 84% of Australians want more control and choice over the collection and use of their information. They also see privacy as important when they are looking to purchase products and services, rating it as the third most significant factor after quality and price.

As the nation’s privacy regulator, it is my job to ensure that new technologies can deliver social and economic benefits in a way that is in line with these community expectations. In our October 2024 guidance on developing and training generative AI models, we acknowledged the potential for AI to benefit the Australian economy and society, and outlined how entities could develop and train AI models consistently with the law. We also produced guidance to help businesses to comply with their privacy obligations when using commercially available AI products.

It is clear that existing laws apply to AI technologies, and that regulated entities must ensure that their use of AI adheres to their legal obligations. But what does that look like in practice?

Today we published our Report into preliminary inquiries of I-MED in order to provide a case study of what one company did to adhere to privacy laws when developing AI models. The study demonstrates how good governance, and planning for privacy at the start of a new initiative, can support an organisation to adopt new and innovative data-driven technologies in a way that protects the rights of individuals.

It’s just one example, and of course each business needs to take its own journey to ensuring it is appropriately considering privacy when it embarks on new projects, technologies or new or changed ways of handling personal information.

Strong privacy governance and safeguards are essential for businesses to gain advantage from AI and build trust and confidence in the community. Good privacy practices and clear laws will support innovation, and deliver benefits for all. A more productive economy is contingent upon Australian consumers confidently participating in the digital economy and businesses retaining the trust of their customers in deploying new innovations. Privacy is an important foundation to enable that to happen.

Note: The OAIC did not open an investigation into I-MED or compel the production of documents. We have undertaken preliminary enquiries to ascertain whether there may be an interference with the privacy an individual or a breach of APP 1 warranting an investigation. While on this occasion we have concluded that there is not, this case study should not be taken as an endorsement of I-MED’s acts or practices or an assurance of their broader compliance with the APPs.