23 June 2022

Opening address by Australian Information Commissioner and Privacy Commissioner Angelene Falk

Check against delivery

Thank you very much, Kate, Daniella and the Deloitte team, for the opportunity to speak today at the launch of the 2022 Australian Privacy Index.

I would like to begin by acknowledging the Gadigal people of the Eora Nation, the Traditional Custodians of the land from which I join you today.

I pay my respects to Elders past, present and emerging, and I extend that respect to First Nations peoples with us today.

This year’s index, with its theme – ‘Every Breath You Take’ – comes at a very critical juncture.

We have seen over the past two and a half years the rapid acceleration of the digital world, which has brought with it a significant increase in data sharing, in tracking, and in monitoring.

If I take us back to those days in early 2020 – before the COVID-19 pandemic had taken hold in Australia – the OAIC was conducting the fieldwork for our Australian Community Attitudes to Privacy Survey.

And the top-line finding was that almost 9 in 10 Australians wanted more control and choice over the collection and use of their personal information. Australians saw the biggest risks for privacy as identity and fraud, data security and breaches.

After the pandemic was declared, we went on to test emerging community views on privacy and COVID-19.

And the majority of the Australians we surveyed agreed that some privacy concessions needed to be made for the greater good during the pandemic, as long as they were not permanent.

Half considered their privacy was more at risk in the COVID environment than usual, and interestingly, or perhaps in accordance with the circumstances at the time, location tracking was a particular concern.

And location tracking and surveillance ranked higher as privacy risks than they had before.

These concerns are still as relevant two and a half years later. Many of the themes that will be discussed I’m sure today – the sharing of information, surveillance, location tracking and personalisation – have really dominated the discourse. Not just for those of us who work in privacy, but for all Australians.

If there is a positive to come out of the pandemic in relation to privacy, it’s that the public is now more privacy-aware.

During the pandemic we saw the community recognising the benefits of providing their personal information to achieve other important societal objectives, such as public health – a clear value exchange. And it heightened public awareness of privacy as a critical issue.

Likewise, we see a fairly high level of awareness among the community that the practices that will be discussed today – location tracking, surveillance and personalisation – are taking place in our daily lives.

But I think there remains a clear gap between what organisations provide to individuals and what individuals need to feel in control of their personal information. And bridging this gap starts with offering fair privacy choices.

Our research tells us only half of Australians feel most organisations they deal with are in fact transparent about the way they use their personal information.

Perhaps that’s not surprising when we consider that the usual vehicle for addressing Australians’ concerns is the privacy policy. Recent research by CHOICE found the average privacy policy is 4,000 words long, takes 16 minutes to read and has poor readability.

So just to reflect on that for a moment, clearly it is not realistic nor is it fair to expect individuals to absorb these policies, decipher complex practices, and to give their meaningful consent. Nor does privacy law require them to do so. A privacy policy is a transparency vehicle.

The pendulum I think needs to shift the burden from the shoulders of individuals to those of businesses that seek to derive commercial benefits through personal information.

One of the critical goals for my office is to influence the behaviour of organisations so that individuals are protected from harm, offered fair privacy choices, and have a greater degree of choice and control, and also that entities are incentivised to build the systems that proactively address privacy risks.

So, the regulatory activities from my office are aimed at preventing and addressing privacy harms, including those emerging online.

Among our focus areas is technologies and business practices that record, monitor and track, and we are taking action to hold businesses to account.

You’ll be aware that I recently made three determinations involving 7-Eleven, Clearview AI Inc and the Australian Federal Police that collectively address the use of facial recognition technology, whether the collection of personal information was necessary and fair, and failures around privacy governance.

In the case of Clearview AI, I determined the company breached the Privacy Act by scraping Australians’ biometric information from the web and disclosing it through a facial recognition tool.

The Clearview system includes a database of more than three billion images scraped from social media platforms and other websites.

My determination, which I should note is under appeal, highlighted the lack of transparency around Clearview AI’s collection practices, the monetisation of individuals’ data for a purpose entirely outside reasonable expectations, and the adversity and risk of harm to people whose images are included in their database.

In relation to the 7-Eleven case, 7-Eleven collected customers’ facial images as part of a customer feedback survey and used facial recognition technology to exclude survey responses that might not be genuine.

This is a clear example of using a sledgehammer to crack a nut, and the determination illustrates the importance of proportionality and also community concern being key factors in business decisions.

At this point, I should mention privacy law reform briefly.

Among the OAIC’s key recommendations to the review of the Privacy Act to the Attorney-General’s Department is that a new obligation be introduced for all personal information handling to be fair and reasonable.

We say this would help to prevent activities that do not meet community expectations, and it would require entities not just to collect information by fair and lawful means – as is the current legal test – but to collect, use and disclose it fairly and reasonably.

As well as fairness, we say the Privacy Act needs to have accountability at its centre.

We recommend the Privacy Act is amended to include enhanced accountability requirements, to ensure that entities regulated by the Privacy Act implement actions and controls that demonstrate their compliance with the privacy regulatory framework.

This includes requirements to implement a risk-based privacy management program and a privacy by design approach.

This would complement privacy self-management mechanisms, such as notification and consent requirements, with appropriate organisational accountability obligations to reduce the burden currently on individuals to understand and consent to complicated practices.

Instead, organisations will have to demonstrate upfront that they handle personal information fairly and reasonably.

There are also a number of developments internationally that will feed into the domestic consideration of privacy law reform.

Other countries are reviewing their data protection laws, such as the UK. The EU is considering legislation that requires consent before certain entities can use personal information of end users of their platforms to provide online advertising. It is also considering separate legislation to impose bans regarding targeting advertising using sensitive data or personal information of minors.

So, ahead of any regulatory changes, now is a good opportunity for organisations to take stock.

Both the digital environment and community expectations are constantly evolving. So, standing still is not an option, and nor is taking a narrow view of your privacy obligations.

When developing new products and technologies, organisations need to take account of privacy obligations and community expectations to gain the community’s trust and confidence.

Some of the strategies organisations should be implementing, which the Deloitte team will speak more about today, include:

  • going beyond notice and consent and being transparent about personal information handling practices at every stage of the consumer journey
  • using consistent, simple language to describe online tracking and monitoring activities, and providing options to opt in, to support individuals to make informed choices
  • making pro-privacy settings the default for products and services
  • optimising preference centres to provide customers with real choices
  • and clearly highlighting the ways in which you’re protecting personal information.

These practices all go to the issue of accountability and support privacy self-management in order to build trust.

They put the individual at the centre.

With that, I congratulate Deloitte for expanding our understanding of these issues today and thank you again for inviting me to participate in this event.

I look forward to ongoing discussion about what are fast-moving and critical privacy issues facing Australia.

Thank you.