5 May 2022

Keynote address by Australian Information Commissioner and Privacy Commissioner Angelene Falk: Organisational accountability and building consumer trust and confidence through privacy by design

Good evening

Thank you to Alastair MacGibbon for inviting me to speak here. I recall the many occasions where our work has intersected, first as eSafety Commissioner, then special adviser to the Prime Minister on cyber security, and as head of the ACSC, where we developed joined up advice with IDCARE on the first multiparty data breach response in June 2018.

I also wish to acknowledge the expertise of my fellow speakers, and we are particularly honoured to have Dr Ann Cavoukian, creator of the Privacy by Design Principles, lead discussion tonight.

I wish to commend CyberCX for holding its inaugural Privacy by Design Awards program and celebrating those it recognises in applying privacy by design in their customer facing digital interfaces.

The award recipients are determined by CyberCX. As the independent regulator I support the concept of these awards, to provide a spotlight on the value of privacy by design, and to strengthen organisational accountability.

A concept that is very relevant as we mark this year’s Privacy Awareness Week.

I would note at this point we are of course in caretaker and my remarks will necessarily be broadly focused on the requirements of the Privacy Act and our previously announced expectations as regulator.

And as regulator, I have to emphasise that while tonight acknowledges better practices, there remains much work for organisations to do.

The need to achieve and sustain better privacy practice, and the building blocks required to do so against a backdrop of domestic and international regulatory activity and convergence, is the focus of my remarks this evening.

As you would have seen, our Privacy Awareness Week theme is Privacy: The foundation of trust.

Trust is essential to bringing the community along as the digital economy continues to play a major part in our lives.

It is essential that the public can trust organisations to handle their personal information responsibly; transparently, accountably and in ways that the community would regard as fair and reasonable.

That trust will not occur without privacy as the foundation.

So what do your customers say about privacy? Our most recent Australian Community Attitudes to Privacy Survey in 2020 showed 70% of people consider their personal information to be a major concern in their life.

Our survey found people are most concerned about data security, breaches, identity theft and fraud. And my office’s regulatory experience shows that the consequences of poor security practices can cause both individual and collective privacy harms.

For individuals, we only need to think about the damage that can be caused by identity theft; the time spent to re-establish identity, the financial hardship and the anxiety that can flow.

And the information may persist on the dark web, combined and used for malicious purposes. Ultimately the cost of data breaches is collectively born by us all. Research from the Australian Institute of Criminology showed the cost of identity crime as $3.1 billion in one year, and the most common impact on individuals was being refused credit.

The community is also concerned about social media and smart phone apps.

And trust in personal information handling generally has continued to decline.

It is a given then, that making privacy a priority needs to be a central pillar of business success.

How do organisations build trust in the handling of their personal information, and how do we ensure data flows with trust?

As I speak about privacy and trust this week, I recall an address I gave in Tokyo back in 2019 on the sidelines of the G20 Summit, where global data protection authorities traversed models to achieve that trust.

There I said ‘the individual needs to be at the centre of such a model –  as the key measure of its success’ and referred to the role played by both regulators, and the regulated, in ensuring trust in the data economy. 'And achieving public benefit, for individuals,’ I said, ‘is ultimately what we all seek to enhance and protect.’

That focus on individuals still stands true.

Australians trust health service providers the most with personal information (70 per cent), followed by federal government departments (51 per cent) and financial institutions (50 per cent).

Manufacturers of smart phones, smart toys and other hardware, search engines, apps and the social media industry are the least trusted. 70 per cent of the community found the social media industry untrustworthy.

Yet these less trusted sectors are the place where we must all engage every day and more than ever, they intersect with increasingly digitised traditional sectors, as products and services go online. Of course the pandemic increased this interaction.

So there is a shared cross-sector imperative, for increasing the bottom line for trust in privacy.

Some might think, does it really matter, if millions of Australians engage online anyway? But is there real choice, when faced with take it or leave it terms?

Only half of Australians feel that most of the organisations they deal with are transparent about the way they use their personal information.

And even if we achieved transparency, there are limitations to an individual’s ability to exercise meaningful choice and control; to assess the risks and avoid privacy harms. Information practices are often complex, options limited, as well as our time to closely consider and decide.

In fact CHOICE’s analysis of the privacy policies of 75 apps and websites found they averaged 4,000 words, take 16 minutes to read and most have poor readability.

That is why the pendulum needs to shift the burden from the shoulders of individuals to those of organisations which seek to derive commercial and public benefits through collecting personal information. They must act as the trusted custodians of personal information.

As Erin Turner, CEO of the Consumer Policy Research Centre commented, ‘Customers are increasingly going to start asking not how will I protect my data, but what is the company doing to make this process easier for me?’

To gain trust organisations must build products and services which handle personal information in ways that the community would rightly assess as fair and reasonable, and protect privacy up front.

And that is where privacy by design has much work to do.

We see companies taking products to market with privacy implications and vulnerabilities, when a comprehensive privacy impact assessment would have revealed these risks.

Again, it is about putting the individual at the centre of the process.

I have already demonstrated my concern about the use of facial images without an adequate assessment of the privacy impact, in my determination that Clearview AI, Inc. breached the Privacy Act by scraping Australians’ biometric information from the web and disclosing it, through a facial recognition tool.

The Clearview system includes a database of more than three billion images ‘scraped’ from various social media platforms and other websites.

My determination, which is under appeal, highlighted the lack of transparency around Clearview AI’s collection practices, the monetisation of individuals’ data for a purpose entirely outside reasonable expectations, and the risk of adversity to people whose images are included in their database.

So when developing new products and technologies, organisations need to take account of their privacy obligations as well as community expectations around privacy. While new technologies can be used to enhance our lives, they can also be used in ways that cause harm.

There is a high level of interest around the globe in preventing online harms; privacy risks, the security of personal information, consumer protection, scams, competition and safety.

And a common theme across regulators and frameworks, is the need to take a proactive approach to prevent harm upfront. You will see this reflected in my office’s submission to the review of the Privacy Act, and in recent papers on online marketplaces from the ACCC as part of its Digital Platform Services Inquiry, along with the new Online Safety Act regulated by the eSafety Commissioner.

You will see continued collaboration between the OAIC and domestic regulators like the ACCC, ACMA, and eSafety, as well as APRA and ASIC alongside international cooperation.

The intersection with privacy and consumer protection law, online safety, security, technology and media will increase as we work in concert to achieve the best outcomes in the public interest.

With the ACMA, the ACCC and eSafety, we recently formed the Digital Platform Regulators Forum, to support a streamlined and cohesive approach to the regulation of digital platforms.

Our joint investigation between the OAIC and the Information Commissioner’s Office in the United Kingdom, is an example of international cooperation. This built on a Resolution on Facial Recognition Technology presented by our offices to the Global Privacy Assembly whose membership includes more than 130 data protection authorities.

There is much happening outside Australia.

The provisionally agreed Digital Markets Act and Digital Services Act in the European Union will introduce requirements aimed at addressing the market power of dominant platforms that act as gatekeepers online and to create safer online environments, some of which impact the handling of personal information.

This is likely to include banning targeted advertising based on sensitive data (such as sexual orientation, religion and ethnicity), and prohibiting targeted advertising on platforms accessible to children.

These measures demonstrate a shift towards proactive obligations to identify risks before they lead to harm and clear prohibitions for harmful conduct.

And California’s proposed Age Appropriate Design Code Act following the UK’s Code, also contains design requirements, such as an obligation to consider the best interests of children likely to access a good or service when designing, developing or providing the good, service or product feature.

So what are the implications of this significant regulatory activity for privacy by design and cyber security?

It further emphasises that standing still is not an option, and taking a narrow view of your privacy and security responsibilities will not cut it.

The intersection between privacy principles and the security principle, that requires reasonable steps to protect personal information, is clear.

Practitioners need to take a life cycle approach to handling personal information from creation to destruction. Ask yourself, do you need all the information you are collecting? Are you operating with a belief in data minimisation? Are you continually seeking to reduce your contribution to cyber risk?

It would be remiss of me not to mention some basics.

The Notifiable Data Breaches scheme is well established after four years and we expect organisations to have strong accountability measures to prevent and manage data breaches in line with legal requirements and community expectations.

If organisations do not keep data secure, they risk losing both people’s confidence and business.

The human factor continues to play a role in many cyber security incidents. We see a significant number of cyber breaches as a result of phishing, with emails looking to trick or persuade staff to share usernames and passwords.

Here are a few suggestions:

  • Anticipate – don’t wait for the damage to be done
  • Train your staff.  Make sure staff are aware of their privacy and security obligations.
  • Identify risk and build systems and processes to prevent risk.

This all goes to the issue of accountability.

Accountability is globally recognised as a key building block for effective privacy management.  And our theme for this week, is all about building, block by block, trust.

We are urging organisations to embed strong accountability measures in their internal privacy management processes.

Our Privacy Awareness Week website sets out some core accountability fundamentals that will help businesses build a strong privacy foundation including:

  • implementing a privacy management plan
  • appointing privacy champions
  • assessing privacy risks
  • adopting a ‘privacy by design’ approach
  • ensuring secure systems are in place to protect personal information including preparing for data breaches
  • regular staff training and ongoing review of privacy practices.

Good organisational accountability involves going beyond a check-box exercise of compliance with the principles.

It requires organisations to consider upfront how their information handling activities will impact individuals, and to mitigate against foreseeable privacy risks and harms.

Just as employers you take steps to minimise risk to ensure a safe system of work, organisations should be taking upfront steps to mitigate privacy harms to individuals without relying on reactive action by the regulator.

If you do, you will also be better able to anticipate and adapt to different business and regulatory changes, as well as to crisis situations.

One of the key takeaways from Privacy Awareness Week is that as important as it is to get the foundations right, we need to remember that our privacy platforms need ongoing upgrades.

Organisations need to consistently revisit and revise privacy settings and anticipate changes in their environment. They also need to be aware and responsive to the needs and concerns of their customers and the wider community.

One concern relates to the way in which gender can influence privacy outcomes. There has been considerable evidence of how using artificial intelligence to make decisions can lead to bias against women, and disadvantage others in the community.

As I said on International Women’s Day, the experience of privacy isn’t gender neutral. The use of personal information for location tracking and facial recognition can disproportionately impact and lead to privacy harms for women.

These privacy harms should be front and centre to truly build privacy by design into new and existing technologies.

The best organisations will not only focus on what good privacy practices can prevent, but what privacy can enable. I believe that organisations can make privacy part of their competitive advantage, and I urge you to reflect on this both tonight and in the coming days. As good corporate citizens, it is also the right thing to do.

As an example of the public benefits that can accrue from good privacy policy, I would point to the handling of health information in the pandemic.

At the outset of the pandemic, Australia’s privacy regulators placed a premium on ensuring that an individual’s privacy was respected, that if an individual had the virus, their identity was not made public.

This approach was necessary at that time, so that people got tested and accessed health care. Breaching an individual’s privacy had the potential to adversely impact collective health outcomes.

This takes me back to my earlier point.

When it comes to building trust, the individual needs to be at the centre of such a model.

Achieving public benefit, for individuals, should be what we all seek to enhance and protect.

And that is how we can make privacy the foundation of trust.

Thank you and have a very enjoyable evening.