Published 7 May 2024

Read the keynote address prepared for delivery by Privacy Commissioner Carly Kind for the Office of the Information Commissioner Queensland Privacy Awareness Week launch event on Tuesday 7 May 2024.

Introduction

I am so honoured to be here today. Not only because I am standing here as the newly appointed Privacy Commissioner commemorating my first Privacy Awareness Week, but also because I am doing so in the city where I first became interested in and indeed passionate about privacy.

I was a first-year law student at UQ when the planes flew into the Twin Towers in New York, and my early years of my study saw the end of the post-cold war era and the beginning of the global war on terror. This shaped fundamentally how I came to understand abuses of state power and the importance of human rights law. It was through this lens that I came to be interested in the right to privacy, and its curtailment by expanding national security protections and growing surveillance powers.

I was then incredibly fortunate to commence my career as a young lawyer in a boutique law firm just across the bridge at 8 Petrie Terrace, which was then Boe Lawyers under principal Andrew Boe. At Boe Lawyers, I worked with passionate lawyers who showed me how law and policy could both advance and impede the enjoyment of social and racial justice. The culmination of these two experiences – my years studying law at UQ and my experience working as a young lawyer in the criminal courts – motivated me to want to pursue a career in international human rights, and later work at the intersection of technology and rights.

Over time, I came to understand that the right to privacy is a key means by which power is mediated, limited and expressed. In the early 2000s, we saw increasing national security intrusions into the personal realm, expanding surveillance powers and anti-democratic measures justified under the banner of counterterrorism. Infringements into privacy were one way in which power was exercised over individual journalists, activists and advocates.

It became clear to me that privacy is a fundamental enabling right for a range of rights, not only those related to procedural fairness but also expression, opinion, and self-determination, and that at its core it is about power.

Privacy is about power

Notions of power cut in every direction in the digital ecosystem – the power wielded by tech monopolies and duopolies; the power concealed in political microtargeting and misinformation campaigns; the lack of power and agency consumers feel when they’re using digital technologies.

Information about us is core to who we are, and it’s closely linked to our ability to determine who we are and what we do in life. When we have control of our personal information, we are empowered.

Equally, when others can access and use our information, they are wielding power.

It has been evident for a long time that government access to and use of personal information can operate as a means of exercising power, and can potentially disempower individuals. Indeed, the early founders and builders of the internet saw that technology was a means of ousting government control of information and placing it directly in the hands of individuals. John Perry Barlow, one of the founders of one of the earliest digital rights organisations, the Electronic Frontier Foundation, wrote something called the ‘Declaration of the Independence of Cyberspace’ arguing that governments should have no jurisdiction in the online realm at all, and that no laws should apply there.

That was in 1996, only a few short years before the first antitrust case of the digital realm would establish the harms of online monopolies, and about a decade before smart phones and social media companies would change the paradigm of the internet forever. Since then, we’ve come to understand more about how companies, too, can wield and exert power over individuals by collecting, using, selling or tracking their personal information, particularly in the digital realm.

Seminal moments have occurred: the sale of data from a small group of researchers to a political consultancy, Cambridge Analytica, and the use of that data to influence the US presidential election, or the case of a teenage girl who began receiving promotional material for baby-related products from Target before her parents knew she was pregnant. These moments began to bring home for individuals just how much power organisations that hold and use personal information really possess. In Australia, the major data breaches at Optus and Medibank in recent years were similarly seminal – they exposed to Australians not only just how much personal information companies collect and hold, but what the real consequences of losing or compromising that data are – a tragic reality faced by the owners of the 100,000 compromised Australian passports in particular.

The result is that today we see increasingly high levels of interest in and value placed on personal and data privacy. This hasn’t always been the case, of course. Throughout history, the community’s privacy attitudes have been cyclical. In 1999, the CEO of a tech company Sun Microsystems said that consumer privacy issues are a ‘red herring’. ‘You have zero privacy anyway’, said Scott McNeally, ‘Get over it.’ In the same year, Pew Research surveys showed that only 16% of online users were worried about privacy. In a quaint comparison, their research also captured 29% of users who ‘don’t go online’ who were worried about privacy. Only 20% of internet users worried that their email might be read by someone other than the party they sent it to; 42% said they do not worry about this at all.

(This was a fun dataset to dig into, by the way. It included this gem: 41% of internet users were worried the threat of computer failures related to the Y2K.)

If we compare that to today, a study also by Pew Research shows much, much higher levels both of privacy literacy and privacy concerns. For example, most people (around 75%) believe they have little to no control over what companies or the government do with their data.

We have seen an increase in concern about and value for privacy here in Australia, in our own research of Australian attitudes to privacy; for example the proportion of Australian adults who care enough about protecting their personal information to do something about it increased from 75% in 2020 to 82% in 2023. Our research also showed us:

  • Nine in ten Australians have a clear understanding of why they should protect their personal information.
  • Only two in five people feel most organisations they deal with are transparent about how they handle their information, and 58% say they don’t understand how it is used.
  • 84% want more control and choice over the collection and use of their information.
  • There was a common view that businesses and government agencies should do more to protect Australians’ personal information.

This trend towards valuing privacy more and more is reflected around the world. If we just look at trends in regulation, from February 2021 to March 2023, 17 new countries enacted data privacy laws, bringing the total to 162 globally. Of course, there is even now draft privacy legislation under contemplation in the US, a jurisdiction historically adverse to federal privacy legislation, and it seems possible that the country will enact a privacy law before the end of the year.

Privacy Awareness Week

It is against this backdrop, then, that we celebrate Privacy Awareness Week. This year, awareness of privacy is higher than ever before, arguably. Expectations for better privacy practices are stronger than ever before. Recognition by business of the ethical imperative to be good privacy players is more widespread than ever before.

And yet – privacy harms are still widespread, data breaches occur weekly, data-driven business models are still pervasive and individuals still feel a lack of agency and control when it comes to their personal data.

For that reason, this year we’re calling on people, organisations and government to power up privacy – to take control and to step things up. We really want to see entities inject some power into their approach to privacy, rather than simply being in responsive mode or dealing with privacy issues late in the day. We would also like to see government power up privacy Australia-wide by introducing the reforms to the Privacy Act that are so overdue. In doing so, we believe that we can restore some power to individuals to feel in control of their personal information.

Law reform

It is an especially ideal time for businesses and government agencies covered by the Commonwealth Privacy Act and Queensland public sector agencies to power up existing privacy practices and culture, in advance of privacy law reform.

Early last year, the Attorney-General’s Department completed its review of the Privacy Act. The Australian Government responded in September, agreeing or agreeing in principle to all but 10 of the 116 proposals for reform.

The federal Attorney-General shared last week that at the request of the Prime Minister, he will bring forward legislation in August to overhaul the Privacy Act.

There are many elements to the reforms, but taken together they will power up privacy protections by answering a need in the community for the proper protection of their information and increasing private and public sector organisations’ accountability for their privacy practices.

We see the positive obligation that personal information handling is fair and reasonable as a new keystone of the Australian privacy framework.

This is a fundamental shift in approach that will require organisations to ensure their practices are fair and reasonable in the first place. This will help the Australian community to be confident that like a safety standard, privacy must be built into products and services from start.

The reforms will also provide a greater range of powers to the OAIC, reflecting the Australian community’s growing expectation that its regulators take a more enforcement-focused approach.

Other important developments include enabling individuals to exercise new privacy rights and take direct action in the courts if their privacy is breached. These initiatives reflect the baseline privacy rights expected by our community.

Another key proposal relevant to our discussions today around privacy and technology is a new requirement that has been agreed in principle for organisations to conduct a privacy impact assessment for activities with high privacy risks, which is already a requirement for Australian Government agencies.

Queensland too is powering up privacy protections afforded to Queenslanders by your public sector agencies.

The reforms introduced by the Information Privacy and Other Legislation Amendment Act increase consistency in privacy rights and obligations across the Queensland and Commonwealth jurisdictions.

Alignment between frameworks is important for regulators, for regulated entities and the individuals we look to empower.

For regulated entities, harmonisation reduces compliance costs and provides clarity and simplicity.

For individuals, it helps to ensure their data is protected, wherever it flows, and assists with a clearer understanding of and ability to exercise privacy rights.

Privacy and technology

Privacy law reform poses real opportunities for Australia:

  • There are substantial gains to be made for the Australian community through the proposed reforms, particularly those that address gaps in protections for children and vulnerable groups.
  • There is clear community concern around data breaches, due to the events of recent years, and the reforms to the breach notification regime as well as stronger enforcement powers available to us as the regulator will enable much more decisive action to be taken with respect to data breaches.
  • There is also the potential for the Australian regime to leapfrog equivalent frameworks overseas and take some novel approaches, including the new fair and reasonable test, which should aid in dismantling the practice of organisations using consent as a gateway to problematic privacy practices.
  • There is also an increasingly pressing need for reform given the immense technological changes that are staring us in the face.

I come into this role having spent the past five years working on AI and data governance and policy as Director of the London-based research institute, the Ada Lovelace Institute, which has a remit to ensure that data and AI work for people and society. In that role, I thought a lot about the role of data privacy regulation and regulators in grappling with new and emerging technologies, particularly AI.

This, I know, is probably the biggest issue on many of your minds at the moment. How is AI going to change our societies, and how will important societal values such as privacy be protected in a world in which AI is ever more pervasive? Many of these technologies are incredibly powerful. Some of them decentralise power in interesting ways, for example, generative AI tools are disrupting industries where power has historically been centralised. But they can also be a means for powerful institutes to amass power, including market power.

The OAIC is working hard to understand the implications of AI for privacy, including with our colleagues in other regulators through the Digital Platform Regulators Forum, which is made up of the OAIC, the Australian Competition and Consumer Commission, the Australian Communications and Media Authority and the eSafety Commissioner. We’ve published working papers on algorithms and large language models, and a literature summary on the harms and risks of algorithms.

Online privacy and high privacy impact technologies, including practices involving the use of generative AI, facial recognition and the use of other biometric information, are also high on our regulatory priorities. The Australian Information Commissioner has made determinations concerning the collection of biometric information by Clearview AI and 7-Eleven to match facial biometric templates. The OAIC also has ongoing investigations into the use of facial recognition technology by Bunnings Group Limited and Kmart Australia Limited. These technologies typically rely on artificial intelligence through the use of machine learning algorithms to match biometric templates, and constitute some of the most concerning technological developments from the perspective of the Australian community.

We’ve also begun scoping what other new and emerging technologies might create privacy risks and harms that warrant our intervention. We are opening preliminary inquiries into the use of personal information in connected cars, for example, and the sharing of personal information between car manufacturers and other entities such as insurance companies. We are also looking closely at and taking enforcement action around:

  • practices that impact individual’s choice and control, through opaque information practices or terms and conditions of service
  • technologies and business practices that record, monitor, track and enable surveillance
  • the use of algorithms to profile individuals in ways they may not understand or expect with adverse consequences
  • practices involving the use of generative AI, facial recognition and the use of other biometric information.

Finally, I think it’s incumbent on regulators everywhere to think about how new technologies should inform our regulatory practice, either through necessitating new investigative techniques such as algorithmic audits, or through deployment in house of AI technologies to streamline complaints handling and provide more efficient access to information for citizens. We’ve begun looking at this at the OAIC.

Although many of these new technologies are still on the horizon, I encourage all organisations, especially when considering new technologies, to put some practices into action now:

  • privacy by design across the information lifecycle
  • privacy impact assessments
  • asking whether the community would consider what you’re doing to be fair and reasonable.

These all go to accountability – and there’s good reason to do them and show privacy leadership.

Getting privacy right provides social license for new initiatives. Organisations and agencies that get it wrong face real consequences in terms of the community’s trust and confidence.

We need to put the individual at the heart and at the starting point when thinking many of these technologies.

Data breaches and security

Data breaches are an area where many Australians recently have had a tangible experience of privacy issues and the sense of disempowerment they can bring.

Since the Commonwealth’s Notifiable Data Breaches scheme began in 2018, the OAIC has been notified of around 5,800 data breaches.

The OAIC’s community privacy attitudes research last year found almost one in two people surveyed had experienced a data breach, and three-quarters had experienced harm as a result.

It also told us that Australians consider data breaches the single biggest privacy risk they face, with the proportion who reported this concern increasing by 13 percentage points since 2020.

There are high levels of public concern about data security as a result of the number and scale of recent breaches, and a strong appetite in the community for organisations and agencies to be held accountable.

Mandatory reporting of breaches strengthens the protections afforded to everyone’s personal information and improves accountability and transparency in the way organisations respond to serious data breaches.

The introduction of mandatory reporting in Queensland is an opportunity to build public confidence and trust in government’s handling of personal information and empower individuals to take action to manage risks and mitigate harm should a data breach occur.

Data breaches have the potential to cause serious harm to individuals. and organisations and agencies need to step up their security.

Around 40% of data breaches notified to the OAIC have been the result of cyber security incidents. Organisations need to put in place measures that guard against threats.

Recently, we are seeing the downsides of the increasingly interconnected economy in data breaches that involve multiple entities. An example of this is where a cloud or software provider that holds personal data on behalf of many entities is breached. Organisations and agencies need to be thinking about this risk and how to mitigate it when engaging and working with third parties.

Organisations and agencies covered by the Privacy Act should be aware that the security of personal information is a regulatory priority for the OAIC, and we will take enforcement action where there are serious failures.

The office recently commenced civil penalty proceedings in the Federal Court against Australian Clinical Labs Limited resulting from an investigation of its privacy practices that arose from a data breach in February 2022.

The case took only 11 months to reach the Federal Court and should serve as a warning to businesses to lift your game or risk enforcement action.

Conclusion

I know I won’t be the only one in the room who reads the news every day and sees privacy as central to so many of the issues that challenge politicians, policymakers, organisations and ordinary people:

  • There are the repeated data breaches, of course – just last week there were two high profile incidents with Qantas and NSW clubs.
  • There is the continued concentration of immense power in the hands of a few large tech companies, which causes problematic dynamics in digital markets, as evidenced most recently by the reticence of one tech boss who shall not be named to comply with regulatory restrictions in the safety realm, but also the ongoing battle between tech companies and security services when it comes to encryption.
  • There is the seemingly unstoppable growth of artificial intelligence, and in turn the continued concentration of power in those same few companies who also control much of the infrastructure, compute, data and expertise likely to fuel the AI revolution.
  • There is even the continued scourge of misogyny online, as recently identified as one of the key components of persistent egregious levels of domestic violence in this country, which is in part fuelled by the incentives of the online attention economy.

All of these issues, and many more, relate to privacy, and in my view could be tempered or mitigated through stronger, better privacy protections. Which is not to say that privacy alone is the solution – I think the experience of Europe in recent years has been that privacy regulators have to work hand in hand with competition regulators and others to tackle the most complex dynamics of online markets – but that at the very least privacy may be the starting point.

Now this is unsurprisingly the view of someone who works in privacy and believes wholeheartedly in the right to privacy and in privacy regulation – when you’re a hammer, everything looks like a nail. But if the dinner tables I’m at are anything to judge by, it is also, instinctively, the view of many of our fellow citizens and consumers. Privacy is on everyone’s lips these days.

As we stand at the precipice of a new era of technology and markets, we are urging Australian businesses, agencies and other organisations to meet this challenge – to ‘power up’ privacy and make a real difference for the community.

‘Powered up’ privacy practices are good for everyone: for consumers, who feel more confident participating in the digital economy; for businesses, which can boldly innovate knowing that guardrails are in place to protect customers; and for government, which can realise the benefits of new technologies with the trust of its citizens.