Published 29 May 2024

Read the keynote address prepared for delivery by Privacy Commissioner Carly Kind for the Innovate Australia Showcase on Wednesday 29 May 2024. A recording of this keynote address is available on The Public Sector Podcast by the Public Sector Network.


I would like to begin by acknowledging the Ngunnawal people as Traditional Custodians of the ACT and recognising any other people or families with connection to the lands of the ACT and region. I thank them for their continuous stewardship of this land. I pay my respects to Elders past and present and extend that respect to any First Nations peoples with us today.

I spoke recently at an event in Brisbane and Uncle Billy Cummings, who gave the Welcome to Country, spoke about the importance in ensuring that progress – in this case he was talking about changes to the land, environment and climate – happens slowly and with the input and buy in of all people affected.

I think it’s important that we consider what that approach means in the context of technology policy, particularly as we’re all here today in support of ‘innovation’.

Approaches to innovation

What does innovation really mean? In a definitional sense, innovation is about novelty. The fact that something is new or different makes it innovative, from a semantic purpose.

However, I think we all know that for innovation to be considered a public good, it has to be about more than just novelty. Daily we encounter ‘new’ things – cars, or apps, or clothing items – that are worse quality, badly designed, less sustainable or less effective than older things. Rather, for innovation to be something we value – for us all to gather here at a conference and worship at the altar of innovation – it needs to be about creating value, improving effectiveness or creating efficiencies, or addressing unmet needs.

That is the kind of innovation we want entities and governments to invest in. Innovation that will deliver public benefit back to the Australian community. Chiefly, innovation that has the input and buy in from the people it should benefit.

After all, we’ve seen a great many innovations in the technology arena that have not delivered value in terms of the protection and promotion of individual and collective privacy rights.

Yesterday I authored an op-ed in The Australian, which concerned one such innovation: tracking pixels.

Pixels are one of many tracking tools, including cookies, that permit granular user surveillance across the internet and social media platforms. By deploying pixels – tiny pieces of code – in their websites, brands can share the details of visitors to their websites with social media platforms to ensure those same visitors are later targeted with related advertising. The range of data that websites share differs greatly across sites, but can span from the basic fact of a site visit to personal details such as email addresses and mobile numbers.

These tools are born of innovation devised to support a digital ecosystem driven by the business model of advertising – what professor Shoshana Zuboff called ‘surveillance capitalism’ – wherein brands, keen for our attention, pay a premium to platforms that know enough about us to deploy the right ad at the right time. Major social media platforms are key drivers of this economy. But so are the shopping outlets, news media, health providers, educational institutions and other online services that embed pixels and other tracking tools in their websites.

Now, we may decide that there is a place for such innovations; that tracking pixels may permit mental health services to be targeted to young people in need, for example. But there are undeniably negative effects of this kind of innovation, just as we see many other innovative practices in the online space that similarly have negative effects. Our daily interactions with online and offline attempts to acquire our personal information are like death by a thousand cuts, wearing down our ability to meaningfully engage with privacy policies, terms and conditions and consent notices. Our website browsing preferences have become currency in a data economy that incentivises the collection of more and more personal data; the creation of more and more outrageous, misogynistic and bombastic content; and the engineering of more and more addictive features to keep us scrolling.

In my op-ed yesterday. I indicated that, despite the concerns I’ve just surfaced, I’ve decided to close the preliminary inquiries my office commenced into the TikTok pixel earlier this year. I have concluded that, while harmful, invasive and corrosive of online privacy, there has not been any obvious and clear contravention by TikTok of Australian privacy law as it is written.

For this reason, I’m compelled to point out the need for another form of innovation: regulatory innovation. Now, there are those in the room that will say that is a contradiction in terms, but I would urge you to update your thinking. When it comes to new technologies, against a backdrop of widespread community concern about invasions of privacy and control of personal information, modern, technologically-neutral, robust regulatory protections are key to ensuring businesses and government has the social licence to truly innovate with technology.

Community privacy attitudes

The Australian community is becoming increasingly savvy about the value of their personal information and have made it clear that they have high privacy expectations.

Around this time last year, my office conducted research of Australian attitudes to privacy.

It revealed:

  • 84% of Australians want more control and choice over the collection and use of their information.
  • 89% wanted to see government pass more legislation that protects their personal information.

A positive finding for this audience was that, after health service providers, federal government agencies were the second most trusted sector. Two-thirds (67%) of the Australians we surveyed said they trust agencies when it comes to how they protect and use their personal information.

There is work to be done by government agencies though; 89% of the Australians we surveyed told us they would like government agencies to do more to protect their personal information.

Privacy law reform

Regulatory innovation could be key to unlocking greater trust for government agencies. There are now a suite of reforms to the Privacy Act on the table and the government has committed to bringing forth these legislative changes in the second half of this year. The Privacy Act reforms include proposals that would shift the existing onus on individuals, who are currently required to safeguard their privacy by navigating complex privacy policies and consent requirements, and place more responsibility on entities to be more transparent, consider the impacts on individuals associated with their handling of personal information, and to support individuals to exercise their privacy rights.

Some examples of particular reforms that would represent an innovative approach to privacy regulation here in Australia and at the same time lay the groundwork for more innovation in public sector policy deployment include:

  • A proposal to introduce the fair and reasonable test will provide a baseline level of privacy protection and will allow individuals to engage with products and services with confidence that – like a safety standard – privacy protection is a given. Entities won’t be able to ‘consent out’ of the requirement to justify their activities, which will ensure that the choices presented to individuals are inherently fair and reasonable.
  • A proposal that would introduce a right for individuals to request meaningful information about how substantially automated decisions with legal or similarly significant effect are made. Entities would also be required to include information in privacy policies about the use of personal information to make substantially automated decisions with legal or similarly significant effect.
  • A proposal to change the word ‘about’ in the definition of personal information to ‘related to’ – to clarify that personal information is an expansive concept that includes technical and inferred information if this information can be used to reasonably identify individuals/
  • A proposal to amend the definition of ‘collection’ to expressly cover information obtained from any source and by any means, including inferred or generated information.

The Privacy Act reforms will also extend existing obligations that public sector agencies already comply with to the private sector – these include the requirement to appoint a privacy officer and conduct a privacy impact assessment for high privacy risk projects. This means that, as the private sector looks to uplift their compliance in anticipation of privacy reform, Australian Government agencies will be able to capitalise on their existing arrangements to ensure government is an exemplar in best practice personal information handling.

These reforms will put Australian Government agencies on strong footing to embrace the digital innovation agenda. Stronger privacy laws will build trust with the Australian community. In the case of public sector digital transformation, in some cases that will be about rebuilding trust, given a few key misfires in the space of digital innovation in the public sector to date.

Digital ID

While we await Privacy Act reform, I’m also optimistic about the innovation we’re seeing in the space of digital identity, which similarly reinforces and strengthens Australians’ privacy protections.

While this area has historically been very fraught, I believe we are ready to take this on and success will be delivered by a more privacy-centric approach. The Digital ID system will let us prove who we are online more easily and securely – and crucially, without sharing identity documents with every organisation. This will be good for individuals, and also good for business who generally don’t like storing this sort of information.

Sharing these documents with fewer organisations – and making them less central to proving who we all are – can play a significant role in reducing the potential for data breaches.

Privacy will be at the core of the Digital ID system. As the system’s privacy regulator, the OAIC has developed and is enforcing privacy safeguards to protect data and promote best practice. We are committed to ensuring the entities accredited under the system have the privacy-related guidance needed to confidently meet privacy obligations.

The ACCC and the OAIC have separate but complementary roles in regulating the Digital ID system. We will be working closely with the ACCC, as the accrediting entity with powers to ensure Digital ID providers and services comply with the legislation. We have the power to investigate privacy complaints made about Digital ID conduct and as the system expands, will use a range of enforcement powers to ensure that individuals’ privacy is protected.

Digital ID is a great example of innovation in the pursuit of public good. This is the type of innovation we should value and encourage throughout the public sector.

Many new technologies offer us the opportunity to innovate in ways that benefit the Australian public. Regulation should help us build the foundations for that innovation; by regulating to innovate, we can ensure there are strong protections in place for the Australian public and build a licence to operate for the Australian public and private sectors. I am looking forward to working to support public sector agencies to this end.