Privacy implications of the Digital Platforms Inquiry
Presentation by Australian Information Commissioner and Privacy Commissioner Angelene Falk to the Law Council of Australia Media and Communications Seminar 2019: “Digital Platforms ― The Future”
Thank you, I’m very pleased to be here today with you and with ACCC Chairman Rod Sims.
We’re living in a time when cooperation between regulators, both domestic and international, is more important than ever. The intersection between privacy and consumer law, and their complementary ability to address consumer harm, is increasingly recognised.
Rod and I share many areas of mutual concern. One shared objective ― securing better data protection outcomes for all Australians ― is vital to the community, and we continue to work together on this issue.
This convergence is not unique to Australia. The Global Privacy Assembly is a network of more than 120 data protection and privacy authorities, where I serve on the executive committee. It passed a resolution in 2017 on the need for collaboration between data and consumer protection authorities. Alongside Canada, my Office is leading global work to harness this convergence.
That’s why my first request to staff, when I was appointed acting Information Commissioner and Privacy Commissioner back in March last year, was to set up a meeting with the ACCC Chair.
That week we were in the throes of finalising our submission to the ACCC’s Digital Platforms Inquiry Issues Paper. The following week, I opened a formal investigation into Facebook after learning that the data of Australians may have been acquired and used without authorisation in what has become known as the Cambridge Analytica matter.
So… it was a busy time. There’s nothing quite like opening an investigation into a global tech giant in your second week on the job. And the pace hasn’t stopped.
State of play in privacy
Digital platforms, and their handling of our personal data, are under close scrutiny by both privacy and consumer protection regulators around the world. To name a few:
- Competition regulators in the US and Germany have issued orders against the ‘Social Network’, including the US Federal Trade Commission’s $5 billion penalty.
- In the UK, the Information Commissioner has fined Facebook 500-thousand pounds over Cambridge Analytica.
- French data protection authority the CNIL has imposed a 50-million-euro penalty against Google under the EU’s GDPR.
- Here in Australia, the ACCC recently instituted proceedings against Google alleging misleading conduct and representations.
Against this backdrop, the ACCC’s Digital Platforms Inquiry is far reaching. It has brought privacy and data protection into sharper focus in Australia, perhaps the most focus since the Australian Law Reform Commission’s review of privacy and its 2008 report: For your information.
There’s no doubt that privacy laws, and our concept of how they should best protect privacy, are an evolution. The Privacy Act has been amended almost 90 times since it commenced in January 1989. At the time of their introduction, many of these reforms were world leading.
The ALRC report led to the reforms that came into operation in 2014 and enacted the Australian Privacy Principles. These elevated privacy as a key consideration for business. And in 2018, the introduction of the Notifiable Data Breaches scheme changed the way business mitigates security risks.
The recommendations from the Digital Platforms Inquiry now give us both challenge and opportunity: to ensure Australia’s privacy protection framework effectively protects personal information given the increasing volume and scope of data collection in the digital economy. That it has the right regulatory settings in place, protects the rights of individuals, holds business and government agencies to account, and supports the data economy.
Digital Platforms Inquiry
Digital platforms have brought about fundamental changes in the way we connect, campaign, trade information and do business. The ACCC’s inquiry has helped to unveil the extent of their data use, as well as the information asymmetry between these platforms and the individuals whose data they harvest.
These asymmetries present significant challenges for people in making an informed decision about how their personal information is handled online.
As New Zealand Privacy Commissioner John Edwards recently noted, we now have digital platforms with populations bigger than nation states. Facebook has a ‘population’ of more than two billion people, none of whom has voting rights there ― and many of whom may be reluctant to vote with their feet for fear of missing out.
The ACCC’s report seeks to redress this power imbalance and increase transparency, choice and control for consumers. Its recommendations aim to clarify and fortify our privacy framework. It also suggests new rights for individuals and obligations for organisations.
This is a recent trend we are seeing across the domestic privacy landscape: enhanced protections that apply in specific policy contexts being expressed in legislation as obligations that also confer individual rights.
To give you an example, the right to have your My Health Record permanently deleted, and a corresponding obligation on the System Operator to do so. A similar right applies under the new Consumer Data Right scheme.
In effect, these changes place a greater emphasis in our privacy frameworks on individual rights. It is timely to consider this approach across the privacy spectrum, and the OAIC and ACCC have both flagged the need for a broader review of our privacy framework.
The types of questions that need to be asked are: does the Privacy Act contain the kinds of rights and obligations that are needed to protect personal information over the next decade? Do we need ― as we’ve seen under the GDPR ― rights relating to profiling and automated decision making, or express obligations for business like compulsory impact assessments for high risk data processing?
Privacy into the next decade
So going back to my first week on the job, and my request to meet Rod. Rod is generous with his time. It helps that our offices are in the same building. And we duly met a couple of days later.
And so, Rod asks me, what do you data protection authority types talk about when you get together? What does “ideal” look like? Well, I say ― that’s a big question. And my response goes something like this.
The online environment, with all its innovation and benefits, has created unimagined regulatory challenges. These issues are complex, and the solutions depend on national values, norms and legal systems. They are ultimately questions for Governments. As you know, the Government is consulting on the final Digital Platforms report before releasing its response.
In addressing these regulatory challenges, our data protection experience indicates that there are four key elements to support effective privacy regulation over the next decade:
- Global interoperability ― put simply, making sure our laws continue to connect around the world, so our data is protected wherever it flows.
- Enabling privacy self-management ―so individuals can exercise meaningful choice and control
- Organisational accountability ― ensuring there are sufficient obligations built into the system, and
- A contemporary approach to regulation ― having the right tools to regulate in line with community expectations.
This is the lens that we have applied to our engagement with the Digital Platforms Inquiry and to privacy reforms generally.
The first element ― global interoperability ― doesn’t mean that all data protection laws need to be the same. Around the globe, there is greater convergence of privacy principles and standards as law makers and regulators increasingly see things alike, learn, and borrow concepts from one another.
There’s now a ‘global toolbox’ to draw from with standards and approaches like the GDPR and APEC Privacy Framework Principles, among others. We are seeing the GDPR’s influence in Australia, in our region, and around the world; most recently, in California’s Consumer Privacy law.
So we can learn from this experience and minimise regulatory friction where that fits our domestic context. There is also an opportunity to be ambitious and look for new solutions to suit our local needs.
Enabling privacy self-management
The second element, privacy self-management, is a feature of privacy frameworks globally. Done well, it allows individuals to exercise choice and control by understanding how their personal information is being handled through notice and consent.
This relies on organisations making this information accessible and understandable, using consent where it matters most and is meaningful. This is the consent challenge.
Because life is short, attention spans are shorter, and the ability of individuals to fully and freely engage with options is often fragmented. At the recent privacy professionals’ summit in Sydney we heard from Professor Woodrow Hartzog from Northeastern University about the trend to overload people with information and call it transparency, to give us “endless toggles” and call it choice.
Individuals may be disengaged or unable to engage. They may just want the service and feel they have little option but to agree to complex terms and conditions.
There’s no one solution to this but requiring default settings for data practices that rely on consent to be switched to ‘off’ as the ACCC has recommended is part of the picture. As is their recommendation to make clear in law that different purposes of data collection, use, or disclosure, should not be bundled.
Other ACCC proposals would also strengthen notification and consent requirements. This includes aligning the definition of consent more closely in law with the GDPR to require a clear affirmative act that is freely given, specific, unambiguous and informed.
But in a complex environment of information flows, there is a risk that consent is elevated such that anything can be agreed to by ticking a box. The limitations of the consent model have been recognised by the UK Information Commissioner and the European Data Protection Board. Both stated that consent is only appropriate where individuals can be offered real choice and control over how their personal information is used.
So there is a need to balance stronger notification and consent requirements with the potential for consumer fatigue. As the ACCC report notes, one option to manage this fatigue is to not require consent when personal information is processed in accordance with a contract to which the consumer is a party. The use of standardised icons or phrases can also strengthen notice and consent, by facilitating consumers’ comprehension and decision-making.
We could also say, as a society, that some data practices are simply not ok. The Office of the Privacy Commissioner of Canada calls these ‘no-go zones’. They provide base-level protections, regardless of consent, by identifying and prohibiting collection, use and disclosure practices that are generally considered inappropriate.
For example, those that would result in profiling or categorisation that leads to unfair, unethical or discriminatory treatment, or might cause significant harm to an individual.
An illustration of a possible ‘no-go zone’ is collecting information to use in targeted advertising to children online. We saw how this can play out earlier this year when, following complaints and an investigation by the FTC, YouTube announced that it plans to end targeted advertising for uploaded videos that children are likely to watch.
Getting organisational accountability right
As well as constraining practices which are contrary to consumers’ expectations, additional accountability measures can redress the power and information imbalance. This is the third element to support privacy into the next decade: ensuring sufficient obligations are built into the system to hold organisations accountable.
Requirements to embed privacy into the design of technologies, architecture and systems from the start play an important role in supporting Australians to self-manage their privacy and make entities more accountable for their use of personal information. Mandatory privacy impact assessments for high risk data practices also build in accountability.
We can learn from several models that aim to strike a balance between consent, data rights and organisational accountability, including the GDPR, Californian law and proposed Canadian reforms.
Consumer Data Right as a model
Closer to home, we can also learn from the Consumer Data Right as an example of a reform aimed at balancing individuals’ right to control and utilise their data with strong accountability measures, to enable greater competition, consumer benefits and economic growth.
Consumer consent for the collection and use of their data is the bedrock of the CDR regime.
There are also strict rules around what can and cannot be done with CDR data. This enables the consumer to be the decision maker, directing where their data goes so they can get the most value from it.
This self-management approach is complemented by a range of privacy protections and accountability measures to ensure CDR data is transferred safely and securely. That includes accrediting businesses before they are entrusted with consumers’ data. To be accredited by the ACCC, participants must be able to meet and maintain clear privacy and information security criteria.
The introduction of a third-party accreditation scheme that applies more broadly across the economy could give consumers evidence-based information about the privacy credentials of entities, before they engage with them.
Other Consumer Data Right features that could be considered more broadly include:
- the concept of a ‘dashboard’ where consumers can track where their data goes, and
- the right of direct access to the courts over a privacy breach, in addition to regulator remedies.
Contemporary approach to regulation
The fourth element in ensuring our privacy framework is effective into the next decade is a contemporary approach to regulation. This includes having the right regulatory tools to take a proactive approach to enforcement.
The OAIC has regulatory oversight right across the economy, from the local GP clinic to big banks, telcos, government, and online platforms the size of nation states.
As a welcome step, in March this year the Government announced it will legislate to strengthen privacy protections and regulatory tools, including increased penalties and infringement notices. The OAIC has received additional funding for the next three years to assist us to regulate the online environment and to address the timely resolution of complaints.
We are building capacity within the Office, strengthening our investigative and enforcement teams alongside our strong policy and alternative dispute resolution teams. This will help us meet the ever-increasing demand for privacy advice and guidance, and complaint investigation and resolution.
A contemporary approach to regulation also requires collaboration, using the full range of laws and regulators in a coordinated way both domestically and internationally ― and we are doing this now. Internationally, we continue to develop and participate in arrangements that support cooperation in investigation and the enforcement of privacy and data protection laws. At home, the Consumer Data Right ― co-regulated by the ACCC and the OAIC ― is a prime example of collaboration to protect the community’s rights.
Privacy Code for digital platforms
In March, the Government also announced that it would require my Office to develop a privacy code for social media and online platforms which trade in personal information.
The Code will require platforms to stop using or disclosing an individual's personal information upon request, and to be more transparent about any data sharing. It will also require more specific consent of users to the handling of their personal information, and stronger protections for a user who is a child or other vulnerable Australian.
We will be working in consultation with industry and other stakeholders to develop the code, and learning from the experience of the UK, where Information Commissioner Elizabeth Denham is preparing to release a final version of an age appropriate design code of practice for online services.
The UK Code is based on 16 principles that embed the notion of harm minimisation, and require tech, gaming and online entertainment companies to act in the best interests of the child. Settings must be high privacy by default, including switching off geolocation and profiling options, with the goal not of protecting children from the digital world, but within it.
In conclusion, Australia is at a pivotal point in the regulation of privacy. The Digital Platforms Inquiry gives us an evidence base that supports the need for strengthened privacy protections.
Through the development of the Code for digital platforms, and in contributing our regulatory experience of privacy frameworks, we will seek to apply these four essential elements to protect privacy in the digital age ― interoperability, privacy self-management, organisational accountability, and contemporary regulation.
Our ultimate goal is to be at the forefront of privacy and data protection, with laws and practices that increase consumer trust and confidence in the protection of personal information and enable innovation and economic growth.