Skip to main content
  • On this page

Introduction

1 The Office of the Australian Information Commissioner (OAIC) welcomes the opportunity to make a submission to the Committee’s Inquiry into international digital platforms operated by Big Tech companies – Issues Paper.

2 The OAIC is an independent Commonwealth regulator within the Attorney-General’s portfolio, established to bring together three functions: privacy functions (protecting the privacy of individuals under the Privacy Act 1988 (Cth) (Privacy Act) and other legislation), freedom of information functions (access to information held by the Commonwealth Government in accordance with the Freedom of Information Act 1982 (Cth)), and information management functions (as set out in the Australian Information Commissioner Act 2010 (Cth)).

3 We understand the purpose of the inquiry is to consider the nature and extent of international digital platforms operated by large overseas-based multinational technology companies (Big Tech) exerting power and influence over markets and public debate to the detriment of Australian democracy and users. The Issues Paper canvasses a variety of topics relevant to the regulation of Big Tech, including data and privacy.

4 Many aspects of the daily lives of Australians have been transformed by the products and services offered by Big Tech and their innovations in technology and service delivery. The scale and scope of technological change has changed the way that individuals interact socially, conduct business and receive services. It has also given rise to new risks, many of which arise due to the increase in the amount of data and personal information collected, used, and shared, both in Australia and globally. The OAIC considers that strong data protection and privacy rights are therefore an essential piece in the ring of defence that is being built to address the risks faced by Australians in the online environment.

5 The Privacy Act 1988 (Cth) provides an important framework for minimising the privacy risks associated with personal information handling and providing individuals with choice and control in how their information is used. The Privacy Act applies to international digital platforms carrying on a business in Australia.[1]

6 However, the dynamic digital market and constantly evolving technologies are challenging the parameters of existing legislative frameworks that apply to the digital environment. The Australian Government has recently released the report of the Attorney-General’s Department’s review of the Privacy Act (Privacy Act Review report), which contains various proposals that aim to ensure the Act and its enforcement mechanisms are fit for purpose in the digital economy.

7 This includes proposals that will reduce the burden on consumers to manage their privacy through increasing organisational accountability, better protect children from privacy harms, provide additional mechanisms to pursue compensation and increase transparency in relation to algorithms.

8 The different harms that can arise in the online environment have resulted in intersections between regulatory spheres in regulating digital platforms, which highlights the importance of regulatory cooperation and coordination. The OAIC has effective, collaborative and longstanding working relationships with the ACCC, the ACMA and the Office of the eSafety Commissioner, including through our participation in the Digital Platform Regulators Forum (DP-REG). More information about DP-REG is set out in a separate joint letter submitted by DP-REG to the Committee’s Inquiry.

9 The purpose of this submission is to provide the Committee with further information about Australia’s existing privacy framework, the key proposals in the Privacy Act Review report that have direct relevance to the issues being considered by the Committee, and current domestic initiatives designed to ensure a multi-faceted regulatory approach to the regulation of digital platforms.

Protections in the Privacy Act

10 The Privacy Act contains a set of technologically neutral and flexible principles, the Australian Privacy Principles (APPs). The APPs apply to ‘APP entities’, which relevantly for the purposes of this inquiry includes organisation with an annual turnover of more than $3 million and certain other organisations below this threshold such as those trading in personal information.[2]

11 The APPs outline how organisations are permitted to handle personal information and are structured to reflect the information lifecycle. This includes obligations in relation to the collection, use and disclosure of personal information, adequately protecting personal information and providing transparency in relation to this information handling.

12 The technology neutral construction of the APPs allows them to apply to the handling of personal information within a diverse range of technologies including those considered in the Issues Paper such as cloud services, algorithms, the metaverse and Central Bank Digital Currencies. The flexible principles-based nature of the Privacy Act also means it is adaptable to changing technology, and able to complement other legislation or regulatory frameworks that deal with related issues.

13 The principles-based framing of these obligations enables APP entities to scale their responsibilities proportionally to the volume and type of personal information that they hold. Where the volume or sensitivity of personal information held by an entity increases, so too will the expectations placed upon the entity to protect that information.

Uplifting privacy protections for the 21st century

14 The increased volume and granularity of the collection of personal information combined with other practices such as data sharing, tracking and monitoring have the potential to amplify privacy risks and create new privacy harms. The Privacy Act Review report considers what changes are needed to Australia’s privacy framework to respond to these harms. It includes detailed examination of many of the data and privacy issues the Committee explores in its Issues Paper. To assist the Committee, we highlight some of the key proposals in the Privacy Act Review report which relate to these issues.

Increased accountability for APP entities

15 The Issues Paper discusses privacy concerns arising from the tracking of customers, the collection of large amounts of information and targeting advertisements. This data handling is increasingly complex, making it difficult for individuals to understand and control the ways in which their personal information is being handled.

16 Notice and choice are foundational principles in privacy law across the world, including in the Privacy Act. However, our 2020 Australian Community Attitudes to Privacy Survey found that 69% of individuals do not read privacy policies attached to any internet site, largely due to their length and complexity.[3] In an environment where there has been an exponential increase in the collection, use and disclosure of personal information as part of standard business models, and where consumer information about those practices is long, complex and difficult to navigate, these mechanisms are limited in their ability to restrain harmful activities.

17 Even where individuals do read privacy policies and APP 5 notices, they may feel resigned to consent to the use of their information to access online services because they do not feel there is an alternative. As digital products and services become more entrenched in individuals’ lives as the way in which we work, study and engage socially it is increasingly difficult to avoid pervasive tracking and data handling practices that do not align with their preferences.

18 In these circumstances it is inappropriate for businesses to place the full responsibility on individuals to protect themselves from harm. Additional protections for Australians are required that shift corporate attitudes away from relying on purported consent to the handling of their personal information.

19 The Privacy Act Review report proposes establishing a positive obligation on organisations to handle personal information fairly and reasonably.[4] The OAIC views this proposed reform as a new keystone for the Privacy Act and an important mechanism to change corporate attitudes regarding data collection and management.

20 The introduction of a central obligation to collect, use and disclose personal information fairly and reasonably would provide a new baseline for privacy practice that meets community expectations, and helps to restore and build trust. An example of a practice that may be unfair or unreasonable under the new test may include where social media companies infer highly personal or sensitive information about individuals, particularly children, including about their moods or socio-economic status, and target them with inappropriate content, such as gambling advertising or gambling-related content.

21 In our view, a positive obligation for organisations to handle data fairly and reasonably would give individuals greater confidence that they will be treated fairly when they choose to engage with a service. This would prevent consent being used to legitimise handling of personal information in a manner that, objectively, is unfair or unreasonable.

22 These changes will also remove the privacy burden from individuals, by providing the same assurances to people who share their personal data as those provided through well-established workplace and consumer safeguards. This will allow individuals to engage with products and services with confidence that—like a safety standard—privacy protection is a given. It also provides the flexibility needed by entities to use personal information to innovate and contribute to a thriving digital economy.

23 These measures will need to be supported by an effective enforcement regime. The right regulatory toolbox to deliver a timely and proportionate response will help to build community confidence and deter non-compliant behaviour.

Protecting children from harm online

24 The Issues Paper recognises that online safety is linked to many different aspects of online engagement. Research commissioned by the OAIC found the risks and harms that children face online arise primarily from the monetisation of their personal information, from the social impacts of sharing personal information on their reputation and life opportunities, and from ‘e-safety risks’ such as cyberbullying, contact risks and scams.[5]

25 The technologically neutral APPs apply to this environment to respond to harms flowing from the collection, use, disclosure and holding of personal information. The Privacy Act operates in tandem with cybersecurity protections and the Online Safety Act, which is regulated by the eSafety Commissioner, to protect Australians from online harms. Both privacy and online safety have different but important roles to play in keeping children safe online.

26 Children’s privacy is an area of concern for parents. Our 2020 Australian Community Attitudes to Privacy Survey found 91% of parents are concerned about the protection of the personal information of their child. At the same time, it is important to protect children within the online environment and not from it. 82% of parents believe children must be empowered to use the internet and online services, but their data privacy must also be protected in this environment.

27 The Privacy Act Review report considers child-specific privacy protections and puts forward proposals to strengthen protections for children. To encourage entities to consider the impacts of their information handling on children it proposes including ‘whether the collection, use or disclosure of personal information is in the best interests of the child’ as a factor in whether an act or practice is fair and reasonable.[6] It also proposes prohibitions on direct marketing to a child, targeting to a child and trading in the personal information of children with limited exceptions.[7]

Legal mechanisms to pursue compensation

28 Under the Privacy Act, individuals may make a complaint about an interference with their privacy to the Australian Information Commissioner. Many complaints are resolved through alternative dispute resolution, and the outcome may include an agreement by the respondent to pay compensation. Where a complaint is investigated, after investigating the complaint, the Commissioner may make a determination regarding whether the complaint is substantiated. If the complaint is substantiated, a determination may include a declaration that the individual is entitled to compensation for loss or damage suffered due to the interference with privacy. However, there is currently no avenue for an individual to take a respondent directly to court to seek compensation for an interference with their privacy under the Privacy Act (direct right of action) and no tortious right of action for invasion of privacy.

29 The Privacy Act Review report proposes the introduction of a direct right of action for any individual or group of individuals whose privacy had been interfered with by an APP entity.[8] Under this right they would be able to take their case to court after first having their complaint assessed for conciliation by the OAIC or a recognised external dispute resolution scheme. This would give individuals greater control over their personal information by providing an additional avenue of redress under the Privacy Act, create additional incentives for APP entities to comply with their obligations and provide increased opportunities for the courts to interpret the Privacy Act.

30 The Privacy Act Review report also proposed the introduction of a statutory tort of privacy in the same form as was recommended by ALRC Report 123.[9] ALRC Report 123 recommended a statutory tort where a plaintiff could prove their privacy was invaded in one of two ways: intrusion upon seclusion, or misuse of private information.[10] To establish the tort, the plaintiff would also need to prove that:

  • the public interest in privacy outweighed any countervailing public interest
  • the breach of privacy satisfied a seriousness threshold, and
  • they had a reasonable expectation of privacy in all the circumstances.[11]

31 The scope of the statutory tort would be broader than a direct right of action under the Privacy Act, as the Privacy Act only applies to APP entities and is limited to the protection of personal information rather than other types of privacy such as bodily or territorial. Specifically, the tort would extend to actions by entities not currently covered by the Privacy Act such as individuals acting in a personal capacity, most small business operators, media organisations acting in the course of journalism, and registered political parties and political representatives. For example, the Privacy Act does not allow individuals to seek compensation where their intimate images have been shared by other individuals without their consent. A statutory tort would enable individuals to pursue legal redress for serious invasions of privacy in a broader range of circumstances.

Increasing transparency in algorithms

32 Algorithms operate in an existing regulatory landscape, which includes privacy law. The APPs provide a foundation to address many of the challenges that emerge in algorithms, including requirements for transparency, data quality, and access rights. In relation to transparency, the APPs include requirements to provide information in privacy policies and APP 5 notices about the purposes personal information will be used for, including where they will be used in algorithms. In addition, APP 12 requires APP entities holding personal information about an individual to give the individual access to that information on request unless an exception applies. These mechanisms facilitate privacy self-management by individuals.

33 The OAIC has developed a Guide to Data Analytics and the Australian Privacy Principles, which highlights strategies to manage potential privacy risks for analytics processes, including through the use of algorithms.[12] Strategies include embedding good privacy governance through a privacy-by-design approach, conducting privacy impact assessments, and maintaining awareness that data analytics can create additional personal information, which will also be subject to the APPs.

34 As noted in the Issues Paper, the processes algorithms use to produce their outputs are often complex and difficult to comprehend. This issue has been considered in other government inquiries, including the Australian Human Rights Commission’s Human Rights and Technology project and the Privacy Act Review.[13]

35 The Privacy Act Review report proposed a requirement for privacy policies to set out the types of personal information that will be used in substantially automated decisions which have a legal or similarly significant effect on an individual’s rights. [14] It also proposed a right for individuals to request meaningful information about how substantially automated decisions with legal or similarly significant effects are made.[15] This information would help individuals understand the decisions being made about them, fulfilling a similar function to the notice requirements with Federal Trade Commission oversight in the proposed US legislation.[16]

36 Given the work already taking place to regulate algorithms and the range of existing regulatory bodies operating in relation to digital platforms, careful consideration should be given to whether establishing an oversight body would be the most appropriate model. We suggest consideration is given to whether existing bodies could be resourced to take on any new regulatory activities for algorithms.

Responding to data breaches

37 As set out in the Issues Paper, recent data breaches have highlighted the importance of securing personal information. We note the high level of community concern about the protection of personal information and whether it needs to be collected and retained by organisations.

38 The Privacy Act includes well-established security requirements that play an important part in reducing the likelihood of data breaches occurring and in mitigating their impacts on individuals.

39 Under APP 1, entities must take steps beyond technical security measures to protect and ensure the integrity of personal information throughout the information lifecycle, including by implementing strategies in relation to governance, internal practices, processes and systems, and dealing with third party providers. This ‘privacy by design’ approach under APP 1 supports strong data security amongst regulated entities by establishing measures which prevent the misuse, interference, loss or unauthorised access to, modification or disclosure of personal information. This approach also ensures entities detect privacy breaches promptly and are ready to respond to potential privacy breaches in a timely and appropriate manner.

40 In complying with APP 11, businesses are required to take reasonable steps to protect the personal information they hold, which includes actively monitoring their risk environment for emerging threats and implementing appropriate mitigation strategies.

41 The Notifiable Data Breaches (NDB) scheme requires APP entities to notify the OAIC and affected individuals about data breaches that are likely to result in serious harm to an individual whose personal information is involved. This is designed to enable individuals whose personal information has been compromised in a data breach to take remedial steps to lessen the adverse impact that might arise from the breach.

42 Failure to comply with the NDB obligations is an interference with privacy and can result in civil penalty proceedings. It is important for these penalties to be sufficiently large to deter non‑compliant behaviour. Recent amendments through the Privacy Amendment (Enforcement and Other Measures) Act 2022 (Cth) increased the penalties available under the Privacy Act to the maximum penalty to the greater of:

  • $50 million
  • if it can be determined, three times the value of the benefit the body corporate and related bodies corporate obtained from the contravention
  • if the value of the benefit obtained cannot be determined, 30% of the adjusted turnover of the body corporate during the breach turnover period for the contravention.

43 The OAIC also encourages strong security protection and effective responses to data breaches through monitoring compliance with the NDB scheme and providing guidance on obligations under the Privacy Act.

44 The Privacy Act Review report makes several proposals to amend the NDB scheme, including further changes to the matters that must be included in a notification, changes to notification timeframes and an express requirement for entities to take reasonable steps to implement practices, procedures and systems to enable it to respond to a data breach.[17] While we support these proposals we note they would be strengthened through an express obligation on entities to take reasonable steps to prevent or reduce the harm that is likely to arise for individuals as a result of a data breach.

45 Under the current NDB scheme, the burden is largely placed on individuals to take steps to reduce the harm that may result from a data breach suffered by an entity. We have observed that best practice entities take responsibility for the costs and impacts of data breaches when they occur, and support individuals to mitigate the impact of a data breach. This aligns with community expectations that entities need to take responsibility for supporting individuals during a large-scale data breach. As such, we consider this should be a mandatory obligation on breached entities.

46 The Privacy Act Review report suggests further consideration be given to a requirement that entities should take reasonable steps to prevent or reduce the harm that is likely to arise for individuals as a result of a data breach.[18] As the obligation is qualified by the well-established ‘reasonable steps’ test the steps each entity is required to take will depend on the circumstances, including the nature of the entity, the sensitivity of the personal information involved, the adverse consequences for the individual, and the time and cost involved. It also recognises that it may be reasonable to take no steps depending on the circumstances.

Social Media (Anti-Trolling) Bill 2021

47 The Issues Paper asks whether the former government’s Social Media (Anti-Trolling) Bill 2022 adequately address online safety and, if so, should it be reintroduced into the Parliament.

48 The OAIC made submissions to the Attorney-General’s Department’s consultation on the exposure draft of the Anti-Trolling Bill and to the Senate Legal and Constitutional Affairs Committee when the Bill was referred for inquiry.[19] As set out in those submissions, we consider that the Bill creates privacy risks and impacts for Australian users of social media, which we have set out in more detail below.

49 The Anti-Trolling Bill proposed to introduce a new framework for Australians to ascertain the contact details of individuals that post anonymous defamatory comments on social media for the purposes of commencing defamation proceedings.[20] It deemed social media services as ‘publishers’ for the purposes of defamation law but included a conditional defence from liability if the social media service discloses the contact details of a commenter in certain circumstances. We understand the intention is that social media services would be unable to rely on the defence if they provide fake or inaccurate details.

50 As set out in our submissions, a likely outcome of the framework set out in the Anti-Trolling Bill is that social media services would seek to collect additional categories of personal information that they do not already hold, and then take steps to verify the authenticity and accuracy of their information holdings to rely on the defence to defamation liability. If social media services are required to provide a person’s actual or legal name this may result in social media services seeking to collect identity information, such as government issued credentials, to verify the authenticity of an individual’s name.

51 When an entity collects more personal information than is necessary, this may increase the risk of harm to an individual in the event of a data breach. As has been made apparent from recent data breaches, data breaches can have significant impacts on the community. Collecting government issued credentials can carry an increased risk as they contain significantly more personal information than may already be collected and held by social media services (such as address, date of birth and other identifiers) and can be used in identity theft.

52 The incentives in the Anti-Trolling Bill would also significantly impact the ability to remain anonymous. There are many reasons why an individual may prefer to transact online anonymously or pseudonymously including a preference not to be identified or to be ‘left alone’, to avoid subsequent contact (such as direct marketing) from an entity, to keep their whereabouts secret from others, including in circumstances where they fear harm or harassment from others, to access services (such as counselling or health services) without this becoming known to others, or to express views in the public arena without fear of reprisal.

53 In their submission Reset commented that 'anonymity is important for many different communities and user groups, including for [example] women, LGBTIQ+ and communities of colour'.[21]

54 The Privacy Act recognises that the right to privacy is not absolute, and privacy rights will necessarily give way where there is a compelling public interest reason to do so. Whether this is appropriate will depend on whether any privacy impacts are reasonable, necessary and proportionate to achieving a legitimate objective.

55 We understand a key objective of the Anti-Trolling Bill was to empower individuals who are the victim of reputational and emotional harm to make an informed decision when deciding whether to institute defamation proceedings and, if they decide to do so, to effect service (or seek an order for substituted service) in relation to legal proceedings.[22]

56 An important threshold issue that continues to warrant further consideration is whether the privacy impact on all Australian social media users that will result from the collection and verification of their contact details, is a reasonable, necessary and proportionate response to achieving the objective of the Bill.

57 In particular, evidence of the number of individuals that are currently unable to pursue a claim for defamation due to an inability to identify anonymous users should be appropriately balanced with the privacy impacts that may be experienced by all Australian social media users.[23]

58 As recommended in our submissions, further consideration should be given to whether less privacy-invasive options could achieve the objective.

Regulatory coordination and collaboration

59 The OAIC has observed growing intersections between domestic frameworks relating to data and digital technologies, including privacy, competition and consumer law, and online safety and online content regulation. While there are synergies between these frameworks, there are also variances given each regulatory framework is designed to address different economic and societal issues.

60 Where different regulators exercise different functions under various laws it is important for regulators to work together to avoid any unnecessary or inadvertent overlap and uncertainty for consumers and industry. At the same time, we do not consider that regulatory overlap is necessarily a negative outcome, particularly where it is well managed. It is more problematic if regulatory gaps expose individuals to harm or lead to inconsistent and inefficient regulatory approaches.

61 An effective approach must address the importance of institutional coordination between different regulatory bodies in different areas, given the need for complementary expertise. Regulatory cooperation can involve informal actions, such as engaging with networks like the ACCC’s Scams Awareness Network, to more formal actions, such as collaboration on compliance activities.

62 To this end, the OAIC has entered into MOUs with other regulators including the ACCC, Australian Communications and Media Authority (ACMA), Australian Digital Health Agency and the Inspector-General of Intelligence and Security. In addition, the OAIC is a member of DP-REG, together with the ACCC, ACMA and Office of the eSafety Commissioner. DP-REG is an initiative of Australian independent regulators to share information about, and collaborate on, cross-cutting issues and activities on the regulation of digital platforms. This includes consideration of how competition, consumer protection, privacy, online safety and data issues intersect and provides members with an opportunity to promote proportionate, cohesive, well-designed and efficiently implemented digital platform regulation.

Conclusion

63 As is apparent from the Committee’s Issues Paper, the regulation of international digital platform intersects with a range of areas. Within this, privacy regulation plays an important role in promoting transparent and secure information handling, with technology neutral principles that can adapt as technology evolves.

64 The Committee’s inquiry comes at a time of significant privacy reform. This is an important opportunity to raise the standard of data handling in line with community expectations and build trust and confidence in the digital environment.

65 Alongside regulatory reform, the OAIC continues to work with other regulators to make the digital economy a safe, trusted, fair, innovative and competitive space.

Footnotes

[1] Privacy Act 1988 (Cth) 5B.

[2] Privacy Act 1988 (Cth) ss 6 (definition of ‘APP entity’), 6C and 6D.

[3] Lonergan Research, Australian Community Attitudes to Privacy Survey 2020 , report to OAIC, September 2020, p 69.

[4] Attorney-General’s Department (AGD), Privacy Act Review – Report , AGD, 2022, p 116..

[5] N Witzleb, M Paterson, J Wilson-Otto, G Tolkin-Rosen and M Marks, Privacy risks and harms for children and other vulnerable groups in the online environment , report to OAIC, Monash University and elevenM Consulting, 2020, p 29.

[6] AGD, Privacy Act Review – Report , AGD, 2022, pp 152-153.

[7] AGD, Privacy Act Review – Report , AGD, 2022, pp 216-217.

[8] AGD, Privacy Act Review – Report , AGD, 2022, p 279.

[9] AGD, Privacy Act Review – Report , AGD, 2022, p 287.

[10] ALRC, Serious Invasions of Privacy in the Digital Era – Final Report , ALRC, June 2014, accessed 3 February 2023, p 9.

[11] ALRC, Serious Invasions of Privacy in the Digital Era – Final Report , ALRC, June 2014, accessed 3 February 2023, pp 9-10.

[12] OAIC, Guide to data analytics and the Australian Privacy Principles , OAIC website, March 2018, accessed 6 February 2023.

[13] Australian Human Rights Commission (AHRC), Human Rights and Technology Final Report , AHRC, 2021; Attorney-General’s Department (AGD), Privacy Act Review – Discussion Paper , AGD, October 2021.

[14] AGD, Privacy Act Review – Report , AGD, 2022, p 191.

[15] AGD, Privacy Act Review – Report , AGD, 2022, p 193.

[16] See Algorithmic Justice and Online Platform Transparency Act, HR 3611, 117th Congress (2021); Filter Bubble Transparency Act, S 2024, 117th Congress (2021).

[17] AGD, Privacy Act Review – Report , AGD, 2022, pp 294-298.

[18] AGD, Privacy Act Review – Report , AGD, 2022, p 297.

[20] Attorney-General’s Department (AGD), Social Media (Anti-Trolling) Bill 2021 , AGD website, n.d, accessed 16 February 2023.

[22] Explanatory Memorandum, Social Media (Anti-Trolling) Bill 2022 (Cth), p 8.

[23] We note that the Department’s ‘Frequently Asked Questions’ indicate that around 1 in 7 (around 14.3%) Australians have been subjected to hate speech online, and that many report a negative impact as a result. In about 10% of cases, the negative impact is reputational damage—and this proportion appears to be higher amongst the LGBTIQ+, Indigenous, CALD and disability communities. See https://www.ag.gov.au/legal-system/social-media-anti-trolling-bill/frequently-asked-questions