11 December 2020

40.  Should there be some acts or practices that are prohibited regardless of consent?

42. Should reforms be considered to restrict uses and disclosures of personal information? If so, how should any reforms be balanced to ensure that they do not have an undue impact on the legitimate uses of personal information by entities?

6.1 As discussed in the Objectives section of this submission, this review represents an opportunity to recognise that strong data protection and privacy rights are both necessary to prevent erosion of our human right to dignity in the digital age, and also a precondition for consumer confidence, economic growth and to meet other societal objectives such as the protection of health, safety and security. A greater emphasis on the rights of individuals and the obligations of entities to protect those rights is necessary to ensure the public interest is served by privacy law into the next decade. This includes creating a more central focus on protecting individuals from harms that are associated with the collection, use or disclosure of their personal information.

6.2As noted in Part 3 above, to remain fit for purpose into the next decade, it is essential that the Privacy Act contains flexible protections that are able to evolve and respond as technologies shift, while creating legal obligations that can address current and evolving privacy risks and harms. In some circumstances, this principles-based framework will need to be supplemented with codes, legally-binding rules and Commissioner-issued guidelines to address high risk activities or sectors.[124]

6.3To achieve these goals, the OAIC recommends a number of measures aimed at achieving greater fairness and more accountability in the personal information handling activities of APP entities:

  • introducing specific obligations around the fair and reasonable handling of personal information
  • restraining broad collections of personal information
  • prohibiting certain information handling through no go zones
  • introducing an independent third-party certification scheme.

6.4These measures will assist in protecting individuals’ privacy rights, thereby helping APP entities to build trust and confidence in their personal information handling practices. It will not impose undue compliance burdens on APP entities who are already committed to good privacy practices. These measures are discussed in more detail below.

Introducing fairness and reasonableness standards for the collection, use and disclosure of personal information

6.5Fairness and reasonableness have always been key concepts underlying the protections in the Privacy Act. An objective of the legislation is to ensure the fair handling of personal information.[125] Similarly, a reasonableness standard is used throughout the APPs and is a widely understood legal threshold.[126] The fairness concept is given limited recognition in APP 3, which requires personal information to only be collected by fair and lawful means. However, the OAIC’s regulatory experience indicates that this protection does not go far enough.

6.6By only applying to the means of collection, this requirement, that personal information may only be collected by fair and lawful means, may not prevent other inappropriate practices. While a collection may not reach an unfair “means” threshold such as collection through deception, there are collection practices which are nonetheless are unfair in that they adversely affect rights and interests. Examples may include the targeted collection of personal information from children or vulnerable people. In addition, determining whether conduct is unlawful may be a complex task for the OAIC without the assistance of a decision of a court or finding of a relevant decision-making body.

6.7Crucially, this protection does not apply to the uses and disclosures of personal information.[127]

6.8The practical implication of this limited protection is that the APPs will not prevent all unfair or unreasonable collections, uses or disclosures of personal information, even where these practices do not meet community expectations and may cause harms to individuals. Accordingly, the OAIC considers that a new approach should be taken to replace APP 3.5 with expanded obligations that will have a greater focus on protecting individuals.

Examples of unfair or unreasonable conduct that may be permitted under the APPs

The consequence of the current formulation of the APPs is that acts or practices may be permissible if they comply with the APPs, even if these acts or practices are unfair and unreasonable. For example:

  • APP entities can breach APP 3 when collecting personal information but comply with APP 6 when using or disclosing this information: An APP entity providing a mobile phone application collects personal information from mobile phones for the purpose of on selling it to advertisers. While this may breach APP 3, it may not clearly breach APP 6 as the personal information is being disclosed for the primary purpose for which it was collected. An example of this was a flashlight mobile application which secretly collected personal information, including location information, which was shared with advertisers.
  • APP entities may broadly define and obscure their purposes for disclosure: An APP entity may define its primary purpose for collection to include profiling of individuals. This conduct may result in uses or disclosures that go far beyond the individual’s expectations and may not be in the individual’s best interests. This has included information being used for political or research purposes without the individual’s knowledge.
  • APP entities can make privacy controls difficult to locate: Although an APP entity’s website may provide notifications about its privacy practices, the controls to actually opt-out of information sharing activities may be difficult to locate. While this practice may be unfair, unreasonable and cause harms to consumers, it may not be prohibited under APP 3 (for example, where the type of information collected is not ‘sensitive’ information and therefore does not require consent for collection).
  • APP entities can directly or indirectly use personal information to show vulnerable individuals’ inappropriate content: An APP entity can use personal information to target individuals, including vulnerable people, with inappropriate or adult content such as gambling advertisements. This may cause financial or other harms to individuals.
  • APP entities can define their purposes for handling information to include unfair or unreasonable purposes: An APP entity can define its purpose to include publishing  personal information, including addresses and photos, to facilitate targeting of individuals. Publication for this purpose will likely comply with APP 6 (use and disclosure) even though it may lead to harms such as stalking or targeting individuals at their homes.

6.9The OAIC recommends introducing fairness and reasonableness obligations into APP 3 and APP 6 as follows:

APP 3 - The collection of personal information by an APP entity under Australian Privacy Principle 3 must be fair and reasonable in the circumstances, even if an individual consents to the collection.

and

APP 6 - The use or disclosure of personal information by an APP entity under Australian Privacy Principle 6 must be fair and reasonable in the circumstances, even if an individual consents to the use or disclosure.

6.10 This would create a proactive requirement for APP entities that is aimed at preventing unfair and unreasonable activities that may result in harms to individuals. These obligations should be overarching requirements that qualify other requirements in the APPs, including whether an individual has consented to the act or practice. Although consent would likely have constrained unfair or unreasonable personal information handling in the past, this protection has been eroded in the online environment.[128]

6.11 The proposed amendments would supplement the proposed Online Platforms code by enabling it to target the unfair or unreasonable behaviours of entities covered by the code more effectively. It will also capture entities across the economy who may be operating online, but are not covered by the online platforms code.

6.12 The OAIC recommend that the legislation set out a non-exhaustive list of factors that the Commissioner will consider when determining whether acts or practices are fair and reasonable in the circumstances. The OAIC considers it important that these factors be defined to promote regulatory certainty and guide interpretation of these terms in a privacy context.  These considerations could be drawn from similar requirements in other jurisdictions, which could, for example, include:

  • The primary purpose or reasonably anticipated secondary purposes for which the personal information was collected, used and disclosed will have unjustified adverse impacts on any individuals.[129]
  • The primary purpose or reasonably anticipated secondary purposes for which the personal information was collected, used and disclosed is reasonable, necessary and proportionate.[130]
  • The collection, use or disclosure of personal information will not intrude to an unreasonable extent on the personal affairs of any individual.[131]
  • The collection, use or disclosure of personal information is within the reasonable expectations of the individual to whom the information relates.[132]

6.13 We also recommend that APP 1 is amended to require APP entities to take steps as are reasonable in the circumstances to implement practices, procedure and systems that will mitigate the risk of unfair and unreasonable information handling practices as a result of the entity’s handling of personal information.[133]

6.14 These provisions will be flexible and apply to APP entities depending on the particular facts and circumstances.[134] They will also only have significant impacts on the small amount of APP entities whose acts or practices do not meet these standards of fairness and reasonableness.

6.15 While the OAIC considers a fairness and reasonableness obligation to be the preferable approach, an alternative model that could be considered is creating a statutory duty of care on APP entities to protect individuals from harms stemming from the use of their personal information.

Recommendation 37 Introduce fairness and reasonableness obligations into APPs 3 and 6:

APP 3 - The collection of personal information by an APP entity under Australian Privacy Principle 3 must be fair and reasonable in the circumstances, even if an individual consents to the collection. 
and
APP 6 - The use or disclosure of personal information by an APP entity under Australian Privacy Principle 6 must be fair and reasonable in the circumstances, even if an individual consents to the use or disclosure.

Recommendation 38 Introduce a non-exhaustive list of factors that the Commissioner will consider when determining whether acts or practices are fair and reasonable.

Recommendation 39 Amend APP 1 to require APP entities to take steps as are reasonable in the circumstances to implement practices, procedure and systems which will mitigate the risk of unfair and unreasonable information handling practices as a result of the entity’s handling of personal information.

Interactions with the consumer protection regime

6.16 As noted in the Objectives section of this submission, a foundation in human rights is a key reason why privacy protections in Australia and internationally exist as a separate but complementary legal framework to other Australian laws that protect the rights of individuals. The Australian Consumer Law (ACL) is a key example of a separate but complementary legal regime to the Privacy Act.

6.17 Introducing obligations of fairness and reasonableness in the Privacy Act would strengthen and complement existing ACL protections, as well as the proposed restrictions on unfair trading practices proposed at Recommendation 21 of the ACCC’s DPI final report.

6.18 Fairness and reasonableness are important concepts in both the Privacy Act and the ACL, and the OAIC anticipates that APP entities will be able to be guided by existing precedents on these principles.

6.19 However, the proposed non-exhaustive list of factors that the OAIC will consider when determining whether acts or practices are fair and reasonable draws on foundational privacy concepts. In effect, this means that these obligations will respond to unfair or unreasonable practices through a data protection lens which seeks to uphold the right to privacy where personal information is used by business, health practitioners and government. This is separate to the objectives of unfair consumer law protections which seek to safeguard consumers’ ability to make free and informed choices that further their own interests.[135]

6.20 Similarly, the Privacy Act applies to personal information wherever it flows, meaning that it will provide important protections in instances where consumer protections may not apply:

  • While some personal information handling activities are pursuant to a contract, this will not always be the case. In the OAIC’s view, privacy policies and information collection statements in of themselves, are not intended to constitute binding legal contracts. This limits the application of unfair contract provisions under the ACL.
  • In an information economy, personal information is increasingly being shared between third parties. In this business-to-business context, these third parties may not have any direct relationship with the data subject. This may limit the application of several consumer protections.
  • While fairness and reasonableness incorporate transparency, conduct may be unfair or unreasonable, even if an APP entity is transparent. This suggests that prohibitions on misleading, deceptive or false conduct will not always apply in a personal information handling context.

6.21 The OAIC considers that the best result for Australians and the regulated community is for the privacy framework to apply to all issues of personal information handling. Introducing fairness and reasonableness obligations will be an important factor in allowing the OAIC to approach personal information handling issues holistically.

6.22 To the extent that consumer law and privacy law operate concurrently, the OAIC and the ACCC will continue to work together on issues that fall under both regimes, building on the memorandum of understanding on exchanges of information between these two agencies.[136]

Restraining broad collections of personal information

6.23 The digital age has seen the rise of business models built around monetising the collection, use and disclosure of personal information. This has incentivised increasingly extensive collections of personal information by a wide range of private entities, led by the major online platforms, data brokers and the adtech industry. This increase in the collection of personal information has naturally resulted in increased privacy risk.

6.24Information collection is addressed in APP 3 of the Privacy Act. This principle limits collection to what is reasonably necessary (or directly related for Australian Government agencies) for one or more of the entity’s functions or activities. Many privacy frameworks globally contain substantially similar data minimisation principles.[137]

6.25 For more traditional businesses, where the handling of personal information is incidental to their functions or activities, APP 3 is an effective constraint on information collection.

6.26 However, where an entity’s functions or activities focus on the collection, use and disclosure of personal information, APP 3 will have a more limited effect. This is particularly because the Privacy Act also permits private organisations to define their own functions or activities and provides limited mechanism for this to be challenged.

6.27The OAIC’s Recommendation 37 to introduce fairness and reasonableness obligations on APP entities may help prevent some inappropriate information collection practices. However, it is unlikely that it will address the root cause of this issue which is whether these data-driven business models and associated expansive information collecting activities themselves meet Australian community expectations.

81% consider it a misuse for an organisation to ask for information that does not seem relevant to the purpose of the transaction, up 7% since 2017.[138]

6.28The Privacy Act provides the Commissioner with little ability to challenge the legitimacy of an APP entity’s stated business model. Regardless, it would be difficult for the Commissioner to make this assessment, as the relevant considerations go beyond privacy issues. Many of these data-driven companies are highly innovative, and provide benefits for society, even if they carry potentially substantial privacy risks.

6.29 To address increasingly expansive personal information activities, the review provides an opportunity for the Government to consider whether some of these practices, and the associated data-driven business models, remain appropriate for Australia. This assessment should have regard to the need to protect individuals, the legitimate interests of private industry and the public interest in privacy.

6.30 If there is a public interest in specifically regulating these practices or business models, this could be implemented through full or partial prohibitions in the Privacy Act, as per the OAIC’s Recommendation 40. In some instances, however, these activities and business models may be more appropriately regulated through other legislative frameworks.

Prohibiting certain information handling: No-go zones

6.31 Some types of information handling practices simply do not meet the expectations of the Australian community.

6.32 The OAIC considers that the worst of these practices should be prohibited, even if an entity has purported to have sought consent to the collection, use or disclosure.  APP entities engaging in other high-risk activities should be subject to additional organisational accountability obligations that require them to ‘proceed with caution’ to ensure that individuals are protected from harms arising from those practices.

6.33 To this end, the OAIC recommends the creation of no-go zones and proceed with caution zones under the general privacy framework. These full and partial prohibitions could be introduced directly into the Privacy Act or be implemented through codes, legally-binding rules and Commissioner-issued guidelines on fairness and reasonableness requirements.[139]

6.34 Prohibitions on information handling activities are common features of other Australian privacy-related laws. The specialist regimes under the My Health Records Act 2012,[140] credit reporting provisions in Part IIIA of the Privacy Act[141] and CDR scheme[142] all contain restrictions of this kind.

6.35 The Privacy Act review process could be an effective forum to consult with the community on the types of acts or practices that Australians think should be prohibited or where additional restrictions are warranted.

6.36 Based on the OAIC’s regulatory experience, the following types of acts or practices should be considered for full or partial prohibitions:

  • Profiling, tracking or behaviourally monitoring of, or directing targeted advertising at, children. The majority of parents consider that children should have the right to grow up without being profiled and targeted (84% agree, 59% strongly agree).[143]
  • An APP entity undertaking inappropriate surveillance or monitoring of an individual through audio or video functionality of the individual’s mobile phone or other personal devices. This prohibition would have to be carefully drafted to ensure it does not place blanket prohibitions on personal devices such as smart speakers. The majority of Australians (83%) feel their personal devices listening to their conversations and sharing data with other organisations without their knowledge would be a misuse of personal information.[144]
  • The scraping of personal information from online platforms. Online platforms should also be required to proactively take reasonable steps to prevent scraping and the risks flowing from this conduct. The community considers the social media industry the most untrustworthy in how they protect or use their personal information (70% consider this industry untrustworthy).[145]
  • The collection, use and disclosure of location information about individuals can be used to profile individuals and is difficult to make anonymous. This information is often considered particularly invasive by the community where its collection, use or disclosure is not reasonably necessary for the operation of the relevant service or product or is not reasonably expected by the user. Around 72% of older Australians were uncomfortable with digital platforms/online businesses tracking their location through their mobile or web browser.[146]
  • Certain uses of AI technology to make decisions about individuals. This is discussed in more detail below.

6.37For ‘proceed with caution’ zones, we recommend the Privacy Act review consider whether to specifically define these zones or whether it is more appropriate to create a risk-based assessment. The latter option has the benefit of ensuring that emerging risky activities can still be identified as requiring additional care.

Recommendation 40 Introduce full or partial prohibitions of specified information handling activities into the general privacy framework. These could apply to the following practices:

  • profiling, tracking or behavioural monitoring of, or direct advertising targeted at children
  • inappropriate surveillance or monitoring of an individual through audio or video functionality of the individual’s mobile phone or other personal devices
  • scraping of personal information from online platforms
  • handling location information about individuals, and
  • certain uses of AI technology to make decisions about individuals.

Restrictions on use or disclosure in relation to the use of artificial intelligence

6.38 Artificial Intelligence (AI) technologies are increasingly being used by private and public entities. This has the potential to generate significant opportunities and efficiencies for business, government and the community. However, the use of these technologies also creates risks, including to privacy.

6.39 Privacy frameworks around the world have highlighted the processing of personal information using AI tools as an area where enhanced privacy protections are warranted. This will allow the benefits of AI technologies to be realised while managing the risks.

6.40 The Privacy Act applies to AI technologies that use personal information. The current APPs address some of the risks posed by this technology, for example, through requirements to have practices, procedures and systems in place to ensure compliance with the APPs, notice and consent requirements, and obligations to take reasonable steps to ensure the accuracy and quality of personal information.

6.41 The enhancements to the current Privacy Act framework recommended by the OAIC in this submission will therefore also be relevant to information processing by way of AI. These include recommendations around notice and consent requirements,[147] mandating ‘privacy by design’ and ‘privacy by default’,[148] providing individuals with new rights, such as a right to object to information handling and to erasure of personal information,[149] and promoting provable accountability, including through a certification scheme.[150]

6.42 Given that AI is a modern technology that is being deployed by APP entities to handle personal information in increasingly innovative ways, the OAIC also considers that introducing obligations requiring the fair and reasonable handling of information will ensure that the Privacy Act is able to apply flexibly to address this evolving risk.[151] The OAIC’s Recommendation 40 about the introduction of full or partial prohibitions in relation to the profiling, tracking or behaviourally monitoring of, or directing targeted advertising at, children will also be relevant.[152]

Overwhelmingly, Australians are seeking more rights in relation to the use of AI technologies. The OAIC’s 2020 ACAPS results found that:

  • 84% of Australians think that individuals should have a right to know if a decision affecting them is made using AI technology.
  • 78% of Australians believe that when AI technology is used to make or assist in making decisions, people should be told what factors and personal information are considered by the algorithm and how these factors are weighted.

6.43 The OAIC recommends introducing additional rights that apply specifically to the processing of personal information by AI technologies. These could apply as a partial prohibition or ‘proceed with caution’ zone as discussed in paragraph 6.31-6.37 above.

6.44 Several jurisdictions have legislated or are considering introducing a specific protection in relation to automated decision-making. These protections are often modelled on Article 22 of the GDPR.[153] While this could be an appropriate starting point, the review should closely consider whether all aspects of this clause are appropriate in an Australian context.

6.45 These AI-specific rights must be drafted with care to ensure that the interests of individuals are appropriately protected while allowing APP entities to deploy this technology. The OAIC suggests the Privacy Act review has regard to the experiences of other international jurisdictions, as well as the other Commonwealth projects and reviews in respect to AI that are currently underway.[154]

6.46 These domestic and international reviews have highlighted areas where Article 22 of the GDPR could be potentially improved.

6.47 For example, there has been uncertainty around the use of the word ‘solely’ in this provision,[155] and the Office of the Privacy Commissioner of Canada has recently recommended an AI-specific right that does not include this term, or similar terms such as ‘exclusively’, due to concerns they would narrowly circumscribe this protection. [156]

6.48 The AHRC’s Human Rights and Technology discussion paper has recently considered this issue and proposed the definition ‘AI-informed decision-making’. This refers to decisions that have a legal or similarly significant effect and AI has materially assisted in the process of making this decision.[157]

6.49 The OAIC is generally supportive of this definition as a positive adaptation of the threshold proposed in Article 22 of the GDPR. However, the OAIC also notes that there has been some uncertainty around the meaning of ‘similarly significant effects’ in the GDPR.[158] Some draft privacy legislation in the United States has sought to provide additional clarification for this term proposing a non-exhaustive list of significant effects which includes, but is not limited to, denial of consequential services or support, such as financial and lending services, housing, insurance, education enrolment, criminal justice, employment opportunities and health care services.[159] This additional clarification could provide a useful model in the Australian context.

6.50 The OAIC also notes that this threshold is not likely to capture many instances of targeting of online content, including advertisements, job postings, media articles or political content. In order to apply to algorithms that use personal information in those contexts, a broader definition will be necessary.

6.51 AI-specific protections for individuals must also be supported by appropriate transparency measures to require APP entities to provide further information and an explanation of AI-informed decision-making.[160]

6.52 This information should be sufficiently meaningful to enable an individual to understand the nature of the decision being made about them. It may also include the types of personal information involved and the weighting of this information.[161]

6.53 A requirement for entities to provide more technical information as part of their notification obligations could provide a basis for individuals (with expert assistance where required) to contest decisions. However, consideration will need to be given to how such an obligation will address issues of commercial confidence. For example, the Privacy Act currently provides an exception to entities where an individual requests access to their personal information, and its provision would reveal evaluative information generated within the entity in connection with a commercially sensitive decision‑making process.

6.54 In considering appropriate notice obligations in relation to AI-informed decision-making, the OAIC suggests that the review draw on the work done by other regulators including the UK ICO and the Office of the Victorian Information Commissioner.[162]

Recommendation 41 Introduce additional rights that apply specifically to the processing of personal information by AI technologies.

Footnotes

[124] See the OAIC’s Recommendations 14, 15 and 16 to enhance the existing code-making powers in Part IIIC of the Privacy Act and introduce new powers to make legally-binding rules and a requirement for entities to take Commissioner-issued guidance into account when undertaking their functions or activities under the Privacy Act.

[125] See for example page 16 of the explanatory memorandum to the Privacy Amendments (Private Sector) Bill 2000 which extended the operation of the Privacy Act to the private sector. This document highlighted that an objective of the legislation was to develop a scheme for the fair handling of consumers’ personal information.

[126] A reasonableness standard is used regularly in relation to the collection, use and disclosure of personal information (see for example APP 3, APP 6.2, s 16A & 16B), the eligible data breach scheme at Part IIIC and in relation to the security of personal information (APP 11).

[127] This builds on recommendation 17(c) in the Australian Competition and Consumer Commission’s Digital Platforms Inquiry, p. 478.

[128] A similar requirement in Canadian Personal Information Protection and Electronic Documents Act 2000 s 5 has been found by their Federal Court to be an overarching requirement that is superimposed on an organisation’s other obligations (see Office of the Privacy Commissioner of Canada (2018) Guidance on inappropriate data practices: Interpretation and application of subsection 5(3) [online document], Office of the Privacy Commissioner of Canada website, accessed 6 November 2020).

[129] The UK Information Commissioner Office, Principle (a): Lawfulness, fairness and transparency [online document], UK ICO website, accessed 12 November 2020 guidance states that entities must not use information in ways that will have unjustified adverse impacts on individuals. This requires entities to consider not just how they can use personal data, but whether they should use it in these ways. Similarly, under s 18 of the Personal Data Protection Act 2012 (Singapore), a purpose that is harmful to an individual concerned is unlikely to be considered appropriate by a reasonable person (see Personal Data Protection Commission Singapore (2013), Advisory Guidelines on Key Concepts in the Personal Data Protection Act, PDPC, Singaporean Government, accessed on 26 November 2020, p. 58

[130] While privacy is a human right, it is not an absolute right. As set out in the Parliament Joint Committee on Human Rights (2015) Guide to Human Rights, Australian Government, reasonableness, necessity and proportionality are key concepts when determining whether limitations on non-absolute human rights are justifiable (see also Attorney-General’s Department, Permissible limitations: Public sector guidance sheet [online document], Attorney-General’s Department website, Accessed 12 November 2020). Accordingly, the OAIC generally recommends that Government agencies ensure that their collection, use or disclosure is reasonable, necessary and proportionate to achieve a legitimate policy aim when designing legislation that may infringe on privacy rights (see for example OAIC (2019) Data Sharing and Release legislative reforms discussion paper — submission to Prime Minister and Cabinet [online document], OAIC website, accessed 7 November 2020). These principles have also been recognised by European regulators as fundamental to the interpretation of the GDPR. Principles of necessity and proportionality have been recognised as  a part of the principles for processing under Article 5 by the European Data Protection Board (see European Data Protection Board (2020) Guidelines 08/2020 on the targeting of social media users, EDPB, European Government ). These concepts are also important elements of the balancing test required by the UK ICO to determine whether processing fits within the legitimate interests basis for processing (UK ICO, Legitimate interests, UK ICO website [online document], accessed 7 November 2020).  These principles have also been found to be important elements in interpreting a similar right in Canada’s privacy legislation (see Privacy Commissioner of Canada (2018) Guidance on inappropriate data practices: Interpretation and application of subsection 5(3) [online document], Office of the Privacy Commissioner of Canada website, accessed 7 November 2020) of several judgments setting out the key considerations when evaluating a organisation’s purpose under s 5(3) of the Personal Information Protection and Electronic Documents Act 2000 (PIPEDA)).

[131] This consideration is modelled on Information Privacy Principle 4 of the New Zealand Privacy Act 2020. See also the Office of the Privacy Commissioner of Canada’s interpretation of Wansink v. Telus Communications Inc. 2007 FCA 21 in the Canadian Federal Court of Appeal which held that evaluating if a purpose is an appropriate purpose under s 5 of the PIPEDA requires an assessment of whether there are less invasive means of achieving the same ends at comparable cost and with comparable benefits.

[132] The UK ICO states that fairness obligations under the GDPR requires that entities to not use information in ways that an individual would not reasonably expect (UK Information Commissioner Office, Principle (a): Lawfulness, fairness and transparency [online document], UK ICO website, accessed 12 November 2020).

[133] A similar requirement to consider the risks of harms to individuals is contained in Article 24 of the General Data Protection Regulation. This requires controllers and processors to implement appropriate controls technical and organisational measures to ensure and to be able to demonstrate that processing is performed in accordance with this Regulation, having regard to risks of varying likelihood and severity for the rights and freedoms of natural persons.

[134] Taylor M and Paterson J (in press) Protecting privacy in India: The role of consent and fairness in data protection Indian Journal of Law and Technology, p. 12.

[135] OAIC (2020) Australian Community Attitudes to Privacy Survey 2020, report prepared by Lonergan Research, p. 139.

[136] OAIC (August 2020) MOU with ACCC — exchange of information [online document], OAIC website, accessed 20 November 2020.

[137] See for example GDPR, Article 5(1)(c); Privacy Act 2020 (New Zealand), s. 22, Information Privacy Principle 1; Personal Data (Privacy) Ordinance (Cap. 486) (Hong Kong), Schedule 1, Principle 1; Personal Information Protection and Electronic Documents Act, SC 2000 (Canada), Fair Information Principle 4.

[138] OAIC (2020) Australian Community Attitudes to Privacy Survey 2020, report prepared by Lonergan Research, p. 7.

[139] See the OAIC’s Recommendations 14, 15 and 16. The no-go zones in the Canadian framework are implemented through guidance in Office of the Privacy Commissioner of Canada (2018) Guidance on inappropriate data practices: Interpretation and application of subsection 5(3) [online document], Office of the Privacy Commissioner of Canada website, accessed 12 November 2020.

[140] My Health Records Act 2012 (Cth), s 70A which defines prohibited purposes for the use of My Health Records which includes underwriting a contract of insurance for a healthcare recipient, determining whether a contract of insurance covers a healthcare recipient and determining the employment of a healthcare recipient.

[141] See for example s 20E which prohibits all uses of credit reporting information except in certain circumstances, such as where the disclosure is a permitted CRB disclosure under s 20F. Similarly, certain types of entities are prohibited from being an access seeker under the Part IIIA (see s 6L(2)).

[142] Competition and Consumer (Consumer Data Right) Rules 2020 (Cth), r. 4.12(3) and r. 7.5(2) which prohibit an accredited person from selling CDR unless it is de-identified in accordance with the rules or use CDR data to identify, compile insights or build a profile about a person who isn’t the consumer, unless this is required to provide the consumer with the requested goods or services and the consumer has consented.

[143] OAIC, Australian Community Attitudes to Privacy Survey 2020, p. 90.

[144] OAIC (2020) Australian Community Attitudes to Privacy Survey 2020, report prepared by Lonergan Research, p. 36.

[145] OAIC (2020) Australian Community Attitudes to Privacy Survey 2020, report prepared by Lonergan Research, p. 55.

[146] OAIC (2020) Australian Community Attitudes to Privacy Survey 2020, report prepared by Lonergan Research, p. 79.

[147] See Recommendations 32 and 34.

[148] See Recommendation 42.

[149] See Recommendations 23 and 25.

[150] See Recommendations 42, 43, 44 and 45.

[151] See Recommendation 37.

[152] See discussion of full and partial prohibitions of certain information handling activities at paragraph 6.31-6.37.

[153] Article 22 of the GDPR states:

1.   The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

2.   Paragraph 1 shall not apply if the decision:

(a)    is necessary for entering into, or performance of, a contract between the data subject and a data controller;

(b)    is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or

(c)    is based on the data subject's explicit consent.

3.   In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.

4.   Decisions referred to in paragraph 2 shall not be based on special categories of personal data referred to in Article 9(1), unless point (a) or (g) of Article 9(2) applies and suitable measures to safeguard the data subject's rights and freedoms and legitimate interests are in place.

A similar right is contained at s 71 of South Africa’s Protection of Personal Information Act. A right modelled on the GDPR is also being considered in the United States of America which has looked to clarify this language from the GDPR. See for example Consumer Rights to Personal Data Processing Bill SF 2912 (Minnesota) (proposed legislation); New York Privacy Bill SB 5642 (New York) (proposed legislation); Protecting Consumer. Data Bill SB 5376 – 2019-20 (Washington State) (proposed legislation which is confined to profiling based on facial recognition).

[154] For example, the Department of Industry has published an AI Ethics Framework and is currently developing an AI Intelligence Action Plan. The Australian Human Rights Commission is undertaking a Human Rights and Technology project which has focused on AI and human rights.

[155] See Castan Centre for Human Rights Law, Faculty of Law, Monash University (2020), Submission to the Human Rights Commission Discussion Paper on Human Rights and Technology, submission to the AHRC, p. 1-2.

[156] Office of the Privacy Commissioner of Canada (2018) A Regulatory Framework for AI: Recommendations for PIPEDA Reform [online document], Office of the Privacy Commissioner of Canada website, accessed 1 December 2020.

[157] See AHRC (December 2019) Human Rights and Technology: Discussion Paper, AHRC, Australian Government, pp. 61-71.

[158] See Castan Centre for Human Rights Law, Faculty of Law, Monash University (2020), Submission to the Human Rights Commission Discussion Paper on Human Rights and Technology, submission to the ARHC, pp. 4-5.

[159] Consumer Rights to Personal Data Processing Bill SF 2912 (Minnesota); New York Privacy Bill SB 5642 (New York); Protecting Consumer. Data Bill SB 5376 – 2019-20 (Washington State).

[160] Such a right could be modelled on Article 13(2)(f) of the GDPR.

[161] The UK ICO states that similar information should be provided under Article 13 of the GDRP. See UK ICO (n.d.) What else do we need to consider if Article 22 applies? [online document], ICO website, accessed on 16 November 2020. See also the discussion of a meaningful explanation in Office of the Privacy Commissioner of Canada (2018) A Regulatory Framework for AI: Recommendations for PIPEDA Reform [online document], Office of the Privacy Commissioner of Canada website, accessed 1 December 2020.

[162] For example, UK ICO and the Turing Institute (n.d.) Explaining decisions made with AI [online document], ICO website, accessed on 16 November 2020 as well as the Office of the Victorian Information Commissioner (2019), ‘Closer to the Machine: AI e-book’, OVIC, Victorian Government.