Part 3: Flexibility of the APPs in regulating and protecting privacy

6.       Is the framework of the Act effective in appropriately balancing flexibility to cater for a wide variety of entities, acts and practices, while ensuring sufficient clarity about protections and obligations?

3.1          While the definition of personal information sets out the scope of what is regulated by the Privacy Act, the APPs form the cornerstone of the privacy protection framework.[37] The APPs are legally binding principles, which provide entities with the flexibility to take a risk-based approach to compliance, based on their particular circumstances, including size, resources and business model, while ensuring the protection of individuals’ privacy.

3.2          The principles-based approach therefore enables the APPs to be scalable for entities of various sizes and capabilities across the economy and to be adapted to different acts and practices of those entities. The APPs are also technology neutral, applying equally to paper-based (offline) and digital environments. This allows for greater ‘future-proofing’, which is intended to preserve the relevance and applicability of the APPs, in a context of continually changing and emerging technologies.[38]

3.3          As outlined in the Objectives section of this submission, a key object of the Privacy Act is to recognise that the protection of the privacy of individuals is balanced with the interests of entities in carrying out their functions or activities.[39] The principles-based approach of the APPs sets overall objectives that must be met to enable APP entities to achieve this balance. By contrast, rules-based regulation is comparatively rigid. Detailed rules impose requirements that are not always appropriate for all entities regulated by the relevant scheme and, further, they do not always cover all of the entities that are intended to be regulated.[40]

3.4          The GDPR similarly enables a flexible approach to compliance based on key principles. Guidance from the United Kingdom Information Commissioner’s Office (UK ICO) states:

Every organisation is different and there is no one-size fits-all answer. Data protection law doesn’t set many absolute rules. Instead it takes a risk-based approach, based on some key principles. This means it’s flexible and can be applied to a huge range of organisations and situations, and it doesn’t act as a barrier to doing new things in new ways.[41]

3.5          While the principles-based approach is the foundation of Australia’s privacy protection framework, the Privacy Act also contains mechanisms that allow the APPs to be supplemented by more specific rules in regulations or other legislative instruments in appropriate circumstances.

3.6          The OAIC considers that the principles, risk-based framework of the Privacy Act continues to be the most effective regulatory model for the protection of personal information in Australia. This approach is also consistent with other data protection laws around the world, including the GDPR, as outlined above. 

3.7          However, as noted in the Objectives section, the OAIC considers that the review presents an opportunity to place greater emphasis in the Privacy Act on the rights of individuals and the obligations of entities to protect those rights. The recommended enhancements outlined in this section are designed to achieve this and to resolve ambiguities in the existing APPs.

3.8          The OAIC’s recommendations are also aimed at maintaining the flexibility and scalability of the existing principles-based privacy framework, while providing the Commissioner with enhanced abilities to make legally binding instruments to address areas that either require further certainty or specificity in the law, or that merit specific privacy protections.

Legislative flexibility to adapt the APPs

3.9          While the OAIC considers the principles-based approach to the APPs should be retained, we acknowledge that there may be areas that require further certainty or specificity in the law, or that merit specific privacy protections.

3.10       The Issues Paper outlines two existing mechanisms that may be used to prescribe specific requirements or treatments in relation to certain classes of entities, information, or acts and practices. Exempt entities (or classes of entities) or acts and practices can be brought within the regulatory remit of the APPs through delegated legislation, where there is a public interest in doing so.[42] Additionally, Part IIIB of the Act creates a framework for the development, registration and variation of codes of practice about information privacy, called APP codes.

3.11       The Issues Paper notes that the Commissioner may develop an APP code if the Commissioner considers that it is in the public interest to do so. However, this power can only be exercised if the Commissioner has requested a code developer to develop an APP code and the request has not been complied with, or the Commissioner has decided not to register the APP code that was developed as requested. The Commissioner may then develop and register an APP code only after these procedural steps have been followed.[43]

3.12       The factors that will be taken into account by the Commissioner in identifying an appropriate code developer include whether an entity, group of entities, or association or body:

  • has the capacity to develop a code including whether they have the resources and expertise, and
  • is generally representative of the entities in the sector or industry to which the code will apply.[44]

3.13       In certain circumstances, it would be challenging to identify an appropriate entity or group of entities that meet the above criteria. For example, it may be necessary to develop a code to cover a particular activity that is being engaged in across a broad sector of the economy, such as the online sector, that is made up of a diverse range of entities. In these circumstances, it may be difficult to identify a code developer (or developers) that is generally representative of the entities that are intended to be captured. It may also be challenging to identify a developer/s with adequate resources and expertise to develop an APP code that is intended to capture a wide range of entities of various sizes and with different personal-information handling practices.

3.14       Further, a situation may arise where an APP code needs to be developed as a matter of urgency. However, under the existing APP code provisions in the Privacy Act, there are prescribed minimum timeframes that must be complied with during the development of a code. For instance, the Commissioner must provide a code developer a minimum period of 120 days to develop the code.[45] If the Commissioner is required to develop an APP code in the circumstances described above, a further consultation period of at least 28 days must occur.[46]

3.15       The OAIC considers that, in urgent circumstances, it would be beneficial if the Commissioner had the ability to expeditiously issue a temporary APP code where there is a clear public interest in doing so. Importantly, a temporary APP code would be in force for a limited period of time. For example, the response to the pandemic has necessarily required regulated entities to quickly implement new or changed information-handling practices. In these circumstances, a temporary APP code issued quickly in response to changing circumstances could assist affected entities by providing greater clarity and certainty around their privacy obligations. The OAIC notes that this approach aligns with recent amendments to New Zealand’s privacy law, which enables the Privacy Commissioner to temporarily issue, amend or revoke a privacy code of practice in urgent circumstances where it is impracticable to follow the regular code-making procedures.[47]

3.16       Considering the above, the OAIC recommends that the existing APP code framework is amended to provide the Commissioner with greater flexibility and discretion to develop APP codes. Specifically, the framework should:

  • enable the Commissioner to develop an APP code in the first instance (i.e. without having to first request a code developer to develop an APP code), and
  • enable the Commissioner to issue a temporary APP code if it is urgently required and where it is in the public interest to do so, and
  • retain the existing power which enables the Commissioner to request that a code developer develop a code, and
  • enable the Commissioner to intervene at any point in the code development process where an APP code is being developed by a code developer if satisfied it would be preferable for the Commissioner to develop the code.

3.17       We consider the proposed amendments would provide the Commissioner with greater flexibility to develop codes, thereby ensuring that greater specificity can be given to the APPs where required, emerging privacy risks can be addressed, and additional clarity and certainty can be provided to regulated entities in exigent circumstances. Enabling the Commissioner to develop a code in the first instance would also address the challenges associated with identifying a code developer where a code is intended to apply to a wide range of entities and personal-information handling activities. The additional discretionary power for the Commissioner to intervene in circumstances where a code developer is developing a code will enable the OAIC to retain leadership over the code development process and ensure any APP code meets its intended objectives.

3.18       In addition, a general rule-making power would provide the Commissioner with the ability to provide the regulated community with further certainty in how to address certain privacy risks and concerns, by providing greater specificity and particularisation around the application of the APPs where necessary. This is similar to the model under the Consumer Data Right regime, which enables the ACCC to issue legally binding rules to provide greater detail around the application of the CDR regime.[48] Accordingly, the OAIC recommends that the code-making powers in the Privacy Act are supplemented by a general power for the Commissioner to issue legally binding rules about the application of the APPs.

3.19       The Commissioner may also currently make guidelines for the avoidance of acts or practices that may or might be interferences with the privacy of individuals, or which may otherwise have any adverse effects on the privacy of individuals.[49] The OAIC has developed APP guidelines, which set out the Commissioner’s interpretation of the APPs, including the matters that may be taken into account when exercising functions and powers relating to the APPs.

3.20       However, these guidelines are not legally binding nor is an entity required to have regard to them when considering how to comply with the Act. By contrast, under s 93A of the Freedom of Information Act 1982 (FOI Act) the Commissioner may, by instrument in writing, issue guidelines for the purposes of the Act, which an agency must have regard to when performing a function or exercising a power under the Act.[50]

3.21       The OAIC recommends providing further certainty for entities around their compliance obligations by elevating the status of the APP guidelines through a new provision that would require entities to have regard to any guidelines issued by the Commissioner when carrying out their functions and activities under the Privacy Act.

Recommendation 14 – Amend the APP code framework in Part IIIB of the Privacy Act to provide the Commissioner with greater flexibility and discretion to develop APP codes. The framework should:

  • enable the Commissioner to develop an APP code in the first instance (i.e. without having to first request a code developer to develop an APP code), and
  • enable the Commissioner to issue a temporary APP code if it is urgently required and where it is in the public interest to do so, and
  • retain the existing power which enables the Commissioner to request that a code developer develop a code, and
  • enable the Commissioner to intervene at any point in the code development process where an APP code is being developed by a code developer if satisfied it would be preferable for the Commissioner to develop the code.

Recommendation 15 – Supplement the code-making powers in Part IIIB of the Privacy Act with a general power for the Commissioner to issue legally binding rules about the application of the APPs.

Recommendation 16 – Include a new provision in the Privacy Act that would require entities to have regard to any guidelines issued by the Commissioner when carrying out their functions and activities under the Privacy Act.

Safeguards in the APPs to prevent the misuse of sensitive information

35. Does the Act adequately protect sensitive information? If not, what safeguards should be put in place to protect against the misuse of sensitive information?

3.22       In accordance with the OAIC’s APP guidelines, APP entities need to consider the sensitivity of information that they are handling when determining the reasonable steps that they should take to comply with many of the APPs.[51] The OAIC will also consider whether a matter involves sensitive information as a factor in determining whether to take regulatory action.[52] 

3.23       The OAIC’s recommendation 6, to clarify that inferred information is captured under the definition of personal information (which includes sensitive information), will ensure that existing Privacy Act protections apply to inferred sensitive information. These include notice requirements under APP 5 and the requirements to seek consent when collecting this information under APP 3.

3.24       The OAIC is also considering whether there are categories of information which are considered sensitive by the community that deserve additional protections. An important example of this is location information, which can be used to profile individuals and is difficult to make anonymous.[53] Location information is particularly intrusive in that beyond showing where an individual has been, it can also reveal sensitive information about them such as information about their health or religious beliefs. The OAIC recommends, however, that this issue be dealt with by way of a full or partial prohibition, which is a stronger protection than making this data sensitive information (See Recommendation 40).

3.25        In addition, this submission also recommends the introduction of additional protections to apply to the misuse of sensitive information. These include:

  • Recommendation 31 – Strengthen notice and consent requirements in the Privacy Act to address the limitations in these mechanisms, but preserve the use of consent for high privacy risk situations, rather than routine personal information handling.
  • Recommendation 37 - Introduce fairness and reasonableness obligations into APPs 3 and 6.
  • Recommendation 50 – Introduce several amendments to the enforcement mechanisms under the Privacy Act to ensure that the Commissioner has the correct regulatory tools that provide a credible deterrent against privacy infringements.

Two-thirds (62%) of Australians are uncomfortable with digital platforms/online businesses tracking their location through their mobile or web browser. This is higher among females, with two-thirds (65%) feeling uncomfortable compared to males (59%), and highest among older Australians, with three-quarters (72%) feeling uncomfortable compared to only 55% of those aged 18-49 years.[54]

Since the outbreak of COVID-19, Australian’s concerns around location information and privacy risks have also increased. Location tracking has become the third biggest privacy risk where it was previously ranked fifth.[55]

Strengthening the APPs

APP 3 – additional obligations for collection from third parties

3.26       APP 3.6 requires personal information about an individual to be collected from that individual unless it is unreasonable or impracticable to do so.

3.27       However, as the Issue Paper observes, it does not place any express obligations on APP entities that rely on this exception to consider the circumstances of the initial collection and ensure that it was in compliance with the requirements in APP 3.

3.28       Personal information is being increasingly shared between third parties (for example, data brokers or through the ad-tech ecosystem) or scraped off social media. As individuals are increasingly excluded from these activities, they are not afforded the opportunity to refuse to participate in the collection (for example, if the individual would not have given up the information if asked directly) and there is an increased likelihood that the information collected will not be accurate, up-to-date, complete and relevant. In these circumstances, there is a need to encourage entities to give greater focus to the context of the original collection and take reasonable steps to satisfy itself that the information was collected in accordance with the requirements of the APPs.

The OAIC has identified instances where the collection of personal information from third parties or public sources has resulted in detriment for individuals in circumstances where it may be readily apparent, or the OAIC later identifies, that this information was collected by unfair or unlawful means:

Collection from third parties – An APP entity purchased personal information from a third party without making enquiries as to how the information had been initially collected. The information was later identified to have been initially collected by unfair or unlawful means.

Collection from public sources – An APP entity collects personal information, such as photographs of individuals, from a public internet website where it is reasonably apparent that this information was collected for publication by unfair or unlawful means. This information can then be used to target individuals.

Collection from public sources – An APP entity collects personal information from the dark web or from a public internet website where it appears reasonably apparent that this information was likely published by the perpetrator of a data breach.

3.29       Even with the proposed fairness and reasonableness requirements in Part 6 below, the law would benefit from greater clarity as to the extent these requirements would extend to the original collection in circumstances where personal information has not been collected directly from an individual.

3.30       The OAIC recommends that the Privacy Act is amended to introduce a due diligence requirement that, where personal information was not collected directly from an individual, an APP entity must take reasonable steps to satisfy itself that the information was originally collected in accordance with APP 3.

3.31       This provision creates additional safeguards for individuals where an APP entity collects personal information from a third party. This could be achieved by making a minor addition to APP 3.6:

(c)  If the APP entity does not collect the information from the individual, it must take reasonable steps to satisfy itself that the information was originally collected from the individual in accordance with this principle.

3.32       This obligation would support the OAIC’s recommendation 37 to require APP entities to only collect, use or disclose information fairly and reasonably.

3.33       The Commissioner’s guidance could then set out examples of reasonable steps that an APP entity can take to satisfy itself that the information was originally collected from individuals in accordance with APP 3. These could include making reasonable enquiries regarding the collecting entities’ notice and consent procedures or seeking contractual warranties that the information was collected in accordance with the APPs.

Recommendation 17 – Amend APP 3.6 to require an APP entity to take reasonable steps to satisfy itself that personal information that was not collected directly from an individual was originally collected in accordance with APP 3.

APP 7 – Direct marketing

37. Does the Act strike the right balance between the use of personal information in relation to direct marketing? If not, how could protections for individuals be improved?

3.34       APP 7 sets out specific requirements where personal information is used for direct marketing purposes. Direct marketing involves the use and/or disclosure of personal information to communicate directly with an individual to promote goods and services.[56]

3.35       The explanatory memorandum to the Privacy Amendment (Enhancing Privacy Protection) Bill 2012 notes that direct marketing is addressed separately within a discrete principle rather than as a kind of secondary purpose under APP 6 because of the significant community interest about the use and disclosure of personal information for the purposes of direct marketing.[57]

3.36       However, it should be noted that APP 7 only applies to certain methods of direct marketing. APP 7.8 states that the principle does not apply to the extent that the Interactive Gambling Act 2001, the Do Not Call Register Act 2006 or the Spam Act 2003 applies. In other words, APP 7 will only apply to direct marketing communications that are not covered by these Acts. This means, in practice, APP 7 will generally only apply to:

  • direct marketing calls or faxes where the number is not listed on the Do Not Call Register, or the call is made by a registered charity
  • direct marketing by mail (whether sent by post or hand delivered) and door-to-door direct marketing
  • targeted marketing online (including on websites and mobile apps), but only if personal information is used or disclosed to target that marketing.

3.37       While the OAIC acknowledges the policy objective behind APP 7, the privacy risks associated with direct marketing have changed significantly since 2012. Further, the OAIC considers that the current approach, which means entities must comply with different obligations for different channels, creates regulatory fragmentation and confusion. The Australian Communications and Media Authority (ACMA) has also called for broader reform of the regulatory framework for unsolicited communications.[58]

3.38       The protections contained in APP 7 apply to the use and disclosure of personal information for direct marketing. As noted in Part 2, new developments in the way that data is handled makes it increasingly difficult to draw a bright line between personal and non-personal information in the online environment. APP 7 is also expected to apply to increasingly complex methods of targeted marketing involving multiple parties in the online environment using cookies and other online identifiers. This enables the individual user of a device to be targeted to receive a particular ad, offered personalised content or recommendations, sent political messaging, or subjected to automated decisions such as differential pricing. 

3.39       The application of APP 7 in the circumstances described above is not clear, nor is it clear which entity or entities in the ad tech ecosystem are responsible for compliance with APP 7 requirements.

3.40       Accordingly, the OAIC considers that APP 7 is no longer fit for purpose and recommends that APP 7 should be repealed. If APP 7 was repealed, the use and disclosure of personal information for direct marketing purposes and related activities would then be subject to the existing requirements contained in APP 6. That is, an entity could only use or disclose personal information for direct marketing with consent, or if one of the exceptions in APP 6.2 applies. This approach would be enhanced by the OAIC’s recommendation 37 to introduce a requirement for APP entities to use and disclose personal information ‘fairly and reasonably’.

3.41       In addition, to ensure that individuals have the ability to request not to receive direct marketing as currently required by APP 7, the OAIC recommends that the right to object (discussed below) includes an absolute right for individuals to object to the use and disclosure of their personal information for direct marketing purposes. That is, an entity would not be able to rely on any of the proposed exceptions to the right to object to continue to use and disclose an individual’s personal information for direct marketing. This is consistent with the approach taken under the GDPR, which equips data subjects with an absolute right to stop their data being processed for direct marketing purposes. To ensure the existing protections of APP 7 are preserved, the right to object should also include the ability for individuals to request an organisation to identify the source of the personal information that is uses or discloses for direct marketing. An entity should be required to notify the individual of its source, unless this is unreasonable or impracticable.

Recommendation 18 – Repeal APP 7 and rely on the existing use and disclosure requirements in APP 6 for direct marketing activities.

Recommendation 19 – Ensure that the proposed new right to object includes:

  • an absolute right for individuals to object to the use and disclosure of their personal information for direct marketing purposes, and
  • the ability for individuals to request an organisation to identify the source of the personal information and the organisation should be required to notify the individual of its source, unless this is unreasonable or impracticable.

APP 11 – Security of personal information

43. Are the security requirements under the Act reasonable and appropriate to protect the personal information of individuals?

3.42     APP 11.1 requires APP entities to take reasonable steps to protect the personal information that they hold from misuse, interference and loss, as well as unauthorised access, modification or disclosure.

3.43       The security obligations in APP 11 encompass personal information held both online and offline and require entities to protect personal information against a wide range of threats, including cyber security threats, human error, theft, inadvertent loss of physical or electronic information, or information being improperly used or accessed by employees or external third parties.

3.44       The principles-based framing of APP 11 enables entities to scale their responsibilities proportionally to the volume and type of personal information that they hold. Where the volume or sensitivity of personal information held by an entity increases, so too will the expectations placed upon the entity to protect that information. In particular, there is an expectation that in complying with APP 11, entities will actively monitor their risk environment for emerging threats and take reasonable steps to protect personal information by mitigating those risks.

3.45       Similarly, when considering what steps are reasonable under the Privacy Act, APP 11 requires entities to take account of the broader security environment in which they operate and apply any security obligations imposed under other frameworks.

3.46       Under APP 11, entities must also take steps beyond technical security measures in order to protect and ensure the integrity of personal information throughout the information lifecycle, including implementing strategies in relation to governance, internal practices, processes and systems, and dealing with third party providers.

3.47       The framing of APP 11 acknowledges that security threats and the responsibilities of entities to respond to those threats are not static and require consideration of obligations that go beyond those set out in the Privacy Act. Requiring entities to take ‘reasonable steps’ to secure personal information can help to ensure that the security standards applied are commensurate to the risk of the data handling activity. A café that needs to secure hard copy contact details presents a different risk profile to a GP office or a multinational corporation.

3.48       The OAIC therefore considers that it is important to retain the principles-based approach in APP 11, to ensure that entities are able to apply their obligations flexibly to respond to emerging threats, new and broad obligations, and the specific risk environment that they operate in.

3.49       The need for flexibility in responding to security threats is noted in the Government’s 2020 Cyber Security Strategy, which says, in relation to a proposed new regulatory framework for critical infrastructure and systems of national significance:

One size does not fit all. The framework will balance objectives for cyber, physical, personnel and supply chain protections across all sectors, while recognising sector-specific differences. This is why the framework will be built around principles-based outcomes, underpinned by guidance and advice proportionate to the risks and circumstances in each sector.

3.50       As acknowledged in the ALRC Report, the principles-based approach of the APPs ‘does not foreclose the possibility of technology specific regulation or legislative instruments in certain circumstances’.

3.51       The OAIC supports opportunities to enhance the current privacy framework through the introduction of additional and specific measures in relation to information security. The OAIC considers that Recommendations 14 and 15 to enhance the Commissioner’s code-making powers and introduce a new general power for the Commissioner to issue legally binding rules is the most appropriate way of achieving this outcome.

3.52       These powers could be used by the Commissioner to further enhance requirements that prevent information loss attributable to specific threats, such as cyber intrusion, specific industries, or in relation to specific technologies. For example, as per Recommendation 15, the Commissioner could use a new legally binding rule-making power to develop rules for a specific industry to provide greater clarity around the ‘reasonable steps’ that they should take to meet their compliance obligations under APP 11.1. This could be modelled on the approach under Article 32 of the GDPR, which sets out specific measures to ensure a level of security appropriate to the risk, including (as appropriate): the pseudonymisation and encryption of personal data; the ability to ensure the ongoing confidentiality, integrity and availability and resilience of processing systems and services; and a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures.

3.53       There is already precedent for this approach, with existing sector specific frameworks used to add specificity and clarity in relation to certain parts of the economy. For instance, there are particular privacy security requirements relating to credit information and credit eligibility information,[59] and tax file number information.[60] There are specific personal information security requirements relating to My Health Records and retained data under the Telecommunications (Interception and Access) Act 1979 (Cth).[61] Further, there are specific information security requirements relating to ‘CDR data’ under the Consumer Data Right system. These may differ across sectors and classes of persons.[62]

3.54       Proposals to introduce a certification framework under the Privacy Act would also support entities to meet their APP 11 security obligations. This proposal is considered further in Part 7 below. For example, an accreditation scheme has been created under the CDR model, which provides a safe mechanism for individuals and businesses to direct data holders to share their data with accredited third parties.

Recommendation 20 – Introduce enhanced code-making powers and new powers for the Commissioner to issue legally binding rules to enable the Commissioner to make sector- or threat-specific legislative instruments that support the principles-based approach in APP 11.1.

44. Should there be greater requirements placed on entities to destroy or de-identify personal information that they hold?           

3.55       APP 11.2 requires entities to take such steps as are reasonable in the circumstances to destroy or de-identify personal information when it is no longer needed for any purpose for which it may be used or disclosed under the APPs (and if the information is not contained in a Commonwealth record or legally required to be retained by the entity).

3.56       Destroying and de-identifying personal information that is no longer needed is an important strategy to help mitigate security risks. For example, holding large amounts of personal information for longer than is needed may increase the risk of unauthorised access by staff or contractors. ‘Honey pots’ containing vast amounts of valuable data may increase the risk that an entity’s information systems may be hacked.[63]

3.57       The principles-based framing of APP 11.2 enables entities to scale and tailor their approach to destruction and de-identification based on their circumstances. Similar to the considerations outlined above for APP 11.1, more rigorous steps may be required by an entity to destroy or de-identify personal information based on the amount or sensitivity of its personal information holdings. 

3.58       The OAIC considers that the principles-based approach to APP 11.2 should be retained, rather than prescribing greater requirements for entities to destroy or de-identify personal information. Additional requirements may not be applicable or appropriate in all circumstances given entities hold personal information for a variety of different purposes. This necessarily means there is no ‘one size fits all approach’, as personal information may need to be retained longer by some entities than others.

3.59       As outlined at Recommendations 14 and 15, the proposed enhancements to the Commissioner’s code-making powers and the introduction of a new power to make legally-binding rules would enable the Commissioner the flexibility to place greater requirements around the destruction or de-identification of personal information where appropriate. For example, rules could be issued that set standards around the destruction or de-identification of personal information for a particular industry.

Recommendation 21 – Introduce enhanced code-making powers and new powers to make legally-binding rules under the Privacy Act to enable the Commissioner to set requirements or standards for destruction and de-identification by legislative instrument where appropriate.

Individual rights under the APPs

3.60       Individual privacy rights are an important component of privacy self-management and, more broadly, facilitate the enjoyment of other rights and freedoms including freedom of association, thought and expression, as well as freedom from discrimination.[64] As noted above, effective rights for individuals to exercise control over their personal information are also crucial build confidence in the privacy framework; a necessary precondition to create the trust and confidence in entities handling personal information to carry out their activities.

3.61       To this end, the existing rights of access (APP 12) and correction (APP 13) are well-established rights in Australia’s privacy framework. The proposed additional right of erasure and right to object are intended to complement the existing rights under the Act and are also tied to the broader obligations under the APPs. For example, entities already have an existing obligation under APP 11.2 to destroy or de-identify personal information when it is no longer necessary for any purpose for which it may be used or disclosed under the APPs. Accordingly, entities should already have practices, procedures and systems in place to give effect to this requirement. The right to erasure enables individuals to initiate this process on request.

3.62       This section makes recommendations to enhance the existing rights in the APPs and introduce new rights to further support privacy self-management.

Existing rights – access and correction 

45. Should amendments be made to the Act to enhance: a. transparency to individuals about what personal information is being collected and used by entities? b. the ability for personal information to be kept up to date or corrected?     

3.63       APP 12.1 provides that if an APP entity (including an agency) holds personal information about an individual, the entity must, on request by the individual, give the individual access to the information.

3.64       APP 13.1 provides that an APP entity must take reasonable steps to correct personal information it holds, to ensure it is accurate, up-to-date, complete, relevant and not misleading, having regard to the purpose for which it is held. The requirement to take reasonable steps applies in two circumstances:

  • where an APP entity is satisfied, independently of any request, that personal information it holds is incorrect, or
  • where an individual requests an APP entity to correct their personal information.

3.65       APPs 12 and 13 operate alongside, and do not replace, other informal and legal processes by which an individual can be provided access to personal information, or by which they may seek correction of their personal information. For example, the FOI Act provides a complementary procedure that gives individuals a legally enforceable right of access to documents (under Part III) and the right to request correction or update (Part V) of their personal information in agency records or the official documents of a minister. It is also open to individuals to seek access or correction of their personal information informally through administrative processes.

3.66        The intended benefits behind providing individuals with an alternative means of accessing personal information held by Australian Government agencies under APP 12 include enabling agencies to process requests for personal information promptly and at the lowest reasonable cost, by allowing agencies to focus on personal information rather than documents (to which exemptions under the FOI Act might otherwise apply). The Privacy Act also provides the ability to seek remedies where access to personal information is denied in breach of the Act. In contrast, a denial of access under the FOI Act permits an application for internal and external review of the administrative decision.

3.67       As outlined above, an individual may request that an entity corrects personal information that it ‘holds’ about them. An entity ‘holds’ personal information if ‘the entity has possession or control of a ‘record’ that contains the personal information. [65] Accordingly, entities are not required to correct personal information that they do not have possession or control of (for example, personal information that has been published on social media).

3.68       The OAIC recommends that the right to request correction of personal information should extend to require further steps to be taken in relation to personal information that is no longer ‘held’ by the entity such as publicly available information that has been posted online. In addition to the proposed right of erasure discussed further below, this will provide individuals with greater control of their personal information and mitigate the risk of harm that may arise from incorrect information being widely available online.

Recommendation 22 – Extend the right to request correction of personal information in APP 13 to personal information that is no longer ‘held’ by the entity.

New rights

46. Should a ‘right to erasure’ be introduced into the Act? If so, what should be the key features of such a right? What would be the financial impact on entities?            

47. What considerations are necessary to achieve greater consumer control through a ‘right to erasure’ without negatively impacting other public interests?               

Right to erasure

3.69       The OAIC supports the introduction of a right to enable individuals to request the erasure of their personal information into the Act, subject to some exceptions.

3.70       As recommended in the DPI report, under a right to erasure, APP entities would be required to comply with a request to erase personal information without undue delay, unless there is an overriding reason for the information to be retained.[66]

3.71       This would bring the Australian privacy framework into line with other international jurisdictions including the United Kingdom and European Union. It would also be consistent with the direction taken by other domestic legislative frameworks, such as the Consumer Data Right and My Health Records systems, which allow individuals to request the deletion of their data in certain circumstances.[67]

3.72       The OAIC’s 2020 ACAPS results demonstrated that there is also community support for the introduction of a right to erasure. The survey found that 84% of respondents would like to have increased rights around certain issues such as asking businesses to delete information, while 64% of respondents want the right to ask a government agency to delete their personal information.

3.73       The potential for implementation challenges and regulatory impact would need to be carefully considered when determining the most appropriate scope for a new right to erasure to ensure the protection of individuals’ privacy alongside the interests of entities in carrying out their functions and activities. These challenges have been highlighted in the GDPR context, for example, in relation to the requirement that an entity that is obliged to erase personal data must inform other organisations of the erasure of personal data where that data has been disclosed to others or has been made public in an online environment.[68] Although this is qualified by a reasonable steps test, and an entity may take into account implementation cost in discharging this obligation, we understand this requirement may pose technical, cost and other resource challenges for entities.[69]

3.74       The OAIC considers that a right to erasure would complement APP 11.2, which requires APP entities to destroy or de-identify personal information that they no longer need. The processes and procedures that APP entities have in place to meet this obligation would ease the burden of complying with a new right of erasure.

3.75       The OAIC recommends that the right to erasure should apply to personal information that is no longer ‘held’ by an entity (for example, publicly available information on social networks). This would extend the application of this right to the online environment. However, the obligation to erase personal information, and to notify others of an erasure request where the entity has made the personal information public, should be subject to appropriate exceptions.

3.76       The OAIC recommends that the right to erasure should include the following key features, as a minimum:

  • The exceptions recommended in the DPI report (‘unless the retention of information is necessary for the performance of a contract to which the consumer is a party, is required under law, or is otherwise necessary for an overriding public interest reason’).[70]
  • An exception for ‘frivolous or vexatious’ requests, consistent with APP 12,[71] or a similar threshold, for example ‘manifestly unfounded or excessive requests, consistent with the GDPR.[72]
  • An exception for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes where the right to erasure is likely to render impossible or seriously impair the achievement of the objectives of that processing, consistent with the GDPR.[73]
  • An appropriate timeframe within which APP entities must respond to erasure requests, for example consistent with APP 12[74] or the GDPR.[75]

3.77       In addition, the OAIC recommends that any right to erasure be complemented by:

  • A requirement for APP entities to notify individuals of their ability to request the erasure of their personal information. This could be modelled on similar requirements in Article 13 of the GDPR.
  • A right for individuals to object to the handling of their personal information for specific purposes. Under a right to object, an individual may stop or prevent certain types of data processing without requiring the erasure of their personal information, which is important where individuals wish to continue using a service. This right should apply prospectively, not retrospectively.
Right to object

3.78       While the ‘right to object’ is not explored in the Issues Paper, the OAIC considers that it is an important reform that supports the ability of individuals to have control over their personal information.

3.79       Under a right to object, individuals would be able to object to the handling of their personal information for certain purposes at any time. This would allow individuals to stop APP entities from handling their personal information, unless an exception applies.

3.80       Individuals are provided with a right to object under Article 21 of the GDPR. This includes an absolute right to object to processing for direct marketing purposes, including profiling to the extent that it is related to such direct marketing.[76] Upon receiving an objection to processing for direct marketing, the organisation must cease processing of the data for that purpose from that time onwards.[77] It also includes limited rights to object to processing on several other grounds, subject to exceptions.[78]

3.81       As outlined above, a right to object would also complement a right to erasure by allowing an individual to stop certain types of data processing without requiring the erasure of their personal information, which is important where individuals wish to continue using a service. The OAIC’s 2020 ACAPS results demonstrated that there is community support in this regard: 77% of respondents would like the right to object to certain data practices while still being able to access and use the service.

3.82       A right to object should also be considered as part of the OAIC’s Recommendation 18 for the review to consider whether APP 7 (direct marketing) is still fit for purpose. A right to object, if formulated on similar terms to the GDPR right, would allow an individual to object not only to the handling of their personal information for direct marketing purposes. As outlined in the ‘APP 7 – Direct marketing’ section of this submission, APP 7 may be not be sufficient to mitigate the privacy harms posed by increasingly complex methods of targeted marketing in the current online environment, which may include profiling. The right to object, if implemented, would achieve the same outcome as APP 7 but go further, by not only allowing an individual to in effect ‘opt out’ of receiving direct marketing communications, but also requiring the APP entity to stop handling the personal information for that purpose.

3.83       The OAIC therefore recommends that a right to object is introduced into the Privacy Act, modelled off Article 21 of the GDPR as a starting point. In a similar vein to the GDPR right, the OAIC considers it appropriate for individuals to have an absolute right to object in relation to direct marketing, but a limited right to object in relation to processing on other grounds.

Recommendation 23 – Introduce a right to erasure that includes, as a minimum:

  • the exceptions recommended in the DPI report
  • an exception for ‘frivolous or vexatious’ requests, consistent with APP 12, or a similar threshold, for example ‘manifestly unfounded or excessive requests, consistent with the GDPR
  • an appropriate timeframe within which APP entities must respond to erasure requests, for example consistent with APP 12 or the GDPR, and
  • extends to personal information that is no longer ‘held’ by an entity, and to notify others of the erasure request where personal information has been made public, subject to the exceptions outlined at point (a) above.

Recommendation 24 – Introduce a requirement for APP entities to notify individuals of their ability to request the erasure of their personal information. This could be modelled on similar requirements in Article 13 of the GDPR.

Recommendation 25 – Introduce a right to object that includes:

  • an absolute right to object in relation to direct marketing
  • a limited right to object in relation to processing on other grounds.

Recommendation 26 – Introduce a requirement for APP entities to notify individuals of their ability to object to the handling of their personal information, including the absolute right for individuals to object to the use and disclosure of their personal information for direct marketing.

Emergency Declarations

41. Is an emergency declaration appropriately framed to facilitate the sharing of information in response to an emergency or disaster and protect the privacy of individuals?

3.84       APP entities are ordinarily required to comply with the APPs when handling personal information. When an emergency declaration is in force, Part VIA allows agencies and organisations to collect, use and disclose personal information about an individual impacted by an emergency for several purposes that may not otherwise be permitted under the APPs. Agencies and organisations will still need to comply with other obligations under the Privacy Act, including notice and information security requirements.

3.85       As noted in the Issues Paper, the explanatory memorandum to the Bill that introduced the emergency declaration provisions states:

Part VIA of the Privacy Act was introduced to provide a clear and certain legal basis for the collection, use and disclosure of personal information about deceased, injured and missing Australians in an emergency or disaster situation in Australia or overseas… [The provisions] place beyond doubt the capacity of the Australian Government and others to lawfully exchange personal information in an emergency or disaster situation.[79]

3.86       As these provisions override the ordinary purposes for which personal information may be collected, used or disclosed under the APPs, the OAIC considers that it is appropriate that Part VIA is only relied upon in limited circumstances. The OAIC notes that in many cases, exceptions to APPs 3 and 6 would be sufficient to enable APP entities to collect, use or disclose personal information in emergency situations.[80]

3.87       The OAIC is not making recommendations about changes to the emergency declaration provisions at this stage. The OAIC will consider relevant information submitted by stakeholders as part of the Issues Paper consultation before making final recommendations on this issue.

3.88       However, the OAIC suggests that the Privacy Act review ensure that any amendments to these provisions are aligned across Government. In particular, the OAIC notes that the Government has committed to creating a new law concerning the declaration of natural disasters[81] as a result of the recommendations made by the Royal Commission into National Natural Disaster Arrangements. These relevantly including that:

  • Australian, State and Territory governments should restructure and reinvigorate ministerial forums with a view to enabling timely and informed strategic decision-making in respect of the response to, and recovery from, natural disasters of national scale or consequence[82]
  • Australian, state and territory governments should ensure that personal information of individuals affected by a natural disaster is able to be appropriately shared between all levels of government, agencies, insurers, charities and organisations delivering recovery services, taking account of all necessary safeguards to ensure the sharing is only for recovery purposes.[83]

3.89       Part VIA allows for sharing of information from Commonwealth agencies to State & Territory government bodies. Information handling by State & Territory government bodies during an emergency, however, must be handled in accordance with the applicable laws in that jurisdiction.

3.90       These recommendations by the Royal Commission will promote an object of the Privacy Act to provide the basis for the nationally consistent regulation of privacy and handling of personal information. The OAIC’s recommendation 3 suggests that the Council of Attorney-Generals could establish a working group to consider amendments to State and Territory privacy laws to ensure alignment with the Privacy Act. These recommendations could be considered in this forum.


[37] Explanatory Memorandum, Privacy Amendment (Enhancing Privacy Protection) Bill 2012, 52.

[38] OAIC (July 2019) Australian Privacy Principles guidelines [online document], OAIC, accessed 26 November 2020.

[39] Privacy Act 1988 (Cth) s 2A.

[40] ALRC (2008) For Your Information: Australian Privacy Law and Practice (ALRC Report 108), ALRC, Australian Government, accessed 26 November 2020.

[41] UK ICO (n.d.) Guide to Data Protection [online document], accessed 26 November 2020.

[42] Privacy Act 1988 (Cth), Div 1, Pt II.

[43] Explanatory Memorandum, Privacy Amendment (Enhancing Privacy Protection) Bill 2012, 4.

[44] OAIC (September 2013) Guidelines for developing codes [online document], OAIC, accessed 26 November 2020.

[45] Privacy Act 1988 (Cth), s 26E(4)(a).

[46] Privacy Act 1988 (Cth), s 26G(3)(b).

[47] Privacy Act 2020 (NZ), s 34. See also Privacy Act 1988 (Cth), Div 2, Pt VI which sets out the process for making temporary public interest determinations.

[48] The OAIC notes that, in October 2020, draft legislation to amend Part IVD of the Competition and Consumer Act 2020, which will reallocate rulemaking functions for the CDR system to the Treasury, was released for public consultation. At the time of writing this submission, the consultation had concluded but the draft legislation had not been introduced to Parliament.

[49] Privacy Act 1988 (Cth), s 28(1)(a).

[50] Section 93A(3) of the FOI Act provides that guidelines are not legislative instruments.

[51] OAIC (July 2019) Australian Privacy Principles guidelines [online document], OAIC, accessed 26 November 2020.

[52] OAIC (May 2018Privacy regulatory action policy [online document], OAIC, accessed 26 November 2020.

[53] Anna Johnston (12 November 2020) ‘Location, location, location: online or offline, privacy matters’, Salinger Privacy blog, accessed 26 November 2020.

[54] OAIC (2020) Australian Community Attitudes to Privacy Survey 2020, report prepared by Lonergan Research, p. 79

[55] OAIC (2020) Australian Community Attitudes to Privacy Survey 2020, report prepared by Lonergan Research, p. 105

[56] OAIC (July 2019) Australian Privacy Principles guidelines [online document], OAIC, accessed 26 November 2020.

[57] Explanatory Memorandum, Privacy Amendment (Enhancing Privacy Protection) Bill 2012, 81.

[59] Privacy Act 1988 (Cth), Part IIIA.

[60] Privacy (Tax File Number) Rule 2015, r 11.

[61] My Health Records Rules 2016, r 44; Telecommunications (Interception and Access) Act 1979 (Cth), s 187LA.

[62] Under s 56EO(1) of the Competition and Consumer Act 2010 (Cth) entities must take the steps specified in consumer data rules to protect CDR data. The consumer data rules may prescribe different information security requirements for different sectors and classes of persons in the CDR system (for example, see s 56BB(a) of the Competition and Consumer Act 2010).

[63] OAIC (March 2018) Guide to data analytics and the Australian Privacy Principles [online document], OAIC, accessed 26 November 2020.

[64] OAIC (n.d) What is privacy?, OAIC website, accessed 26 November 2020. 

[65] Privacy Act 1988 (Cth), s 6(1).

[66] Issues paper, p. [55], citing Australian Competition and Consumer Commission, Digital Platforms Inquiry (Final Report, June 2019) 470.

[67] Section 56BAA of the Competition and Consumer Act 2010 (Cth), which provides that the Consumer Data Right Rules must include a requirement on an accredited data recipient to delete all or part of the CDR data where requested by a consumer; and s 17(3) of the My Health Records Act 2012 (Cth), which requires the destruction of records containing health information in a My Health Record upon request by the individual.

[68] Article 17(2) and Recital 66 of the GDPR.

[70] Australian Competition and Consumer Commission (ACCC), Digital Platforms Inquiry (Final Report, June 2019) p. 470.

[71] Under APP 12.3(c), entities are not required to give individuals access to personal information to the extent that the request is frivolous or vexatious.

[72] Article 12(5) of the GDPR provides that a controller may refuse to act on an erasure request from a data subject where the request is ‘manifestly unfounded or excessive, in particular because of [its] repetitive character’.

[73] For this exception to apply under Article 17(3)(d) of the GDPR, such processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes must be done in accordance with Article 89(1), which requires such processing to be subject to appropriate safeguards.

[74] Under APP 12.4, an APP entity must respond to access requests within: 30 days (for an agency); or a ‘reasonable period after the request is made’ (for an organisation).

[75] Recital 59 of the GDPR provides that ‘[t]he controller should be obliged to respond to requests from the data subject without undue delay and at the latest within one month’. See also Article 17(1).

[76] Article 21(2) of the GDPR.

[77] Article 21(3) of the GDPR.

[78] There is a limited right to object to processing that is: necessary for the performance of a task carried out in the public interest or in the exercise of official authority (including profiling); necessary for the purposes of legitimate interests of the controller or a third party (including profiling). These limited rights do not apply where the controller demonstrates compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject or for the establishment, exercise or defence of legal claims: Article 21(1) of the GDPR.

There is a further limited right to object to processing for scientific or historical research purposes or statistical purposes, except where processing is necessary for the performance of a task carried out for reasons of public interest: Article 21(6) of the GDPR.

[79] Explanatory memorandum to the Privacy Legislation Amendment (Emergencies and Disasters) Bill 2006.

[80] The APPs allow for the collection, use or disclosure of personal information where a permitted general situations under s16A or a permitted health situation under 16B exists. For example, s16A this allows for the collection, use and disclosure of personal information where it is unreasonable or impracticable to obtain the individual’s consent, and the APP entity reasonably believes that the collection, use or disclosure is necessary to lessen or prevent a serious threat to life, health or safety of an individual, or to public health and safety.

[81] Henderson A & Hitch G (13 November 2020) ‘Federal Government responds to bushfire royal commission, will create national state of emergency’ ABC News, accessed 16 November 2020.

[82]The Royal Commission into National Natural Disaster Arrangements Report (2020) The Royal Commission into National Natural Disaster Arrangements Report, Australian Government, Recommendation 3.1.

[83] The Royal Commission into National Natural Disaster Arrangements Report (2020) The Royal Commission into National Natural Disaster Arrangements ReportAustralian Government, Recommendation 22.2.