16 August 2017

Our reference: D2017/006362

The Black Economy Taskforce Secretariat
The Treasury
Langton Crescent
Parkes ACT 2600

By email: blackeconomy@treasury.gov.au

Dear Secretariat

Black Economy Taskforce Interim Report and Consultation Paper

I welcome the opportunity to provide comments on the Black Economy Taskforce Interim Report (the interim report) and associated Black Economy Taskforce Consultation Paper (consultation paper). I also thank the Chairman, Mr Michael Andrew AO, and the Black Economy Taskforce (the Taskforce) for their early engagement with the Australian Information Commissioner in developing the interim report and consultation paper.

The interim report outlines the Taskforce’s initial findings on the scope, drivers and risks underpinning black economy activities and outlines recommendations and policy options to address these. The consultation paper outlines a number of additional policy ideas drawing on the Taskforce’s broad consultation process.

I acknowledge the important policy objectives of the Taskforce as reflected in these consultation materials, which include developing an innovative, forward-looking whole-of-government policy response to combat the black economy in Australia.

A number of the measures outlined in the interim report and consultation paper raise privacy issues. At a high level I have outlined some of the privacy risks that may arise in relation to the recommendations in the interim report and consultation paper for your consideration. I have also provided information regarding the role of Privacy Impact Assessments (PIA) which are an important tool that can assist to identify privacy impacts and mitigation strategies.

Given these potential privacy issues, I would appreciate the opportunity for the Australian Information Commissioner (OAIC) to be further engaged about these initiatives as these policy proposals are developed further.

Role of the Office of the Australian Information Commissioner

The Office of the OAIC is an independent Commonwealth statutory agency, established by the Australian Parliament to bring together three functions:

  • privacy functions (protecting the privacy of individuals under the Privacy Act 1988 (Privacy Act) and other Acts)
  • freedom of information functions (access to information held by the Commonwealth Government in accordance with the Freedom of Information Act 1982)
  • information management functions (as set out in the Information Commissioner Act 2010).

The integration of these interrelated functions into one agency has made the OAIC well placed to strike an appropriate balance between promoting the right to privacy and broader information policy goals.

In the digital age, more information is being collected than ever before. While technology is allowing organisations to use and analyse data in innovative ways, often to great social and economic benefit, privacy must be integral to the equation. Establishing good privacy practices will help to engender public trust and gives individuals choice and confidence that their privacy rights will be respected.

The Privacy Act contains 13 Australian Privacy Principles (APPs) which outline how Australian Government agencies, private sector organisations with an annual turnover of more than $3 million, all private health service providers and some small businesses must handle, use and manage personal information (APP entities).

Sensitive information, which is defined in the Privacy Act,[1] is generally afforded a higher level of protection under the APPs than other personal information. Relevantly, sensitive information includes information about an individual’s criminal record, biometric templates and biometric information that is to be used for the purpose of automated biometric verification or biometric identification.

The APPs are designed to strike an appropriate balance between protecting the privacy of individuals and allowing appropriate uses and disclosures of personal information. As principles-based, technology-neutral law, the APPs provide flexibility for regulated entities to tailor their personal information handling practices to their particular needs. The APPs also contain a range of exceptions including, for example, where the individual consents to the use or disclosure of personal information,[2] or where use or disclosure of personal information is required or authorised by an Australian law or a court/tribunal order.[3]

The OAIC has additional, specific regulatory roles relevant to the Taskforce, including in relation to tax file numbers[4] and government data matching.[5]

Recommendations in the Interim Report that raise potential privacy risks

Individual identity and biometrics

The interim report states that the Taskforce intends to look more closely in the final report at ‘whether increased use of biometrics (subject to privacy protections) would assist to reduce black economy participation, along with a broader look at the issue of digital identities which some initial consultations have noted should be considered.’[6] The consultation paper also states that the Taskforce is interested in views on a national individual identity system, indicating that a ‘simple biometric marker is a possible solution’ to verifying individuals’ identities.[7] The paper also suggests that ‘[i]n the non-government arena, biometrics could be used on payment cards or non-biometric personal identifiers might be developed’.[8]

As noted above, biometric information that is to be used for the purpose of automated biometric verification or biometric identification is considered sensitive information under the Privacy Act. Sensitive information is generally afforded a higher level of protection under the APPs than other personal information. For example, APP entities may only collect sensitive information about an individual with consent and for purposes that are reasonably necessary for or directly related to that entity’s functions unless an exception applies (APP 3.3).[9]

In the OAIC’s experience, proposals that involve the handling of biometric information are likely to raise increased community concern, particularly in regards to ensuring that appropriate privacy safeguards are in place. For example, the OAIC’s Australian Community Attitudes to Privacy (ACAPS) Survey 2017 found that 58 percent of respondents are ‘somewhat’ or ’very ’concerned about providing biometric information to access licensed venues (e.g. hotels), and that 56 percent of people are ‘somewhat’ or ‘very’ concerned about using biometric information for day-to-day banking.[10] Given this community concern, initiatives that involve biometrics would generally warrant conducting a privacy impact assessment at an early stage (discussed below).

More generally, to ensure the success and sustainability of any national individual identity system, public support will be essential. For example, nearly half of survey respondents in the OAIC’s ACAPS survey 2017 reported that they were ‘somewhat’ or ‘very’ uncomfortable with government agencies sharing personal information with other agencies.[11] Building the social licence for such a system would require well thought out data-governance measures. Incorporating appropriate data-governance measures at the design stage, will give individuals confidence about how their personal information will be managed, and give clear guidance to system participants about how they should handle personal information. This would also include ensuring there are robust privacy safeguards included in the design stage of any such system to protect personal information used in the system from misuse. As privacy law, underpinned as it is by principles of transparency and accountability, would be an essential component of any such system, the OAIC would appreciate the opportunity to engage with the Taskforce further if this initiative is progressed.

Data analytics and smart technologies

The interim report highlights the role that emerging analytic tools may have in combatting the black economy. In particular, the interim report states that ‘greater application of data analytics may offer significant new information to government allowing better targeting of services as well as anti-black economy efforts’.[12]

While data analytics has the potential to bring about enormous social and economic benefits, it may also have significant privacy impacts.[13] For example, using analytics to generate data about an individual’s behavioural habits may result in the collection of personal, and potentially sensitive, information. As you may be aware, APP 10 generally requires APP entities to take reasonable steps to ensure the quality of personal information they collect, use and disclose. The nature of predictive algorithms may raise risks as to whether the new personal information will be accurate, up-to-date and complete. Data analytics can also challenge how other privacy principles, such as notice and consent, work in practice. A range of potential privacy risks associated with data analytics are outlined in the OAIC’s draft Guide to big data and the Australian Privacy Principles.[14]

Data matching

The interim report notes that the Taskforce intends to look more closely in its final report at the value in developing a dedicated black economy strategy for data collection, use and analytics. This recommendation notes the existing uses of data matching by government agencies, and calls for the expansion and modernisation of these matching programs.[15]

As noted above, the OAIC has regulatory responsibility for data matching, including under the Data-matching Program (Assistance and Tax) Act 1990 (Cth) (where data matching involves tax file numbers) and under the Privacy Act and the OAIC’s voluntary Guidelines on Data Matching in Australian Government Administration (for data matching that does not involve tax file numbers). Data matching involves bringing together at least two data sets that contain personal information, and that come from different sources, and comparing those data sets with the intention of producing a match.[16] It presents possible risks to the privacy of individuals as it involves the sharing of large amounts of personal information, may result in new personal information being generated, and may result in adverse administrative actions against the individual. As noted in the Guidelines on Data Matching in Australian Government Administration, the OAIC considers that it is best privacy practice to only carry out data matching where there is a clear business case, having regard to:

  • the financial and non-financial costs and benefits
  • whether the desired outcome could practicably be achieved by other means which pose less risk to an individuals’ privacy.[17]

In addition, for many data matching program initiatives, a privacy impact assessment (see below) would be required as part of complying with the requirement in APP 1.2 to ‘take reasonable steps to implement practices, procedures and systems to ensure compliance with the APPs (and any applicable registered APP code) and to enable complaints about compliance’.[18]

Privacy impacts of proposals in the Consultation Paper

In addition to the recommendations in the interim report, the consultation paper sets out a range of possible policies and programs developed during the Taskforce’s consultations. While I note that these are not formal recommendations, many of these policies or programs may have significant privacy impacts, such as:

  • reforms to Know Your Client tests that may involve linking to birth records and photo IDs[19]
  • reforms to Australian Business Number (ABN) arrangements,[20] particularly where an ABN is held by a sole trader or personal information may otherwise be dealt with
  • reforms to tax privacy provisions[21]
  • creation of an offence for possession of encrypted mobile phones or apps, regardless of what use has been made of them[22]
  • whistle-blower arrangements[23]
  • ‘naming and shaming’ of tax evaders[24]
  • creation of a national criminal database to facilitate data sharing and enable access to information in real time[25]
  • changes to the payment system, including reduced use of cash.[26]

The OAIC would welcome further engagement with the Taskforce, should these initiatives be developed further.

Privacy impact assessments

Where a policy or program may have an impact on the privacy of individuals, I recommend that a privacy impact assessment (PIA) be conducted in the earliest policy design stage.[27] A PIA is a systematic assessment of a project that identifies the impact that the project might have on the privacy of individuals, and sets out recommendations for managing, minimising or eliminating that impact. This will help to identify any impacts on the privacy of individuals and allow for privacy safeguards to be built into the preferred regulatory model.

In particular, proposals that involve the handling of sensitive information (such as criminal records or biometric information) are likely to raise increased community concern to ensure that appropriate privacy safeguards are in place, and this would generally warrant conducting a PIA at an early stage.

I look forward to further engagement with the Taskforce on these matters. If your staff would like to discuss these comments or have any questions, please contact Sophie Higgins, Director, Regulation & Strategy on [contact details removed].

Yours sincerely

Per

[Signature of Andrew Solomon]

On behalf of

Angelene Falk
Acting Australian Information Commissioner

16 August 2017

Footnotes

[1]Privacy Act 1988 (Cth), s 6(1).

[2] Australian Privacy Principle 6.1(a).

[3] Australian Privacy Principle 6.2(b).

[4]Privacy Act 1988 (Cth), s 17.

[5] Where data matching involves tax file numbers, see the Data-matching Program (Assistance and Tax) Act 1990 (Cth). Where data matching does not involve tax file numbers, see the voluntary Guidelines on Data Matching in Australian Government Administration, <https://www.oaic.gov.au/agencies-and-organisations/advisory-guidelines/data-matching-guidelines-2014>.

[6] Interim report, p 53.

[7] Consultation paper, p 1.

[8] Consultation paper, p 1.

[9] More information about the higher standards that apply to the collection, use and disclosure of sensitive information is available in the OAIC’s APP Guidelines.

[10]Australian Community Attitudes to Privacy Survey 2017, p 21.

[11]Australian Community Attitudes to Privacy Survey 2017, p 12.

[12] Interim report, p 41.

[13] Interim report, p 41.

[14] This draft resource is available on the OAIC website at <www.oaic.gov.au>.

[15] Interim report, p 54.

[16] This definition appears in the Guidelines on Data Matching in Australian Government Administration.

[17] See OAIC Guidelines on Data Matching in Australian Government Administration, available at <www.oaic.gov.au>.

[18] See also OAIC Consultation draft: APS Privacy Governance Code, available at <www.oaic.gov.au>.

[19] Consultation paper, pp 1–2.

[20] Consultation paper, p 2.

[21] Consultation paper, p 4.

[22] Consultation paper, p 4.

[23] Consultation paper, p 5.

[24] Consultation paper, pp 6–7.

[25] Consultation paper, p 7.

[26] Consultation paper, pp 8–9.

[27] For more information about PIAs, see the OAIC’s Guide to undertaking privacy impact assessments and our new Undertaking a Privacy Impact Assessment e-learning course available on the OAIC website <www.oaic.gov.au>.