Office of the Australian Information Commissioner - Home

Australian Government - Office of the Australian Information Commissioner
Australian Government - Office of the Australian Information Commissioner

Main menu

Big data and privacy: a regulators perspective

Speech by Timothy Pilgrim to International Conference on Big Data from a Privacy Perspective, Hong Kong

Big data is big business. And it is no wonder – it is estimated that the amount of data globally is growing by fifty percent each year, with ninety percent of the world’s data generated in the last two years.  Much of this data is generated through our online interactions. Each time we post or like something on Facebook, each photo and caption we share on Instagram, each time we purchase online, and each time we search via Google.

Our smartphones mean we are continuously connecting with the online world through apps that align with our interests. And as the Internet of Things becomes more embedded in our lives our digital touch points will increase at incredible rates. Wearable computing, smart cars, smart electricity, smart fridges, even smart toys for our children. And all this data we are creating is increasingly valuable and sought after.

Big data analytics has great power to amass, aggregate and analyse this data. As a result, big data has the potential to bring about enormous social and economic benefits. It helps organisations target products, deliver services, develop policies and personalise people’s online experience. But as we regulators tend to say, with all the opportunities of this new technology it also has the potential to significantly impinge on individual privacy. And we say this because it is true.

Data protection authorities have already signalled an intention through the Mauritius Resolution on Big Data and the Declaration on the Internet of Things to closely monitor these developments. In doing so, we recognise the benefits that can come from them in terms of health management and disease prevention, emergency responses, crime prevention and even traffic management.

When there is technological advancement we are used to hearing ‘this will be the end of privacy’ or ‘that the benefits of the new technology outweigh privacy’ — but such propositions are unnecessary. Technological advances can occur with privacy built in. In responding to these reports we regulators also tend to say ‘this exciting new technology presents great opportunities but does present some risks to privacy that can be appropriately managed’.

And the thinking behind our response is right — organisations need to commit to getting privacy right when using new technology. And this makes sense for big data because, as we keep hearing, big data is big business. Big data has become affordable to store, and affordable to analyse. What we also need to hear is that getting privacy right is part of the big data business plan.

Recently, during Privacy Awareness Week, our Office released a Privacy Management Framework to assist organisations to get privacy governance right. We looked at the work done by our colleagues in Hong Kong, New Zealand and Canada to name a few, in doing this, because globally privacy regulators recognise our role in helping organisations meet their privacy obligations.

Using the Framework, organisations will establish robust privacy management and commit to privacy from the top down, embedding a privacy culture. They will establish robust and effective privacy processes, as well as incorporating existing privacy tools such as privacy by design and undertaking privacy impact assessments. This will enable organisations to actively identify and mitigate privacy risks through ongoing regular monitoring and review.

Privacy governance should take place as part of the regular governance processes of organisations. Privacy is not an activity to be undertaken in isolation.

We need to counter the attitude of ‘run hard, apologise later’ which we understand some big data marketing companies have. And we all need to learn from big data blunders.

One of the most well-known big data marketing blunders occurred when Target sent a teen mother-to-be advertising. As we know, this was how her father found out she was pregnant and a family storm ensued. This is a cautionary tale of what can go wrong with big data, yet at an Australian forum for big data analytics marketers were still excited to share that Target had estimated the teenager’s due date accurately!

Thankfully, the ‘run hard, apologise later’ attitude is not shared by all. The Australian government released the Australian Public Service Big Data Strategy in August 2013.   The Strategy provides that government data is a national asset that should be used for public good. Big data is poised to improve the delivery of government services, encourage innovation and advance policy formation. Privacy is a crucial plank of the Strategy, with the Australian government recognising that privacy is an enabler to successfully implementing big data activities while maintaining citizen trust.

And maintaining citizen or consumer trust is key. In Australia, our Community Attitudes to Privacy survey found that the majority of Australians (60%) have decided not to deal with a private company due to concerns about how their personal information will be used.

Staying in Australia, in August 2013 the Association for Data-driven Marketing & Advertising (ADMA) released Best Practice Guidelines on big data. Their guide suggests that to manage customer concerns about privacy, businesses should be transparent — by explaining how their data is being collected and also by giving them choice. They also say businesses should put the customer first and be careful about personalisation and don’t — as they say — ‘be creepy’.

The ADMA guide encourages marketers to consider using de-identified or anonymised data. As we know, de-identification is not a silver bullet to big data privacy concerns. But much can be achieved by using de-identified data. And it would seem that big data users are less interested in the identities of individuals but rather their attributes.

Our Office released guidance on de-identification in February 2014. The guidance helps organisations assess when de-identification is appropriate, how to choose appropriate de-identification techniques, as well as how to assess and mitigate against the risk of re-identification. Risk of re-identification does exist, but again this risk can often be identified and mitigated against.

When organisations decide that using personal information for big data activities is necessary a privacy framework exists for them to work within. The privacy principles are inherently flexible and technologically neutral. This is their strength.

When organisations get privacy right they retain the trust of consumers and citizens. When they get it wrong their consumer base responds, and so do we as regulators.

When it was reported at the end of 2012, that Instagram had the right to sell users’ photographs without payment or notification their users quickly responded. A day after user uproar Instagram wrote on their blog ‘we’ve heard loud and clear that many users are confused and upset about what the changes [to the Privacy Policy and Terms of Service] mean… we will be doing more to answer your questions, fix any mistakes, and eliminate the confusion’. They went on to say ‘We need to be clear about changes we make — this is our responsibility to you.’

Interestingly, when Twitter was reported to be selling users’ tweets to companies, for a reported $32 million in the first half of 2013, and again in 2015 a similar reaction was not seen. Chris Moody, Twitter’s data strategy chief shared that they ask themselves — how do we ensure that we are not being creepy?’ The answer he shared was context. Because Twitter users know ‘what you say on Twitter may be viewed all around the world instantly’.

Context was also shown to be crucial by the World Economic Forum and Microsoft through their work looking at individual’s attitudes to personal information and its use. Salinger Privacy considered the outcomes of that work to show that the single most important variable affecting the ‘acceptability’ of a scenario was not the type of data at issue, the way it was proposed to be used, or even the type of organisation or institution seeking to use it — but the method by which the personal information was collected.

Yet people are sharing more about themselves than ever before. What people are sharing is however often carefully chosen. People share an ‘image’ of themselves on social media. If we asked the same people to post their last tax return, the details of their medical issues, or their bank information most would not.

Even as they share more information about themselves, Australians are thinking about privacy. Our latest Community Attitudes to Privacy survey told us that Australians believe the biggest privacy risks facing people are online services — including social media sites. Young Australians were most concerned about personal information and online services, with six in ten mentioning this as a privacy risk.

The Survey also told us that the majority of Australians are annoyed when they receive unsolicited marketing. And 97% of Australians do not like their personal information to be used for a secondary purpose.

These statistics clearly point to the importance of getting privacy right for big data. People are saying they want to know why their information is being collected and how it will be used. When they don’t like the answer they are choosing not to deal with the organisation.

And Silicon Valley is listening. Just last week at the EPIC 2015 event in Washington, Apple CEO Tim Cook sought to position Apple as a company that can be trusted on privacy. He has been reported as staying ‘Apple doesn’t want you data… we don’t think you should ever have to trade [your personal information] for a service you think is free but actually comes at a very high cost. This is especially true now that we’re storing data about our health, our finances and our homes on our devices.’ Cook went on to say ‘We believe the customer should be in control of their own information’.

Getting privacy right is important for business. In Australia, Australian Privacy Principle 1 is the bedrock principle as it is all about openness and transparency. It sets up the other practices, procedures and systems that organisations need to have in place to meet their obligations.  If they get APP 1 right, they are well on their way to getting privacy governance right. And this bedrock principle works with the other privacy principles to give people choice and control. 

Often we boil choice and control down to notice and consent. This is because notice and consent represent the balance between organisational accountability and individual privacy self-management. This is the balancing act of privacy.

With the emergence of big data we have heard from commentators that this balance no longer works. They are concerned that notice and consent does not deliver meaningful privacy outcomes for individuals. They are concerned about the onslaught of notices that individuals endure, where in practice many people do not see them or they choose to ignore them.

Making it work is about organisations telling people in innovative ways what they are doing with their information. Making it work is about making it easy for individuals to understand what their personal information is being used for and making choices based on that.

Innovative new approaches to notice and consent are starting to be suggested, like ‘the New Deal on Data’ by Professor Alex Pentland. The New Deal proposes to use a dashboard for example ‘…to give people the ability to see what’s being collected and opt out or opt in.’ Ideas like this could transform how notice and consent work in practice.

Think for a moment, if the privacy framework was to move from a notice and consent paradigm what would we move to? An accountability model has been suggested where transparency would be a key component. Yet what is transparency without choice? Imagine being told by a company ‘we will use your personal information to decide how to better market our products to you’ and not then being able to say ‘no, thanks’. Getting notice and consent right for the online world is a challenge. It calls for effort. It calls for innovation. And innovation is what business is good at.

At the same time, encouraging organisations to consider the existing privacy framework with a greater accountability lens is valuable.

In Australia, following a year of bedding down the most significant reforms to the Privacy Act in 20 years, we are now turning our attention to privacy accountability and governance. And as I mentioned before, to support organisations, we have launched a Privacy Management Framework.  Because having a strong and effective privacy governance framework in place will help guide the development of new technologies and activities, like big data, in a privacy friendly way.

Robust privacy governance also minimises the risk of data breaches occurring, or enables them to be managed appropriately if they do occur. This is particularly important in the era of big data and the creation of data honey pots. In fact, Price Waterhouse Coopers reported that security incidents from hackers increased by 48 percent worldwide last year, as a key finding of their Global State of Information Security Survey 2015. We also see some jurisdictions, like the USA and Australia committing to introducing mandatory data breach notification laws, partially in response to the rise in big data.

We live in a world where big data has become interesting enough to have its own television series. It is a National Geographic series called ‘The Big Picture’. Kal Penn, host of the series explained that his interest with big data started when he noticed all the negative reporting about big data and that even with all that reporting he said ‘very few of us read those user agreements’.

We know that just over fifty percent of Australians do not read privacy policies. We also know they are sharing and engaging online more than ever. So why don’t they read privacy policies? They tell us that the policies are too long, too complex and just plain boring.

Yet, when they hear how their personal information, or their children’s personal information is being used they do care.

When it was revealed in 2014 that Facebook had changed users’ News Feeds ‘to show either only happy or sad posts from friends’ leading to a corresponding effect on mood there was user backlash. Users reacted describing the research as ‘creepy’ and ‘terrifying’.

Facebook had relied on user consent for the research program, on the basis that research was included in their Terms and Conditions. Following user reaction, Chief Technology Officer Mike Schroepfer acknowledged in a blog post that the social network mishandled the study. Facebook has since instituted a new framework for handling research.

The question of how much consumers value their data was considered by product strategy and design firm Frog. Frog surveyed 900 people from the United States, the United Kingdom, Germany, China and India about their awareness of how their data was collected and used. Writing about their study in a Harvard Business Review article, Frog explained that they found ‘government identification, health, and credit card information tended to be the most highly valued personal information across countries, and location and demographic information among the least.’

They also looked at how consumers felt about their privacy and what they expected to receive in return for their data. Interestingly, Germans placed the most value on their personal data. Chinese and Indian people the least, and British and Americans were in the middle.

Frog also found that how people felt about the use of their personal information depended largely on the value they got from the bargain, as well how transparent the data practices were. For example, using Disney’s MagicBand to access rides, pay for food and unlock hotel rooms was considered a fair bargain by users because in trading data about their preferences, they received convenience and a ‘sense of privileged access’.

However, where organisations’ data collection practices impact more vulnerable people, like children or bystanders, attitudes can be different. On hearing about Hello Barbie — the new Barbie that can hear your children and reply to them (as well as collecting the data in the process) — the Campaign for a Commercial-Free Childhood started a petition to stop production and parents are signing it. Will it stop production? Maybe not. Will it lead to better privacy protections? We can hope.

We interact with the online world differently to how we engaged with organisations in the past. The online world is more dynamic, multi layered and user centric. In that environment the way privacy information can be displayed could be so much better. It can be given in context, at the right time, it can be easy to read and understand. It can be meaningful. What we need is innovation. And what business is good at is innovation.

Thank you.