You are being tracked online, and you probably have no idea how much. Are company pledges to use data fairly enough to assuage concerns?

Personal data is money. The more of it that a company can amass, the greater the possibilities to “monetise” it through behavioural advertising and other techniques for categorising and targeting consumers.

And just like money, personal data can be used for good or bad. It can underpin legitimate advertising, leaving consumers to choose if they want to respond. But it can also be used to curtail free choice by presenting consumers with a skewed range of options that advertisers think are appropriate.

This is the concept of “digital predestination”, and it is starting to alarm data protection authorities. Each year, the world's privacy commissioners get together to highlight emerging trends. This year, their conference was in Mauritius, 13-16 October. The proliferation of personal data, and the ubiquity of tracking on the internet, speakers said, raised the possibility that companies will get to know consumers a little too intimately, and will start to predetermine their choices.

Speaking at the conference, Julie Brill, head of the United States Federal Trade Commission, highlighted the risks of online tracking and analysis of consumer data. She said there was a “clear potential for use of this information to be harmful and discriminatory” by, for example, segmenting consumers on the basis of race, though this is not an allowable consideration in commercial transactions, such as providing loans.

The risks increase when big data and the “internet of things” are factored into the equation. Big data is the use of extremely large datasets to spot behavioural patterns. The internet of things consists of connected devices with sensors, and the constant transmission and analysis of data picked up by those sensors. Internet-of-things devices can be used, for example, to monitor health conditions, or to provide constant moderation of the temperature in buildings.

Data could identify potential criminals
 

Big data and the internet of things in combination are a privacy commissioner’s nightmare. Mass surveillance and analysis of health data, for example, could be used to identify those at risk from certain conditions and to corral them in certain directions. Analysis of crime and social data could see people labelled as likely future criminals, even if they have never committed a crime. Jacob Kohnstamm, chairman of the Dutch data protection authority, speaking at the data commissioners’ conference, said that such categorisation could be a “frightening manifestation of digital predestination”.

Paranoia?

The warnings might sound paranoid. But, says Todd Ruback, chief privacy officer at Ghostery, which produces privacy applications, online tracking of the behaviour of internet users is “out of control”. Even respectable companies “70% to 80% of the time have unauthorised trackers on their website that they don't even know about”.

Unauthorised tracking doesn't necessarily happen for dubious reasons, Ruback adds. The problem is that websites take advertising from third parties that might include tracking and analytics applications, which the website has no direct relationship with or even knowledge of. The internet has become a “very complex marketing cloud” and it is a “systemic challenge” for companies to know who might be piggy-backing on their online presence, and what might be done with the data the intruders collect, Ruback says.

Third party advertising may include tracking
 

One response to the prospect of intrusive tracking and data collection is legislation. The European Union, for example, is currently updating its data protection laws. The new legislation, which could be finalised in 2015, will make clear that the consent of users is required for data processing – the collection and use of personal information – and could threaten companies with large fines for violations.

But legislation can be cumbersome and have unforeseen consequences. An alternative is that companies demonstrate to consumers that they will safeguard privacy. In the UK, the Market Research Society (MRS) has established the Fair Data certification scheme. To obtain certification, companies must abide by 10 principles, ranging from ensuring that data is only collected on the basis of consent, to ensuring that staff are adequately trained in data processing that respects privacy.


The 10 principles of Fair Data

  1. We will ensure that all personal data is collected with customers’ consent.
  2. We will not use personal data for any purpose other than that for which consent was given, respecting customers' wishes about the use of their data.
  3. We will make sure that customers have access to their personal data that we hold, and that we tell them how we use it.
  4. We will protect personal data and keep it secure and confidential.
  5. We will ensure staff understand that personal data is just that – personal – and ensure that it is treated with respect.
  6. We will ensure that the vulnerable and under-age are properly protected by the processes we use for data collection.
  7. We will manage our data supply chain to the same ethical standards we expect from other suppliers.
  8. We will ensure that ethical best practice in personal data is integral to our procurement process.
  9. We will ensure that all staff who have access to personal data are properly trained in its use.
  10. We will not use personal data if there is uncertainty as to whether the Fair Data principles have been applied.

Source: http://www.fairdata.org.uk/10-principles/


A number of the 10 principles are in fact obligations laid down by legislation – for example, the consent principle, and the principle that data should only be used for the purpose for which consent was given. MRS chief executive Jane Frost says, however, that the 10 principles give a clear summary of a company's approach to privacy that a customer might only otherwise find in the legalese of its privacy policy. It's “quite invidious” to expect internet users to check the privacy policy of every website they visit, she says.

In some areas, the Fair Data principles go beyond legislation. For example, certified organisations commit to ensuring respect for privacy in their supply chains. Frost says the scheme is a work in progress, and the 31 certified companies so far are MRS members that collect huge amounts of data as part of their market research activities. “Many of our companies could identify most people's personal lives down to a very small margin,” Frost says. The Fair Data scheme involves an extensive audit process, and further certifications of non-MRS members will be forthcoming, she adds.

Privacy seals

The UK's data protection authority, the Information Commissioner’s Office (ICO), is also working on a privacy certification scheme. A public consultation on ICO-endorsed “privacy seals” closed in October. Iain Bourne, the ICO's group manager for policy delivery, says seals and trustmarks that convey clearly and simply a company’s commitment to privacy are “definitely the way to go” but schemes must be rigorous, “otherwise it's just people saying nice things”.

The ICO plans to offer the authority to certify to third parties, so that a number of privacy seal schemes could emerge, run by different organisations that gain the ICO's approval. The aim is to have “the first round of ICO-endorsed schemes up and running in 2016”, Bourne says. The approach of online companies to the privacy of their customers is changing, he adds. “There was a period of 'let's just hoover stuff up' but now they are realising that the skill of the game is not to be in opposition to their customers' wishes.”

The internet is, in fact, dividing along privacy lines. The business model adopted by companies such as Facebook and Google has given them global dominance, but potentially intrudes on the privacy of users. Todd Ruback says: “The internet model we have is a free model, but it's not free, it's subsidised by marketing.” Personal data underpins that marketing.

But consumers are realising that use of some services is at the expense of their privacy, and are starting to defect to non-privacy-invading alternatives, or to adopt spoiling tactics. Frost of the MRS says: “There is evidence that younger people happily dirty their data. Just because you've got data it doesn't mean it is good data.”

New services are increasingly being introduced that incorporate privacy from the outset. The new social network Ello, for example, promises no tracking and no advertisements, as an alternative to Facebook. Another example is the search engine startpage.com, which is positioning itself as a privacy-respecting alternative to Google.

Startpage CEO Robert Beens says its traffic has grown hugely, especially since former US National Security Agency contractor Edward Snowden started revealing details of state surveillance of internet users. However, Beens concedes: “In absolute numbers – about 0.1% of the worldwide Google traffic – we are still a niche player.”

“Privacy is a fundamental civil right,” Beens says. “We protect peoples’ privacy as far as we can: we don’t store any personal data, don’t use tracking cookies, don’t pass on personal information.” Being able to search the internet anonymously is crucial, he adds, “because it reveals so much of your personal life, religion, hobbies, interests, politics, medical history”.

Microsoft accuses Google of “scroogling” Gmail users
 

Even the big boys are starting to trade blows on privacy. Microsoft has been running the scroogled.com campaign, which attacks Google for compromising privacy. Google, says Microsoft, monitors Gmail messages for keywords that are then used as the basis for targeted advertising, while the Microsoft alternative Outlook does not “go through your email to sell ads”.

However, Beens says that respecting privacy means lower profits. “We make money by advertising – but this is non-targeted. We show ads on top of the results page based on the search term and language only, not on any personal profile, like the other engines do. This reduces the commercial value but is still good enough to cover all cost and investment. Privacy-sensitive services can easily be profitable. But big tech is just hooked on big profits.”

But too much privacy could lead to a less rich internet, according to industry advocates. Townsend Feehan, chief executive of IAB Europe, which represents the online advertising sector, says internet users “can't expect to think that everything on the internet can be free. The strategic issue for the industry is to make people aware of what they get” – which is often funded through advertising.

 
Big data means big profits
 

Nevertheless, IAB Europe supports an “ecology of solutions” for people who are concerned about being tracked online, which Feehan says would consist of legislation backed up by privacy codes such as Fair Data and “added things that industry can do to lower the level of apprehension”. IAB Europe has its own website – youronlinechoices.com – which tells visitors the degree to which they are being targeted by behavioural advertising companies.

The growing wariness of consumers about the creepiness of online tracking – even if for relatively innocent purposes such as advertising – is a “medium to long-term strategic concern” for the industry, Feehan says. What is needed therefore is a “grown-up and informed conversation about what the value exchange is.”

data security  data series  digital privacy  internet  online privacy 

comments powered by Disqus