Is Google taking the pee out of data protection?

Google taking the pissIt’s no exaggeration to claim but when the history of data protection and privacy is written 10 years from now, one company will be credited with having had the most influence over the shape of data protection and privacy across the European Union (EU).

And it’s Google.

No week goes past without some reference to one of the most powerful digital companies on the planet. And this week just gone has been no exception.

On Thursday 9 July, Google was forced to revise its privacy policy after the Dutch Data Protection Authority (DPA) threatened to fine company €15m. Google will now have to seek new users’ permission to combine their personal data throughout its services. The company used personal information and cookies from videos watched, searched browser history and read emails to generate personalized ads without any explicit consent to do so from its users.

Elsewhere, Google is under pressure from US civil-rights group Consumer Watchdog for failing to extend the right of erasure to US citizens that are enjoyed by citizens with the EU. Under EU rules, residents of EU countries can ask search engines to delete links to information about them they find objectionable. The ruling, which went into effect in May 2014, doesn’t require the information be deleted from the internet, only that search engines not link to it.

The Federal Trade Commission (FTC) is now being asked to investigate and it’s highly likely that Google will be forced to observe the right to be forgotten in the US not before too long.

What we are witnessing in all of these moves is a ‘correction’ in the laws and rules that govern data protection in the online environment.

As data protection rights get more traction in the online world, regulators across the EU as well as the European Commission, European Parliament and the Council of Ministers are beginning to think much more carefully about how to reconcile data protection and browser-generated information (BGI) with the important rights to freedom of expression (Article 10) as enshrined in the European Convention on Human Rights (ECHR).

According to many lawyers, this is a really difficult issue and one that we’re only now starting to grapple with.

The recent Court of Appeal landmark ruling in Vidal-Hall versus Google (2015) has influenced the final shape of the EU General Data Protection Regulation (GDPR) that’s due to be signed off later this year.

Vidal-Hall v Google Inc [2015] EWCA Civ 311

The background to Vidal-Hall was that for a period of time Google had been secretly tracking internet browsing activities of its users by means of ‘cookies’ despite the fact that it had promised users it wasn’t doing that, and it engaged in that tracking activity specifically so that it could enable targeted adverts to be sent to users.

The use of targeting advertising, as we all know, is absolutely central to Google’s business model.

Google had been subject to very significant regulatory fines in the US because of breach of its promise to users but privacy claims for damages floundered in the US.

So what happened was that a number of UK-based Google users, including Vidal-Hall, decided to bring claims in the English High Court, claims for both the misuse of private information and for breach of the Data Protection Act 1998, including compensation for distress under Section 13, DPA 1998.

Now, because obviously Google is a foreign-based corporation – it’s located out in California – a preliminary issue arose as to whether Google could be served with the proceedings in the US.

There were three key issues that the Court of Appeal had to consider:

Issue 1: Was the misuse of private information to be classified as a tort or as an equitable claim for breach of confidence?”

The Court of Appeal held that, whatever its origins, misuse of private information was now to be classified as a tort rather than as an equitable claim and therefore that claim could properly proceed against Google.

Issue 2: Did browser-generated information (BGI) amount to personal data?

The Counsel for Google argued that BGI didn’t amount to personal data because it amounted to data drawn from individual devices – iPads, smartphones and so on – and those devices were anonymous so far as Google was concerned; Google didn’t know the identity of the person standing behind those individual devices, didn’t know their name, didn’t know what they looked like. And so Google argued it must therefore follow that the BGI is anonymous, non-identifying, impersonal data.

The Court of Appeal took the view that there were strong arguments that a broad approach should be adopted to the definition of personal data.

The definition didn’t require that the data controller knows, for example, the name of the individual data subject or what they look like, it’s not that the data controller has to be able to recognise them walking down the street.

Instead, the definition requires that the individual is sufficiently ‘individuated’ or singled out from all others by the data controller such that they are in a meaningful sense identified.

Counsel for the respondents to this appeal successfully argued that BGI did amount to personal data based on the following cumulative analysis:

  • The data reflected the unique browsing histories of individual users but more than this, because of the way in which Google operated, Google was able to marry up those unique browsing histories first of all with the IP address of individual users, i.e. their virtual address and also – and this was very important – through the use of Cookie technology with information which told Google when those individual users were online.
  • Following this, it’s reasonable to infer that Google doesn’t only know when users are at their virtual addresses; it also knows when users are at home in a virtual sense. And it’s this combination of individuating factors which then enables adverts to be targeted at users on an individuated basis.

Issue 3: Whether distress rather than pecuniary loss for data breach was recoverable under the DPA  1998?

Google argued the claims were barred from proceeding on an application of Section 13, DPA 1998.

Such an argument was comprehensively rejected by the Court of Appeal as well as the Information Commissioner’s Office (ICO).

The Court of Appeal felt that compensation was available for mere distress under Section 13 of DPA 1998 despite the clear contrary wording of that provision.

And the Court of Appeal reached that conclusion based on a series of propositions, one of which was that the Directive 95/46/EC from which the DPA 1998 derives is concerned fundamentally with privacy rights, not economic rights. And it follows therefore that the word ‘damage’, as that word appears in Article 23 of the EU Directive, must be construed as referring both to pecuniary loss and to distress, it’s not simply focusing on whether the data subject has suffered financial loss.

Legal terms in EU law have an autonomous meaning which should be given effect harmoniously across the laws of Member States. The respondents to the appeal argued that Section 13(2) should be disapplied because it conflicts with the Articles7 and 8 of the EU Charter of Fundamental Rights (EUCFR) and the Court of Appeal accepted such a legal argument.

Why this matters?

The pendulum has swung in favour of the protection of fundamental rights even in the face of national laws that would appear to be in contraction of such a principle.

EUCFR had a horizontal direct effect in respect of these rights in issue so that the Court of Appeal had to strike down the offending exclusionary wording in Section 13, DPA 1998 so as to ensure that UK domestic law met the requirements of Article 8 of EUCFR.

With the prospect of GDPR becoming a reality around the corner, it’s highly likely that courts in the UK and across the EU will take it upon themselves to move ahead and tighten the rules on data protection and privacy , forcing organisations and companies across all sectors to re-assess their data compliance and business continuity risks.


Leave a reply