Защита персональных данных | George Papantoniou & Co LLC

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Find out more here.

header news

THE RIGHT TO BE FORGOTTEN

Data protection is a subject of growing interest within the landscape of Internet law. Nowadays, more personal data is collected about people, and the processing of such data does not undergo any rigorous attention or control. The danger of remaining in the public domain indefinitely is evident and not enough measures have been taken to balance the gains and threats of such processing and ensure adequate protection for individuals. In Europe, an initial step to the direction of protecting individuals’ personal data was taken by the adoption of the EC Data Protection Directive (“DPD”) with regard to the processing of personal data and the free movement of such data.

Moreover, the ‘Google judgment’1 has a significant impact on the operation of the internet. The judgement affects internet operators in the way they control personal data and it has also a direct impact, to the extent of data accessibility by internet users with regards to specific individuals. The Court concluded that, the activity of search engine in finding information published or placed in the internet by other web pages, indexing it automatically, storing it temporarily and making it available to internet users must be classified as ‘processing of personal data’.

Furthermore, the judgement stated that an individual whose personal data has been processed may ask the controller to erase or block any data processed and which does not comply with the provisions of Directive 95/46/EC (hereinafter referred to as the «DPD»). However, the Directive permits member states to allow for the absolutely necessary exceptions on these basic rules, in order to reconcile the right to privacy with the freedom of expression and, particularly, with journalistic, artistic, or literary expression. With regards to the ‘controller’ liability, the CJEU explicitly put fundamental rights and particularly rights relating to privacy on the agenda.

The Google judgment raises many critiques concerning human rights. One opinion argues that the judgement fails to balance the right to be forgotten with the protection of other fundamental rights and, particularly, freedom of expression and freedom of the press, as well as for the responsibilities of private Internet operators beyond Google, who actually carry out the same activities. Another criticism of the judgment concerns its consistency with the ECHR, to which the EU is soon to accede. It is been argued that the CJEU’s reluctance to refer to the ECHR and to the Strasbourg Court’s case law is particularly problematic, both in light of the EU’s pending accession and, more broadly, because it risks upsetting the sensitive constitutional balance struck in the European fundamental rights landscape to date, in which the European Convention has played a crucial part.

On the other hand, the exponential growth of the material available on the Internet and the expansion of search engines are relatively new phenomena that could not, at the time of the Directive’s enactment, have been foreseen. The Court further highlights the significance of the fundamental rights argument in its ruling. Despite the absence of specific legislation regulating the flow of information, the Court was not faced with an absence of relevant rules altogether. The constitutional status of the provisions enshrined in the Charter allowed the Court to take them into account in its judgment, because the higher normative validity of these rights affects the interpretation of all other legislation.

In Article 17, the Commission Proposal introduced a new right – the right to be forgotten and to erasure. The right has been one of the most controversial changes raising much debate about what exactly the ‘forgetting’ means, the potential conflict with freedoms of expression, information and to conduct business, and what it would entail for the data controllers. Despite the controversy, the right made it to the European Parliament (hereinafter referred to as ‘EP’) first reading, with wider scope and stronger implementation tools, albeit with the title changed to the ‘right to erasure’.

There are two distinct visions of the right to be forgotten: the first vision is of a right to have one’s data deleted when its retention or other processing is taking place longer than authorised or is illegal for other reasons, and second, as a right to a clean slate, an example of ‘forgive and forget’ thinking, meaning that ‘outdated negative information should not be used against people’.

However, the same opinion about the right to be forgotten and the right to know is not globally shared. In the USA and particularly in San Francisco, the Superior Court of California Court decided in a similar case JIAN ZHANG et al v Baidu.com Inc that freedom of speech as set out in the first amendment, protects the results produced by a search engine. This is because the first amendment of the US constitution is protecting the freedom of speech as an ultimate freedom. Moreover, the court decision was based on several other similar US decisions.

The “right to be forgotten” as analysed in the Google case is a step forward in the field of privacy rights. Although it is criticised on the grounds of freedom of speech, it is argued that it still strikes a balance between fundamental rights.

 

THE ERA OF BIG DATA

Another major issue concerning privacy and data protection is the ‘big data’ and the benefits deriving from it. Data protection and privacy have not been always easy to reconcile. The decision in the case of British Gas Trading Ltd v Data Protection Registrar specified that “the aim of data protection principles is to protect with respect to the processing of personal data”. Thus far another big part of big data is produced about people by other parties, mainly by public and private organizations gathering and storing data about individuals in databases. Google and Facebook literally know more about us than we can remember ourselves.

Mobile phones continuously generate location data that is stored by telecom providers for every communication. As pointed out more than ten years ago, when an internet user surfs, transactional data are created, thus wherever someone goes on the internet, he leaves digital trace. Most of this data are not personal data but it is always possible that this data are combined with other data and then instantaneously becomes personal data. In addition to that, data can also be used for profiling and then can become relevant when this anonymous profile data are applied to an individual.

However, legally speaking the DPD is applicable when processing personal data. It is argued that if group profiles are applied to identified or identifiable persons then these persons can object to the application of group profiles if these profiles are exclusively based on automated decision making. The irony is that people simply do not know if certain profiles are being applied to them by the tactic of web personalization based on web mining. What makes this useful are the algorithms to process this data and extract meaningful value from it. The problem is to find the right balance between privacy risks and big data rewards. Big data creates many opportunities in various aspects of life. The importance of big data is not only the size and how fast it is growing, but also the reality that the data comes from a remarkable array of sources. At the same time these extraordinary benefits are mitigated by concerns on privacy, fairness, equality and freedom of expression. Furthermore, data are routinely collected and traded with few controls to secure it. Nevertheless, data analysts ask for legislation to reclaim some of that privacy and ensure that any data collected remain secure.

Privacy in EU is protected by the law. Citizens have the right to object to processing personal data without their consent based on “compelling legitimate grounds”. Furthermore, collectors of personal data must protect it from misuse. Even where legislature demonstrates awareness, data processing on a big scale may be lacking on practical risk implications. In the world of the information age, in which data represents individuals in even more transactions, matters on accuracy and reliability of data may matter more than privacy. This has been proved to be true, with cloud computing nowadays. States sacrifice individual’s privacy in the name of national security. Critical data and applications are stored remotely, an action which renders accessibility more important than privacy. Personal data is universally collected and shared crosswise the world. Most data protection laws are old and not informed or updated in order to protect against the threats to individuals, institutions and society from big data shared globally.

 

NETWORK MONITORING SYSTEMS

Network monitoring systems have increased interest in technical and legal issues. The monitoring can take the form with which systems can communicate with each other or it can go further and look into the content of the traffic exchanged in the network. The technologies used for this monitoring must record all network data in order to be practically useful. Therefore, privacy concerns are highly significant. An example is the “Cleanfeed” technology, whose purpose is mass censorship on the Web to protect their clients from illegal content. This has prompted public criticism because a private body could take censorship decisions for the UK Internet users, by determining whether the content could be illicit, with partial procedural safeguards and no supervision. It had not shown much interest on civil liberties and had not given answers about its legal qualifications for carrying out this task, its lack of public accountability and its failure to implement due process.

A system such as “Cleanfeed” had an incident with Wikipedia, which raised concerns about the legislative basis of any blocking system and the freedom of expression. In 2008 the IWF blocked an URL of Wikipedia because it contained a pornographic image of a child, which was the cover image of the Scorpions song album which was being legally sold many years earlier. Moreover, the IWF did block not only the image but the access to the whole page of wiki entry. Although IWF is a subject of criticism, there is no doubt that it is a serious body doing important work. It cannot be compared with the profit organization Google who owns YouTube and it is the world’s main TV channel. Google has enormous influence over who can find an audience on the Web around the world. Google’s search engine can monitor what controversial material appears on the local search engines that Google maintains in many countries around the globe, as well as on Google.com which appears globally. Consequently, Google arguably has more influence over the outlines of online expression than anyone else on the planet.

In 2008, an advertising company named Phorm triggered anger from privacy protection campaigners because in order to help advertisers better it aimed consumers by monitoring their browsing history. The technique deployed Phorm for advertising involved equipment installed in an ISP network that intercepted all web traffic fleeting along every customer's broadband connection, and scanned through it for keywords that could be used to send indented advertising that reflects customers' communication flows. The main criticism on this practise was on the fact that it would have been an opt-out service which meant that no prior consent was given by the users whose data was monitored. The Phorm scandal finally leads the European Commission to launch an Infringement Proceeding against the UK, in April 2009, for failing to fully implement the DPD and the e-Privacy Directive2.

Another similar case concern music copyrights via peer-to-peer protocols. In 2009 Virgin Media UK planned to deploy a Detica CView technology which could monitor 40 per cent of its customers without their awareness or prior consent. Company’s purpose was to establish an “index” of copyright infringements emphasised that data derived from it would be aggregated and anonymised. Apart from that, the company assured that no humans would see the data. Hence, IP address of the user was going to be replaced with a randomly generated unique identifier. However, despite that fact of anonymity, privacy campaigners argued that using that technology could identify those using torrents or block content and would only require “a slight tweak to the software”.

It is noteworthy that there are many perfectly legitimate uses of peer-to-peer applications like BitTorrent. Hence it would be dangerous to block BitTorrent rather an ISP where it need to be sure that there is a copyright offending file. Considering how legal framework has developed it is arguable that self regulatory attempt to discipline the copyright infringements via peer to peer network could easily carry on legal critical examination. Undeniably on the one hand the findings of the CJEU are vague. On the other hand the decisions of national judges have on numerous occasions denied to identify any considerable privacy implications in relation to the use of DPI practices.

The case Scarlet v Sabam3 concerns also copyright infringement matters and privacy with the use of similar to Detica CView technology. The CJEU decision attempted to assess the validity of DPI practise from different ankles. The CJEU examined the article 15 of the e-commerce Directive.4 It holds that “active observation of all electronic communications conducted on the network of the ISP”. Yet intermediary providers are prohibited by the article 15 of the imposition of such obligations. Concerning the protection of personal data, the methodical analysis of all content and the gathering and identification of users IP addresses are protected personal data because they permit those users to be identified. The CJEU concluded that collection of IP Addresses cannot be the real source of concern because only the packet payloads transmitted through the means of suspicious ports are inspected.[5] The decision of CJEU remains ambiguous. It does not mention Article 15 of DPD which could seem relevant on DPI practise. This practise violates the principle of confidentiality of communications. Besides that, on the grounds of the e-commerce Directive and the freedom to conduct one’s business the Court could justify DPI practises if they are adopted voluntarily.

In a more recently case UPC TelekabelWien GmbH v Constantin Film Verleih GmbH and another, the CJEU had to judge whether internet service provider could effectively prevent unauthorised access to the copyrighted material whilst not interfering with internet users’ fundamental right of freedom of information. Unsurprisingly the court have not mention the right to privacy when assessing the validity of the generic injunction required an Internet access provider to block access to an infringing website. Apart from that, the addressee of the generic injunction must select the proper technology in order to comply with that injunction and the fundamental right of internet users of freedom of information.

The CJEU required that in order to get exemption from liability under article 17 of Directive 2000/31/EC one must show that has no control over third party data. Therefore, article 17 of the Directive 2000/31/EC offers a shield to hosting data intermediaries when they have not actual knowledge of illicit activities and on obtaining such knowledge acts expeditiously to disable or remove access to the litigious data. It is thus a “neutrality test” that is applied to the role played by the intermediary. The CJEU gave some guidance but neutrality test is supposed to be applied by national courts.

Another issue concerning data protection regards “Cookies” which is a technology used to track internet users’ activity. This technology is developed and cannot be easily avoided, even if a user set his browser to avoid “cookies”. This data is analysed in order to construct a personal profile for each user. This analysis is called data mining. In most cases, data is not expressly provided by users and the information collected can be very detailed. Apart from that, unremarkable and non sensitive personal data can take an unexpected new significance when combined with data collected from other contexts. In order to control these advertising practises e-privacy Directive adopted an opt-out system which due to the development of information technology has been amended and it is now necessary for websites to flag the use of cookies on each visit and requires users to opt in which means they have control over their information exposed. However, most activities are based on session cookies which are temporarily stored. Other cookies are persistent ‘all mouse click recording cookies’. Unfortunately this important distinction is not appreciated by everyone rejecting the proposal to move to an opt-in system. Cookies facilitate behavioural advertising and therefore regulating this particular technology must take into account the inherent risks related to it. The law clearly needs to be backed by further tools like compulsory default browser settings that block tracking cookies.

[1] Google Spain SL and Google Inc v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González (N 1), paragraphs 65-68

[2] European Commission, “Telecoms: Commission launches case against UK over privacy and personal data protection”, April 14, 2009, EC IP/09/570

[3] CJEU case C-70/100/ Scarlet v Sabam [2012] E.C.D.R. 4

[4] E commerce directive, Directive 2000/31/EC of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce)/1

icons clockРабочие часы

Пон-Пятн
08:00-18:00

Our Newsletter

How many eyes has a typical person? (ex: 1)
Name:
Email: