Friday, September 28, 2012

Leak reveals EU's plans for Large-Scale Surveillance of - kracktivist

This article is also available in:
Deutsch:?CleanIT ? Pl?ne zur ?berwachung des Internets im gro?en Stil?

A leaked document from the CleanIT project shows just how far internal discussions in that initiative have drifted away from its publicly stated aims, as well as the most fundamental legal rules that underpin European democracy and the rule of law.

The European Commission-funded CleanIT project claims that it wants to fight terrorism through voluntary self-regulatory measures that defends the rule of law.

The initial meetings of the initiative, with their directionless and ill-informed discussions about doing ?something? to solve unidentified online ?terrorist? problems were mainly attended by filtering companies, who saw an interesting business opportunity. Their work has paid off, with numerous proposals for filtering by companies and governments, proposals for liability in case sufficiently intrusive filtering is not used, and calls for increased funding by governments of new filtering technologies.

The leaked document contradicts a letter sent from CleanIT Coordinator But Klaasen to Dutch NGO Bits of Freedom in April of this year, which explained that the project would first identify problems before making policy proposals. The promise to defend the rule of law has been abandoned. There appears never to have been a plan to identify a specific problem to be solved ? instead the initiative has become little more than a protection racket (use filtering or be held liable for terrorist offences) for the online security industry.

The proposals urge Internet companies to ban unwelcome activity through their terms of service, but advise that these ?should not be very detailed?. This already widespread approach results, for example, in Microsoft (as a wholly typical example of current industry practice) having terms of service that would ban pictures of the always trouserless Donald Duck as potential pornography (?depicts nudity of any sort ? in non-human forms such as cartoons?). The leaked paper also contradicts the assertion in the letter that the project ?does not aim to restrict behaviour that is not forbidden by law? ? the whole point of prohibiting content in terms of service that is theoretically prohibited by law, is to permit extra-judicial vigilantism by private companies, otherwise the democratically justified law would be enough. Worse, the only way for a company to be sure of banning everything that is banned by law, is to use terms that are more broad, less well defined and less predictable than real law.

Moving still further into the realm of the absurd, the leaked document proposes the use of terms of service to remove content ?which is fully legal?? although this is up to the ?ethical or business? priorities of the company in question what they remove. In other words, if Donald Duck is displeasing to the police, they would welcome, but don?t explicitly demand, ISPs banning his behaviour in their terms of service. Cooperative ISPs would then be rewarded by being prioritised in state-funded calls for tender.

CleanIT (terrorism), financed by DG Home Affairs of the European Commission is duplicating much of the work of the CEO Coalition (child protection), which is financed by DG Communications Networks of the European Commission. Both are, independently and without coordination, developing policies on issues such as reporting buttons and flagging of possibly illegal material. Both CleanIT and the CEO Coalition are duplicating each other?s work on creating ?voluntary? rules for notification and removal of possibly illegal content and are jointly duplicating the evidence-based policy work being done by DG Internal Market of the European Commission, which recently completed a consultation on this subject. Both have also been discussing upload filtering, to monitor all content being put online by European citizens.

CleanIT wants binding engagements from internet companies to carry out surveillance, to block and to filter (albeit only at ?end user? ? meaning local network ? level). It wants a network of trusted online informants and, contrary to everything that they have ever said, they also want new, stricter legislation from Member States.

Unsurprisingly, in EDRi?s discussions with both law enforcement agencies and industry about CleanIT, the word that appears with most frequency is ?incompetence?.

The document linked below is distributed to participants on a ?need to know? basis ? we are sharing the document because citizens need to know what is being proposed.

Key measures being proposed:

  • Removal of any legislation preventing filtering/surveillance of employees? Internet connections
  • Law enforcement authorities should be able to have content removed ?without following the more labour-intensive and formal procedures for ?notice and action??
  • ?Knowingly? providing links to ?terrorist content? (the draft does not refer to content which has been ruled to be illegal by a court, but undefined ?terrorist content? in general) will be an offence ?just like? the terrorist
  • Legal underpinning of ?real name? rules to prevent anonymous use of online services
  • ISPs to be held liable for not making ?reasonable? efforts to use technological surveillance to identify (undefined) ?terrorist? use of the Internet
  • Companies providing end-user filtering systems and their customers should be liable for failing to report ?illegal? activity identified by the filter
  • Customers should also be held liable for ?knowingly? sending a report of content which is not illegal
  • Governments should use the helpfulness of ISPs as a criterion for awarding public contracts
  • The proposal on blocking lists contradict each other, on the one hand providing comprehensive details for each piece of illegal content and judicial references, but then saying that the owner can appeal (although if there was already a judicial ruling, the legal process would already have been at an end) and that filtering such be based on the ?output? of the proposed content regulation body, the ?European Advisory Foundation?
  • Blocking or ?warning? systems should be implemented by social media platforms ? somehow it will be both illegal to provide (undefined) ?Internet services? to ?terrorist persons? and legal to knowingly provide access to illegal content, while ?warning? the end-user that they are accessing illegal content
  • The anonymity of individuals reporting (possibly) illegal content must be preserved? yet their IP address must be logged to permit them to be prosecuted if it is suspected that they are reporting legal content deliberately and to permit reliable informants? reports to be processed more quickly
  • Companies should implement upload filters to monitor uploaded content to make sure that content that is removed ? or content that is similar to what is removed ? is not re-uploaded
  • It proposes that content should not be removed in all cases but ?blocked? (i.e. make inaccessible by the hosting provider ? not ?blocked? in the access provider sense) and, in other cases, left available online but with the domain name removed.

Leaked document:?http://www.edri.org/files/cleanIT_sept2012.pdf

CleanIT Project website:?http://www.cleanitproject.eu/

Microsoft ?code of conduct?:?http://windows.microsoft.com/is-IS/windows-live/code-of-conduct

CleanIT?s letter to Bits of Freedom about ?factual inaccuracies? and their unfulfilled promise to produce a problem definition:?http://95.211.138.23/wp-content/uploads/2012/07/20120106-Reaction-blog?

EDRigram article 29 August:?http://edri.org/edrigram/number10.16/cleanit-safer-internet-for-terror?

EDRigram article 20 June:?http://edri.org/edrigram/number10.12/the-rise-of-the-european-upload-f?

Source: http://kractivist.wordpress.com/2012/09/28/leak-reveals-eus-plans-for-large-scale-surveillance-of-communications/

neti pot iron chef bath and body works coupons jeff probst jeff probst king jong il dead south korea

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.