Bearbeiten von „E-commerce“

Zur Navigation springen Zur Suche springen

 

Die Bearbeitung kann rückgängig gemacht werden. Bitte prüfe den Vergleich unten, um sicherzustellen, dass du dies tun möchtest, und speichere dann unten deine Änderungen, um die Bearbeitung rückgängig zu machen.

Aktuelle Version Dein Text
Zeile 30: Zeile 30:
  
 
According to general principles of civil law, only those need to prevent harm that have created a hazard source. The exchange of information is at the roots of human nature. It is a fundamental right (article 11 of the charter of fundamental rights) and can therefore not be considered a "hazard source". The provision of telecommunications services does not create a greater risk of rights infringements than the provision of any product or service. Typical, socially adequate and therefore legal risks do not put their originator in a position of being responsible for intentional violations committed by other people.
 
According to general principles of civil law, only those need to prevent harm that have created a hazard source. The exchange of information is at the roots of human nature. It is a fundamental right (article 11 of the charter of fundamental rights) and can therefore not be considered a "hazard source". The provision of telecommunications services does not create a greater risk of rights infringements than the provision of any product or service. Typical, socially adequate and therefore legal risks do not put their originator in a position of being responsible for intentional violations committed by other people.
 
Also information society services are often similar to off-line activity such as personal discussions, noticeboards, CD recorders, photocopying machines or lockers. Internet users must not be discriminated in a way that puts them under permanent scrutiny and surveillance on-line where similar activities off-line are completely anonymous and confidential, simply because the Internet makes total control technically possible. The Internet must be kept as free as the rest of our lives. In information society we must defend our freedoms if we do not want to gradually lose them.
 
  
 
Even prevention (or policing) technology that can reasonably be implemented or is industry standard must not be imposed on all service providers because of its devastating effects on freedom of speech. Filtering technology, for example, will by its nature suppress legal content that is merely similar to illegal content. This leads to the suppression of controversial but politically very valuable content, for example critical comments on companies and products or "fair use" of intellectual property. Policing is not the job of private companies.  
 
Even prevention (or policing) technology that can reasonably be implemented or is industry standard must not be imposed on all service providers because of its devastating effects on freedom of speech. Filtering technology, for example, will by its nature suppress legal content that is merely similar to illegal content. This leads to the suppression of controversial but politically very valuable content, for example critical comments on companies and products or "fair use" of intellectual property. Policing is not the job of private companies.  
Zeile 47: Zeile 45:
  
 
A service provider that is notified of allegedly illegal content should therefore not be required to remove the content before its legality has been assessed by a judge in a preliminary procedure. Member States can design this procedure to be fast and effective. However its cost must not be borne by the provider as this would again have a chilling effect on free speech. Needy claimants can use legal aid. Claimants can recover legal expenses by suing the user that generated the content.
 
A service provider that is notified of allegedly illegal content should therefore not be required to remove the content before its legality has been assessed by a judge in a preliminary procedure. Member States can design this procedure to be fast and effective. However its cost must not be borne by the provider as this would again have a chilling effect on free speech. Needy claimants can use legal aid. Claimants can recover legal expenses by suing the user that generated the content.
 
I quote the Joint Declaration of the OSCE Representative on Freedom of the Media and Reporters Sans Frontières on Guaranteeing Media Freedom on the Internet:<ref>https://www.osce.org/documents/rfm/2005/06/15239_en.pdf</ref>
 
 
'''A decision on whether a website is legal or illegal can only be taken by a judge, not by a service provider. Such proceedings should guarantee transparency, accountability and the right to appeal.'''
 
 
I quote the Joint Declaration on International Mechanisms for Promoting Freedom of Expression of the UN Special Rapporteur on Freedom of Opinion and Expression, the OSCE Representative on Freedom of the Media and the OAS Special Rapporteur on Freedom of Expression:<ref>http://www.article19.org/pdfs/igo-documents/three-mandates-dec-2005.pdf</ref>
 
 
'''No one should be liable for content on the Internet of which they are not the author, unless they have either adopted that content as their own or refused to obey a court order to remove that content.'''
 
  
 
Recommendation:
 
Recommendation:
Zeile 94: Zeile 84:
 
*The e-commerce directive should be amended as follows: "''Member States shall ensure the confidentiality of the usage of information society services and the related traffic and suscriber data, through national legislation. In particular, they shall prohibit listening, tapping, storage or other kinds of interception or surveillance of usage and the related traffic and subscriber data by persons other than users, without the consent of the users concerned, except when legally authorised to do so. This paragraph shall not prevent technical storage which is necessary for the provision of a service without prejudice to the principle of confidentiality.''"
 
*The e-commerce directive should be amended as follows: "''Member States shall ensure the confidentiality of the usage of information society services and the related traffic and suscriber data, through national legislation. In particular, they shall prohibit listening, tapping, storage or other kinds of interception or surveillance of usage and the related traffic and subscriber data by persons other than users, without the consent of the users concerned, except when legally authorised to do so. This paragraph shall not prevent technical storage which is necessary for the provision of a service without prejudice to the principle of confidentiality.''"
  
==Question 52==
 
 
''Overall, have you had any difficulties with the interpretation of the provisions on the liability of the intermediary service providers? If so, which?''
 
 
Articles 12-14 shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to prevent "an infringement". It is unclear what is meant by "an infringement". Apparenty the legislator is referring to a specific infringement and not to any infringement of a kind. But what means can the provider be required to employ to "prevent an infringement"? Can ineffective measures be required that can easily be circumvented?
 
 
Article 15 does not prevent Member States from imposing a monitoring obligation on service providers "in a specific case" or "to apply duties of care [...] in order to detect and prevent certain types of illegal activities". It is unclear what is meant by a "specific case" and by "certain types of illegal activities". How "specific" does a monitoring duty need to be? Can it extend to all users, to all content and to all the time without becoming a "general obligation"? It is also unclear what means the provider can be required to employ in order to "detect and prevent certain types of illegal activities". Can ineffective measures be required that can easily be circumvented? Can measures be required that have severe side-effects on legal use and user rights to privacy and free speech? Can measures be required that are unthinkable for similar off-line services, simply because they are technically feasible on-line?
 
 
All in all the idea of private policing and infringement detection should be abandoned as explained above.
 
 
==Question 53==
 
 
''Have you had any difficulties with the interpretation of the term "actual knowledge" in Articles 13(1)(e) and 14(1)(a) with respect to the removal of problematic information? Are you aware of any situations where this criterion has proved counter-productive for providers voluntarily making efforts to detect illegal activities?''
 
 
The interpretation of the term "actual knowledge" is controversial because it is unclear whether knowledge of the illegal nature of the content is required. If a provider is notified of allegedly illegal content, it often has no knowledge whether that content is legal or not. For this reason a court ruling on the matter should be required as explained above.
 
 
The knowledge criterion does not prove counter-productive for providers voluntarily making efforts to detect illegal activities, because those practises can be regulated in the provider's terms of service. Obviously those contract terms must be subject to a fairness test according to directive 93/13.
 
 
In my opinion, private efforts to detect illegal activities should not be facilitated but, to the contrary, made comply to the rule of law. The removal of content without the consent of its author should be banned unless ordered by the judiciary, after hearing the user. The law may provide for interim orders issued by the judiciary. Those orders should expire if not confirmed in the ordinary procedure after a certain period of time. Providers must not be allowed to remove content in their own right because this has proven to have disastrous effects on the freedom of speech.
 
 
I quote the Joint Declaration of the OSCE Representative on Freedom of the Media and Reporters Sans Frontières on Guaranteeing Media Freedom on the Internet:<ref>https://www.osce.org/documents/rfm/2005/06/15239_en.pdf</ref>
 
 
'''A decision on whether a website is legal or illegal can only be taken by a judge, not by a service provider. Such proceedings should guarantee transparency, accountability and the right to appeal.'''
 
 
I quote the Joint Declaration on International Mechanisms for Promoting Freedom of Expression of the UN Special Rapporteur on Freedom of Opinion and Expression, the OSCE Representative on Freedom of the Media and the OAS Special Rapporteur on Freedom of Expression:<ref>http://www.article19.org/pdfs/igo-documents/three-mandates-dec-2005.pdf</ref>
 
 
'''No one should be liable for content on the Internet of which they are not the author, unless they have either adopted that content as their own or refused to obey a court order to remove that content.'''
 
 
==Question 57==
 
 
''Do practices other than notice and take down appear to be more effective? ("notice and stay down"13, "notice and notice"14, etc)''
 
 
I regret that this question aims at "effectiveness" only. Proportionality and fundamental rights must be the point of departure when considering the procedure for the removal of content.
 
 
==Question 58==
 
 
''Are you aware of cases where national authorities or legal bodies have imposed general monitoring or filtering obligations?''
 
 
German courts are imposing general monitoring or filtering obligations all the time. In order to detect potentially illegal content of a kind (e.g. Rolex imitations), they require providers, for example, to require user registration and confirm their identity, to log all user actions, to filter all user-generated content for keywords, to manually review all content generated by certain users or relating to certain topics. These requirements are far-reaching enough to make providers delete any remotely suspicious content and account, or even to move to a country outside the EU.
 
 
==Question 59==
 
 
''From a technical and technological point of view, are you aware of effective specific filtering methods? Do you think that it is possible to establish specific filtering?''
 
 
Filtering is not effective. Several methods such as checksums, keywords, manual reviews, user identity or external links have been discussed in detail and discarded for various reasons.<ref>Breyer, http://www.daten-speicherung.de/index.php/verkehrssicherungspflichten-von-internetdiensten-im-lichte-der-grundrechte/</ref> If rights holders believe filtering to be effective they are free to search the Internet and use such technology by themselves in order to give notice to the provider of any infringement. Policing and the enforcement of private titles is not the job of intermediaries, but of the police and courts. Intermediaries must be neutral in conflicts of interest.
 
 
==Question 60==
 
 
''Do you think that the introduction of technical standards for filtering would make a useful contribution to combating counterfeiting and piracy, or could it, on the contrary make matters worse?''
 
 
The introduction of technical standards for filtering or even of filtering at all would be a catastrophe from a freedom of speech point of view. Instead filtering should be prohibited.
 
 
I quote the Joint Declaration of the OSCE Representative on Freedom of the Media and Reporters Sans Frontières on Guaranteeing Media Freedom on the Internet:<ref>https://www.osce.org/documents/rfm/2005/06/15239_en.pdf</ref>
 
 
'''In a democratic and open society it is up to the citizens to decide what they wish to access and view on the Internet. Filtering or rating of online content by governments is unacceptable. Filters should only be installed by Internet users themselves. Any policy of filtering, be it at a national or local level, conflicts with the principle of free flow of information.'''
 
 
I quote the Joint Declaration on International Mechanisms for Promoting Freedom of Expression of the UN Special Rapporteur on Freedom of Opinion and Expression, the OSCE Representative on Freedom of the Media and the OAS Special Rapporteur on Freedom of Expression:<ref>http://www.article19.org/pdfs/igo-documents/three-mandates-dec-2005.pdf</ref>
 
 
'''Filtering systems which are not end-user controlled – whether imposed by a government or commercial service provider – are a form of prior-censorship and cannot be justified.'''
 
 
==Question 62==
 
 
''What is your experience with the liability regimes for hyperlinks in the Member States?''
 
 
The ECD liability exeptions are not being applied to hyperlinks by German courts. This has, for example, resulted in an on-line press publication being ordered to remove a hyperlink to the AnyDVD website, although this website can be identified in seconds by using a search engine. The liability exeptions should be extended to cover hyperlinks.
 
 
==Question 63==
 
 
''What is your experience of the liability regimes for search engines in the Member States?''
 
 
The ECD liability exeptions are not being applied to applications for injunctive relief by German courts. The liability exeptions should be extended to outlaw prevention orders.
 
 
==Question 64==
 
 
''Are you aware of specific problems with the application of the liability regime for Web 2.0 and "cloud computing"?''
 
 
The operator of a weblog that allows for user comments or the operator of a wiki where users can post information should be considered hosting user-generated content. Yet is is unclear whether article 14 ECD covers these services. The article should be clarified or an amendment be inserted to make sure that user-generated content is covered by article 14.
 
 
==Question 67==
 
 
''Do you think that the prohibition to impose a general obligation to monitor is challenged by the obligations placed by administrative or legal authorities to service providers, with the aim of preventing law infringements? If yes, why?''
 
 
"Preventive duties" imposed by courts in specific cases violate article 15 of the e-commerce directive. Injunctions lead to a de facto obligation to monitor user-generated content, and thus amount to a general monitoring obligation. Service providers should not be obliged to prevent infractions at all.
 
 
==Question 69==
 
 
''Do you think that a lack of investment in law enforcement with regard to the Internet is one reason for the counterfeiting and piracy problem? Please detail your answer.''
 
 
I think that a "war on counterfeiting and piracy" cannot be won any more than the "war on drugs" that has been waged for decades. Illegal drugs are available as easily and cheaply as never in the past today despite all efforts. The same applies to counterfeiting and copyright infringement. Beyond the traditional powers of law enforcement, all other measures to contain counterfeiting and copyright infringement cannot be demonstrated to ultimately reduce these practices.
 
 
==Footnotes==
 
  
<references />
 
  
 
([http://www.daten-speicherung.de/data/Forderungen_Telemedienrecht_26-02-2009_publ.pdf More information in German])
 
([http://www.daten-speicherung.de/data/Forderungen_Telemedienrecht_26-02-2009_publ.pdf More information in German])

Bitte kopiere keine Webseiten, die nicht deine eigenen sind, benutze keine urheberrechtlich geschützten Werke ohne Erlaubnis des Urhebers!
Du gibst uns hiermit deine Zusage, dass du den Text selbst verfasst hast, dass der Text Allgemeingut (public domain) ist, oder dass der Urheber seine Zustimmung gegeben hat. Falls dieser Text bereits woanders veröffentlicht wurde, weise bitte auf der Diskussionsseite darauf hin. Bitte beachte, dass alle daten-speicherung.de-Beiträge automatisch unter der „Namensnennung 2.0 Deutschland“ stehen (siehe daten-speicherung.de:Urheberrechte für Einzelheiten). Falls du nicht möchtest, dass deine Arbeit hier von anderen verändert und verbreitet wird, dann klicke nicht auf „Seite speichern“.

Bitte beantworte die folgende Frage, um diese Seite speichern zu können (weitere Informationen):

Abbrechen Bearbeitungshilfe (wird in einem neuen Fenster geöffnet)