UGC and the Law
From a legal perspective, User Generated Content (UGC) can cover Blogs, Forums, Video, Photographs, Audio, Software Code and other postings. UGC can encompass any content which users upload or post online. With such a wide array of UGC, there’s an equally wide number of ways in which content on a website you own might contravene the law.
This is a summary of a presentation by Paul Massey at Internet World. It should not be interpreted as legal advice. Paul also presented at NMK’s Better Moderation seminar on 27/9/07, and the interest his session generated leads us to believe that this earlier article will be of interest to readers.
Copyright infringement is one of the most frequently mentioned legal issues in relation to sites like YouTube. Broadly, copyright infringement involves the unlicensed use of others’ material and protected performance rights. Copyright can refer to original literary, dramatic, musical or artistic works; to sound recordings, films, broadcasts or cable programmes, and even to the typographical arrangement of published editions. A single video could therefore potentially infringe a number of copyrights in a number of ways.
Defamation is another potential headache for site owners. In broad terms, this involves, "the publication of a statement which tends to lower the claimant in the estimation of reasonable people;" something you could find happening in blogs and forums. A site owner may be able to avoid liability for defamatory content posted to a website by claiming a defence that the site was acting as a "mere conduit" as opposed to being liable as the author, editor or publisher of submitted content. However, Massey recommended that site owners monitor the content submitted as far as possible in order to be able to offer a defence that the site owner took ‘reasonable care’ to avoid defamation occurring. If questionable remarks are found or brought to the attention of the site, then the advice is to "delete first, ask questions later".
Other legal headaches involve the violation of individual’s image rights (people’s right to privacy), obscenity, contempt of court and harassment. Social networks have been particularly prone to this latter crime and Massey suggested that such sites include provision in their terms and conditions ensuring that users can be barred for such behaviour.
The discrepancies between the laws of different countries is also an issue to consider. The fact that a certain action is legal in your own country, doesn’t mean that a website operator will not be pursued legally elsewhere. This is particularly applicable to multinational sites. For example, the selling of Nazi memorabilia on eBay and Yahoo! created legal issues in Germany and France, despite the fact that the companies are primarily based in the US where freedom of expression is enshrined into the constitution.
Moderation is an especially thorny issue affecting hosts of user generated content. In Europe the law is governed by the European E-Commerce Directive and implemented in the UK by the Electronic Commerce (EC Directive) Regulations 2002. This provides so called ’safe-harbour’ provisions for the providers of internet sites and services. Similar provisions exist in the US through the Digital Millennium Copyright Act. The provisions may protect site owners whose users upload content which infringes the rights of third parties provided that they have no knowledge of the infringing material. However, such service providers do have a legal obligation to remove or disable access to information or files if they receive a take down notice from a copyright owner. While the legislation does not impose a duty upon website owners to monitor the activity of users, the Napster trial in the United States held that site owners have a duty to police their sites to the fullest extent possible. Web site operators must also avoid any activities which could be deemed to encourage or authorise illegal activity.
The Viacom versus YouTube trial in the United States could prove very important for other UGC sites. Viacom are requesting a number of remedies from the court including: (a) a Declaration of Infringement viz. admission of guilt; (b) a permanent injunction requiring YouTube to employ reasonable methodologies to prevent or limit infringement of Viacom’s copyright and (c) maximum damages for past and present infringement taking account of YouTube’s profits (estimated to be at least $1bn).
Viacom allege that YouTube is not simply a ‘mere conduit’ of infringing material and that tools such as video embedding, tags, thumbnails and the "friends" features provide encouragement to infringe copyright or hinder its prevention. Viacom also allege that YouTube failed to react in a timely manner to take-down notices issued by Viacom’s lawyers.
Safeguarding Your Site
Massey recommends that site owners provide both "click-wrap" and "browse-wrap" licences on their websites. This means that users explicitly agree to the terms and conditions by clicking a button and are also subject to the site’s general terms and conditions linked to from each webpage. Terms and conditions ought to include:
A royalty-free, worldwide licence to store, reproduce, distribute and perform the UGC online (and in other formats if necessary, for example, mobile).
A waiver of moral rights.
Warranties and undertakings in relation to content submitted. (e.g. no infringing material)
No obscene or defamatory material or material constituting an invasion of privacy.
An indemnity to protect the distributor against potential loss or damage.
A minimum age threshold – with consent from adults for minors to upload content. Could include age verification procedures although target audience are often minors.
Privacy provisions - if necessary including consent to transfer personal information abroad.
Sites need to respond quickly to take down notices, either removing the material altogether or, depending on the context, approaching the copyright owner to try to reach a commercial agreement for the content to remain online. Many Web 2.0 sites, such as digg and YouTube, enable the users themselves to moderate the content of the site, reporting inappropriate or illegal content to the site owners. Other sites, such as CurrentTV, moderate incoming content before it is posted on the site. Owners should consider the possibility of these approaches to moderation bearing in mind their business model.
It’s a good idea to provide a code of conduct in easy and understandable language as well as the terms and conditions which may necessarily contain a certain amount of legalese. If online communities clearly understand the legal framework within which they are participating, then they are less likely to create legal problems for website operators by submitting illegal content.
Flickr is an excellent example here. On the subject of copyright, the terms and conditions state "You agree to not use the Service to upload, post, email, transmit or otherwise make available any Content that infringes any patent, trademark, trade secret, copyright or other proprietary rights ("Rights") of any party.” Its code of conduct, on the other hand, says that members should "Respect the copyright of others. This means don’t steal photographs that other people have taken and pass them off as your own. (That’s what favourites are for)."
Paul Massey is an associate in the Intellectual Property practice area at K&L Gates. Paul’s experience includes advising on patent and domain name disputes, trade marks, copyright, and design rights with a focus on telecommunications, digital technologies and e-commerce. firstname.lastname@example.org