By: Julia Kosheleva, Kate Rushton, Ryan Baker, Corey Hyllested, and Christine Petrozzo
Megaupload, a Hong Kong-based online file storage service, was founded in 2005 by Kim Dotcom, a resident of New Zealand. Until recently, the company provided cloud storage, or cyberlockers, to its 66.6 million users and deployed hundreds of servers throughout the world, including the U.S., Netherlands, and France. The company made money by selling advertisements and premium subscriptions. The basic function of Megaupload allowed users to upload files to “lockers”, where the file was stored on a company server. Then users could publish the name of the file and the locker URL on public blogs, from which anyone could download or stream the content. In its Terms of Service, Megaupload required its users to agree not to upload copyright protected materials. The company also claims that it took down unauthorized content when its registered DMCA agent was notified of such material by copyright holders during the years of its operation. It also created a so called “abuse tool” which allowed copyright holders to remove files.
In January 2012, the federal grand jury in Alexandria, Va., charged Megaupload with abetting criminal copyright infringement, claiming illegal distribution of at least $500 million worth of copyrighted music, TV shows, movies, video games, software, books, and images. It claims that 90 percent of the content uploaded to Megaupload was infringing and that Megaupload’s employees were well aware of the fact that the site was used for uploading infringing materials as evidenced by internal email exchanges and chat logs. Moreover, internal communications indicated that employees understood the importance of this content to the success of the business.
Megaupload’s case is similar to Viacom v. YouTube in many respects and it will raise similar legal questions. Does general awareness of the illegal materials disqualify the service provider from DMCA safe harbor protection or is it knowledge of only specific files? Does the DMCA provision speak to “willful blindness”? Is it conceivable that a service provider which receives financial benefits directly attributable to infringing activity qualify for safe harbor? On the other hand, this case will expose additional complex legal issues as it expands beyond U.S. borders and deals with a substantially large volume of known infringing content.
The DMCA’s 17 U.S.C. § 512(c) provision specifies a set of requirements that a service provider must meet in order to be eligible for safe harbor immunity. The first of these requirements stipulates that the provider must not have actual subjective knowledge that the material or activity on its network was infringing, and that it must not be aware of facts or circumstances that would lead an objective, reasonable person to find a likelihood of infringing activity (as interpreted by Second Circuit in Viacom v. YouTube).
In Viacom, the Court of Appeals clarified that disqualifying knowledge must be of “specific and identifiable infringements,” and that “mere knowledge” of the presence of infringing materials on the site is not enough to disqualify a provider from DMCA safe harbor. That court found several internal YouTube communications that suggested that staff members did indeed have awareness of specific cases of infringement, and may not have acted to remove them expeditiously. The court found that YouTube may have been ineligible for safe harbor protection, and remanded to the District Court for fact-finding.
Similarly, the government’s case that Megaupload employees had actual knowledge of specific infringement cases is supported by looking at certain site features and internal communications. One contention is that the site’s “Top 100” download lists were doctored to remove mentions of infringing material, and did not “actually portray the most popular downloads” (Ars Technica). This suggests that Megaupload’s staff may have had direct knowledge that a file was infringing and affirmatively hid it from view. The government also points to internal emails in which employees mention infringing files by name, with no indications of concern or plans to remove them. There are also allegations that employees themselves uploaded a copyrighted “BBC Earth” episode in 2008.
If these facts were presented to the same Second Circuit Court as in Viacom, the court would be highly unlikely to find that Megaupload was entitled to DMCA safe harbor protection, based on the “actual knowledge” provisions set forth in 17 U.S.C. § 512(c)(1)(a).
In § 512(c)(1)(b) of the DMCA, a website service provider only benefits from safe harbor if it has the “right and ability to control” infringing activity, and does not directly benefit financially from the acts of the infringement. According to the Viacom case, the court examined two interpretations of the statutory phrase brought by the defendant and the plaintiff: 1). YouTube doesn’t know of the infringing material, therefore how can it control the activity, and 2). YouTube has the right to remove or delete any of the videos on its servers, therefore it has the ability to control the infringing activity. The same arguments can be made for Megaupload, even though its evident the employees of the company had actual knowledge of the infringing material. The court remained vague on the phrase in Viacom, but suggested there needs to be “something more” other than removing or block access to the content. The “something more”…? The Court never specified, and still remains vague today as its been turned back to the trial court for more fact-finding.
Although there’s still room for interpretation regarding the “right and ability to control”, it can be argued YouTube directly benefitted from the infringing content through its display advertisements. Similarly, it’s been reported Megaupload generated more than $175 million in illegal profits through advertising revenue (Ars Technica). Until the lower Court clarifies the phrase “right and ability to control,” disqualification for safe harbor benefits remain unclear for Megaupload.
According to § 512(c)(1)(c) of the DMCA, a website hosting user-generated content is offered safe harbor as long as it “responds expeditiously to remove, or disable access to, the material that is claimed to be infringing or to be the subject of infringing activity.” YouTube complied with this portion of the DMCA, promptly deleting videos when it received notice of infringement. In fact, Viacom notes that all of the videos disputed in the case have since been taken down. YouTube also developed the ContentID system, which allows copyright holders to scan through audio and video and automatically detect and flag content similar to their own.
In contrast, one of the strikes against Megaupload is that its “Abuse Tool,” allegedly designed to allow DMCA takedown notices, only removed links to infringing content, rather than the content itself. This loose interpretation of the “remove, or disable access to” requirement may prove to be its downfall, because the architecture of Megaupload is such that there may be many links to the same file. The reason for this is technical–if a user wanted to upload a file that Megaupload already had, it would simply provide a new link to the file rather than re-upload it. However, this also means that removing a single link does not remove access to the infringing content, because other links to the same thing still work. As a result, courts will likely find this to be non-compliance with safe harbor provision 512(c)(1)(c).
As far as new developments arising out of the Megaupload case, Kim Dotcom is working on a few projects, Mega, a newer digital locker with more protection for users and MegaBox, a music-sharing service that allows artists to sell their art and reap 90 percent of the revenue.
Mega, which after the shutdown in Gabon (me.ga) is slated to go live in January 2013 on the anniversary of the raid on the MegaUpload servers. Mega, like MegaUpload before it, is a cloud-based service that allows users to upload, access and share files. The new service will also provide a one-click encryption of files. The decryption key will be given to the user, and not stored by Mega. This prevents Mega from being able to review or be aware of what files are uploaded to its servers. Another key difference, is Mega will not remove duplicate copies. Thus, ten uploads of the film “The Big Lebowski” creates ten different copies–each encrypted with a different key. Kim Dotcom maintains this will not be a “middle finger” to Hollywood or the U.S. Department of Justice. It will also allow content creators such as film studios, the ability and access to remove files. Prior to having access to the tool, content creators must agree not to sue Kim Dotcom or the Mega service. The EFF’s Julie Samuels hinted its just the next iteration of a cat-and-mouse game on the Internet.
MegaBox, the not-yet-release music service, will allow users to purchase music or they can install a tool that will replace up to 15 percent of their Internet ads with ads provided by MegaBox.
205 in the news article: