1
Generally, legal updates regarding copyright law are mostly seen as a technical matter, followed in detail by only a handful of copyright experts and industry figures. As an exception, the new Directive on Copyright in the Digital Single Market (DSM), [1] [2]has become one of the most controversial legislative pieces in the European Union’s (EU) history. The opposing petition was signed over 5 million times, [3] which gives a sense of the interest sparked by the DSM. In addition, dozens of demonstrations were held in numerous European cities on March 25, 2019 to protest against the DSM. Even the European Wikipedia site was shut down for an entire day in a protest against “dangerous copyright laws”. [4] Despite ferocious campaigning - led primarily by US tech giants Google/YouTube and Facebook, opposing artists, as well as internet freedom activists - the controversial DSM was approved by the European Parliament on March 26 2019 and endorsed by the Council of the EU on April 15, 2019. It was published in the Official Journal of the EU on May 17, 2019 and entered into force on June 7, 2019 (article 31 DSM). Since it is a directive (not a regulation), EU Member states have two years to implement the DSM into their national legislations, which is until June 7, 2021 (article 29[1] DSM). The DSM is the first major update to EU copyright rules in nearly two decades (2001).
2
The DSM’s most substantial and controversial articles are article 15 (formerly article 11) and article 17 (formerly article 13). While article 15 DSM focuses on online news content and grants a new right for press publishers over the use of their press publications by information society service providers (“link tax”), this paper focusses on article 17 DSM. This article will make online platforms and aggregator sites liable for copyright infringements if they do not take pro-active steps to enter into licensing agreements with rightholders and remove the non-licensed material on their platform. Although it is mainly aimed at Google/YouTube and Facebook, article 17 DSM will also impact other online content-sharing service providers (CSSPs), in particular smaller CSSPs, and has the potential to negatively affect them to a much greater extent than it would the big tech companies. The main reason for this is that article 17(4) DSM places great emphasis on the adoption of automatic filtering systems and introduces a financial risk for CSSPs if they do not successfully implement such filtering systems. In addition, article 17(9) DSM requires CSSPs to operate an effective complaint and redress mechanism for users in the event of disputes over the disabling of access to, or the removal of, works uploaded by CSSPs. Article 17 DSM therefore not only requires CSSPs to guarantee the unavailability of non-licensed content, it also requires CSSPs to guarantee user’s limitations and exceptions, such as parody.
3
This paper is structured as follows: Section 2 will explain the rationale of the DSM’s new liability regime, while Section 3 will briefly outline the relation between article 17 DSM and the E-Commerce Directive (ECD), [5] and allude to some inconsistencies between these directives. Section 4 will describe why having access to a strong upload-filter is essential for CSSPs to be able to remain competitive under the new era of the DSM. In Section 5, this paper will examine why small CSSPs will not have access to filtering technology - unless a provider is able to develop a sufficiently sophisticated and competitive filtering technology and willing to license it to small CSSPs against a reasonable fee - thus creating the risk that the DSM’s filtering requirement may substantially harm competition amongst CSSPs in the EU. In Section 6, this paper will examine why today’s content filtering technologies are subject to significant inherent limitations with regard to their accuracy, efficiency and affordability. Section 7 will describe what a CSSP will have to do to meet the requirement of having an effective and expeditious complaint and redress mechanism. Last but not least, in Section 8, this paper will describe why under the new DSM era larger CSSPs may gain a new competitive advantage over small CSSPs, which may lead to more market concentration, and whether small CSSPs can do anything to remain competitive.
4
The music industry has suffered great losses in the past decades from declining CD sales and, more recently, electronic downloads. The new digital business models such as subscription based models were not able to make up for this loss - whether through paid subscriptions to Spotify or Apple Music, Internet radio from Pandora, or videos on YouTube. [6] Big media companies and collective management organizations (CMOs) are at odds with digital music providers - especially free, ad-supported music services such as YouTube, for allegedly not returning significant revenue to the music industry. According to a Google report, from October 2017 to September 2018, YouTube paid more than USD 1.8 billion in ad revenue to the music industry. [7] This deal is, however, not lucrative enough for them. The International Federation of the Phonographic Industry (IFPI) claims that for every USD 20 Spotify returns to the music industry, YouTube returns only one dollar. [8] Thus, the worldwide music industry is fighting for more money and returns, in particular from tech giants like YouTube or Facebook.
5
In addition, the public perception and reputation of big tech companies suffered greatly from their insufficient actions to fight the spread of hate speech, violent videos and copyright infringements on their platforms. It suffered more still from recent scandals involving data breaches, the Cambridge Analytica scandal, as well as the role of tech companies and social media platforms in recent political elections and generally from their alarmingly increasing dominance in certain markets. Today, more than ever, tech companies are subject to the highest scrutiny of regulators and policymakers alike.
6
Moreover, safe harbor regimes are in turmoil, not only, but especially, in Europe. There seems to be a much wider global trend against safe harbor jurisdictions. Such a trend aims to impose proactive monitoring and filtering obligations on Internet service providers (ISPs). Some authors argue that the introduction of article 17 DSM is rooted in the discourse about the “Internet threat”, which reflects a gradual shift in the perception of ISPs from being “mere-conduits” to “active gate-keepers” of content uploaded and shared by users. [9]
7
On a political level, the question of “the fair remuneration of authors and performers and of the difference in bargaining power when they license or transfer their rights” was raised in a communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions. [10] According to an additional communication from the Commission, online intermediaries have a duty to provide a safe online environment to users by ensuring that illegal content is removed promptly and proactively, and they should adopt effective proactive measures to detect and remove illegal content online and not only limit themselves to reacting to takedown notices which they receive. [11]
8
This eventually led to the introduction of article 17 DSM, which is, in such a context, not surprising and may even seem overdue. The rationale of article 17 DSM is based on the assumption that online intermediaries that operate on an ad-funded business model (e.g. YouTube, Dailymotion, Vimeo) - as opposed to companies that operate on a subscription-based business model (e.g. Spotify, Apple Music) - do not obtain licenses from rightholders for the works which they store on their platforms. With the goal of closing the value gap, article 17(4) DSM forces CSSPs to apply best efforts to enter into licensing agreements with rightholders, CMOs and big media companies. This requirement should strengthen the negotiation power of rightholders, which should eventually lead to more favorable licensing terms for the rightholders and media companies.
9
It is important to note that it is disputed whether the value gap actually exists because there is no robust empirical evidence for its existence. The idea of a value gap was developed by the music and entertainment industry, which used this term in music industry global reports without having any empirical evidence. [12] Different trade associations representing the music industry created the term “value gap” sometime around 2015 as a slogan. Since then, they have concertedly and constantly used this term in numerous public and government relations campaigns. [13] The Draft Directive’s Impact Assessment confirms this assumption and states that “the limited availability of data in this area […] did not allow to elaborate a quantitative analysis of the impacts of the different policy options”. [14] The European Copyright Society also pointed out that the DSM’s proposal is not founded on any solid scientific - in particular economic - evidence. [15] Different reports come to the same conclusion and indicate that there is no clear evidence on the effects of copyright infringement in the digital environment, the scale of it, or the effectiveness of more aggressive enforcement strategies. [16] A report commissioned by the European Commission, which was released only upon the filing of an access request by the Pirate Party’s MEP Julia Reda, [17] states that there is no “robust statistical evidence of a displacement of sales by online copyright infringements”. [18]
10
There is no doubt that the policy goal to redistribute resources from big platforms to creators for the use of their works in the platform economy is well-intended. After all, the policy goal of our copyright system should be for creators to allow the European public to enjoy creative content in all ways made possible by digital technology against a fair compensation system. [19] However, it remains to be seen whether the implementation of the DSM in the EU Member States’ national legislation will have the desired effect, since it is uncertain whether the value gap actually exists. Even then, assuming that it exists, it is still not clear if article 17 DSM will have the desired effects. Hence, it is questionable whether the EU has chosen an appropriate means to achieve this policy goal by introducing article 17 DSM given its potential negative side effects. These effects include harm to smaller companies and European competition amongst CSSPs in general, as well as harm to freedom of expression in particular, due to the over-blocking of content, as will be highlighted in this paper.
11
The DSM’s new liability regime shall apply to CSSPs, which are, according to article 2(6) DSM, defined as providers “of an information society service of which the main or one of the main purposes is to store and give the public access to a large amount of copyright-protected works or other protected subject matter uploaded by its users, which it organizes and promotes for profit-making purposes”. This broad definition and, in particular, its term “large amount” create considerable uncertainty leaving it to the courts to define what a “large amount” means. [20]
12
The second part of article 2(6) DSM states that “providers of services, such as not-for-profit online encyclopedias, not-for-profit educational and scientific repositories, open source software-developing and-sharing platforms, electronic communication service providers as defined in Directive (EU) 2018/1972, online marketplaces, business-to-business cloud services and cloud services that allow users to upload content for their own use” shall not fall under the DSM’s definition of CSSP. This excludes Wikipedia, open access repositories, and open source sharing platforms because they do not operate for profit. Internet access providers and telecom service providers are not covered by the definition because it is not their main purpose to give the public access to a large amount of copyright protected works. Also, online marketplaces such as eBay, whose main activity is online retail, are not covered for the same reason. [21]
13
In theory, the new regime for CSSPs should be coherent and complement the ECD, which introduced on a horizontal level a framework of conditional liability, enabling the development and functioning of online services in various forms. However, the emphasis on the adoption of upload-filters under article 17 DSM and the encouragement to invest in such technologies may well result in a clash with articles 14 and 15 ECD. [22] These articles provide safe harbor protection and prohibit Member States from imposing a general monitoring obligation on ISPs that fall under one of three categories, i.e. mere conduit, caching and hosting. [23]
14
Under the ECD, ISPs are qualified as a hosting service according to article 14(1) ECD as long as they meet the so-called neutrality test developed by the European Court of Justice (CJEU) and are granted safe harbor protection. [24] However, according to article 17(3) DSM, when a CSSP performs an act of communication to the public or an act of making available to the public – which will most likely always be the case because article 2(6) DSM’s definition of CSSP already entails these acts – it shall no longer be protected by article 14(1) ECD. Accordingly, the ECD’s safe harbor protection will no longer apply to CSSPs covered by article 2(6) DSM.
The DSM’s monitoring obligation through the use of filtering clashes with article 15 ECD and will blur the line between active and passive hosting providers. This distinction was determined in detail in a number of cases by the CJEU, who has slowly associated the requirement of knowledge and awareness of copyright infringing content with the active role of an intermediary, which goes beyond the mere provision of services neutrally and the technical automatic processing of data provided by the intermediary’s users. [25] The CJEU also specifically recognized that monitoring obligations to prevent copyright infringement would be in violation of the ECD. [26]
16
Therefore, article 15 ECD’s prohibition for Member States to impose general monitoring obligations on ISPs will be seriously compromised with the new obligations under article 17 DSM – and this, notwithstanding the fact that article 17(8) DSM states that “the application of the provisions in this article shall not lead to any general monitoring obligation”. Indeed, this statement only makes sense if you argue that the monitoring obligation for CSSPs does not apply to “any content”, but only to content for which the rightholders have provided the CSSP with the necessary information. However, as Frosio quite logically observes, the introduction of any filtering technology “de facto imposes a general monitoring obligation as in order to filter unwanted content, all content must be monitored”. [27] Thus, the statement set forth at article 17(8) DSM is only true if CSSPs can obtain a license for all copyright protected works from all rightholders. However, this is an unlikely scenario since CSSPs will always host some amount of unlicensed content and will therefore need to ensure that such non-licensed content is not available on their platform by monitoring all content. In addition, as Bridy describes, the most prevalent filtering technologies “work by screening every piece of user-uploaded content in real time against that universe of works. No file escapes the system’s surveillance. If such functionality does not amount to general monitoring, it is hard to imagine what would”. [28] Thus, article 17 DSM will bring a systematic inconsistency within EU law and effectively repeals article 15 DSM for storage providers covered by article 14 ECD.
4.1. Article 17(4) DSM’s “best effort requirement” and its incentive to rely on filtering technologies
17
As outlined above, under the DSM’s liability regime, CSSPs are no longer protected by the ECD’s safe harbor regime and, by establishing a monitoring obligation for CSSPs that perform an act of communication to the public, article 17(3) DSM forces CSSPs to take on a more active role. More precisely, to avoid liability, CSSPs shall, as a general rule, obtain a license from the rightholders for any content available on their platform (article 17[1] DSM). [29] If no authorization is granted, CSSPs shall be liable for unauthorized acts of communication to the public, unless they may demonstrate that the following three conditions are met according to article 17(4) DSM. First, CSSPs must have made best efforts to obtain an authorization for the work from the rightholder (article 17[4][a] DSM). Second, CSSPs must have made best efforts to ensure the unavailability of specific works and other subject matter for which the rightholder provided the service provider with the relevant and necessary information to locate the infringing works (article 17[4][b] DSM). Third, to prevent future uploads, CSSPs must continue to have effective “notice and take down” as well as a “notice and stay down” mechanisms based on information provided by the rightholder (article 17[4][c] DSM).
18
The second condition (article 17[4][b] DSM) will serve as a major incentive for CSSPs to increase their efforts and improve their filters. Recital 66 of the DSM specifies that when assessing whether or not a CSSP met the requirements of the second condition to make best efforts in accordance with the high industry standards of professional diligence (article 17[4][b] DSM), “account should be taken of whether the service provider has taken all the steps that would be taken by a diligent operator to achieve the result of preventing the availability of unauthorized works […] taking into account best industry practices and the effectiveness of the steps taken […] as well as the principle of proportionality”. There will be specific edge cases in which CSSPs will not be required to use filtering technologies to guarantee the non-availability of non-licensed content (see below, Section 4.2.). This may be of help to smaller companies in such specific situations. However, companies such as YouTube, Facebook and Soundcloud already apply filtering technologies and will not only continue to use these filters but will likely also increase their efforts around and investments in these technologies in order to improve them. This follows from the potential threat of article 17(4) DSM and specifically the requirements to make best efforts to take all the steps to achieve the result of preventing the availability of unauthorized works on its website. One can expect their improved filters to become the new industry standard. For instance, YouTube invested over USD 100 million in a sophisticated upload-filter called “Content ID” [30] and YouTube’s CEO Susan Wojcicki has made a number of statements that indicate that the biggest video hosting platform is in favor of pre-filtering content before making it available to the public. [31] This shows that from the perspective of a CSSP like YouTube, the use of filters is already effective and proportionate and the steps taken by a diligent operator to ensure the unavailability of specific works. Similar measures will also be expected from other CSSPs.
19
In sum, article 17(4)(b) DSM can only be understood as an obligation to filter and block these specific works with the use of filtering technology because without filters, article 17(4)(b) DSM’s preventive measures cannot realistically be achieved. Therefore, CSSPs will likely heavily invest in their filtering technologies to ensure the unavailability of specific content and to avoid liability under the DSM. In the coming years, filtering technologies will thus likely become more prevalent and sophisticated. Smaller companies with a tighter financial budget and smaller pockets will not have the means to make such investments, which may have negative consequences for them, as outlined in Section 5.
20
Article 17(5) DSM mentions certain factors to be weighed in when considering whether a CSSP satisfied the requirements of art. 17(4) DSM, including the filtering requirement. The requirement to make best efforts in accordance with high industry standards and professional diligence must be interpreted in light of (i) the principle of proportionality, (ii) the type, the audience and the size of the service, and the type of works uploaded by the users, and (iii) the availability of suitable and effective means and their cost for service providers. In other words, if there are no suitable and effective means, or simply not enough financial resources, CSSPs may not have to filter content. [32] As a result, there will be certain situations - assessed on a case-by-case basis - in which (presumably small and less-dominant) CSSPs will not be required to use filtering technologies to guarantee the non-availability of non-licensed content. However, these situations seem to be exceptional and come at the cost of certainty for CSSPs.
21
In addition, to account for start-up companies that leverage user uploads to develop new business models (recital 67 DSM), article 17(6) DSM offers an exception for small and young companies, which, under certain circumstances, are not required to apply filtering technologies to avoid liability. If a new CSSP exists for less than three years and if its annual turnover is below EUR 10 million (cumulative), the liability regime under article 17(4) DSM is limited to compliance with the requirements of (i) making best efforts to enter into licensing agreements with rightholders, and (ii) providing an efficient “notice and take down” system. However, if the CSSPs number of monthly unique visitors exceeds 5 million, they do not benefit from the exception and must put a “notice and stay down” mechanism in place. [33]
22
However, this measure is merely a drop in the ocean. A company that falls below the cumulative thresholds of article 17(6) DSM (i.e. annual turnover below EUR 10 million and less than three years since incorporation of CSSP), is presumably not a competitor of a big tech company or even an established CSSP. For instance, Soundcloud, a subscription-based music and podcast streaming platform from Berlin that hosts approx. 200 million song files, had an estimated annual turnover of over EUR 100 million in 2017 [34] and currently an estimated 75 million visitors per month. [35] YouTube had an estimated revenue of over USD 8 billion in 2015 [36] and currently an estimated 1.9 billion visitors per month. [37] A small or young company offering services related to content uploaded by users and thereby trying to compete with established CSSPs or big tech companies will - at the latest after three years of its existence even if the turnover is below EUR 10 million - fall under the direct liability regime of article 17 DSM and have to apply filtering technologies. In practice, the exception regime of article 17(6) DSM will thus likely rarely apply and generally not be helpful to start-up companies or to help increase competition amongst CSSPs in the EU market. The thresholds are too low and cumulative, which makes it very difficult for a CSSP to be exempted from the filtering requirement.
23
As we have seen, article 17 DSM provides several strong incentives for companies to invest in the development of copyright filtering technologies. However, developing and maintaining an upload-filter requires significant resources. As mentioned, YouTube has already invested over USD 100 million in its proprietary “Content ID” filter using a filtering technology called fingerprinting. [38] According to a 2016 Google report, “Content ID” is responsible for 98% of the content management. With regard to the music industry, even 99.5% of reported sound recording copyright claims were automated through Content ID. [39] To give an idea of the scale of content: 500 hours of video material are uploaded on YouTube every minute (or 82.2 years every day). [40] YouTube describes its filter as a technology that blocks videos which match items identified by a small, trusted group of rightholders. [41] Since “Content ID” is a proprietary filter, we cannot analyze the tool in detail and must rely on (rare) publicly available information about it. At any rate, in view of the significant number of YouTube users, the massive amount of content uploaded to the platform, and YouTube’s efforts and investments in its filter, there are many reasons to believe that to date, “Content ID” may well be the most sophisticated upload-filter.
24
However, YouTube is not the only platform that has developed its own filtering technology. Audible Magic is a US based private company and currently the only third-party provider that offers content recognition solutions, which it licenses to universities and social media platforms. It has a growing list of social media clients, which includes, for instance, Facebook, Vimeo, Viacom, DailyMotion, and Tumblr. [42] YouTube initially licensed Audible Magic’s digital fingerprinting technology, but ultimately decided to build its own proprietary system. Confusingly, both companies now refer to their systems as “Content ID”, [43] which led to an (on-going) trademark dispute over this name. [44]
25
Also, smaller platforms in the EU have made important investments in the development of such filters. For instance, Soundcloud spent over EUR 5 million to build its own filtering technology and dedicates seven full-time employees – out of approx. 300 employees [45] - to maintain the technology. [46]
26
The above shows that, contrary to claims made by proponents of article 17 DSM, copyright filtering tools are expensive, both to develop and maintain, and such cost may be too much too bear for startups and small companies. The European Commission’s policy rationale for proposing a mandatory filtering obligation (outlined in a leaked impact assessment) reflects a misunderstanding of the technological and economic realities around content filtering. In this impact assessment, the European Commission claims that the cost of filtering tools would be negligible for startups: “it is estimated that a small-scale online service provider can obtain such services for less than 900 euros a month” . [47] This estimate is almost solely based on comments submitted by Audible Magic to the US Copyright Office in a study about the effectiveness of the U.S. Digital Millennium Copyright Act’s safe harbor provisions. [48] However, as Engstrom and Feamster explain, this estimate is only accurate for an incredibly small number of CSSPs. [49]
27
According to Audible Magic’s website and its current pricing, a CSSP hosting less than 10,000 song files per month has to pay a monthly fee of USD 1,000 in order to license Audible Magic’s filtering technology for audio files only (video files have to be licensed separately at the same rate). [50] However, 10,000 song files is an incredibly low volume for a CSSP. To put this in perspective, when Soundcloud was only five years old, users were already uploading twelve hours of audio content every minute (or two years of material per day). [51] This amounts to 172,000 files per month, with an average song length of 3 minutes.
As such, developing or licensing a content filter comes at a relatively high cost. A survey of CSSPs reported that medium-sized platforms engaged in file-hosting services paid between USD 10,000 and USD 50,000 a month of licensing fees just to use Audible Magic’s filtering tool. [52] Another source states that, for mid-sized streaming companies, Audible Magic is quoting on average USD 30,000 to 60,000 per month of licensing fees. [53] However, it is worth noting that the licensing fees paid to the third-party provider amount to only a portion of the total costs associated with fingerprinting software. As noted by Urban et. al., licensed content filtering systems are not a turnkey service. In addition, filtering systems require integration with existing systems and additional operational work, such as tracking and managing user appeals. [54] The platform of any CSSP must be altered or augmented to perform the fingerprint lookups and comparisons against a fingerprint database. This is also a substantial software integration task. [55] Under article 17(4) DSM, this is what a CSSP will potentially need to do in order to meet the requirement to make best efforts to ensure the unavailability of specific works for which the rightholders have provided the CSSPs with the necessary information (i.e. fingerprinting and other information, which then needs to be compared against the CSSP’s data base). Thus, the cost for CSSPs to source filtering technologies from third parties and implement them in their internal systems is likely to be much greater in absolute terms than the European Commission’s initial projection.
29
The fact that article 17 DSM applies to “copyright-protected works or other protected subject matter uploaded by its users” is another challenge, because it is not restricted to certain specific copyrighted works such as audio or audio-visual works, for which most of the current filtering technologies are designed for, but for any kind of work. [56] This makes it even more difficult for a CSSP to efficiently filter the entire uploaded content and will create challenges for any CSSP applying article 17 DSM. Indeed, because rightholders can provide CSSPs with the relevant information about any kind of copyrighted work, including images or text files, a DSM-compliant filter will have to be capable of recognizing any kind of copyrighted content. However, fingerprinting tools are narrowly tailored to particular media types (for instance, an audio fingerprinting tool cannot be used to match copyrighted text files), and such tools only exist for a small subset of the many types of copyrighted content available online. [57] This will imply additional costs, since text, image, audio and video content must be separated and will likely each require a separate tool or technology (in case a CSSP’s platform allows one to upload these different kinds of works).
30
An additional aspect of the requirement set by article 17 DSM is that the cost of filtering systems also makes it harder for young companies to attract investors and compete with incumbents. This is confirmed by a survey in the US and the EU, which indicates that a majority of investors would be “uncomfortable investing in businesses that would be required by law to run a technological filter on user uploaded content”. [58]
31
The effects of economies of scale also weigh in favor of large tech companies when it comes to the filtering requirement. Once a technological product is built, the costs of offering an additional unit decrease with increasing scale. Economies of scale are a key advantage for large businesses, as such businesses can afford to invest in expensive and specialized capital machinery, whereas it may not be viable or cost-efficient for smaller businesses to either buy/license or invest in specific technologies, thus creating an entry barrier for such companies, where having the technology in question is a legal requirement. Large companies are also more likely to have a large workforce that can be assigned to separate tasks in order to boost productivity. [59] These factors all contribute to the fact that it will likely be big companies with sufficient financial and human resources and a large user-base that will be in a position to develop a sophisticated content filter, rather than smaller players with no such resources and a much smaller user-base. For instance, it seems doubtful that any other company than YouTube would reasonably be in a position to invest USD 100 million in a proprietary content filtering technology. And even if another company did choose to invest in the development of its own filter, the end product would presumably not be as powerful as YouTube’s “Content ID” and might not even be compliant with the requirements of article 17 (4) DSM (which requires CSSPs to make best efforts in accordance with the high industry standards of professional diligence and to take all the steps that would be taken by a diligent operator to achieve the result of preventing the availability of unauthorized works taking into account best industry practices and the effectiveness of the steps taken).
32
The German Federal Commissioner for Data Protection and Freedom of Information, Ulrich Kelber, also comes to the conclusion that smaller CSSPs will not be able to develop their own filtering technology and that they will have to license it from a third party. He raises an interesting point in connection with data collection. According to Kelber, the filtering obligation will ultimately create an oligopoly of a few providers of filtering technologies, through which more or less all the Internet traffic of the relevant platforms and services will run. Thus, these providers would receive and collect extensive data. [60] This would allow these companies to collect data about the users of their clients’ (i.e. other CSSPs) platforms in addition to the data already collected in connection with their own platform. At this point, however, it seems more likely that large tech companies will rather take advantage of their technological, structural and financial edge over smaller CSSPs and keep their more advanced filtering technology for themselves. This allows large CSSPs to gain in market power instead of supporting their competitors by licensing out their filtering technology to smaller CSSPs on a voluntary basis. However, it remains to be seen how powerful third-party providers like Audible Magic will become and how they will use the collected data from their clients.
33
In sum, it seems unlikely that smaller companies will be able to develop sufficiently strong content filters on their own, primarily due to their financial and structural disadvantage compared to larger companies. Hence, in the new DSM world, smaller CSSPs will have no choice but to license their filters from a third party – provided they can even afford it given the probable high licensing fees that may be imposed upon smaller CSSPs. Adding to the challenge is that there is no guarantee that a third-party provider such as Audible Magic will be able to develop a content filter that meets the requirements of the DSM and is offered against an affordable fee. Whether or not this is the case will mostly depend on whether the third-party provider can attract investors that are willing to inject funds into the improvement/development of its filtering technology. This, in turn, will depend on whether there is a sufficient pool of potential clients, which are CSSPs that are willing to invest a substantial amount of money and internal resources in order to license a third-party filtering technology.
34
As mentioned, upload-filters already play an important role for CSSPs today, and their role will only gain in importance under article 17 DSM. The role of upload-filters under article 17 DSM might be somewhat overstated and may stem from the false impression that filtering technologies are more developed than what they actually are. For instance, in a video for the European Commission, Audible Magic advertised the benefits and affordability of filtering technologies, [61]giving the (false) impression that such filters were efficient, accurate and affordable. As part of its lobbying efforts, Audible Magic stated that its technology is accurate to about 99%. Even if this statement is true, an algorithm that misidentifies about one in every 100 pieces of audio content does raise a number of issues. To put the range of acceptable false positive rates into perspective, e-mail service providers, for instance, consider that any false positive rate higher than about 0.1 percent is too high to be used for spam filters, due to the potential limitations on speech that could arise as a result of legitimate e-mail messages being misidentified as spam. [62] Thus, an accuracy rate of 99% might sound high at first glance, but taking a closer look may well prove insufficient for a tech company filtering millions of files on a daily basis.
35
In their report that examines the current state of content filtering technology, Engstrom and Feamster find that all tools currently available to identify potentially infringing material – from hash-based filtering to fingerprinting – are limited in their capacity to accurately identify infringements. [63]The authors observe that “critically, all content filtering technologies are at best capable of simply identifying the contents of a file, not making the often complex determination as to whether the use of a particular file constitutes an infringement”. [64] Filtering technologies have a role to play in the online ecosystem to identify and remove infringing material. However, due to the mentioned limitations, the range of infringing activity that filtering tools can effectively address is rather narrow. Even for media types for which filtering tools already exist, the tools are only capable of matching content; however, they are not capable of determining whether or not the use of a particular work constitutes an infringement. This determination generally requires the intervention of humans - courts and legal practitioners, since such identification does, depending on the applicable law, also entail being able to correctly identify copyright concepts such as fair use and fair dealing, as well as specific copyright exceptions such as parody or criticism. In the EU, the list of exceptions is already quite rigid and comprises twenty-one exceptions, nearly all optional, which describe exhaustively when a copyright protected work may be lawfully used without the rightholder’s approval. [65] The lack of EU-level harmonization in relation to copyright exception and limitations makes it even more challenging for filtering systems to be effective. This would require filtering systems to ascertain on a case-by-case basis the infringing nature of content in a geographically sound manner, namely taking into account the diverse existing national exception regimes. [66] For these reasons, it seems rather unlikely that in the near future an algorithm will be able to accurately identify copyright infringements, in the EU or elsewhere. This would, however, be necessary to avoid false-positives.
36
Even works belonging to the public domain present challenges for upload-filters as evidenced for instance by YouTube’s Content ID, which struggles to recognize the difference between copyrighted material and works belonging to the public domain. A German music professor tested Content ID and uploaded public domain recordings of copyright-free music pieces by Bartok, Schubert, Puccini, and Wagner, recorded before 1963 (and therefore in the public domain under German law). All these music pieces were blocked by Content ID and the professor had to appeal to numerous takedown requests. [67]Another professor even uploaded a ten-hour video of white noise, only to have it flagged five times for copyright infringement. [68]
37
The above shows that filtering technologies are still in a rudimentary state and are far from being able to identify copyright infringements and are not even capable of accurately matching content. Therefore, there will be (too many) false positives, or, in other words, non-infringing content that will be (over-)blocked. How article 17 DSM addresses this issue will be analyzed in the following Section 7.
38
The new liability regime established by article 17 DSM could lead to a “shoot-first-ask-questions-later” effect. In other words, CSSPs will be tempted to over-block uploaded content and err on the side of caution by filtering rather too much than too little. Article 17(7) DSM addresses the issue of over-blocking by providing that the cooperation between CSSPs and rightholders shall not result in the prevention of the availability of works uploaded by users, which do not infringe another’s copyright, including where such works or other subject matter are covered by an exception or limitation. In particular, Member States must ensure that users are able to rely on (a) quotation, criticism, review, and (b) use for the purpose of caricature, parody or pastiche.
39
Recital 70 of the DSM specifies that article 17(7) DSM “is particularly important for the purposes of striking a balance between, in particular, the freedom of expression and the freedom of the arts, and the right to property, including intellectual property.” In order to ensure that users receive uniform protection across the EU, those exceptions and limitations should, therefore, be made mandatory.
40
There appears to be a conflict between taking into consideration limitations and exceptions on the one hand, and the application of the filtering requirement on the other hand. At a minimum, this will be a difficult undertaking, given the risk of direct liability that CSSPs run if they do not proactively monitor their platform’s content by pre-filtering uploaded works. A diligent CSSP will thus naturally be tempted to block content to avoid liability, and given the mentioned technical limitations of upload-filters (which are not capable of recognizing copyright infringements, as described above under Section 6) this will unavoidably lead to over-blocking of content.
41
Since a significant amount of non-infringing content will be blocked, article 17(9) DSM states that CSSPs shall “put in place an effective and expeditious complaint and redress mechanism that is available to users of their services in the event of disputes over the disabling of access to, or the removal of, works or other subject matter uploaded by them”. Recital 70 of the DSM further sets out that the complaint and redress mechanisms shall allow users to complain about the steps taken with regard to their uploads, in particular where they could benefit from an exception or limitation to copyright in relation to an upload to which access has been disabled or that has been removed. These complaints filed under such mechanisms should be processed “without undue delay” and - since upload-filters will most likely not develop a sense of humor and since algorithms may not be able to recognize a parody - be subject to human review, as specified by recital 70 of the DSM. As we have seen, platforms deal with millions of uploads per day, and filters still make a substantial number of mistakes, it seems unavoidable that the number of complaints to deal with will potentially be huge.
42
As having recourse to humans rather than to automatic systems is far more expensive, CSSPs will try to minimize human intervention in this context. This can be done for instance by starting to filter all of the works for which CSSPs receive the necessary information by the rightholders without making any effort to decide in advance whether the works are in fact protected and not covered by exceptions or limitations to copyright. CSSPs could counterbalance this by accepting or reinstating the upload as soon as an uploader shows evidence that the upload is not protected or covered by an exception or limitations. [69] However, such a process would lead to an over-blocking of content since it would entail the blocking of content, which is rightfully uploaded until the work is reinstated.
43
Another way to respond to this new threat is to allow uploaders to certify in advance that their upload is not protected by another’s copyright (e.g. public domain) or covered by an exception or limitation. This could be balanced by blocking the material as soon as a rightholder refuses to provide such certification. [70]
44
The CSSPs could argue that these are the only available and suitable solutions in view of the associated costs and the principle of proportionality. How CSSPs will implement the requirement to put in place a redress mechanism will also depend on how aggressively big media companies will fight online creators making use of exceptions and limitations. Under the current regime, record labels engage in fierce battles against online teachers and video creators for every use by the latter of record labels’ works, even if it is only a few seconds of the copyrighted material. [71]
45
In any case, the determination as to whether an uploaded content is covered by an exception or limitation will very likely have to be made by a human and some kind of internal review mechanism will have to be implemented in order to guarantee, in particular, the freedom of expression and freedom of arts. This comes at a high cost and small CSSPs with a tight budget will necessarily feel the blow more than bigger companies with substantial financial resources.
8.1. Article 17 DSM harms smaller CSSPs more than the big tech companies and creates a bigger market for third-party filtering technology services
46
It is possible that article 17 DSM will strengthen the negotiation power of the rightholders and allow them to obtain more favorable licensing terms, which, to some extent, may close the “value gap”. The tech giants that operate on an ad-funded basis (e.g. YouTube/Google, Facebook), and which article 17 DSM is mainly aimed at, will incur losses as a result of more “rightholder-friendly” licensing terms. At the same time, these tech companies will have to invest in the development of upload-filters given the DSM’s emphasis on the adoption of automatic filtering systems and the financial risks that occur if non-licensed works remain available on their platforms. Therefore, it is likely that we will see enhanced filtering systems over the next years. Also, additional human resources will be needed to implement an efficient and expeditious internal complaint and redress mechanism. Big tech companies have the financial power, technological knowledge, and internal structure to develop their own sophisticated and competitive upload-filters and to maintain an efficient internal complaint and redress mechanism. They will thus be able to implement these new requirements and - even though these new measures will financially harm them - continue to run a successful business model in the EU.
47
By contrast, smaller EU companies will feel the blow of article 17 DSM much stronger because these measures are, relatively speaking, more expensive for them than they are for the big tech companies. Also, as many companies are currently not using filtering technology, the costs associated with these measures will be new to them. In an open letter from a coalition of 240 Europe-based online businesses to the members of the European Parliament, the signatories mentioned that “most companies are neither equipped nor capable of implementing the automatic content filtering mechanisms”. [72]
48
Instead of developing their own filtering technology, which is, as we have seen, very costly, small and mid-sized CSSPs could also license the filtering technology from a third-party. YouTube does not license its Content ID to third parties and the only obvious alternative is Audible Magic, who touts its services as “the industry standard” that is most often recommended by “the biggest names in music”. [73] As the only serious third-party provider of filtering technology and first-mover, Audible Magic will have several competitive advantages over new entrants to the filtering technology services market. Also, Audible Magic has a 10 million reference file, which has, as Bridy observes, already won the trust of the world’s corporate rightholders, which further strengthens its first-mover advantage. [74] In addition, Audible Magic has a large patent portfolio covering automated content recognition and fingerprinting technology. [75] New entrants will first have to deal with these patents before being able to compete with Audible Magic. [76] It remains to be seen whether Audible Magic will be able to offer a DSM-compliant filtering technology for a “reasonable” license fee from the perspective of its EU clients.
49
The European filtering technology market will be potentially large as CSSPs will have to comply with article 17 DSM. Unless they are exempted by article 17(5) DSM, relatively young (i.e. as of three years) and small (i.e. above an annual turnover of USD 10 million) companies will have to meet the requirements of article 17(4) DSM to avoid liability under article 17 DSM. In all cases, for five million monthly (unique) visitors, CSSPs will have to offer an efficient “notice and take down” regime. Both require some kind of filtering technology that will have to be in line with “high industry standards”. Hence, in practice, article 17 DSM leads to a situation where nearly any company that offers content-sharing services will, at some point, require strong and sophisticated filtering tools in order to avoid the DSM’s liability regime. In other words, for a business to compete in this market and to offer a broad range of interesting (and non-infringing) copyright content, it will be crucial to have access to sophisticated filtering systems. Since these filters are expensive and technically limited with regard to their efficiency, accuracy and range of applicability, many companies will struggle in this regard because, presumably, the cheaper the systems are the less effective they would be. [77]
50
A possible scenario is that the most sophisticated upload-filters will be in the hands of a very select number of companies (e.g. YouTube/Google, Facebook, Audible Magic), the only ones with the financial power, know-how and structure to develop such filters. These companies will not voluntarily share their technology with their competitors (YouTube/Google, Facebook), or charge unreasonably high fees (Audible Magic). As a consequence, the question of mandatory licensing will arise.
51
A CSSP’s refusal to grant a license could constitute a violation of article 102 of the Treaty on the Functioning of the European Union (TFEU), which prohibits the abuse of a dominant position. For article 102 TFEU issues to arise in the context of the refusal to license intellectual property rights (IPR), there must first be an undertaking that enjoys a dominant position in a relevant product and geographic market. The question of whether the refusal to grant a license to a third party constitutes an abuse under article 102 TFEU has been considered in detail by different EU courts. They have consistently held that the refusal by a dominant company to license IPR to a third party amounts to an abuse in the sense of article 102 TFEU only in exceptional circumstances. We have seen such cases in Volvo v Yeng, [78] Magill, [79] Bronner [80] or Microsoft. [81] At this point, it is very early to assess whether a CSSP could make a similar case based on article 102 TFEU, arguing that a refusal to grant a license for an upload-filter constitutes an abuse of a dominant position given the necessity to obtain access to such technology. The question whether an argument under article 102 TFEU could successfully be made will depend on many factors such as whether upload-filters will become essential to compete in a market, whether the technology will be in the hands of a tech company with a dominant position, and whether a third-party provider such as Audible Magic will be able to develop an affordable, sophisticated and DSM-complaint filtering technology.
52
Instead of traditional competition law solutions, a broader regulatory solution through a Fair, Reasonable and Non-Discriminatory (FRAND) access regime could be envisaged, as suggested by Heim and Nikolic. The authors propose to use the FRAND regime to help to get access to critical infrastructure such as digital platforms and access to data, while ensuring effective competition and maintaining incentives of dominant platforms to innovate. [82] This could be an interesting and long-term solution. However, since it requires regulatory steps first, this approach would take time. Under the article 17 DSM regime, CSSPs will not have this time to get access to filters.
53
At the current stage, it seems more obvious for a CSSP to try to invoke article 17(5) DSM and argue that it is not required to apply filtering technology (for instance because it is too expensive and therefore not proportional). It will be important to observe how article 17(5) DSM will work for smaller CSSPs. At this point, however, relying on this article seems to be a rather risky approach as it contains a number of grey areas which will ultimately have to be clarified by courts. For instance, the courts will have to define what type, audience and size of services and what type of works are likely to fall under article 17(5) DSM. This is, at the current stage, very unclear, which is a problem for CSSPs hoping to find a way to avoid the filtering requirement. In any event, it will only be in exceptional cases that a CSSP will be able to benefit from article 17 (5) DSM and most likely not apply to a CSSP with a widely dispersed target audience.
54
If the legal regime does not provide for a solution that gives any CSSP access to upload-filters against a reasonable fee or to be exempted from the filtering requirement, having access to a sophisticated filtering system will become a market entry barrier or push smaller companies completely out of the market. This will hinder investments in, and innovation by, CSSPs in the EU.
55
As mentioned above (see Section 5), a survey amongst US and EU investors indicates that a majority of investors would be “uncomfortable investing in businesses that would be required by law to run a technological filter on user uploaded content”. [83] The issue is that investors are often precisely what young companies need to succeed. For this reason, and if the survey is any indicator, it seems unlikely that the next YouTube will come from the EU. We will rather see investments in companies offering content filtering technologies. In all of this, and perhaps unfortunately for the EU, the company that is likely to benefit from the filtering requirement is US-based Audible Magic. Large US tech giants or Audible Magic may thus well end up with a monopoly for video, audio and other content filtering, with the unintended consequence of the DSM being that such companies will become more powerful and collect extensive data about EU users, which will ultimately give them the power to decide - to some extent - what can and cannot be posted online in the EU. This is precisely a situation the EU wants to avoid.
56
The policy goal of redistributing resources from big US platforms to EU creators for uses of their works in the platform economy is undeniably well-intended. The positive impact that article 17 DSM might have for rightholders in the EU, however, comes with a price to be paid primarily by small and mid-sized EU CSSPs and EU artists having blocked their rightfully used works due to over-blocking. There might soon be less competition for US tech companies in the EU, which will lead to greater market concentration among EU CSSPs. This situation is not in the interest of the EU and seems to be an expensive price to pay to try to close a gap that may not even exist.
* By Thomas Spoerri, attorney-at-law, LL.M. in Law, Science and Technology (Stanford), CIPP/E; email: thomasmspoerri@gmail.com . The author is grateful to Prof. Paul Goldstein and Jason Dumont from Stanford Law School, Evelyne Studer, Shivanghi Sukumar and to the JIPITEC reviewers for their valuable inputs and comments on earlier versions of this paper.
[1] All links to electronic sources were last accessed on August 22, 2019 (unless otherwise indicated).
[2] Formally the “Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC”, final version available at https://eur-lex.europa.eu/legal-content/EN/ .
[3] Petition available at: https://www.change.org/p/european- .
[4] James Vincent, European Wikipedias have been turned off for the day to protest dangerous copyright laws (March 21, 2019), The Verge, available at: https://www.theverge.com/2019/3/21/1827546 .
[5] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce) [2000] OJ L178/1, available at: https://eur-lex.europa.eu/legal-content/EN/TXT/ .
[6] With regard to the U.S. music industry, see for instance Ben Sisario and Karl Russell, In Shift to Streaming, Music Business Has Lost Billions (March 24, 2016), available at: https://www.nytimes.com/2016/03 ; Ben Beaumont-Thomas and Laura Snapes, Has 10 years of Spotify ruined music? (October 5, 2018), available at: https://www.theguardian.com/ .
[7] How Google fights piracy, 21, available at https://drive.google.com/file/d/0Bwxy .
[8] Roy Trakin, IFPI Report Finds Streaming Continues to Rise, YouTube Dominates Online Listening (October 9, 2018), available at: https://variety.com/2018/music/ .
[9] Giancarlo Frosio and Sunimal Mendis, Monitoring and Filtering: European Reform or Global Trend? (July 12, 2019), available at: https://ssrn.com/abstract=3411615 .
[10] Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, Promoting a Fair, Efficient and Competitive European Copyright-based Economy in the Digital Single Market, COM (2016) 592 final (September 14, 2016), 7, available at: https://eur-lex.europa.eu/ , released in parallel with the DSM’s proposal.
[11] Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, Tackling Illegal Content Online, Towards an Enhanced Responsibility of Online Platforms, COM (2017) 555 final (September 28, 2017), 10, available at: https://ec.europa.eu/transparency/regdoc/ .
[12] Giancarlo Frosio, To Filter or Not to Filter? That Is the Question in EU Copyright Reform (October 25, 2017), 36(2) Cardozo Arts & Entertainment Law Journal, 2018, 131 with further references, available at: https://ssrn.com/abstract=3058680 .
[13] See Annemarie Bridy, EU Copyright Reform: Grappling with the Google Effect, (n 6) at 2 et seq. with several references to these reports and campaigns, available at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3412249 .
[14] Commission Staff Working Document, Impact Assessment on the modernisation of EU copyright rules, COM (2016) 301 final (September 14, 2016), PART 1/3, 136.
[15] General Opinion on the EU Copyright Reform Package, EUR. COPYRIGHT SOC’Y 5 (January 24, 2017), available at https://europeancopyrightsocietydotorg. .
[17] See Julia Reda, What the Commission found out about copyright infringement but ‘forgot’ to tell us (September 20, 2017), https://juliareda.eu/2017/09/secret-copyright-infringement-study .
[18] Martin van der Ende et al., Estimating displacement rates of copyrighted content in the EU: Final Report 7 (European Commission, 2015).
[20] Dirk Visser, Trying to understand article 13, draft – work in progress (March 18, 2019), 4, available at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3354494 .
[21] ibid.
[22] Maria Lillà Montagnani and Alina Trapova, New obligations for Internet intermediaries in the Digital Single Market – safe harbours in turmoil? Journal of Internet Law, Vol. 22 Issue 7, (January 1, 2019), 1, available at SSRN: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3361073 .
[23] Olivier Englisch and Giulia Priora, Safe harbour protection for online video platforms: a time to say goodbye? Rivista di diritto dei media (2019), available at: http://www.medialaws.eu/wp-content .
[24] CJEU, Joined Cases C-236/08 to C-238/08, Google France and Google (2010), § 112 ss.; CJEU, Opinion of the Advocate General, Case C-324/09, L’Oréal v eBay (2010), § 138 ss; CJEU, Case C-291/13, Papasavvas (2014), § 39 ss.
[26] L’Oreal SA (2010), § 139.
[29] While this paper does not further examine the licensing requirement of article 17(1) DSM, other authors have analyzed this topic in detail (for instance, Bridy [n [13]]).
[30] The numbers refer to the status in 2018, it can be assumed that the number has increased since then (see Paul Sawers, YouTube: We’ve invested $100 million in Content ID and paid over $3 billion to rightsholders (November 7, 2018), available at: https://venturebeat.com/2018/11/07 ).
[31] Andy, Article 13: YouTube CEO is Now Lobbying FOR Upload Filters (November 15, 2018), available at: https://torrentfreak.com/ .
[33] The DSM is silent with regard to the geographical scope which counts towards the 5 million visitors’ requirement. Without any contrary indication, it has to be assumed that it refers to the number of visitors worldwide.
[34] Burce Houghton, SoundCloud Revenues Up 80%, Tops $100 Million (January 29, 2019), available at: https://www.hypebot.com/ .
[35] Craig Smith, 16 Amazing SoundCloud Statistics and Facts (June 9, 2019), available at: https://expandedramblings.com/index.php/soundcloud-statistics/ .
[36] Artyom Dogtiev, Business of Apps, YouTube Revenue and Usage Statistic of 2018 (January 7, 2019), available at: http://www.businessofapps.com/data/youtube-statistics/ . According to analysts, this number could have increased to over USD 15 billion in 2018, see Adam Levy, YouTube Could Be a $15 Billion Business This Year (February 18, 2018), available at: https://www.fool.com/investing/ .
[37] See YouTube press material, available at: https://www.youtube.com/intl/en-GB/yt/about/press/ .
[38] The most commonly used filtering technologies are metadata searches, hash-based filtering content, and fingerprinting. Fingerprinting is technically more complex than the other two filtering methods (for explanations about these filtering technologies, see Evan Engstrom and Nick Feamster, The Limits of Filtering: A look at the Functionality & Shortcomings of Content Detection Tools (March 2017), 11-15, available at: https://www.engine.is/the-limits-of-filtering ).
[41] See YouTube Help Desk, available at: https://support.google.com/youtube/answer/2797370?hl=en .
[42] Audible Magic, Customers and Partners, available at: https://perma.cc/M27S-H45P .
[44] Audible Magic Press Release, Audible Magic Pursues Trademark Case Against Google (January 10, 2017), available at: https://www.audiblemagic.com/2017/01/10/ .
[45] According to Wikipedia ( https://en.wikipedia.org/wiki/SoundCloud ).
[47] Commission Staff Working Document Impact Assessment on the Modernisation of EU Copyright Rules (draft), at 138, (2016).
[48] Comments of Audible Magic, U.S. Copyright Office, Section 512 Study, at 2 (2016).
[50] See Audible Magic official prizing list, available at: https://www.audiblemagic.com/compliance-service/#pricing (last time checked on July 12, 2019).
[51] Evan Engstrom and Nick Feamster (n [38]) 22, referring to Janko Roettgers, SoundCloud Turns 5, Creators Now Upload 12 Hours of Audio Every Minute (November 13, 2013), available at: https://gigaom.com/2013/11/13/ .
[52] Evan Engstrom and Nick Feamster (n [38]) 22; Jennifer M. Urban et al., Notice and Takedown in Everyday Practice (March 22, 2017), UC Berkeley Public Law Research Paper No. 2755628, 64, available at SSRN: http://ssrn.com/abstract=2755628 .
[53] Mike Masnik, There Was Heavy Tech Lobbying On Article 13... From The Company Hoping To Sell Everyone The Filters (January 23, 2019), available at: https://www.techdirt.com/articles/ .
[56] For instance, Audible Magic is currently only offering filtering technology for audio and video files according to its pricing list (n [50]).
[58] Evan Engstrom, et al., The Impact of Internet Regulation on Early Stage Investment, (2014), available at: https://static1.squarespace.com/static/ .
[59] Examples taken from https://www.tutor2u.net/business/ .
[60] Federal Commissioner for Data Protection and Freedom of Information (Bundesbeauftragte für den Datenschutz und die Informationsfreiheit), press release: Reform des Urheberrechts birgt auch datenschutzrechtliche Risiken (February 26, 2019), available at: https://www.bfdi.bund.de/DE/ .
[61] Video available at: https://vimeo.com/198929871 .
[65] Maria Lillà Montagnani and Alina Trapova (n [22]) 4; Tito Rendas, Destereotyping the Copyright Wars: The 'Fair Use vs. Closed List' Debate in the EU (September 8, 2015), available at: https://ssrn.com/abstract=2657482 .
[67] See Karl Bode, Vice Magazine, “This This Music Theory Professor Just Showed How Stupid and Broken Copyright Filters Are” (August 30, 2018), available at: https://www.vice.com/en_us/ .
[68] ibid.
[70] ibid.
[71] For an example of a fight between a record label and an online creator, see Julia Alexander, Youtubers and record labels are fighting, and record labels keep winning, The Verge, (May 24, 2019), available at: https://www.theverge.com/2019/5/24/ .
[72] Open Letter to European Members of Parliament from 240 EU Businesses Against Copyright Directive Art. 11 & 13 (March. 19, 2019), available at: https://perma.cc/VX2C-SAXC .
[73] Audible Magic, Copyright Compliance Service, available at: https://perma.cc/7ZWF-EC8B .
[75] Audible Magic website, patents, available at: https://www.audiblemagic.com/patents/ (“Audible Magic has been awarded 33 patents, and additional patents are pending with US and European patent offices. Patents are in the areas of digital fingerprint-based media detection technology; detection of content on media playing devices such as smart phones, televisions, video players, and other devices; identification of content as it flows across networks; and approaches to caching and indexing a reference database to improve the performance of the system.”).
[77] Christina Angelopoulos et al., “The Copyright Directive: Misinformation and Independent Enquiry” (June 29, 2018), available at: https://www.create.ac.uk/wp-content/ .
[78] Case 238/87 AB Volvo v Erik Yeng (UK) Ltd, judgement of October 5, 1988.
[79] Cases C-241 and C-242/91P RTE and ITP v Commission, judgement of April 6, 1995.
[80] Case C-7/97 Oscar Bronner GmbH & Co. v Mediaprint Zeitungs und Zeitschriftenverlag GmbH, judgement of November 26, 1998.
[81] Case T-201/04 Microsoft v Commission, judgement of September 17, 2007. Microsoft did not appeal the judgement to the CJEU.
[82] Mathew Heim and Igor Nikolic, A FRAND Regime for Dominant Platforms, JIPITEC – Journal of Intellectual Property, Information Technology and E-Commerce Law (2019), available at: https://www.jipitec.eu/issues/jipitec-10-1-2019/4883 (date of last visit: August 18, 2019).