The power is down?
The disruption of traditional cultural dissemination in a decentralized p2p world
Prepared 23 April 2009 for Prof. Rick Salutin (UNI221H1S, University of Toronto).
————— —— —————
In this paper:
Introduction
Contesting the lowest common denominator
Synergy muscles flex: conglomerate revanchism over de-centrality
The “got no uncensored media conglomerate blues”: effects of data tampering
Conclusion: the destabilizing nature of a new medium quashed
References
Introduction
Not terribly long ago, entertainment industry concerns started decrying the emerging practice of online file sharing amongst consumers. Their refrain was familiar to those who remembered the “home taping is killing the industry” threats prior to the World Wide Web. The industry’s blanket argument was simple, albeit misleading: file sharing hurt artistic talent — namely, talent it represented. Rather than defending file sharing and the internet’s unique ability to make promotion at a “viral”, word-of-mouth level cheaply possible and geographically independent, it exercised legal action by targeting suspected file sharers in court, even without direct evidence that the accused were specifically involved (McBride 2007, B4). Instead of convincing the public that file sharing was bad on ethical or moral grounds, sentiment backfired as consumers realized there was more than one way to preview content of interest.
Then, content portals like YouTube emerged, and users uploaded material for other users to consume under the auspice of fair use. The industry furiously objected, bringing lawsuits against the portals and demanding to know the identity of every person who viewed uploaded content (Spangler 2008, 58). Apple then found that consumers were willing to set aside file sharing or online portals to pay for content so long as it was easily available, downloadable, and purchasable on a per-unit basis (Cherry 2004, 56). And consumers found other ways to distribute content through distributed, peer-to-peer (p2p) protocols like BitTorrent, which evolved with encryption like that used for web banking. The entertainment industry objected, but this encryption made it hard to identify intercepted data: it was impossible to identify what was inside.
That is, until another dispute, which at first seemed unrelated, emerged as internet service providers, or ISPs, began restricting subscriber internet access. They argued that subscribers who fully utilized the bandwidth capabilities of their pay service plans were overwhelming infrastructure and “hurting” light-use subscribers. Despite the logical problem with this argument (email and surfing would never need so much bandwidth so as to hinder performance!), these ISPs started blocking access based on the type of data transferred instead of simply eliminating high bandwidth plans. Blocking required a unique, invasive technology that has since come under heavy debate. Coincidentally, these ISPs are not independent companies, but part of the same major entertainment conglomerates who objected to copyrighted content being shared online. Do ISPs have a vested right to know what subscribers are transferring? Does this violate subscriber privacy? Is this an inherent business flaw as one conglomerate’s holding enforces policy at another holding? Or is this an attempt to re-centralize cultural content in an inherently decentralized reality?
Contesting the lowest common denominator
What is striking about these separate hostilities is how they both interrelate to the internet’s disruptiveness as an entirely different information medium — one of decentralized, unregulated, and inherently open principles (Cooper 2003, 196). It also highlights the conflict of interest between the agency of centralized media distribution — by conglomerates who “converged” as the World Wide Web emerged — and the efforts invested to hyper-centralize management, production, distribution, dissemination and control over media content as a way to improve quality (Scatamburlo-D’Annibale and Boin, 2006, 239). As companies merged, these convergences assumed tighter grip over intellectual content, removing artists from leverage over the fate of their work. Control, distribution, and cross-promotions are now regularly dealt within the same conglomerate, effecting a narrower availability of catalogue offerings. This type of business rationalization, known as “hit-driven economics”, aggressively pushes high-yield (read: monetized) content to maximize revenue on a few, lowest common denominator assets at the detriment to a wide variety of lesser-known work (Anderson 2004, 171).
Consumer patience wore thin for this hit-driven consolidation trend. By the later 1990s, recorded music sales began falling as consumers chose alternative venues to discover new content (Legrand 2000, 86). As internet connection speeds improved and file sharing technology emerged, it became possible to side-step centralized distribution outright. This was also when established artists, weary from having so little leverage over their work, began trying to regain ownership over their original productions (Kennedy 2000, 503). This anomie of neglect by entertainment conglomerates set a stage that made p2p file sharing, open-source software, and Creative Commons share-attribution (in lieu of traditional copyrighting) decentralized alternatives under the logic that centralized media control was the only viable way to disseminate creative content or conduct the business of monetizing cultural content.
Synergy muscles flex: conglomerate revanchism over de-centrality
Bandwidth “traffic-shaping” — or throttling — by ISPs is controversial because it attenuates data transfer rates and discriminates against subscribers who use their allocated bandwidth. Major ISPs like Bell Canada, Rogers, and (in the U.S.) Comcast, defend throttling because “heavy” users degrade the network experience for “light” users (Rau 2008, 12). According to these ISPs, their solution is counterintuitive. Rather than reallocate infrastructure so that fewer subscribers are connected to a particular data node (like a neighbourhood switching box), the ISPs employed a new product to effect selective bandwidth throttling — blocking data not on total subscriber rates or volume, but on the type of data. Detecting this requires special interceptive software called deep-packet inspection, or DPI (Riley and Scott, 2009, 3). DPI can theoretically be used to improve the quality of data transfer if configured benignly, but ISPs employed DPI to rifle through subscriber data to determine what kind of content is being transferred. It is analogous to a messenger opening and reading envelopes while en route to an addressed recipient: if the messenger objects to the contents within, they dispose the envelope in a bin, and the recipient never receives it or any further deliveries from the same sender. DPI technology opponents include privacy advocates like the Electronic Frontier Foundation (EFF) and even lawmakers at the U.S. Congress (Wilson 2008, ¶1–2). Opposition to DPI centres on how it violates the doctrine of Net Neutrality, a founding cornerstone of the internet which has “allowed users to travel anywhere on the internet, free from interference . . . [and] has been a foundation of communications law and policy for decades” (Riley and Scott 2009, 3). Were DPI inspecting postal mail instead of data packets, it would be punishable as a criminal offence, yet ISPs continue to defend and use it.
Why were these ISPs so heavily invested to throttle subscriber data by intercepting it first?
Two issues challenge this logic for throttling and doing so invasively. First, when subscribers order a service plan for a level of bandwidth (say, a four-megabit per second — 4Mbps — connection rate), it is understood that the ISP, in good faith, is supposed to deliver that bandwidth rate within technical limitation, notwithstanding incidental interference (such as a subscriber’s distance from an ISP’s switching station or wiring flaws inside a building) (de Souza et al. 2007, 1). The second is that ISPs arbitrarily predicted “suitable” subscriber bandwidth usage rates based on how most people use the internet. This usage rate typically falls well below the subscriber’s maximum bandwidth allocation promised by their service plan. ISPs forecast that a “suitable” average, if everyone is using this mean rate simultaneously, will still fall within their network’s capabilities. In other words, ISPs overbook — or “oversubscribe” — network capacity the way airlines overbook flights to maximize revenue (Bradner 2008, 25). Should every subscriber use their maximum bandwidth rate simultaneously, it would overload the network. This actually occurred on the extraordinary morning of 11 September 2001, when simultaneous usage crashed networks (Welsh and Culler 2003, 43).
North American ISPs remain loath to substantially upgrade infrastructure capacity unless another party is willing to finance it — such as by government subsidy — as it “requires extensive focus on research and development and being willing to take financial risk . . . Corporations that have to answer to stockholders . . . and strict profit margins are rarely eager to allocate funds to projects with no immediate pay-off” (Papacharissi and Zaks 2006, 73). This disregards Moore’s Law of computing — that computer power doubles roughly every two years — or its impacts on network growth and demand. It also ignores the innovations that can expand networks without adding more physical infrastructure. Johnson (2008b, 24) contended that subscriber usage dynamically evolves much the way biological systems do, in that needs will grow and expand over time as life itself does. Cisco, a network systems company known for designing architecture for the internet, explained pithily: “Today’s ‘bandwidth hog’ is tomorrow’s average user” (2008, 12).
When subscriber throttling began this decade, ISPs first denied it was happening. But complaints persisted that throttling particularly impacted users of p2p applications from file sharing to gaming. Then after being confronted with demonstrative evidence showing otherwise, the ISPs defended the practice, contending that subscribers who used substantial bandwidth were de facto abusing it with p2p software to exchange copyright-protected cultural content (Geist 2007, D5). This defence was stunning, because it amounted to a confession of privacy intrusion by an intermediary trusted to benignly deliver data services for subscribers. DPI’s ability to pry open a packet of data and determine that it was from a p2p source was unprecedented. The admission also ignored subscribers who regularly viewed high-bandwidth cultural content (e.g., TV shows, user-made videos, movies, etc.) from services like YouTube, Apple, or Hulu. Such services were precisely the high-bandwidth sources which ISPs claimed would hurt their network.
But the admission that DPI was being used to discriminate against certain subscribers also ran counterintuitive to a successful growth model (Johnson 2008a, 25). These ISPs could have looked to high-bandwidth users to gauge how mean subscribers would be using future services. An pioneering increase in YouTube viewers, for instance, might reliably foreshadow that many more subscribers would later use YouTube or similar services (Johnson 2008b, 24). Admitting DPI use suggested otherwise: offending ISPs, who were within major entertainment conglomerates, were more interested to comply with dictates ordered by other holdings within the conglomerate, even if that violated user-subscriber privacy, freedom of speech considerations, or even competitive new businesses requiring large bandwidth to provide their services.
The public backlash echoes a chilling of the Net Neutrality doctrine that people had previously taken for granted. In the U.S., Comcast was directed by the FCC to cease and desist the practice of throttling with DPI (Goth 2008, 1). But in Canada, independent ISPs vehemently formalized a complaint against the use of DPI and throttling disruption. As resellers from infrastructure owners like Bell, they were beholden to Bell’s “natural infrastructure monopoly, which had initially been created using taxpayer funds” — thus placing Bell at an uncompetitive advantage (CBC 2009, ¶12). Several internet concerns, including Google, openly challenged Bell’s tactic, arguing that they “recognize that sometimes ISPs will need to manage traffic over their networks, [but] you can’t do that in a manner that’s specific to certain applications or content. When you do that, you undermine innovation online” (Sorensen 2008, B1). Once Bell started using DPI to throttle not only its subscribers but also third-party resellers, it entirely negated any competitive performance guarantees those third-party ISPs offered to their subscribers.
In 2008, the CRTC reviewed the complaint, but arrived at a completely different decision from the FCC-Comcast ruling. It ordered that Bell could continue using DPI to throttle bandwidth to both individual subscribers and to resellers offering independent ISP services to subscribers (Hartley and Avery 2008, B2). Not only was Bell affirmed for using privacy invasion techniques to discriminate against users, but the CRTC also eliminated their effective competition in the DSL market. Rogers, meanwhile, created generic monthly data transfer limits and blocked certain access computer ports from transmitting data on its network. While restrictive, their policy remained blind to the content being transferred. Then Rogers expanded restrictions to all encrypted data, even for email services like Gmail and virtual private networking, or VPN — used by governments and businesses to guarantee secure, remote connections to conduct internal business (Geist 2007, D5).
The “got no uncensored media conglomerate blues”: effects of data tampering
Conflicts of interest between providing cultural content and data access services now slants favourably toward major ISPs that use discriminating technologies like DPI. It is doubly problematic for consumers, because ISPs possess a direct financial stake in — or are part of — entertainment conglomerates that either produce or distribute cultural content in related industries. These conglomerates control significant shares of cultural products through their entertainment subsidiaries. In Canada, this includes Quebecor, CanWest, CTVglobemedia (of which Bell Canada holds a 15 percent stake), and Rogers, whereas in the U.S., Time Warner, Comcast, and Cox Communications predominate (Shade 2006, 355–7). It means ISPs can advance content favouritism which they believe will bolster bottom lines, such as Time Warner Cable giving content priority for CNN, which in turn promotes a Warner film product in a feature story.
Likewise, unauthorized content could be blocked, or worse, sanctions placed on subscribers. This is particularly troublesome when fair use is considered. For example, is it illegal for a university student to download a movie to watch for personal use when the student owns the DVD (and thus, a licence), but left it behind at home? Is is fair use when someone downloads or uploads commercially discontinued or regionally inaccessible cultural content for the benefit of accessibility, even when content creators support the decentralized, “viral” promotion? Since the copyright holder, the company, denies offering content as a saleable commodity, does it harm the company when both the content’s creator and consumers circumvent the company entirely in order to make the cultural transaction possible? Does this company obstruction amount to collusion against lesser-performing cultural creators, thus raising free-speech questions? And if a major ISP makes unobstructed bandwidth the province of its affiliated company web sites and content, does this disadvantage independent content creators (or cultural content from competing conglomerates)?
Conclusion: the destabilizing nature of a new medium quashed
While seemingly tedious, these concerns are neither academic nor trivial. DPI discrimination, whose technology is improving to more thoroughly identify distinguishing traits contained within data packets — even to the point of identifying an entire file’s content from one piece of intercepted data — threatens not only innovation, but also open cultural exchange.
In a sense, the non-hierarchical nature of the internet comes closer to what Innis vaunted as the oral tradition and a remedy to “the problem of space”. In absence of face-to-face oral tradition, other mediums like the printed word and broadcasting dominated in state societies: “The concept of the state as an economic factor has become an indication of economic activity. Without religion as an anchorage the state has become more dependent on cultural development” (Innis 1951, 130). He referred to cultural development in the face of mature and immature state societies and the effects of written text, such as print. But static print media is confined by design: open, collaborative communication cannot happen with static, monological texts, as with books or recorded documents (even though annotations are a partial workaround for this). Text can also be used in hegemonic capacity. The internet, despite hierarchical structures arising within, is by design a non-hierarchical, continuous conversation that manages to foster real-time, dialogical communication irrespective of distance.
Implications for this medium are substantial, as it stands to upend and redefine power relationships. Extricating the decentralized dimension of the internet — and access to that neutrality — presents the most substantial obstruction to whether that neutral, open architecture can endure in its core form. Even if not, the collective social knowledge wrought from experiences with internet capabilities is certain to impress on future mediums to supersede it.
References
Anderson, Chris. 2004. The long tail. Wired, 12(10) October: 170–77.
Bradner, Scott. 2008. Broadband pricing: solutions falling short. Network World, 25(25) 23 June: 25.
CBC News. 2009. Bell defends plan to meter billing for wholesale ISPs. CBCnews.ca, 15 April. Retrieved 15 April 2009.
Cherry, Steven. 2004. Selling music for a song: online music stores make at most a dime per track. Where does the money go? IEEE Spectrum, 41(12): 56.
Cisco Systems. 2008. Approaching the zettabyte era (white paper). San Jose: Cisco Systems, Inc.
Cooper, Mark. 2003. Open communications platforms: the physical infrastructure as the bedrock of innovation and democratic discourse in the internet age. Journal on Telecommunications and High Technology Law, (2): 177–244.
de Souza, et al. 2007. Impact of non-stationary noise on xDSL systems: an experimental analysis. In Leon Cohen (Ed.), Noise and fluctuations in photonics, quantum optics, and communications (Proceedings of SPIE, 21 May 2007, Florence), 6603(G): 1–9.
Geist, Michael. 2007. Business: ISP must come clean on ‘traffic shaping’. Toronto Star, 16 April: p. B5.
Goth, Greg. 2008. ISP traffic management: will innovation or regulation ensure fairness? IEEE Distributed Systems Online, 9(9) article #0809-o9002: 1–4.
Hartley, Matt and Simon Avery. 2008. Report on business: ‘traffic-shaping’ complaint quashed. The Globe and Mail, 21 November: p. B2.
Innis, Harold A. 1951. The bias of communication (2nd ed.). Toronto: University of Toronto Press.
Johnson, Johna Till. 2008. With Amazon’s Kindle, it’s love at first byte. Network World, 25(25) 23 June: 25.
—————. 2008. About those ‘net capacity issues: there’s more. Network World, 25(47) 8 December: 24.
Kennedy, Erica. 2000. ‘Come n 2 their dream world’: for the ever-spiritual Artist Formerly Known as Prince, home is just a frame of mind. In Style, 7(5) 1 May: 500–7.
Legrand, Emmanuel. 2000. SNEP blames burners for sales drop. Billboard, 112(6): 86.
McBride, Sarah. 2007. Media & marketing: music industry wins digital piracy case. Wall Street Journal, 5 October: p. B4.
Papacharissi, Zizi and Anna Zaks. 2006. Is broadband the future? An analysis of broadband technology potential and diffusion. Telecommunications Policy, 30: 64–75.
Rau, Krishna. 2008. Bell Canada controls internet traffic: technology gives company unlimited power over online usage. Xtra!, 18 December: p. 12.
Riley, M. Chris and Ben Scott. 2009 (March). Deep packet inspection: The end of the Internet as we know it? Northampton, MA: Freepress.net.
Scatamburlo-D’Annibale, Valerie and Pail Boin. 2006. New media. In Paul Attallah and Leslie Regan Shade (Eds.), Mediascapes: new patterns in Canadian communication (2nd ed.), 2007, 235–49. Toronto: Thomson-Nelson.
Shade, Leslie Regan. 2006. O Canada: Media (de)convergence, concentration, and culture. In Paul Attallah and Leslie Regan Shade (Eds.), Mediascapes: new patterns in Canadian communication (2nd ed.), 2007, 346–51. Toronto: Thomson-Nelson.
Sorensen, Chris. 2008. Business: internet traffic shaping seen stifling innovation. Toronto Star, 25 February: p. B1.
Spangler, Todd. 2008. Viacom, Google to shield user names. Multichannel News, 29(28): 58.
Welsh, Matt and David Culler. 2003. Adaptive overload control for busy internet servers. Proceedings of the 4th USENIX Symposium on Internet Technologies and Systems (Seattle WA), 26–28 March 2003: 43–57.
Wilson, Carol. 2008. DPI: the good, the bad, the stuff no one talks about. Telephony Online, 18 July. Retrieved 11 April 2009.