¶ 1 Leave a comment on paragraph 1 0 The goal of this chapter is not to make an ethical or political judgement on the monetization of dataveillance–that I will cover in the next chapter. Instead, I’m drawing attention simply to the way this particular quality of the information landscape is mostly ignored or viewed as unproblematic by the thinkers I consider. Whether we ultimately decide this particular economic incentive good, bad, or neutral, it is a major feature of the subject under analysis and thus imprudent to not investigate its role and influence. However, neither Benkler nor Jenkins acknowledge the role of dataveillance in their analysis of practices and trends of networked digital activity. Benkler treats information solely as a medium which enables the practical exchange and collaboration of information-based goods through networks, rather than the medium that likewise allows for monetizing user activity. Likewise, his perspective of commons-based peer production is overly skewed towards to users’ perception of these activities rather than the businesses whose digital services enable them. His description of commons-based peer production as not relying on “the price system or a managerial structure for coordination” (60) fails to recognize the platforms which have come to support a great deal of what appears like commons-based peer production” exist purely on account economic incentive and are designed to structure user coordination and activity in ways that maximize profits. Users are thoroughly alienated from the economic product of their labor in that they do not receive a portion of the profit, nor are steps taken to make them explicitly aware of the product they’re creating and its value. While commons-based peer production can be seen to support the democratization of certain aspects of knowledge and cultural production, this democratization is sharply constrained to the level of producing and distributing information-based content rather than democratically understanding and governing the rules and structures which enable that exchange. Similarly, Jenkins’ observations also fail to acknowledge the role of dataveillance in the phenomena he describes as convergence culture, collective intelligence, and participatory culture.
¶ 2 Leave a comment on paragraph 2 0 In some respects, the lack of attention to dataveillance in the texts of Benkler and Jenkins might be chalked up to the time period in which they were writing. In 2006, the year that both The Wealth of Networks and Convergence Culture were published, business models that relied on user data extraction were still in their infancy and not as broadly discussed in the media or scholarly literature as they are today. Tim O’Reilly’s Web 2.0 manifesto and Facebook were only two years old. Twitter had just launched and Google Docs was still a year away. Many, though not all, of the activities Benkler points to as examples of commons based peer production were in fact conducted in nonproprietary virtual spaces, such as Wikipedia, Project Gutenberg, SETI@Home, and the Free Software initiative. Jenkins’ examples of networked activity are likewise focused on virtual spaces that emerged prior to the widespread adoption of dataveillance. However, neither of them give much thought to the economic models supporting the technologies they describe. Neither of them anticipate how these emerging economic models might complicate their claims. Instead, both Benkler and Jenkins pay more critical attention on the incumbents of the old industrial and cultural order who are threatening to stifle these new forms of activity through copyright protection and “moral panic” (Jenkins 2006, 260). As already noted, Jenkins argues that the most important issue at hand is not to fight media conglomerates, but to work with them in helping broaden the reach of participatory culture (260). Similarly, Benkler’s book aims to grapple with arguments and preconceptions from the old industrial order rather than identify new threats arising from the new networked information economy:
¶ 3 Leave a comment on paragraph 3 0 The rise of greater scope for individual and cooperative nonmarket production of information and culture, however, threatens the incumbents of the industrial information economy. At the beginning of the twenty-first century, we find ourselves in the midst of a battle over the institutional ecology of the digital environment. A wide range of laws and institutions—from broad areas like telecommunications, copyright, or international trade regulation, to minutiae like the rules for registering domain names or whether digital television receivers will be required by law to recognize a particular code—are being tugged and warped in efforts to tilt the playing field toward one way of doing things or the other. How these battles turn out over the next decade or so will likely have a significant effect on how we come to know what is going on in the world we occupy, and to what extent and in what forms we will be able—as autonomous individuals, as citizens, and as participants in cultures and communities—to affect how we and others see the world as it is and as it might be. (2)
¶ 4 Leave a comment on paragraph 4 0 Kevin Kelly’s book is remarkably more attuned to the role of data in the development and use of networked digital technologies. Published in 2016, ten years after The Wealth of Networks and Convergence Culture, Kelly had the benefit of observing the way in which dataveillance would come to play an increasingly important role in the networked information economy. Kelly goes even further in observing that dataveillance isn’t simply about selling targeted ads, but rather created inputs for machine learning in an industry-wide dash towards developing artificial intelligence. Pointing to the ways in which forms of artificial intelligence are already deployed in everyday applications, Kelly argues that artificial intelligence will increasingly take over all forms of labor–from professional occupations to other types of activities such as recommendation systems, search results, marketing, construction, nursing, personal assistance and so forth. Whether Kelly is right about the degree to which artificial intelligence “takes over,” his observations imply that networked user activity plays a direct role in “training” these artificial intelligences as well as producing an industry product completely alienated from a majority of the users whose labor has unwittingly gone into producing it.
Inevitability and the narrow scope of action
¶ 5 Leave a comment on paragraph 5 0 Judging from the title of his book alone, The Inevitable: Understanding the Twelve Technological Forces That Will Shape Our Future, Kevin Kelly seems to ascribe a far more deterministic view about the nature of technological change. However, a close reading of all four texts under consideration reveals that Kelly, Jenkins, and Benkler share more similar views on this topic than one might immediately suspect. Kelly believes that there are certain “natural,” “physical” forces that drive technological development, but that humans still have an important role in managing them through “legal and technological means.” For example, he writes: “…while we can’t stop (tracking), it does matter greatly what legal and social regimes surround ubiquitous (tracking). How we handle rewards for innovation, intellectual property rights and responsibilities, ownership of and access to (tracking) makes a huge difference to society’s prosperity and happiness. Ubiquitous (tracking) is inevitable, but we have significant choices about its character” (256). Throughout the book he makes a few specific suggestions about what those choices should include. For example, he argues that given the inevitability of tracking all human activity, measures should be put in place so that the benefits of tracking can be equitably shared by all: “If symmetry can be restored so we can track who is tracking, if we can hold the trackers accountable by law (there should be regulation) and responsible for accuracy, and if we can make the nefits obvious and relevant, then I suspect the expansions of tracking will be accepted” (261). However, these comments are sparse and fairly undeveloped, giving one these sense that Kelly doesn’t see these issues of primary concern. What comes across more strongly in the book is the sense of glory manifesting itself in these inevitable changes. In his last chapter, he invokes the sublime by describing how technology is evolving towards the development of an unfathomably vast intelligence which will make our current moment appear “ancient”:
¶ 6 Leave a comment on paragraph 6 0 Thousands of years from now, when historians review the past, our ancient time here at the beginning of the third millennium will be seen as an amazing moment. This is the time when in habitants of this planet first linked themselves together into one very large thing. Later, the very large thing would become even larger, but you and I are alive at that moment when it first awoke. Future people will envy us, wishing they could have witnessed the birth we saw. It was in these years that humans began animating inert objects with tiny bits of intelligence, weaving them into a cloud of machine intelligences and then linking billions of their own minds into this single supermind. This convergence will be recognized as the largest, most complex, and most surprising event on the planet up until this time. (291)
¶ 7 Leave a comment on paragraph 7 0 Kelly’s tone in this passage recalls the 19th century belief in manifest destiny, or the belief that American settlers were historically (and virtuously) destined to settle the continent as part of God’s work. Compare Kelly’s words with Thomas Paine’s “Common Sense” pamphlet, a document that greatly influence this belief: “We have it in our power to begin the world over again. A situation, similar to the present, hath not happened since the days of Noah until now. The birthday of a new world is at hand” (118-119). Like Paine, Kelly finds our moment one in which the world is being reborn. There is a sense of virtuousness and inevitability in Kelly’s claims as well: “We are marching inexorably towards firmly connecting all humans and all machines into a global matrix” (296). In this future we will find a “new regime wherein our creations makes (sic) us better humans…” (296). In this final chapter, there is no mention of the stakes of this “inexorable march,” nor any reiteration of the importance or potential effects of human choice. It is also difficult to discern what he means by “better humans,” as he gives no concrete description of what “better” might entail. Indeed, only a few chapters earlier, he makes a statement that should make us further question what he means by “better,” and who in fact will benefit from these technological developments:
¶ 8 Leave a comment on paragraph 8 0 The big global system will not be utopia. Even three decades from now, regional fences will remain in this cloud. Parts will be firewalled, censored, privatized. Corporate monopolies will control aspects of the infrastructure, though these internet monopolies are fragile and ephemeral, subject to sudden displacement by competitors. Although minimal access will be universal, higher bandwidth will be uneven and clumped around urban areas. The rich will get the premium access. In short, the distribution of resources will resemble the rest of life. (294)
¶ 9 Leave a comment on paragraph 9 0 Thus, Kelly’s view of technological change offers a very narrow scope of ill-defined action that humans might contribute to its stewarding. What seems perhaps even more inevitable than technological development, however, is the persistence of an economic order that preserves corporate domination and unequal distribution of resources among social classes. Here we find a certain acceptance and perceived neutrality of the current economic order driving technological change. Nothing in his book suggests that he thinks this order could be changed by either human action (such as the laws and policy changes he suggests earlier). Even the supermind composed of “billions of minds” is apparently powerless to think through new possibilities for social order.
¶ 10 Leave a comment on paragraph 10 0 Though Jenkins and Benkler are far more reserved in comparison to the enthusiastic claims of Kelly, one can detect a slightly similar attitude in their views on technological development. In their texts their exhortations for human intervention in technological change is slightly more frequent and urgent than Kelly’s. However, like Kelly, these exhortations have curious parameters. Both Benkler and Jenkins urge for policy changes that rein in restrictive copyright laws that threaten to stifle new forms of peer production. And Jenkins goes further in advocating for concrete methods of instilling critical media literacy in students so as to foster these new forms of “participatory culture.” However, neither question the particular economic foundation that is driving the development and availability of digital technologies.
¶ 11 Leave a comment on paragraph 11 0 Instead, technological changes are described as if they are happening without a subject, or as if the changes themselves are the subject. Take for instance Benkler: “A series of changes in the technologies, economic organization, and social practices of production in this environment has created new opportunities for how we make and exchange information, knowledge, and culture. These changes have increased the role of nonmarket and nonproprietary production, both by individuals alone and by cooperative efforts in a wide range of loosely or tightly woven collaborations. These newly emerging practices have seen remarkable success in areas as diverse as software development and investigative reporting, avant-garde video and multiplayer online games” (2). Or alternately, in Jenkins: “Changes in the media environment are altering our understanding of literacy and requiring new habits of mind, new ways of processing culture and interacting with the world around us” (21). As important as Jenkins’ advocacy is for developing education in critical media literacy, it is worth noting that it is an education meant to teach students how to adapt to the emerging technological environment and all its imperfections. Technology is presented as an alien force that human beings can only domesticate and prepare for. Nowhere in any of these three writers is there a suggestion that the technology itself–even prior to the development of laws and policies that govern it–is an expression of human action and human power; their discussion erases the human origins of these technologies. However, what they describe as an inevitable force of nature can also be viewed as the technological extension of capitalism and the expression of the interests of the dominant capitalist class. In the next chapter, I’ll explore an alternate framework that enables us to take a different view of technology. This framework will allow us to pay better attention to how private interest shapes the activities, mentalities, and capabilities of the networked user and prepare us later on to imagine more equitable forms of participation in technology’s development.