Last night Ben O'Loughlin spoke at the launch for Tobias Blanke's new book, Digital Asset Ecosystems: Rethinking Crowds and Clouds. Tobias is Senior Lecturer at the Centre for E-Research at King's College London.
Tobias argues we inhabit ecosystems best understood through the complementary interaction of clouds (digital platforms, ubiquitous and heavily interlinked) and crowds (humans collaborating, knowingly or not). Clouds and crowds are ‘two sides of the same coin’ (p3). Through this division of labour, value is produced – cultural, social, economic but primarily network value. This means rethinking what digital assets are. They are not files, objects or items with content; they are connectors whose value depends on them being circulated and consumed through networks.
The research task that follows, for those of us in political communication as well as in digital humanities and big data research, is to follow the assets. Tobias writes, ‘how digital assets integrate in digital networks in their life cycle, how they move from place to place and from system to system, and how they pass through the hands of ‘dedicated communities’’ (p8, italics added) This is similar to Arjun Appadurai’s approach to cultural economy in tribal societies: follow objects and the meaning they have to their holders/consumers as they pass from person to person. The difference between passing a sacred artifact or gift around then and passing a campaign strategy document around today is that today network effects kick in. The circuit of connectivity around the object is open, unknowable in advance, and difficult to control without harsh rights management techniques. The NPCU has tracked and theorised how these assets become meaningful and valuable, for instance through Chadwick’s work on Obama's campaign videos or O’Loughlin’s work on jihadist videos.
But a conceptual problem becomes apparent in a digital ecosystem. Is meaning – and therefore ascription of value -- only generated by humans? Tobias shows this might not be the case. The semantic web or web 3.0 allows computers to evaluate how an object/asset is valuable qua what it can do and what functions it can help with within digital networks. The result is we find a mix of computers calculating link-ability in big data and use-ability in networks, and humans calculating qualitatively; somehow these join together – clouds and crowds perform ongoing co-evaluation operations. In this way, digital assets are continually valued, assessed, integrated; their value is ongoing-ly produced and affirmed or diminished.
This leads to a conception of rational, strategic action: harness and deploy this interplay of clouds and crowds to generate things of value, i.e. that connect, sell, connect, sell. Tobias discusses the case of Amazon’s Mechanical Turks and other free, voluntary or cheap labour. The book explores the political economy of digital ecosystems and offers a fresh understanding of value, labour, property and other classic concepts in a way that moves on from the open source debates of the 1990s and 2000s. All of this leads naturally to questions of open data. Most clouds are privately owned and owners like Amazon and Apple are strategic about how to open or close them in ways that maximize network value. Most citizens are not so strategic and lack the resources to create their own clouds. Could we have public service clouds? Not when trust in the state is so low, after the actions of the NSA and GCHQ. But what does public even mean anymore? Is it necessarily synonymous with the state? And what prospects are there for public demand for open data to be generated and them realized? Given that our political subjectivities and strategies are formed within these digital ecosystems, surely the loop is closed?
The book is well worth a read and shows the virtue of interdisciplinary thinking. As one of the social scientists present noted after the discussion, who knew that computer science had theory?