Earlier this year Aaron Swartz, a US writer and political activist, was arrested and charged with wire fraud, computer fraud and unlawfully obtaining information from a protected computer. Swartz had entered the MIT campus and used MIT computers to download thousands of scholarly articles. Despite the fact that JSTOR (the owners of the database containing the articles) stated that they had suffered no damage from Swartz’s actions and had asked the US government not to proceed with prosecution, the government went ahead and arrested Swartz. He now faces up to thirty-five years in prison. The charges focus not on the act of accessing and downloading articles per se, but that Swartz had downloaded too many articles at once and that by exceeding the publisher’s license agreement he was engaged in serious hacking. Such zealous pursuit of intellectual property, in spite of the wishes of the ‘aggrieved’ party, indicates the degree to which knowledge capitalism has become an entrenched phenomenon.
The basis for prosecution, that Swartz was downloading the articles so as to redistribute them freely via torrent sites, is unlikely given the unwieldy result of such an action: lacking the search functions and architecture of JSTOR originals the information would be largely useless to any downloader. In fact Swartz’s previous work dealt with tracing the funding sources of legal research, where nearly half a million articles were analysed. Thus his actions make sense as a means of generating large scale quantitative data rather than internet hacking or piracy.
Swartz’s case has renewed focus upon the role of contemporary scholarly publishing and the related question of access. In 2002 Paul James and Douglas McQueen-Thompson pointed out in Arena Journal that the rising costs of individual academic journals and the various strategies used by transnational academic publishers were creating a market monopoly from the products of publically funded labour. Corporate takeovers of journals once run by groups of academics and individual university departments, the creation of software to measure citations and impacts (owned by the same publishing companies), hyperlinked footnotes that lead to other articles within the publisher’s ‘stable’, and the growing emphasis on the journal articles as a measure of academic distinction has meant that within a decade the whole field of scholarly publishing changed. Prices went up, editorial practices became more impersonal and disconnected from place and institution, outputs became more frequent and the link between academic research and public debate became more tenuous.
The situation today is even direr. Paper-based journals are in the minority and the majority of articles are found online. This has a twofold effect on access—members of the public can no longer simply walk into a library and obtain a journal from the shelves; they have to be paid-up members of an institution. Outside of university membership, public libraries have their own difficulties—in an era of budget cuts, database subscriptions are often the first thing to go. A lack of paper content impacts not just upon current journals but archives as well. Many libraries jettison their hard-copy holdings for reasons of space or ‘convenience’ but if the subscription is not kept up, access to past and present research disappears.
The result of all this is that the handful of transnational journal publishers that dominate the landscape find themselves the beneficiary of the current environment of academia where universities, faculties and departments compete with each other for funds—with journal publication a key measure of performance. While universities jostle for meagre funding and staff complain routinely of the pressures of overproduction, the winners are commercial publishers that derive huge profits from publically subsidised work. At the same time these publishers limit the degree to which research is able to circulate in the public domain.
JSTOR is a relatively benign organisation, though it still can cost to up to $20 for an individual article, which is prohibitively expensive for independent researchers. Commercial publishers such as Sage, Elsevier, Springer and Wiley, however, can easily charge triple this amount for individual articles. Journal subscription prices have rocketed, with some science journals now costing $20,000 a year. It is getting to the stage where many university libraries cannot afford such costs, so access to research and archival material is lost. A new division between ‘elite’ universities and the rest opens up, with only a select few academics being able to read essential material. While in theory the digitisation of information makes it easier to circulate, the larger framework of knowledge capitalism means that intellectual property regimes have been strengthened. Copyright laws in many countries have been extended, and previously free information has been recommodified, which is why it still costs money to download a paper by Albert Einstein.
There have been attempts to address this situation. Some academics place their material on their homepage, and some open access journals do exist. But alternatives like these have been hindered by the degree to which scholars remain attached to ‘known’ journals (even as these are taken over by corporate publishers), and the rise of auditing regimes which use citation statistics and the like to favour commercially produced journals that have widespread distribution patterns. A return to more locally based forms of co-operative publishing among scholars would redress the situation but the current disposition of universities is unlikely to see any value in such an arrangement.
The corporatisation of the university and the commercialisation of knowledge have restricted wider access to research at a time where ‘trust’ in information is at an all-time low. To some extent this is due to the changed way in which we engage with news and information. The transformation of the public sphere via the internet has increased public participation but also led to the creation of customised information environments where users can privatise their information needs. Geert Lovink has noted the decline of the ‘netizen’, a representative figure of moderation and tolerance prominent in the early culture of the internet. Lovink argues that the netizen has largely disappeared in today’s polarised information landscape. Discussion and debate has been subsumed by the rise of extreme opinions that never need engage with alternative viewpoints. This kind of information bunkering does not simply occur on the net, however. Lovink’s depiction of the change in net culture might equally apply to the practices of the Murdoch press, among others.
One of the strongest elements of Robert Manne’s recent Quarterly Essay on Murdoch concerned the comprehensive distortion of the debate around climate change in The Australian. Manne convincingly argued that the newspaper ‘had waged war on science and reason’ by publishing ‘scores of articles’ from the few scientists who rejected the overwhelming census on climate change, and even more articles rejecting climate change by writers with little or no scientific background at all. The deeply partisan nature and insularity of The Australian could not have been symbolised better than by Paul Kelly’s no-show at a debate with Manne—The Australian only likes to talk to itself.
Whether the Australian government’s various media inquiries will be able to address the problems of ownership and accountability, trust and diversity remain an open question. The lack of focus upon media ownership means that the distortions made possible through media monopolies are likely to continue. It would be a shame, however, if the monopolies of print and broadcast media were allowed to encroach further into new media. Despite the tendency for net culture to fragment into communities of sameness, the digitisation of the media also contains the potential for more accountability in terms of public debate.
Whatever significance one might attach to Julian Assange’s philosophy of radical transparency, his more modest project of ‘scientific journalism’—where journalists work with documents placed in the public domain so that members of the public can check and engage with the same material—is a step towards the restoration of trust in the media. In the United Kingdom, The Guardian used a version of this model with respect to the politicians’ expenses scandal and the cables released by WikiLeaks. In an age of declining trust the sharing of source information between media organisations and the public can open up new forms of exchange.
This is why the privatisation of public research—the locking up of academic research behind paywalls and prohibitively expensive journals—needs to be contested. How can there be an adequate debate over climate change or genetic engineering if information cannot enter the public domain? One of the casualties of the degradation of the public sphere has been the status of informed opinion and expertise. Nowhere is this more evident than in the cynicism about scientific evidence of global warming, although one does not have to go far to see other instances of the marginalisation of experts. The main purpose of university research is to foster a critical and interpretative culture, not generate profits for publishers by renting out material whose production they never funded in the first place. Brecht once asked whether the greater crime lay in robbing a bank or starting one. In the case of Aaron Swartz and the corporate publishers, one also has to ask where the real crime lies.
Simon Cooper is an Arena Publications editor.