Category: technology

Hackers can’t solve Surveillance

Médecins Sans Frontières (MSF), also known as Doctors without Borders, is an organization that saves lives in war-torn and underdeveloped regions, providing health care and training in over 70 different countries. MSF saves lives. Yet, nobody thinks that doctors can “solve” healthcare. It’s widely understood that healthcare is a social issue, and universal health care can not be achieved by either the voluntary work of Doctors or by way of donations and charity alone.

Just as Doctors can’t solve healthcare, Hackers can’t solve surveillance. Doctors can’t make human frailty disappear with some sort of clever medical trick. They can help mitigate issues, fight emergencies, they can be selfless, heroic. but they can’t, on their own, solve healthcare.

One of the ways that Hackers can fight surveillance is to develop better cryptographic communications tools, and train people how to use them.. This is certainly critical work that hackers can contribute to, but we can’t, on our own, solve surveillance.

Nothing that Hackers can do on their own can eliminate surveillance. Just as universal healthcare is only something that can be achieved by social means, privacy respecting mass communications platforms can only be achieved by social means. Safe mass communications platforms can not be created by private interests, neither commercially, nor voluntarily.

As we well know, private medical provisioning provides unequal health care. The reason is obvious, health needs and the ability to pay are not usually corelated. Private provisioning means that those who can’t pay, wont be served by profit-driven institutions, and though this can be mitigated by voluntarism and charity, it can’t be fully overcome.

Likewise, mass communications that are built for the profit motive either need to charge a fee, and thereby be exclusive, or be advertising supported. Other options can exist for connected and technically savvy users, but these will be niche by necessity. For the masses, the main options available will always be well funded platforms with employees to do support, development, and marketing, without wich, it’s impossible to build-up a mass user base.

The lucrativeness of advertising-based platforms, makes it difficult even for fee-based systems to compete, since they don’t generally produce enough revenue to invest significantly in support, development and marketing, which makes them less attractive even to users who could or would pay, but the major issue that kills such platforms is that the fee means that some people will not be able to use it at all.

Thus, commercial mass platforms tend to be advertising driven. This means that the business of platform operators is selling audience commodity. Commodities are sold by measure and grade. You can buy 10lbs of Fancy Grade Granny Smith Apples, or two dozen Grade A free range eggs. Or 2 million clicks from age 18-35 white males.

Audience commodity, the users of the platform, are sold to advertisers, by measure of clicks or conversion, and by grade. For advertisers, audience is graded by specifications that include age, sex, income level, family composition, location, ethnicity, home or automobile ownership, credit card status, etc. The Demographics, as they say.

Since an advertising funded platform must grade audience commodity, it must collect data on it’s users in order to grade them. This means that the one thing such a platform can not offer its users is privacy. At least not privacy from the platform operators and their advertisers.

And so long as the platform operators collect such data, there is no way that this data will not be made available to local and foreign intelligence agencies.

This hard reality has been hard to grapple with, especially for a hacker community who saw the Internet as a new realm, as John Perry Barlow wrote in the Declaration of the Independence of Cyberspace: “We are creating a world where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity.” His colleague, John Gilmore, famously claimed “The Net interprets censorship as damage and routes around it.”

Those two quotations, born of the 90s hey-day of net.culture, contrast starkly with what Adam Curtis describes in his BBC documentary All Watched Over By Machines of Loving Grace:

“The original promise of the Californian Ideology, was that the computers would liberate us from all the old forms of political control, and we would become Randian heroes, in control of our own destiny. Instead, today, we feel the opposite, that we are helpless components in a global system, a system that is controlled by a rigid logic that we are powerless to challenge or to change”

Oddly, the film doesn’t credit Richard Barbrook and Andy Cameron who coined the term the “Californian Ideology” in their seminal 1995 text, which was among the first to identify the libertarian ideology endemic in Silicon Valley culture.

The visions of a free, uncensorable cyberspace envisioned by Barlow, Gilmore and others was incompatible with the needs of Capital, and thus the libertarian impulses that drives Silicon valley caused a change in tune. Cyberspace was no longer a new world, declared independent with its own unalienable rights, it was now an untamed frontier, a wild-west where spooks and cypherpunks do battle and your worth is measured by your crypto slinging skills and operational security. Rather than united denizens of a new terrain, we are now crypto individualists homesteading in hostile territory.

This, as Seda Gurses argues, leads to Responsibilization, “Information systems that mediate communications in a way that also collects massive amounts of personal information may be prone to externalizing some of the risks associated with these systems onto the users.”

Users themselves are responsible for their privacy and safety online. No more unalienable rights, no more censorship resistant mass networks, no more expressing beliefs without fear of being silenced. Hack or be hacked.

Since libertarian ideology is often at odds with social solutions, holding private enterprise as an ideal and viewing private provisioning as best, the solutions presented are often pushing more entrepreneurship and voluntarism and ever more responsibilization. We just need a new start-up, or some new code, or some magical new business model! This is what Evgeny Morozov calls Solutionism, the belief that all difficulties have benign solutions, often of a technocratic nature. Morozov provides an example “when a Silicon Valley company tries to solve the problem of obesity by building a smart fork that will tell you that you’re eating too quickly, this […] puts the onus for reform on the individual.”

Karl Marx makes a similar argument in Eighteenth Brumaire of Louis Bonaparte:

“The proletariate […] gives up the task of revolutionizing the old world with its own large collective weapons, and, on the contrary, seeks to bring about its emancipation, behind the back of society, in private ways, within the narrow bounds of its own class conditions, and, consequently, inevitably fails.”

Solutionism underestimates social costs and assumes that social issues can be solved by individuals and private interests, and some may be, but where universality, equality and fairness need to be provided regardless of skill or wealth this is not the case. These sorts of things can only be provided socially, as public goods.

Many Hackers have always known this. In a excellent Journal of Peer Production essay Maxigas quotes Simon Yiull:

“The first hacklabs developed in Europe, often coming out of the traditions of squatted social centres and community media labs. In Italy they have been connected with the autonomist social centres, and in Spain, Germany, and the Netherlands with anarchist squatting movements.”

Early hacklabs didn’t view their role as being limited to solutionism, though hackers have alway helped people understand how online communications works and how to use it securely, hackers where embedded within social movements, part of the struggle for a fairer society. Hacker saw themselves as part of affinity groups fighting against privatization, war, colonialism, austerity, inequality, patriarchy and capitalism, they understood that this was the way to a new society, working shoulder to shoulder with mass movements fighting for a new society, and that here their knowledge of networks and communications systems could be of service to these movements.

Yet, as Maxigas goes on to argue,, “hackerspaces are not embedded in and not consciously committed to an overtly political project or idea.”

Instead, hackerspaces often focus on technological empowerment, which is certainly beneficial and important, but like community health centers that teach health maintenance practices are beneficial, they can’t solve larger social issues, such each-one-teach-one projects can not, on their own, solve social issues like privacy or health.

Hackers need to understand that there is no business model for secure mass communications. In order to achieve a society where we can expect privacy we need more hackers and hackerspaces to embrace the broader political challenges of building a more equal society.

Miscommunication Technologies with @dmytri & @baruch at @BerlinAtonal

Baruch Gottlieb and I will be giving a talk about Miscommunication technologies at Berlin Atonal today.

http://www.berlin-atonal.com/

Below is a text written by us about the series of artworks originally published in “Disrupting Business,” Edited by Tatiana Bazzichelli & Geoff Cox and published by Autonomedia – Data Browser 05.

http://networkingart.eu/2013/10/disrupting_business/

Miscommunication Technologies
Telekommunisten Artworks 2009-2013
Dmytri Kleiner, Baruch Gottlieb

The development of communication technologies is not merely a neutral process driven by discovery, progress and innovation, but an intensely social and political process where choices are made in ways that fundamentally influence the reproduction of the class conditions of the societies that produce these technologies. Communications technologies embody and perpetuate the social relations of their mode of production.

The Miscommunication Technologies series of artworks by Telekommunisten explore these social relations by creating technologies that don’t work as expected, or work in unexpected ways. The artworks in the series allow the embedded social relations to be critically experienced and confronted. The series employs parody, juxtaposition, exaggeration and reductio ad absurdum to bring aspects of these relations which are normally hidden from view, into the foreground.

The Miscommunication Technologies artworks illustrate some of the real world challenges faced by anyone or any group which would like to challenge the dominance of capitalist models of production. Miscommunication Technologies take a light-hearted approach to an intractable reality: capitalism is not only the system by which maximum value is extracted from social production, it is also the current global system which, in its unsatisfactory yet somewhat reliable manner, provide vital services we depend on every day. Any challenge to capitalist hegemony must be prepared to provide for the same social needs which will persist any system.

The illusions of the early Internet as a panacea platform for the emancipation of human intelligence and collaborative spirit emerged because it was financed for use-value, not exchange-value. It’s early developers were universities, NGOs, hobbyists and, prominently, the military. The contributors to the early Internet built the platform according to what could be seen as a product of a communist credo, “from each according to ability, to each according to need.”

As Richard Barbrook described in “The::Cyber.Com/munist::Manifesto” “Within the Net, people are developing the most advanced form of collective labour: work-as-gift.” Information and software spread freely across the network. This, to many people, created the impression that a new society was emerging, for instance, “The Declaration of the Independence of Cyberspace” by John Perry Barlow stated “We are creating a world where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity.” Barlow’s EFF co-founder, John Gilmore claimed that “The Net interprets censorship as damage and routes around it.” implying that The Net existed beyond the jurisdiction of States, or even the organisations that operate it, as it can simply “route around” those that would seek to interfere with the freedom of exchange on the network.

This might have held true to some extent during the initial stages of commercialisation of the Internet, since the first commercial ventures, “Internet Service Providers” or “ISPs” did not develop their own communications technologies, but only provided access to the public Internet, and the decentralized, open technologies that ran on it, such as email and usenet. The exchange value these ISPs were capturing was collectively created. Each ISP was independently earning income by being a part of a common platform, not owned by anybody as a whole, but composed of the mutual interconnections of the participants. Though made up of parts owned by public and private organizations, the platform as a whole functioned as a commons, a common stock of productive assets used independently by the ISPs and their users.

In parallel to the Internet, “Online Services” like CompuServ emerged from the capitalist imagination, they were financed for exchange value, by profit seeking investors, and as such did not employ a mesh topology like the Internet, but rather employed a star topology. Users could not communicate directly with each other, but only through the central servers of the operator, which could not be “routed around.” This was required by profit-oriented business models, since control of user interaction and user data is required to monetise the platform, for instance by charging fees or selling advertising.

Part of what fed the illusion of the emancipatory potential of the then-possible Internet was the fact that the platform made Capitalist-funded “Online Services” like CompuServ and AOL obsolete. This happened largely because of the explosive growth made possible by it’s distributed infrastructure, allowing the ISP industry to develop as a kind of petit-bourgeois industry of small producers. ISPs were a cottage industry of mom and pop telecoms of sorts. The design of the Internet allowed anybody with a connection to the internet, to provide a connection to others, thus the barrier of entry to becoming an ISP was relatively small, just an upstream connection, some computers, modems and telephone lines.

During the early days of the public Internet the communistic petit bourgeois ISPs prevailed over the feudalistic haute bourgeois Online Services, making it seem momentarily that the superior technical architecture of the Internet, combined with the cultures of sharing and gift economies would be able to surpass and even transcend Capital.

Both personal and commercial users migrated en masse to the internet. For instance, in a letter to their customers that is still available online the software company BASIS international, “The Big Little Software Company,” writes “By the end of 1997, BASIS plans to move completely off CompuServe (CSi) and onto the Internet. This is a logical consequence of the many changes that have taken place in the online world over the past few years.” In their letter, BASIS spells out a lot of these changes; “While our CSi presence has served the company well in the past, its pay-to-access structure is increasingly harder to justify with the Internet providing almost limitless content at a negligible incremental cost. People are moving away from CSi in significant numbers, making it a less effective platform from which to address our current and future customers. We believe that moving our existing support infrastructure from CSi to the Internet will give us better access to our customers and our customers better access to us” and goes on to explain how it will now use open platforms like email, Usenet and IRC instead of CompuServ’s proprietary and centralized applications. How ironic that now web 2.0 platforms have companies and individuals returning to centralized, proprietary systems for their support and communications. The reason for this is not because centralized platforms where superior all-along, but because they are they only kind of systems that are funded by capitalists.

While ISPs invested in bringing Internet access into households and offices worldwide, they did little to actually develop the communications platforms used on the network, these were largely developed within the gift economy of the users themselves. The ISPs were even less able to take over the provision of long-haul data transmission, dominated by international telecommunications conglomerates. Most ISPs got their start by simply connecting shelves full of consumer grade modems to consumer grade computers running free software, providing connectivity to an upstream internet provider for end-users who were using freely available communications platforms.

Thus, while the emergence of the ISPs and the rapid mainstream adoption of the Internet were spectacular, they were not able to capture enough profit to scale up and take over the more investment-heavy infrastructure of Internet provision. The end was already apparent in beginning. Well-financed telecommunications conglomerates would soon replace the mom & pop ISPs, either buying them up, or driving them out of business by providing “broadband” services which delivered internet to the home along with telephone service, leaving the remaining ISPs as just resellers, providing service over telecom managed circuits.

As Internet usage grew, technically-oriented users became the minority. The general Internet user became what Clay Shirkey eventually called “everybody”. This had a significant impact on the culture of sharing and tolerance. The first wave of “everybody” to arrive was when AOL, in an effort to remain relevant, allowed it’s users to access the Internet, this epoch has been called “The Eternal September” since then. The Jargon File, a glossary of hacker slang, describes this as “All time since September 1993. One of the seasonal rhythms of the Usenet used to be the annual September influx of clueless newbies who, lacking any sense of netiquette, made a general nuisance of themselves. This coincided with people starting college, getting their first internet accounts, and plunging in without bothering to learn what was acceptable. These relatively small drafts of newbies could be assimilated within a few months. But in September 1993, AOL users became able to post to Usenet, nearly overwhelming the old-timers’ capacity to acculturate them; to those who nostalgically recall the period before, this triggered an inexorable decline in the quality of discussions on newsgroups.”

The Jargon File, mentions “Netiquette,” a quaint term from the innocent times of net.culture, yet Netiquette was not simply a way of fitting-in, like table manners at an exclusive dinner party. The cultural context of that Internet that made acculturation necessary was its relative openness and lack of stratification.
Netiquette was required, because the network had relatively little constraints built into it, the constraints needed to be cultural for the system to work. There was much more to this culture than teaching new users how to not abuse resources or make a “general nuisance of themselves.” Netiquette was not so much about online manners, it was rather about how to share. Starting from the shared network resources, sharing was the core of the culture, which not only embraced free software and promoted free communications, but generally resented barriers to free exchange, including barriers required to protect property rights and any business models based on controlling information flow.

As dramatic as the influx of new users was to the old-timers’ net.culture, the influx of capital investment and its conflicting property interests quickly emerged as an existential threat to the basis of the culture. Net.culture required a shared internet, where the network itself and most of the information on it was held in common. Capital required control, constraints and defined property in order to earn returns on investment. Lines in the sand were drawn, the primitive communism of the pre-September Internet was over. The Eternal September began, and along with it, the stratification of the internet began.

Rather than embracing the free, open, platforms where net.culture was born, like Usenet, EMail, IRC, etc, Capital embraced the Web. Not as the interlinked, hypermedia, world-wide-distributed publishing platform it was intended to be, but as a client-server private communications platform where users’ interactions were mediated by the platforms’ operators. The flowering of “Web 2.0” was Capital’s re-engineering of the web into an internet accessible version of the online services they were building all along, such as the very platforms whose mass user bases where the influx that started the Eternal September. CompuServ and AOL most notable among them.

The gift-economy model of software development that developed platforms like email and usenet was unable to compete with a quickly growing Venture Capital start-up scene pushing Web 2.0 platforms. Like the profit-oriented Online Services before them, these start-ups were also compelled by the the profit motives of their investors to implement a centralised topology, a star topology, because once again, the central control of user data and interaction was required to monetise the platforms. We have moved from a world CompuServ and AOL to a world of Amazon and Facebook. Scratch off the Facebook logo and you’ll find the CompuServ logo underneath.

The OCTO P7C-1 prototype premiered at transmediale 2013 was produced by Jeff Mann, Jonas Frankki, Diani Barreto, Baruch Gottlieb and Dmytri Kleiner with raumlaborberlin. OOCTO exemplified this problematic. OCTO, the fictional venture capitalist start-up promised to build the next dimension of the Internet, the physical dimension of communication through a pervasive pneumatic tube network. The utopian rhetoric of the OCTO boosters is exuberantly cliché, promising all manner of human empowerment and positive transformation, and conveniently leaving behind in the shadow of bold promises the fact that this technology will be completely centralised and completely transfused with invasive security and monitoring technologies.

OCTO P7C-1 presented the situation on several parallel levels. First, the actual working prototype, the P7C-1 allowed visitors to send capsules around the entire Haus der Kulturen der Welt. The P7C-1 stations were integrated everywhere at transmediale and used by staff and visitors alike. Use of the system was purposefully complicated, every capsule having to be sent through a central station in coordination and at the mercy of the operators positioned there. P7C-1’s cumbersome, labour-intensive and privacy-agnostic factuality flew in the face of the transcendent promises unflaggingly issued from the fictional directorship of the fictional OCTO company. The constant work of managing the central station, end-stations and tube network is labour theatre, unlike the internet where the physical labour is hidden, the labour in OCTO P7C-1 is presented as a central theatrical aspect of the work. OCTO the company, provided the second layer, the social fiction, constantly driving home the lesson that there is a price for the convenience of every new technological utopia under capitalism, and the price will be extracted from those who are promised to benefit.

We have moved from administering our own email to using the centralized email services of giant entities like Google and Yahoo, which, as part of their mere functioning, parse and analyse private contents. Massive data sets have proven as useful for optimising AI applications such as automatic translation as any improvement from the (academic) information science community. Access to these storehouses of real-time contextual semantic data is the nec plus ultra of contemporary web profit models.

The revolutionary Internet that inspired Barbrook, Barlow, Gilmore and many others has become a dystopia, a platform whose capabilities and pervasiveness of surveillance and behavioural conditioning and influence surpass the wildest dreams of the tyrants and technocrats of previous eras. As we will see again and again, despite claims that culture and economy has gone ‘immaterial’, the rules of access to the physical technology of the internet conditions the forms of services which are eventually at the disposal of users.

Whereas OCTO is the archetypal network startup with a unabashed agenda of market sector conquest, Thimbl appears as the light at the end of the long dark tunnel of centralised hegemonic corporate dominance of the Internet. Developed by Dmytri Kleiner, Jonas Frankki, Rico Weise and Mike Pearce with contributions from a small community that developed around it, including Anthony Shull, Silja Neilson, Mark Carter and Fernando Guillen ,Thimbl is made out to be a distributed, peer-to-peer alternative to microblogging platforms such as Twitter. Thimbl appears as an analogue of projects like Diaspora, also launched in 2010 like Thimbl, Diaspora is a purely altruistic project with no profit motive and only the idealism of freedom of information.

The tragedy of projects like Diaspora is that they are not really a viable replacements for capital-funded projects like Facebook, for economic and political reasons, not technical reasons. Therein lies the message of Thimbl. Anyone who has some understanding of the elementary server architecture of the Internet can use Thimbl, because it is based on a protocol originally developed in the 1970’s called Finger which allowed users to post public “project” and “plan” messages akin to status updates. The free-access, non-commercial functionality of finger harkens back to the period when the Internet was still being developed for use value. By retrieving finger, Thimbl indicates how users today are allowing corporations to benefit from the value of their social interactions for services which, in principle, could be used freely and for free. Thimbl shows that all that is necessary to provide a microblogging experience like Twitter is available for free and built in to the Internet right now, but, precisely because they are freely available, technologies based on protocols like finger will never be developed to the extent that they offer the satisfactory user experience of competitive commercial platforms.
Unlike the highly centralized OCTO, capital will never fund a project like Thimbl because it will not generate sufficient ROI. Thimbl is an economic fiction or social fiction. Making it work is not the greatest challenge, making it financially viable is. Thimbl does not provide investors with the ability to control it’s users or their data, and as Thimbl’s Manifesto states “This control is required by the logic of Capitalist finance in order to capture value. Without such control profit-seeking investors do not provide funds.”

For Thimbl, or any other platform with a similar vision, to become a real alternative to the capitalist financed platforms like Facebook and Twitter, we need more than running code, even more than a small, perhaps dedicated, user base. To get beyond this and actually break the monopolizing grip of centralized social media we need to match their productive capacities. We need financing on a similar scale. so that the development, marketing, and operations budgets are comparable and sufficient to compete. Just like science fiction becomes reality when science transcends the limitations that existed when the fiction was imagined, for economic fiction like Thimbl to become reality society will need to transcend the political and economic limitations that we currently face. We can write code, we can write texts, we can create artworks, but as a small network of artists and hackers, we can’t change the economic conditions we work in by ourselves.

Free, distributed platforms are very practically suited to the work of radical communities, both symbolically as a matter of solidarity, and also practically, since support for privacy and cryptography is often desirable. These platforms should, in a meritocratic economy of technological product, become prevalent, but instead they are marginalized by the current ‘owners’ of the Internet. Free, distributed platforms cannot provide the same ease-of-use, the so-called user-experience (UX) provided by capitalist platforms because they simply lack the work-time to generate such quality. The result is that radical programmers pride themselves on the superiority of the software and bemoaning the state of things which prevents that such software become prevalent. Radical programmers are motivated to campaign on the level of code for a freer, anarchist, egalitarian Internet, but they are not motivated to confront the political and economic realities which prohibit the social adoption of these technologies. This generates much frustration and defensiveness, rather than the commitment to dedicate some small quanta of their formidable imaginations and intelligences to the problem of ownership.

Miscommunication Technologies show-up the improvisatory economic structures of network-optimism in the way they inevitably ‘fail’ to deliver the seamless networked experience they provocatively advertise. The schism between the promise of utopia and the reality of a system which requires much spontaneous effort on the part of users even to provide a modicum of functionality, playfully points to the immense work still needed to produce conditions which will support a radically different model of industrial communications as it prioritizes the generation and cultivation of direct interpersonal engagement between a community of users.

General concern regarding the censorship and surveillance on commercial online platforms is growing, and these concerns are opportunities to to introduce political topics by arguing that these features are not unintended side-effects of these platforms, but central to their business models, and that platforms that do not surveille or control can not and will not be financed by capital, but only by collective or public undertaking as an expression of priorities which diverge from capitalism. Once this becomes clearer, concern over privacy settings on Facebook can be directed towards capitalism itself, instead of the idiosyncrasies of that platform or it’s founders.

Privacy and surveillance, at the same time, become wedge issues to de-legitimize alternative networks and services for the general public. Under the banner of security and ‘quality’, corporations have lobbied governments to favour centralized ‘unfree’ network applications built on the still free but ever fading-from-view Internet. We have seen often enough how products like Bitcoin can be impugned to ‘enable elicit activity’, cast as disreputable, until completely controlled and regulated by capital-concerned governments. Without acknowledging the systemic necessity, under the capitalist financing regime, of a centralized Internet, citizens’ legitimate concerns about corporate encroachment into private and personal spheres is co-opted to generate unfavorable opinions about technologies which could help disrupt the dominance of capitalist priorities of control.

It is worthwhile to re-emphasize that the Internet itself is not immaterial. The Internet is only accessible through hardware which needs to be built according to unfree and often unfair industrial production rules. The industrial production of electronics is a quintessentially capital-intensive undertaking requiring global flows of materials, which, under capitalism take place in extreme conditions of competition and extraction of labour value. Any challenge to how the Internet is run, or what it is available to be used for must also challenge how it is produced and reproduced.

iMine, an experimental art-app/game produced by Baruch Gottlieb in 2011 with Horacio González Diéguez and Cocomoya, prior to Baruch’s work with Telekommunisten, is now integrated into the Miscommunication Technologies series. iMine is a game that can be played on a smartphone building the reality of labour exploitation in the mining industries needed to produce the minerals required to make the device being used to play the game into the experience of playing the game. iMine does not try to make the gameplay enjoyable or directly educational but seeks to create an experience of bleakness and drudgery, true to that of the mine workers, not to entertain the user with the story of the mining, but evoke the experience of the miner. At the heating heart of the emancipatory digital device, are highly hierarchic systems of production and control. iMine is dismalware.

The gameplay is designed from the start to be stripped down to the mere basics Someone who wants to play first creates a new miner giving it a unique name and a country. After this simple registration the only thing left to do is repeatedly thrust the phone as if it were a shovel into the ground. The website keeps track of the global iMining action going on at any particular time, and also features an extensive resource section with information on mining and the political and economic enjeu in global supply chain for minerals necessary in portable computing device production. After having been developed and premiered at LABoral,

Miscommunication Technologies thus indicates that there can be no uniquely technological fix. Colonial wars and security states, corporate rule and centralization will persist despite the best intentions of emancipatory technologists, and worse, the best and most innovative technologies are not only appropriated to perpetuate capital but to this end they are incomparably better funded than had been the visionary projects of their emancipatory inventors. The technologies which become dominant, become dominant in the form dictated by the prevailing conditions of capitalist production under which we labour today.

The free, distributed platforms, that can not be controlled or censored, can not exist on any large scale under capitalism. Not for technical reasons, in fact the technology that enables such interaction is in many cases well-described and readily available, but for social and political reasons. The productive capacity that is required to build and support them will not be provided by Capital, thus so long as Capital is the dominant mode of production, it will produce platforms that reproduce itself, thus platforms than enable the accumulation of wealth by engineering control and extraction into communications systems.

R15N, originally developed as Jessycom by Dmytri Kleiner during a residency at the Israeli Center for Digital Art, was premiered as R15N in collaboration with Jonas Frankki, Jeff Mann, Baruch Gottlieb, Rico Weise and Mike Pearce at transmediale 2011. R15N is a project which pushes to absurdity the emancipatory rhetoric of mobile networked computing. Events like the antiglobalization protests in Copenhagen or the political upheaval often referred to as ‘arab spring’ generate much enthusiastic hyperbole about how new realtime networks employing mobile devices can become an unstoppable democratising force. R15N points to the economic predilections built into the provision of network connectivity may work against such emancipatory agendas.

R15N retrieves an obsolete form of social networking, the ‘telephone tree’ and dresses it up as the lastest thing in robust circumventionist networking. Perfect for planning a flash mob, R15N easily becomes a nuisance as phone calls multiply rendering the commitment one made to one’s community by joining the network a near-constant obligation to participate.

Whereas iMine proposes that critical games or critical media can only do so much to challenge the economic exigencies underlying an unacceptable status quo, and that the materiality (itself) of networked utopia is the key to understanding its injustices, R15N suggests that circumventionism will not fundamentally challenge intolerable social conditions without the concurrent care and effort being dedicated to actually building up strong communities which have committed to working together toward transforming society, as users of R15N are constantly reminded, the system depends on your competence and diligence.

Miscommunication Technologies are artworks with a principal purpose, that of engaging people in provocative networked experiences in which they inadvertently but necessarily confront the unadorned material and economic conditions under which such experiences are made possible.

Werkstatt TechOps Report #1

ThoughtWorks Werkstatt Berlin hosts many different working groups, including several Cryptoparties, The Kids’ Hacker Club, and the Marx-Engels Werkshau group. In order for the groups to plan and stay in touch with each other in between their meetings at Werkstatt, we have implemented Werkstatt Groups, an online discussion forum based on NodeBB.

Creating a discussion channel for Werkstatt is tricky, since working group participants range from Tor project contributors, who are very knowledgable and concerned about technology and privacy issues, to kids, to political activists, who have other interests and areas of focus, and may be still learning about technology and privacy issues. So the Werkstatt Groups platform needs to be something that is usable across the spectrum, to be a place where privacy experts and privacy novices can intereact online.

Looking at the options available, a simple web forum became the most reasonable choice. With the many working groups at Werkstatt, managing dozens of mailing lists seems unworkable. Usenet, alas, has become entombed behind paywalls, and is inaccessable to most people, except through untrusted interfaces like Google Groups. Platforms that offer groups functionality like Facebook obviously have privacy issues, among many others, and old favourites like IRC and Jabber are not particularly suitable for asynchronous group discussion.

So how to set up a web forum that respects privacy? Run it on a Tor hidden service!

Before I explain how this was done, I need to start with a disclaimer: Werkstatt Groups makes no guarantees of privacy or anonymity, Tor is designed to provide anonymity. However, identifying all the possible ways in which the software running the forum may leak information is not easy, so use caution and report any issues or potential issues to us.

There are two ways to access this site, the recommended way is Tor Browser. Downloading and installing Tor Browser Bundle takes seconds and ensures that all your browser traffic goes over Tor and that your browser doesn’t leak any information and is difficult to fingerprint.

Using Tor Browser, you can access Werkstatt Groups using this url: http://vgnx2fk2co55genc.onion. Note HTTPS is not used, this is because the connection is already encrypted by Tor.

The other way of accessing it is by way of the public URL, http://groups.werkstatt.tw, which links to HTTPS when you access the forum. This is a reverse proxy running on a different server than the one that hosts the hidden service, accessing the hidden service over the tor network, thus making the site publicly accessible outside of the Tor network by way of a public url, while at the same time not revealing the location of the hidden service.

The NodeBB platform itself is a very dynamic, responsive platform which makes heavy use of websockets by way of socket.io, this is very advantageous over Tor, as a request to a hidden service needs to traverse 6 different servers, making page loads very expensive. Minimizing page loads by way of websocket requests compensates for this.

However, NodeBB also has some drawbacks, the platform uses Gravatar and Google Fonts, and socket.io includes a Flash fallback option, so a small Flash object is loaded in the site. All these issues are fixable, and are on our isssues list, however the best way to defend against these kinds of issues is to use Tor Browser. This way, even requests to Gravatar and Google Fonts go over Tor, and potentially dangerous plugins like Flash are blocked. However, JavaScript running in the browser is always a security concern, as exploits are possible. Also, NodeBB is beta software in very active development, and we are running the bleeding-edge head-of-branch, so expect glitches and some downtime.

OK, OK, so with all that out of the way, here is how the setup works. If all you want to do is use the forum, just get started here: http://groups.werkstatt.tw, however if you want to know how the setup works, keep reading. This assumes a relatively expert knowledge of server setup, including node, tor, nginx and iptables.

NodeBB

NodeBB installation instructions
for various platforms are available. However Werkstatt Groups uses git and npm to install NodeBB. So the steps are:

– Install and run Redis
– Pull head
git clone https://github.com/NodeBB/NodeBB.git
– Enter the directory, i.e:
cd /srv/http/NodeBB
– Build dependencies
npm install
– run the setup script
./nodebb setup
– change bind address in config.json to local interface only
"bind_address": "127.0.0.1",
– If all that runs successfully, start NodeBB
./nodebb start
– Check the log
./nodebb log
This line should appear if all is good:
info: NodeBB is now listening on: 127.0.0.1:4567

Tor Hidden Service

– Install and run Tor
– Configure the Tor hidden service in /etc/tor/torrc

HiddenServiceDir /var/lib/tor/hidden_service/
HiddenServicePort 80 127.0.0.1:4567

– restart Tor and find out onion address

cat /var/lib/tor/hidden_service/hostname
vgnx2fk2co55genc.onion

Your onion address will be different, of course.

To reduce the chances of the server revealing it’s address to other services based on outbound requests, add iptables rules to ensure that requests that come from the server go over Tor, here is an example of REDIRECT rules in the nat table OUTPUT chain configuration on Werkstatt Groups

*nat
-A OUTPUT -p icmp -j REDIRECT --to-ports 9040
-A OUTPUT -s xxx.xxx.xxx.xxx/32 -p tcp -m owner ! --uid-owner 43 -j REDIRECT --to-ports 9040
-A OUTPUT -p udp -m udp --dport 53 -j REDIRECT --to-ports 5353

‘xxx.xxx.xxx.xxx’ is the IP address of the hidden service and 43 is Tor’s userid, this means that all requests that originate fom this ip address that are not Tor itself are redirected over Tor’s Transparent Proxy, which I’ve configured to run on 9040. DNS Requests are redirected over Tor. For good measure, ping is short circuited as ell.

– Enable DNSPort and TransPort in /etc/tor/torrc

DNSPort 5353
TransPort 9040

Restart Tor again and visit the onion address in Tor Browser, you will see your NodeBB forum! Hooray!

Reverse Proxy

The reverse proxy runs nginx and tor.

In order to set up the public https server we need to use a different server. The IP address of the hidden service should not be listed anywhere, so it can not be used in your DNS zone.

So on this other server
– install and run tor
– install and run nginx
– make an ssl certificate and set up an https server with nginx
– set up proxy_pass in nginx for your onion node with websocket support, i.e.

proxy_pass http://vgnx2fk2co55genc.onion/;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";

Now we need to set up Tor to transparently proxy requests for Tor hidden services.

– In /etc/tor/torrc/

DNSPort 5353
TransPort 9040
VirtualAddrNetwork 10.192.0.0/10
AutomapHostsOnResolve 1

In addition to setting up the DNSPort and TransPort, The above code maps hidden services to the network 10.192.0.0/10, which we can then use in our iptables rules, as follows:

*nat
-A OUTPUT -p udp --dport 53 -j REDIRECT --to-ports 5353
-A OUTPUT -d 10.192.0.0/10 -p tcp -j REDIRECT --to-ports 9040

And viola! Once you restart Tor and nginx, if you navigate to your https server, you should see your hidden service!

Questions and comments very welcome!

Dear #NETMundial, Governance is cool and all, but we need to DEMAND IPv6 NOW! cc #OurNetMundial

Many of my friends and colleagues where in Sao Paulo last week for NETMundial, the Multi-stakeholder Meeting on the Future of Internet Governance. Dilma Rousseff, President of Brazil, convened this initiative to “focus on principles of Internet governance and the proposal for a roadmap for future development of this ecosystem.”

NETMundial was originally motivated by revelations from Edward Snowden about mass surveillance conducted by the US and UK governments, including spying on President Rouseff herself. These revelations prompted Mrs Rousseff to state “In the absence of the right to privacy, there can be no true freedom of expression and opinion, and therefore no effective democracy” in a speech to the UN at the 68th General Assembly.

Yet, as important as Internet governance is for our future, and as valuable any effort to address this is, it is unlikely to do much, if anything, about the right to privacy online. Why? Because surveillance is not an issue of Internet governance, but of the way the Internet is financed. The vast amount of consumer data amassed by private companies like Google, Facebook and Verizon is not the result of IANA or ICANN policy, but of the business models of these companies which seek to generate profits by way of this data. It is inconceivable that these companies could amass such vast amounts of consumer data, use it for marketing purposes, sell and share access to it with other companies, and yet, somehow keep it out of the hands of the NSA and similar intelligence agencies. Likewise, the extraordinary hacks, mods and exploits the NSA has conducted, as revealed by Snowden, would not be thwarted by any IANA regulation. Aggression by the US is not an Internet problem, and Internet governance can not do away with it, any more that it can do away with drone strikes and regime change projects.

Yet, there is lots that governments can do to ensure the right to privacy, and they can do so today, even absent any change in global Internet governance.

Governments have the ability to regulate the way Telecomms and Internet companies operate within their countries, indeed, the government is no stranger to creating regulation. Government regulation ensures buildings are built correctly, structurally sound, follow the fire code, etc. Governments create rules that make sure highways, roads, and sidewalks are used safely. Governments pass laws to prevent consumers from being defrauded, create statuary warranties, labour standards, regulate broadcast media, etc. Governments can pass regulations to protect the right to privacy. The idea that the Governments such as Brazil, Germany and the others participating in NETMundial need reforms to IANA and friends before they can work towards guaranteeing their own citizens’ right to privacy is absurd.

To guarantee the right to privacy, communication systems must implement the end-to-end principle, which states that functionality ought to reside in the end hosts of a network rather than in intermediary nodes. The term “end-to-end” principle was coined in a 1981 paper by J.H. Saltzer, D.P. Reed and D.D. Clark at the MIT Laboratory for Computer Science, “End-to-End Arguments in System Design,” in which they specifically address privacy.

In the section titled “Secure transmission of data,” the authors argue that to ensure “that a misbehaving user or application program does not deliberately transmit information that should not be exposed,” the “automatic encryption of all data as it is put into the network […] is a different requirement from authenticating access rights of a system user to specific parts of the data.” This means that to protect the users’ rights to privacy, it is not sufficient to encrypt the network itself, or even the platform, as this does not protect against the operators of the network, or other users who have access to the platform. What is needed, the authors argue, is the “use of encryption for application-level authentication and protection,” meaning that only the software run by the user on the end-node, or their own personal computer, should be able to encrypt and decrypt information for transmission, rather than any intermediary nodes, and only with the user’s own login credentials.

The end-to-end principle is a key concept in the design of the Internet itself, the underlying “Transmission Control Protocol,” one of the core protocols of the Internet protocol suite (TCP/IP), exemplifies the end-to-principle, and allows applications running on remote nodes to use the Internet for the reliable communication of arbitrary data across the network, without requiring any of the intermediary nodes to know or understand the purpose of the data being transmitted.

In principle, therefore, there is absolutely nothing technically stopping everybody from employing private communications on the Internet. So then, how do we get into this mess we’re in now? Why did the Internet, which has the end-to-end principle in it’s core architecture, become host to the most large scale mass surveillance in history?

Two reasons: Capitalism and IPv4. Let’s start with IPv4.

Internet Protocol Version 4 (IPv4) was created in 1981, the same year the Saltzer, Reed, and Clark paper was published. IPv4 provides approximately 4.3 billion addresses, which sounds like a lot, until you realize the every device that connects to the Internet needs at least one. Running out was not presumed to be a big issue at the time, as this version was originally presumed to be a test of DARPA’s networking concepts, and not the final addressing scheme for the global Internet. In 1981 4.3 billion addresses seemed like an awful lot, but when the public Internet began to take off in the Nineties, it became clear that this would not be nearly enough. In 1998 RFC 2460 was released, this document is the specification for IPv6, an addressing scheme that allows for a near limitless number of addresses, trillions of trillions for each person on earth. Yet, as NETMundial was taking place in Brazil, nearly 16 years since the protocol was invented, Google reports that about 3% of visits to its services use IPv6. The “World IPv6 Launch” site, which promotes IPv6 adoption, estimates that more than half Internet users around the world will have IPv6 available by 2018. In other words, 20 years after the design of the protocol, nearly half of all Internet users will not have access. It’s important to note that it is not hardware adoption that is holding things up, it’s highly doubtful that many device made in the last 10 years could not support IPv6, it’s rather that the owners of the networks do not configure their networks to support it.

As everybody knows, 20 years is effectively infinity in Internet years. With IPv6 a far away utopia, and with IPv4 addresses still the currency of Internet service, NAT was developed. The vast majority of devices available to users where not assigned public IP addresses, but only private ones, separated from the public internet by “Network Address Translation” (NAT), a system that allowed the sharing of public IP addresses by many end-nodes, this was an effective solution to IPv4 address exhaustion, but introduced a bigger problem, the network was no longer symmetric, software running on users’ computers can reach central Internet resources, but can not reach other users, who are also on private address space, without some intermediary service providing access.

What this means is that so long as users’ are on private address space, any communication system they use requires centralized resources to bridge connections between users, and what’s more, the scale of these central resources must grow in proportion to the the number of users it has. In order for the end-to-end principle to be respected, these intermediary services need to support it.

And this where we get to the Capitalism part: Building, maintaining and scaling these resources requires money. In the case of “web scale” platforms, lots of money.

By and large, this money comes from Venture Capital. As Capitalists must capture profit or lose their capital, these platforms require business models, and while many business models are possible, the most
popular today, the one presumed to be the most lucrative by investors, is big data. Thus, instead of respecting the end-to-end principle and engineering functionality into the end hosts of a network, capitalists instead only invest in applications where core functionality is built into the intermediary nodes, that can capture user data and control user interaction, which is how they make money.

Capitalist platforms grow and collect data around these intermediary nodes in the same way the mould grows around leaky pipes. In order to give alternative platforms that respect the right to privacy a fighting chance and rid the Internet of the mould of centralize data-collecting platforms, we must fix the pipes, we need to remove the asymmetry in the network.

We can not allow private initiative alone to push adoption of IPv6, and wait however many years or decades it takes to get it. If governments want to promote their citizens right to privacy, they need to mandate adoption of IPv6, to ensure their citizens are able to use software that respects the end-to-end principle.

Here is a charter of rights that all Governments can provide to their own citizens right now to promote the right of privacy:

– IPv6 connectivity with adequate public address space for all!
– At least one DNS Domain Name for every citizen!
– At least one Government signed SSL certificate for every citizen!

If each citizen had a public address space, a domain name and a signed certificate, the leaky pipes of the Internet could be fixed, the surveillance mould would dissipate, and new privacy-respecting applications could flourish!

DEMAND IPv6 NOW!

M-C-LOL: Circuits of value in the Lulz economy.

Neither free software, nor crowd funding will save us from capitalism. We can’t overthrow capitalism by undertaking work merely for the Lulz, we need to create new value circuits that allow is to build new means of survival for the planet, and only then can we do away with capitalism.

In the stages of capitalist production, the Capitalist comes to market twice. The first time as a buyer, the second time as a seller.

Marx described this as M – C – M’

In the first stage the capitalist buys commodities and labour time. In the second stage, the purchased commodities and labour time are put into production. The result is a commodity of more value than that of the elements entering into its production. In the third stage, the capitalist returns to the market as a seller; the new commodities are turned into more money.

As the capitalist winds up with more money as a result of the productive process, the capitalist can purchase more labour time and commodities and repeat the process again, and again.

Investing in production allows the capitalist to reproduce, increase and accumulate capital. This reproduction cycle is what makes capitalism a thriving, dynamic system, that expands.

This very process of capitalist production has many negatives, many of which extend from the inherent exploitation involved in making labour time into a commodity, many others from the practice of allocating productive assets in the interests of profit, instead of social good, still more from the dispossesion and enclosure required to create the social conditions for capitalist production.

Yet, capitalism sustains us. Despite it’s social costs, its factories and institutions provide the means of survival that the world depends on, even while it’s contradictions jeopardize our survival.

In order to transcend capitalism, we need to find ways to provision the means of survival differently. “Ending” Capitalism, before alternative productive strategies for survival are not only conceived, but actually existing on sufficient scale, would more likely lead to collapse and a new dark ages than it would a fairer and more sustainable society.

In order for any such alternative productive strategies to grow to a scale in which they could be a viable alternative to capitalism, they must, like capitalism be thriving, dynamic systems capable of growth. They need to be able to reproduce their productive inputs. Economic alternatives need to have sustainable value circuits to be truly viable.

Free Software as well as the goods financed by Kickstarter and similar sites seem like production, after all stuff is produced. One can use free software, just like one can consume a movie, book, album or novelty gadget funded by Kickstarter.

Yet, the way the creation of these goods is financed can not reproduce its inputs.

In the creation of free software and in the funding of Kickstarter projects, money to sustain the inputs comes from donation, either actual donation of money in the case of crowd funding, or in-kind in the form of free labour in the case of some free software. These donations and in-kind contributions are done voluntarily. Yet such voluntarist production is different from capitalist production.

M – C – LOL

Thus, like capitalists, voluntary producers, come to market twice. Fist time as buyers, the second time for the lulz. However, unlike capitalists their circuit is not completed, because the lulz do not enable them to be buyers again, do not allow for them to acquire the inputs they need to repeat such production.

Yes, in the case of Free Software, major corporations do provide funding, lots of it. This is when the Capitalist is coming to market as a buyer, not a seller. Thus it is capitalist consumption, they don’t need to make a profit from Free Software directly, they use it in their production process and make money when they return to the market with the resulting product, which is distributed for more money, not lulz.

The source of this money is not a new mode of production, but capitalism. It’s simply part of the investment capital must make in its means of production, it is consumption not production.

And yes, recipients of Kickstarter financing can use such financing to make money, but such income does not flow back to those that donated the funds in the first place. The donors, for the most part, need to go back to work to get another paycheck before donating again. Thus the money comes from their Capitalist employers and is spent out of their “disposable income,” in other words, once again it is consumption, not production.

Both free software and crowd funding are simply novel forms of distribution within the capitalist mode of production, and therefor not a new mode of production that could potentially disrupt capitalism.

In order to transform these practices into genuinely revolutionary forms, we must collectively own the means of production so created, so not only must the software be free, but we must collectively own the wealth that results form using the software in production. We must collectively own the products produced by crowd funding, so that we can use the wealth created to reproduce the cycle, again, and again.

So long as our free labour earns only lulz in return, Capitalism has the last laugh.

I’ll be at Cafe Buchhandlung tonight around 9pm or so, come by if you’re in time, hope we have lots of surprise guests still hanging around Berlin after transmediale.

Scratch-off the Facebook logo, and you’ll find the CompuServ logo underneath.

During the summer I traveled to the Monostori Fortress near Komárom, Hungary to attend IslandCQ 2013 “Crisis! Re/Constructing Europe.” This text is for the IslandCQ 2013 publication. Rather than simply transcribing my presentation, I have created this text to cover some of things we talked about, and to expand upon them and take the topic further. This text is a remix and extension of three previous texts, two from my blog, and one co-written with Baruch Gottlieb.

Remixing and forking both software and text is an approach I have used for years, and indeed most of my texts contain fragments of other texts, some of which I have written myself, some co-written with others. I inherited this technique from the long history of radical art, from practitioners of cut-up, like Brion Gysin and William Burroughs, to Dada and The Situationists International, and into my own generation with the Neoist Network.

The Internet and free software, to me, were a natural extension of my already existing support of free communications and anti-copyright. When I encountered the Internet for the first time I immediately embraced it, its distributed architecture, its capacity for allowing free speech, and perhaps most significantly, its culture of sharing. The Internet embodied the social relations to match my political and artistic convictions.

However, when I encountered the Internet, though I didn’t know it, it was already dying. It was clear to me that there were challenges, to be sure, but I didn’t yet realize how bad the prognosis was. To me, my fight to save the Internet was against the cencorius desires of other users and the timidity of the small companies providing internet services. This was a fight that seemed winnable. However, what I didn’t know at the time, was that the real fight was against Capitalism, and as such, the inevitable end of the Internet was already evident.

A good example of my early participation is a text I posted on Usenet, it was republished on Wired Magazine’s HotWired site, which claimed to be the world’s first commercial web magazine. In it, I argue that sysadmins working for internet service providers should focus on keeping their servers running, and sanction users that are abusing system resources, but not interfere with content, because if they did so, if they assumed the role of online censor, they would jeopardise the spirit of the Net, and also jeopardise the viability of their own service.

In some way I was right, assuming the Net worked the way we thought it worked, worked the way that John Perry Barlow thought when he wrote “We are creating a world where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity,” or the way John Gilmore thought when he wrote “The Net interprets censorship as damage and routes around it,” or the way Richard Barbrook thought when he wrote “Within the Net, people are developing the most advanced form of collective labour: work-as-gift.”

Unfortunately, I wrote my article in September. The 790th day of September, 1993, to be exact. What would have been October 31st, 1995 on the pre-September calendar.

The Jargon File defines “The September that never ends” as “All time since September 1993. One of the seasonal rhythms of Usenet used to be the annual September influx of clueless newbies who, lacking any sense of netiquette, made a general nuisance of themselves. This coincided with people starting college, getting their first internet accounts, and plunging in without bothering to learn what was acceptable. These relatively small drafts of newbies could be assimilated within a few months. But in September 1993, AOL users became able to post to Usenet, nearly overwhelming the old-timers’ capacity to acculturate them; to those who nostalgically recall the period before, this triggered an inexorable decline in the quality of discussions on newsgroups.”

Once the internet was available to the general public, outside of the research/education/NGO world that had inhabited it before September, the large numbers of users arriving on the untamed shores of early cyberspace “nearly overwhelmed the old-timers’ capacity to acculturate them.” The Jargon File mentions “netiquette,” a quaint term from the innocent times of net.culture, yet netiquette was not simply a way of fitting in, it wasn’t like table manners at an exclusive dinner party. The cultural context of that Internet that made acculturation necessary was its relative openness and lack of stratification.

Netiquette was required, because the network had relatively little constraints built into it, the constraints needed to be cultural for the system to work. There was much more to this culture than teaching new users how to not abuse resources or make a “general nuisance of themselves.” Netiquette was not so much about online manners, it was rather about how to share. Starting from the shared network resources, sharing was the core of the culture, which not only embraced free software and promoted free communications, but generally resented barriers to free exchange, including barriers required to protect property rights and any business models based on controlling information flow.

As dramatic as the influx of new users was to the “old-timers” net.culture, the influx of capital investment and it’s conflicting property interests quickly emerged as an existential threat to the basis of the culture. net.culture required a shared internet, where the network itself and most of the information on it was held in common. Capital required control, constraints and defined property in order to earn returns on investment. Lines in the sand were drawn, the primitive communism of the pre-September Internet was over. The Eternal September began, and along with it, the stratification of the Internet began.

Rather than embracing the free, open platforms where net.culture was born, like Usenet, email, IRC, etc, Capital embraced the Web. Not as the interlinked, hypermedia, world-wide-distributed publishing platform it was intended to be, but as a client-server private communications platform where users’ interactions were mediated by the platforms’ operators. The flowering of “Web 2.0″ was Capital’s re-engineering of the web into an internet accessible version of the online services they were building all along, such as the very platforms whose mass user bases were the influx that started the Eternal September. CompuServ and AOL most notable among them.

The Eternal September started when these Online Services allowed their users to access Internet services such as Usenet and email. Web 2.0 replaced Usenet and email with social platforms embedded in private, centralized web-based services that look and work very much like the old Online Services.

Scratch-off the Facebook logo, and you’ll find the CompuServ logo underneath.

The Internet is no longer an open free-for-all where old-timers acculturate new-comers into a community of co-operation and sharing. It is a stratified place where the culture of sharing and co-operation has been destroyed by the terms of service of online platforms and by copyright lobbies pushing for greater and greater restrictions and by governments that create legislation to protect the interests of property and “security” against the interests of sharing.

The culture of co-operation and sharing has been replaced by a culture of surveillance and control.

Much later that September, the 6,820th day of September, 1993, to be exact, I gave a talk with Jacob Appelbaum at the 6th annual Re:publica conference in Berlin. In part, I responded to the earlier presentation by Eben Moglen, the brilliant and tireless legal council of the Free Sofare Foundation and founder of the FreedomBox Foundation, who gave a characteristically excellent speech. However, in it was something that just couldn’t be right.

Moglen claimed that Facebook’s days as a dominant platform are numbered, because we will soon have decentralized social platforms, based on projects such as FreedomBox, users will operate collective social platforms based on their own hardware, retain control of their own data, etc. The trajectory that Moglen is using has centralized social media as the starting point and distributed social media as the place we are moving toward. But in actual fact, this transformation had already occured very long ago.

During the twilight of the CompuServ era, both personal and commercial users migrated en masse to the Internet. For instance, in a letter to their customers that is still available online the software company BASIS international, “The Big Little Software Company,” writes: “BASIS plans to move completely off CompuServe (CSi) and onto the Internet. This is a logical consequence of the many changes that have taken place in the online world over the past few years.”

In their letter, BASIS spells out a lot of these changes: “While our CSi presence has served the company well in the past, its pay-to-access structure is increasingly harder to justify with the Internet providing almost limitless content at a negligible incremental cost. People are moving away from CSi in significant numbers, making it a less effective platform from which to address our current and future customers. We believe that moving our existing support infrastructure from CSi to the Internet will give us better access to our customers and our customers better access to us.”

It goes on to explain how it will now use open platforms like email, Usenet and IRC instead of CompuServ’s proprietary and centralised applications. This letter was published around the same time HotWired reposted my Usenet article.

Contrary to Moglen’s trajectory of social media, the fact is that we already had distributed social media, we already abandoned the centrally controlled platforms such as CompuServ and AOL, and moved to the Internet, and despite this, our decentralized platforms have since been replaced, once again, with centralized social media. Why? Because Capitalism.

The Internet is a distributed social media platform. The classic internet platforms that existed before the commercialization of the web provided all the features of modern social media monopolies. Platforms like Usenet, email, IRC and finger allowed us to do everything we do now with Facebook and friends. We could post status updates, share pictures, send messages, etc. Yet, these platforms have been more or less abandoned. So the question we need to address is not so much how we can invent a distributed social platform, but how and why we started from a fully distributed social platform and replaced it with centralized social media monopolies.

The answer is quite simple. The early internet was not significantly capitalist funded. The change in application topology came along with commercialization, and this change is a consequence of the business models required by capitalist investors to capture profit. The business model of social media platforms is surveillance and behavioral control. The internet’s original protocols and architecture made surveillance and behavioral control more difficult. Once capital became the dominant source of financing it directed investment toward centralized platforms, which are better at providing such surveillance and control, the original platforms were starved of financing. The centralized platforms grew and the decentralized platforms submerged beneath the rising tides of the capitalist web.

This is nothing new. This was the same business model that capital devised for media in general, such as network television. The customer of network television is not the viewer, rather the viewer is the product, the “audience commodity.” The real customers are the advertisers and lobby groups wanting to control the audience.

Network Television didn’t provide the surveillance part, so advertisers needed to employ market research and ratings firms such as Nielson for that bit. This was a major advantage of social media. Richer data from better surveillance allowed for more effective behavioral control than ever before, using tracking, targeting, machine learning, behavioral retargeting, among many techniques made possible by the deep pool of data companies like Facebook and Google have available.

This is not a choice that capitalists made, this is the only way that profit-driven organizations can provide a public good like a communication platform. Capitalist investors must capture profit or lose their capital. If their platforms can not capture profit, they vanish. The obstacle to decentralized social media is not that it has not been invented, but the profit-motive itself. Thus to reverse this trajectory back towards decentralization, requires not so much technical initiative, but political struggle.

So long as we maintain the social choice to provision our communication systems according to the profit motive, we will only get communications platforms that allow for the capture of profit. Free, open systems, that neither surveil, nor control, nor exclude, will not be funded, as they do not provide the mechanisms required to capture profit. These platforms are financed for the purpose of watching people and pushing them to behave in ways that benefit the operators of the platform and their real customers, the advertisers, and the industrial and political lobbies. The platforms exists to shape society according to the interests of these advertisers and lobbies.

Platforms like Facebook are worth billions precisely because of their capacity for surveillance and control.

Like the struggle for other public goods, like education, child care, and health care, free communication platforms for the masses can only come from collective political struggle to achieve such platforms.

This is a political struggle, not a technical one.

Communist Semantic Drivel, The Good Parts. w/ @schneierblog

My Telekommunisten colleague Baruch Gottlieb wrote an excellent, considered response to Bruce Schneier’s recent essay, “The Internet is a Surveillance State.” While Baruch shares Schneier’s concerns about the increasing prevalence of surveillance on the internet, the focus of Baruch’s response is to investigate the political and economic origin of this. Baruch explains that although Schneier is certainly right about this state of affairs, he misses the mark on the political aspects of it.

Acknowledging the essay, Schneier posts a somewhat unusual reply:

“This Communist commentary seems to be mostly semantic drivel, but parts of it are interesting. The author doesn’t seem to have a problem with State surveillance, but he thinks the incentives that cause businesses to use the same tools should be revisited. This seems just as wrong-headed as the Libertarians who have no problem with corporations using surveillance tools, but don’t want governments to use them.”

Now, if Baruch wishes to comment on this, he will, so I’m not going to engage to much with either Schneier’s essay, or Baruch’s response to it, rather I would like to comment on what is implied in Schneier’s response above.

First of all, it should be obvious that the second part of the comment, claiming that Gottlieb is somehow a champion of State surveillance, is very obviously a straw man argument, which Schneier enthusiastically tears down with an irrelevant dismissal of “Libertarians.” A red herring.

And yet, remarkably, these are not the only logical fallacies in this short paragraph, for Bruce also deploys a tidy out-of-hand dismissal, using the term “semantic drivel,” and while not explicit, even the label “Communist” appears to be imply a guilt by association. So, a straw man, a red herring, an out of hand dismissal and perhaps an ad hominem, all in just a few sentences!

I don’t want to single Schneier out here, Bruce is a brilliant and insightful commentator and analyst. Who among us has not blustered on occasion when we’ve felt indignation?

What’s interesting to me is the source of the indignation.

bruce1

What is causing Schneier to act-out in this fashion? I suppose the answer lies in the fact that despite the fallacious dismissals, Schneier notes that “parts of it are interesting.” This communist semantic drivel has some good parts! Something stuck a chord.

I’ve never met Bruce, but when smart people are overcome with indignation and bluster, it’s usually because they feel threatened. They feel unsure, and this feeling makes them defensive, makes them lash out.

I don’t believe that Bruce is threatened by Baruch’s response itself.  But rather, there is something in it which challenges, his world view, and his sense of place in the world.

Baruch’s essay recalls Schneier’s closing comments, as a point of departure: “Welcome to an Internet without privacy, and we’ve ended up here with hardly a fight.”

Baruch, citing EFF, The Open Rights Group, and others, notes that we certainly have fought! You can add many others to that list, including Schneier himself. We have fought! We have fought and lost.

In order to understand the reasons we have fought and lost,  you need to address the structure of wealth and power in our capitalist society, which is what Baruch tries to do, and I wont expand on that here, it’s all there in his essay.

Schneier, perhaps, is not quite as ready to admit we’ve lost, that he himself has lost. This might explain the amnesia, refusing to remember the fight at all.

I hope his indignation is a sign his amnesia is passing, and he’ll soon be ready to confront the true cause of his disappointment with what the Internet has become. Once the initial revulsion and indignation passes, he may realize that the antagonist he is searching for is capitalism, not the laziness, stupidity or apathy of “we,” the masses, who supposedly neglected to fight, or the critical “semantics” of communists.

The problems he so expertly describes result from the profit motive itself.

Today: Octo stakeholder debriefing /// stammtisch

Octavia Allende Friedman has left Berlin, jettsetting on, where to? Hong Kong? Milan? Havana? Perhaps only her personal biographer knows for sure.

Meanwhile, members of the Telekommunisten network will be present as usual, at Cafe Buchhandlung, to greet one and all and raise a drink to a successful launch of Octo P7C-1 at transmediale.

Many deserve a cheer for their amazing contributions to Octo.

Jeff Mann, chief inventor and head of pneumatics, creator of the P7C-1 prototype, contributed decades of research into pneumatics and art
machines to his vision for the tubular system, and his master creation, the P7C-1 central operating station.

Jonas Frankki, Chief Designer, head of graphic identity, created the powerful branding and corporate identity that so perfectly expresses the numerous layers of the project.

Baruch Gottlieb, Chief Director, head of labour dramaturgy, for tirelessly directing the many facets of the project towards a coherent whole.

Diani Barreto, Chief Executive Performer, head of social representation, who brought the project persona to life online and at the festival.

And thanks to our Chief Communication Officer, Mike Pearce, who works towards bringing our often complex, perhaps even convoluted message, to the general public by adding simplicity and concision.

Behind the scenes, Rico Weise, Chief Operations Officer, manages the ever expanding administrative flow.

Not to mention our valiant team of ‘yellow-shirts’ the OCTO central and remote station volunteers, taking the smooth running and efficient delivery of OCTO P7C-1 to heart and ensuring we made a great demo for our current and future investors!

Please come and celebrate with us, share, retweet, all are welcome!

Cafe Buchhandlung is at Tucholskystr. 32

Here is a map: http://bit.ly/buchhandlung

9pm on.

Bitcoin and The Public Function of Money

I want to write a bit about the public function of money, especially as compared to the market function of money, in light of some of the recent discussion about Bitcoin.

Bitcoin is already a very useful technology due to the fact that it allows transactions to take place without any central authority. This alone is significant. The technology behind it is also perhaps applicable in other areas, such as the Namecoin project to replace the centralized Domain Name system.

Does Bitcoin have the potential to replace Government fiat money? No. It doesn’t. It only has the potential to be one commodity form within the money economy.

Countless books and papers have described money, money is a very complex thing which serves many functions. Keith Hart has written about the “Two Sides of the Coin,” heads on one side, tails on the other. One way to interpret this might be to contrast between the public function and the market function of money.

The origin of money is tribute. The source of money is the public, in whatever form, whether empire or democracy or something else, money is spent on public expenditure and demanded back as tribute. Whatever its commodity value, whether minted on gold, printed on paper or electrified as bits in a database, this sort of money has value because it can be used to fulfill tributary obligations, for example, it can be used to pay taxes. As the entire source of this money is government spending, the amount of this money is determined by the amount we want to provide on behalf of all as a society. This is the “Heads” side.

Not all economic activity is done for money. Much of it takes, and has historically taken, gift and kin-communal forms, where work and wealth is shared without specific prices for specific commodities, but rather on a basis of social trust and reciprocation. Markets emerge as economic activity extends beyond communal and neighbourly forms, markets extends the social to beyond the kin-communal, and along with such social distance come more transient relationships that can not rest on trust and reciprocation, and thus must be encompassed by spot transactions, and as a result specific prices for specific commodities and specific price relationships between commodities. With these transient relationships comes money. But this sort of money is different.

Commodities can also be traded directly, even if their relative worth is counted in “Heads” money, and trade can also be done on-account, by credit. The amount of which is not limited to the physical amount of “Heads” money in circulation. In the wider economy, money is endogenous, the amount of money circulating in the economy is not a function of any monetary base, but rather is a function of the amount of things we want to make and do for each other. More specifically, the amount we want to make and do for each other for money. This is the “Tails” side.

This is vertical money and horizontal money. Vertical money is created and destroyed by the public, horizontal money expands and contracts as a result of the economic activity of private individuals and their incorporated forms.

Money that has a commodity base, i.e. Gold, is not completely rooted in a particular public form, since it’s value can cross international borders.

This is where Bitcoin, a digital specie essentially, emerges as a new and rather unique form of money. It’s built-in cryptographic limits on supply make it essentially a virtual commodity form of money, fixed and “hard”, like Gold, yet digital and transferable electronically across global telecommunications networks. As such, it has attractive features as both means of exchange and store of value. Yet, while it certainly is useful on the “Tails” side of money, as one of the various kinds of assets circulating in the global market economy, it does not serve public function well. There is a reason that modern public forms of money are not commodities, why modern economies use “fiat” money, money that is not based in or guaranteed by conversion to any sort of commodity.

If the public restricts itself to commodity-money for public expenditure, this means that what it spends must be limited to what it taxes plus what it borrows, since commodities have a fixed available supply. And though many ignorant or simply disingenuous commentators, such as promoters of austerity, present this to be the case even now, in a modern monetary economy based upon fiat money issued by the public for public purpose, this is factually not the case.

The thing about public money is that we can have as much of it as we want to have. How much we spend relative to how much we tax is a public policy choice, and the right-wing dogma that the appropriate choice is for the budget to be balanced, for taxes to be equal to spending, is universally understood to be false, even among the most celebrated right-wing economists. In his 1948 article “A Monetary and Fiscal Framework for Economic Stability”, “Chicago Boys” patriarch Milton Friedman proposed a counter-cyclical policy, where government spending would be increased beyond taxation during economic downturns, similar to Abba Lerner’s “Functional Finance” which is often referred to as “Keynesian” economic policy. Whatever their ideological stripes, there is little disagreement among economists that to the degree that public budgets need to be balanced, they must be balanced relative to economic cycles and sectoral balances and not merely between annual public spending and taxation.

The balance between spending and taxes is simply the balance of the public “Heads” side of the coin, always in counter-balance with the private “Tails” side of the coin, as expressed by the activity of private interests in the global market.

It is no secret that the national State form is unsatisfactory. Not only is it burdened by its aristocratic roots, and not only is it corrupted by the fact that its modern form is largely captured by the international corporate elite, but the State is clearly unsatisfactory for modern publics as a result of the fact that static territorial forms are increasingly ineffective and inappropriate structures to serve global, distributed communities.

The public form has to evolve from the state form to the networked form, but for that to happen, new, networked public forms will need to emerge that are able to take over the socially necessary public functions. Including the management of forms of public money.

The critical feature required of public money is that we can socially determine how much of it there is, and how much of we want to apply to public purpose. We need ways to create and destroy public money so that we can can have a counter-balance to private activity, to manage cycles, to counter-balance economic sectors, and to socially pursue public objectives, such as health, education, and justice.

Thus, Bitcoin’s innovation in terms of creating a networked form of commodity money is not useful in creating networked forms of public money, and as a result it does not create a way for networked public forms to replace the current State forms.

I’ll be at Stammtisch this evening at 9pm, please come if you’re in Berlin, if not, R15N continues at Mal au Pixel in Paris, you can join the network by calling +33 181 97 97 11

Eternal September // @A_MAZE_Festival

Last month was a long and busy month that started in Canada and ended in South Africa.

Along the way, SecuShare’s {1} Daniel Reusche and I agitated for decentralized social platfors at Berlin’s Campus Party {2}, I presented the first Octo demo {3} at the latest reSource transmedial culture {4} event with Jeff Mann, Jonas Frankki and Baruch Gottlieb, also, Baruch, Jonas and I built the Miscommuniction Station {4} as an online project of the Abandon Normal Devices Festival.

Finally, Baruch and I traveled to the A MAZE / INTERACT festival {5} to present and represent iMine {6} and R15N {7}.

Now I’m back in Berlin and looking forward to tonight’s Stammtisch. And it’s September.

Tuesday, Septemeber 6944, 1993 to be exact {8}.

6944 days, or 19 years and 9 days after the Eternal September began.

A MAZE was fantastic, and the Braamfontein district of Johannesburg where the festival took place was an incredible place, not only to enjoy a great party in a really unbelievable community, but also to reflect on where we are now, nearly twenty years since the commercialization of the internet began to deliver a year-round flow of “newbies” to the Internet 1.0 that nobody yet called “the web”.

The Jargon File defines “The September that never ends” as “All time since September 1993. One of the seasonal rhythms of the Usenet used to be the annual September influx of clueless newbies who, lacking any sense of netiquette, made a general nuisance of themselves. This coincided with people starting college, getting their first internet accounts, and plunging in without bothering to learn what was acceptable. These relatively small drafts of newbies could be assimilated within a few months. But in September 1993, AOL users became able to post to Usenet, nearly overwhelming the old-timers’ capacity to acculturate them; to those who nostalgically recall the period before, this triggered an inexorable decline in the quality of discussions on newsgroups. Syn. eternal September.”

Once the internet was available to the general public, outside of the research/education/ngo world that had inhabited before September, the large numbers of users arriving on the untamed shores of early cyberspace “nearly overwhelmed the old-timers’ capacity to acculturate them.”

Even in Africa, you’d have to go pretty far out of your way to find a community where it’s not September yet. Internet access is certainly not as ubiquitous, reliable or fast as it is it “the West,” but the African people do use the Internet, and are part of its culture.

The Jargon File mentions “Netiquette,” a quaint term from the innocent times of net.culture, yet Netiquette was not simply a way of fitting in like table manners at an exclusive dinner party. The cultural context of that Internet that made acculturation necessary was it’s relative openness and lack of stratification.

Netiquette was required, because the network had relatively little constraints built into it, the constraints needed to be cultural for the system to work. There was much more to this culture than teaching new users how to not abuse resources or make a “general nuisance of themselves.” Nettiquette was not so much about online manners, it was rather about how to share. Starting from the shared network resources, sharing was the core of the culture, which not only embraced free software and promoted free communications, but generally resented barriers to free exchange, including barriers required to protect property rights and any business models based on controlling information flow.

As dramatic as the influx of new users was to the “old-timers” net.culture, the influx of capital investment and it’s conflicting property interests quickly emerged as an existential threat the basis of the culture. Net.culture required a shared internet, where the network itself and most of the information on it was held in common. Capital required control, constraints and defined property in order to earn returns on investment. Lines in the sand where drawn, the primitive communism of the pre-September Internet was over. The Eternal September began, and along with it, the stratification of the internet began.

Rather than embracing the free, open, platforms where net.culture was born, like Usenet, EMail, IRC, etc, Capital embraced the Web. Not as the interlinked, hypermedia, world-wide-distributed publishing platform it was intended to be, but as a client-server private communications platform where users’ interactions where mediated by the platforms’ operators. The flowering of “Web 2.0” was Capital’s re-engineering of the web into an internet accessible version of the online services they where building all along, such as the very platforms whose mass user bases where the influx that started the Eternal September. CompuServ and AOL most notable among them.

The Eternal September started when these Online Services allowed their users to access Internet services such as Usenet and EMail, Web 2.0 instead replaced Usenet and EMail with social platforms embedded in private, centralized web-based services that look and work very much like the old Online Services.

Scratch-off the Facebook logo, and you’ll find the AOL logo underneath.

The internet is no longer a open free-for-all where old-timers acculturate new-comers into a community of co-operation and sharing. It is a stratified place where privileged users have preferential access, including broadband at-home, servers online, users who can control there own “domain,” can run their own mail and web services and access the internet as a whole, including the old platforms such as Usenet and IRC. New users, who may have broadband at home, but have no services and need to use online services like facebook or gmail to communicate at all, subject to the terms of use of those companies. Users who have no broadband at home, and rely on internet cafes and libraries. And at the lowest tier, Users who can only access the mobile internet, on locked-down iPhones and other smart phones, where apps stores control the available apps users can us, and the apps tightly control the users that use them. And of course, each bit of data is paid for from the users’ precious mobile airtime.

As the African people finally cross the digital divide, the once-vibrant cyberspace they arrive in has already been colonized, enclosed and captured by the profit motive. The culture of sharing and co-operation destroyed by the terms of service of online platforms, by copyright lobies pushing for greater and greater restrictions and by governments that create legislation to protect the interests of property and “security” against the interests of sharing.

The culture of co-operation and sharing has been replaced by a culture of surveillance and control.

We once believed that perhaps getting the Africans onto our Internet would help them in their struggles, now perhaps we can hope their capacity for struggle will allow us to find ways to make the Internet a transformational force again. Yet, like the urban centers of cities like Johannesburg, once access is finally won, the centers have been abandoned. The common squares and open markets have already been deserted in favour of protected suburbs and gated communities. Access is allowed not to extend freedom and welcome, but to facilitate exploitation.

If the modern Internet can’t be the liberating force early net.culture believed it could be, maybe we can hope that as the African people come online, their experience in working within environments where inequality, repressions and privilege rule will bring a transformational consciousness to us. They might be our last hope.

If you’re in Berlin this evening, join us at Cafe Buchhandling {9}, while we reminisce and reflect on the unforgettable experience we had in Johannesburg at AMAZE / INTERACT. I’ll be there around 9pm.

{1} http://secushare.org
{2} http://www.youtube.com/watch?v=GW_imx0z3LY
{3} http://telekommunisten.net/octo/
{4} http://project.arnolfini.org.uk/miscommunication-station
{5} http://www.amaze-festival.de
{6} http://i-mine.org
{7} http://r15n.net
{8} http://www.eternal-september.org/?language=en
{9} http://bit.ly/buchhandlung Continue reading