Responsibilization & Collective Social Aspirations

In her informative and entertaing talks Seda Gürses {1} often refers to the process of “responsibilization.” Speaking about privacy, she argues that new, privately operated, online platforms are transferring responsibility of providing safety and privacy to users of these platforms, where users would previously have had the expectations that operators and regulators would bear this responsibility. Responsibilization is the transfer of responsibility form being something that is social and shared, to being held solely by individuals.

Just looking at communications media, privacy on the postal system and the mail system was legislated, and operators and users where held responsible for complying to socially imposed standards.

Yet, on modern online platforms, despite their increasing importance, users are for the most part governed by the site operators’ user agreements, and the responsibility of understanding the sites privacy and safety implications, including often complex settings and options, is held by the user alone.

This raises a very important question.

Do we have a collective right to social aspirations? Do we have a collective right to work towards socially determining social outcomes?

Should people simply get the privacy and safety they deserve as determined by their own behaviour? Or do we want a society where people can expect that their privacy and safety is something that can be socially determined?

What’s worse, is that there is a significant moral hazard at work, even when legislation, regulation, or simply user outcry, seeks to improve the privacy policies of a platform, the site has significant incentive to resist, foot-drag or out-right ignore such expectations. The business models of most operators are based on monetizing user interaction and user data, and therefor whatever the regulatory environment or user desire, so long as it’s up to the operators to implement privacy, we’re leaving the fox in charge of the hen house, as the saying goes.

Can we realistically expect private platforms to enthusiastically place social concerns above there shareholder’s profit interest?

And even if strong regulations, vigilant user advocacy groups or some other incentives can manage to keep the fox from making diner of the hens, is this watchdog state of affairs, with social watchdogs watching profit watchdogs and fighting over every decision really the best way to manage communications platforms?

Ours is often called the communications age, and the development of global digital network communications network is often cited as one of the most important development in human history, comparable to electricity, or even agriculture.

Yet, the commercial Internet was born in a neoliberal era, an era when we are no longer allowed to have collective social aspirations, we are no longer allowed to want social outcomes, or even work towards them. We are allowed only to simply accept outcomes as facts, to believe that outcomes are determined by some sort of exogenous logic, be it the market or the economy or politics or nature itself, and this often comes hand in hand with blaming the victims for their misfortune. If they where faster, stronger, smarter or even luckier they would have done better with the whole privacy and safety thing, or further, the whole wealth and power thing.

Yet, being white, male and rich are often more significant than being strong, smart or talented. By accepting a world where we get what we deserve, where outcomes are the result of individual merit, we embrace a delusion, a make-believe land where power, privilege and wealth do not exist.

This delusion is killing the Internet.

Too often, solutions to the social issues of communications are presented as being individual and not collective. Too often, privacy concerns about online platforms are responded to with “Well, just don’t use Facebook, if you don’t like it” or even more unrealistically, “You should make something better then. If their was a market demand for something better, somebody would have made it, so obviously people don’t care about your issues.”

The fact is that “making something better” requires investment, requires wealth, which means in most cases, capital. Yet, capitalists must invest in ways that capture profit, which brings up the moral hazard again. Don’t trust this fox with your hens? Find a different fox!

More and more it feels like the biggest challenge of our age is the challenge of making people believe that we have a right to collectively work towards our social aspirations, that we can and must work together to achieve collective social outcomes. Not only for privacy and safety online, but to create the kind of society we want broadly. A society where wealth and power, and responsibility are more broadly shared.

{1} http://networkcultures.org/wpmu/unlikeus/2012/03/09/seda-gurses-and-privacy-in-online-social-networks/

3 comments

  1. Joel Kaartinen

    The alternative to facebook already exists. It’s still in development and not too widespread so far but we do actually have an alternative. It’s called Diaspora, an open source social network platform anyone can install on a computer that can automatically join a network of other servers running the same platform.

  2. Dmytri

    Hi Joel, I have extensively covered Diaspora and the reasons why it is not a viable alternative to facebook, etc. The reasons are political, not technological. You may be interested in these:

    http://www.dmytri.info/thimbl-unlike-us-a-pair-of-inconvenient-paradoxes/
    http://networkcultures.org/wpmu/unlikeus/2012/03/09/dmytri-kleiner-the-responses-of-thimbl-r15n-and-deadswap-to-social-media-platforms/
    https://www.youtube.com/watch?v=Y3h46EbqhPo
    https://www.youtube.com/watch?v=GW_imx0z3LY

    Etc….

  3. Christopher Aquilino

    An analogy: Our relationship (with the elite) mirrors domestic abuse. But we deny the bullying, domination, and a narcissism enabling sociopath-empath relationship.

Leave a Reply