“Glocal Search” done :)

My project “Glocal Search. Search technology at the intersection of global capitalism and local socio-political cultures” is finished. YAY. The project was funded by the Anniversary Fund of the Austrian National Bank (OeNB, project number 14702) and carried out at the Institute of Technology Assessment (ITA). It started in March 2012 and ended in September 2015. Its main research question was to investigate how search technology is shaped and governed at the intersection of global and local dynamics. During the project I decided to take the reform of the EU data protection legislation as a case study.

europeanjustDue to large-scale political events, the NSA affair most importantly, the European Union started to get active, to try to unify its data protection legislation and to develop common data protection standards – why the empowerment rhetoric does not easily translate into political practice and where the problems lie turned out to be a central part of my analysis.

A short summary of the project can be found below. The final report can be downloaded here. A couple of publications related to the project can be found in the publication list. Two more publications are currently under review with Social Studies of Science and Science, Technology & Human Values. A follow-up project has been submitted to the Austrian Science Fund (FWF) and will hopefully be funded! This project would enable me to draw together and cross-analyze results from my past, present, and future work to develop “algorithmic imaginaries” driving the development and governance of search engines in global, European, and local contexts. This analysis is supposed to result in a habilitation and a peer-reviewed book. Given the current funding situation in Austria though I have to KEEP MY FINGERS CROSSED!!!

Project summary:

The search engine Google has a market share of more than 90 per cent in most European countries. It is developed in Silicon Valley and thus grows out of a very specific economic and innovative culture which has been coined ‘Californian Ideology’. This label stands for the combination of the free-wheeling spirit of the hippies and the entrepreneurial zeal of the yuppies. The result of this combination is a search engine which conquered large parts of the world because of its state-of-the-art search algorithm, but also because of its clever business model.

Google transforms personal data into user profiles which are used to display personalized advertising. These profiles are fed with data from different Google services – ranging from simple web searches to map services, or social media to ‘intelligent’ artefacts such as Google Glass. Especially since Snowden’s revelations, these business practices have been discussed critically. In Europe, Google is confronted with a number of accusations including the abuse of its quasi-monopoly, data protection violations and collaborations with secret services.

In the course of this growing criticism, the European Union announced a comprehensive data protection reform. In January 2012, the first draft of the ‘general data protection regulation’ was presented. It is supposed to be directly binding for all member states. The main goal of the data protection reform is to force Google & Co. to respect European values and legislations – e.g. with regulations regarding the explicit consent to data transfer, penalties of up to 5 per cent of annual turnover in case of violations of the law, as well as common law enforcement with coordinated data protection authorities. At the same time, the European Court of Justice (ECJ) passed a remarkable judgement with the ‘right to be forgotten’: Based on the current EU Data Protection Directive, the ECJ is forcing Google to delete from its index illegitimate content relating to a person if requested by the individual. In contrast to the EJC, however, not all member states equally support strict data protection standards. In fact, Austria is one of the few European member states that try to hold on to its strict data protection standards and persistently fight for strong data protection standards both in the European Parliament and the Council of Ministers

In view of these developments, the project ‘Glocal Search’ posed the following research questions: How are search engines imagined in Europe and how is Europe imagined in the context of search engines? What visions and values guide European search engine politics and how are they translated into political practice? How does the European imaginary play out in the Austrian context and how do national disparities contribute to the making and unmaking of Europe? What actors, interests, and strategies are central in negotiations of the EU data protection reform? And what role does Austria play in European search engine politics? To answer these questions, discourse analyses of European policy documents and Austrian media as well as 18 qualitative interviews with experts were conducted. Theoretically, concepts from critical Internet research, critical theory, and science and technology studies were used.

Results

To theorize globally operating search engines and their relation to the information economy I extended my previous work on “algorithmic ideology” by drawing on concepts from critical theory. I argued that corporate search engines clearly contribute to materializing and fostering capitalist principles. Even though alternative search engines like DuckDuckGo aim at providing users with different styles of search, the capitalist ideology appears to be hegemonic since not all ideologies are equal in terms of exercising their power. With the globalization of search technologies, Google most importantly, not only technical tools, but also societal norms, values and ideas are globalized that are increasingly at odds with local visions and value-structures. To grasp tensions between globally operating technologies and local sociopolitical cultures I used the EU data protection reform as a case study and the concept of “sociotechnical imaginary” (Jasanoff and Kim 2009) to investigate how search engines and Europe are co-produced and what role Austria can play in European search engine politics.

The empirical analysis of EU policy documents clearly showed the emergence of a European search engine imaginary over the past years – a vision in which fundamental rights such as the right to privacy, data protection and informational self-determination represent key aspects of the European identity. At first, policy documents adopted the US-American rhetoric of internet technologies as driver of economic growth. In view of recent developments, however, they now increasingly describe Google and Co. as intruders in our privacy and as a threat to human rights. The data protection reform is being framed as a central tool to commit North American IT companies to protecting European visions and values.

A glance at the Austrian media enables us to see how fragile the European identity is when it comes to data protection. Following initial euphoria regarding the data protection reform, conflicts of interest soon came to the fore. The ‘battle’ (Falter 28/13) or ‘fight’ (Österreich 09/2012) is fought on multiple ‘fronts’ (Presse 10/2013). On the one hand, the media constructs a ‘divide’ (Presse 01/2011) between the USA and Europe – in this context Europe is described as consistently fundamental rights-friendly. On the other hand, the media discusses inner-European conflicts where practical negotiations of the data protection reform are concerned. Austria, as a data protection-friendly country, is being seen as differing from countries such as Ireland and Great Britain. The latter interpret a strong data protection law as a threat to their economic success. As a result, politics and the media deploy a rhetoric of empowerment when Europe addresses other countries (the USA). Nevertheless, this rhetoric disintegrates in a choir of very different voices and viewpoints when Europe is confronted with itself.

Interviews with experts finally showed that search engine policy has largely moved on from traditional policy-making. Search engine politics clearly mirrors the shift from government to governance, as it is described in the literature. The majority of interview partners describe particularly US-American lobby organizations, but also civil society groups as central actors in the negotiations of the data protection reform. They all employ different strategies to inscribe their interests in the general data protection regulation, but their resources and possibilities are distributed very unequally. Moreover, companies such as Google ‘actually set data protection standards’, as one interviewee put it. This shows that European search engine politics not only reflects geopolitical power relations, but also hegemonic forces that go far beyond Europe’s borders.

Europa gegen Google & Co?

Mein Projekt “Glocal Search” (OeNB) neigt sich bald dem Ende zu. Deshalb wird es am 23. April 2015 eine Abschlussveranstaltung zum Thema Suchmaschinenpolitik, europäische Visionen und Werte, sowie Interessenskonflikte geben (in a nutshell). Die Veranstaltung wird am 23.4.2015 um 18h an der ÖAW stattfinden. Hier der Einladungstext von der ITA Website:

Heißt es Europa gegen Google & Co? Oder lassen sich Suchmaschinen regulieren? Technikforscherin Astrid Mager vom ITA diskutiert ihre Forschungsergebnisse mit VertreterInnen aus Daten- und Konsumentenschutz und von Internet Service Providern.

Google sieht sich insbesondere in Europa mit Vorwürfen konfrontiert, die vom Missbrauch des Quasi-Monopols bis hin zur Zusammenarbeit mit Geheimdiensten reichen. In der Debatte werden das Recht auf Privatsphäre, Datenschutz und informationelle Selbstbestimmung als zentrale Elemente der europäischen Identität ins Feld geführt – und deren Verletzung kritisch diskutiert.

Astrid Mager vom ÖAW-Institut für Technikfolgen-Abschätzung (ITA) hat sich in ihrem Forschungsprojekt „Glokale Suche“ mit Visionen und Werten europäischer Suchmaschinenpolitik, deren (schwieriger) Übersetzung in politisches Handeln, sowie deren Verhältnis zu österreichischen Diskursen beschäftigt. Bei der Präsentation der Ergebnisse und der anschließenden Diskussion wird es darum gehen, wie es Europa gelingen könnte, seinen Wertekanon in die Praxis zu übersetzen; wie global agierende Suchmaschinen reguliert werden können und welche Rolle Österreich in der europäischen Suchmaschinenpolitik spielt.

DiskussionsteilnehmerInnen:

Astrid Mager, Institut für Technikfolgen-Abschätzung (ITA)
Andreas Krisch, European Digital Rights (EDRi), Verein für Internetbenutzer Österreich (VIBE)
Gerhard Kunnert, BKA, Abt. V/7, österr. Vertreter in der EU-Ratsarbeitsgruppe Datenschutz-Grundverordnung
Maximilian Schubert, Internet Service Providers Austria (ISPA)
Daniela Zimmer, Konsumentenpolitische Abteilung, AK Wien

Moderation: Walter Peissl, Institut für Technikfolgen-Abschätzung (ITA)

Infos:

Termin: 23. April 2015, 18 Uhr, anschließend Gedankenaustausch und Buffet

Ort: Österreichische Akademie der Wissenschaften – Clubraum
Dr. Ignaz Seipel-Platz 2, 1010 Wien

Anmeldung bitte über die ITA Website (ganz unten). Danke!

information society @ the crossroads, 3-7 june 2015

We’re happy that we got almost 20 papers for our panel “ICTs and power relations. Present dilemmas and future perspectives” (a panel I co-organize with my ITA colleagues Doris Allhutter & Stefan Strauss). Thanks to the conference organizers we’ll be able to put together two (maybe even three) sessions! YAY. We’ll go through all the abstracts in the next couple of weeks.. I’m already looking forward to that!

That’s the abstract I submitted. I hope it will make it through our tough review process! 😉 Comments & thoughts are highly welcome!!

Algorithmic Imaginaries. Visions and values in the co-production of search engine politics and Europe

Information and communication technologies (ICTs) have been described as transcending and transforming national borders, political regimes, and power relations. They have been envisioned as creating a global network society with hubs and links rather than cities and peripheries; “technological zones” (Barry 2006) rather than political territories. This reordering of distance and space was described as going hand in hand with processes of reordering social life. Such deep entanglements of technological and social arrangements have been coined as processes of co-production (Jasanoff 2005). While this “sociotechnical imaginary of the internet” (Felt 2014) was framed as all-encompassing and world-spanning at first, it is now increasingly seen as conflicting with the diversity of cultural, political, and social values on the ground. Accordingly, alternative interpretations of ICTs and their multiple socio-political implications have emerged over the past years.

 

Especially in the European context, tensions between US-American internet services, Google and its “algorithmic ideology” (Mager 2012, 2014) most importantly, and European visions and values may be observed. After the NSA scandal critical voices have become louder and louder; both in the policy and the public arena. Out of a sudden, issues like privacy, data protection, informational self-determination, and the right to be forgotten have been conceptualized as core European values (even though European secret services heavily surveilled its citizens too – arguable more intensely than the NSA in the British case). This shows that there is a European voice forming that aims at distancing and emancipating Europe from US-American tech companies and their business models based on user-targeted advertising and large-scale citizen surveillance. However, it further shows that there are tensions running through European countries and their national interests, identities and ideologies too. One reason is that Europe is neither a clear-cut, homogeneous entity, nor fixed and stable. In the context of biotechnology policy Jasanoff (2005: 10) argues: “Europe in particular is a multiply imagined community in the minds of the many actors who are struggling to institutionalize their particular versions of Europe, and how far national specificities should become submerged in a single European nationhood – economically, politically, ethically – remains far from settled.”

 

So how is Europe imagined in the context of search engine politics and how are search engines imagined in Europe? And how does the European imaginary relate to national visions and values of search engines? These are the main questions to be answered in the presented analysis by taking Austria as a case study. Analyzing European policy discourses the study examines how search engines – Google in particular – are imagined in the European policy context, what visions and values guide search engine politics, and how Europe is constructed in these narratives. Analyzing Austrian media debates the project investigates how the European imaginary is translated into and transformed in the Austrian context, how Google is portrayed in these debates, and what national specificities shape the narratives. A particular focus is put on the ongoing negotiation of the European data protection reform since this is a central arena where search engines (and other data processing technologies like social media etc) and the European identity are co-constructed these days, but also a site where European disparities, national interests, and local value-systems are at stake. Using a discourse analytical approach and the concept of “sociotechnical imaginaries” (Jasanoff and Kim 2009) this study will give insights in the way ICTs and Europe are co-produced, but also what tensions and contradictions appear between the European imaginary and national interests. While European policy documents try to speak with one voice, the Austrian media shows more nuanced stories of power relations, struggles, and friction that open up the view on the fragility of the European identity when it comes to sensitive, value-laden areas like search engine politics.

 

Google is a particularly interesting technology in this respect since Google was one of the first US-American tech companies that came under scrutiny in the European context. In 2010 Google tried to launch its street view service on the European market. Rather than euphorically embracing the service, however, European citizens, NGOs, and policy makers went on the barricades and started protesting against Google cars in various cities and regions. An Austrian farmer, for example, sparked media publicity by attacking a Google car with a pickaxe. After Google’s illegal scraping of open WiFi data Google cars were banned from Austrian streets for some time (not surprisingly the service was continued later on after Google accepted some restrictions). While the street view debate was the first one that had values like privacy and data protection at its core, the issue was handled nationally back then. Every European country took different actions according to their stance towards the service (varying from unrestricted acceptance in some countries to (initial) blockage in others).

 

Despite these differences among European countries (or also because of them), a European vision – a European “algorithmic imaginary” – started to form in the aftermath of the street view debate. While it was only a silent voice at first, it grew into a stronger message that took its written form in the first draft of the European data protection reform that was launched in early 2012. Since then various actors tried to force their interests into the legislative text – most prominently the US-American IT industry, but also European NGOs and national stakeholders; some of them started lobbying even before the European Commission presented its very first draft. These heavy negotiations show how important this piece of text is for multinational actors doing business on the European market. Even though the reform is far from being finished, the judgment of the “right to be forgotten” that forced Google to obey European law may be seen as a first step towards putting the European imaginary into practice. The Austrian media frames this case as a success in showing US-American IT companies like Google that making business on the European market requires obeying European law. Looking more closely and integrating national visions and values into the analysis, however, indicates how fragile the European imaginary still is, and what tensions and contradictions it faces when being translated into national and local contexts. It shows that Europe tries to speak with a strong voice when addressing other countries and continents, the US most importantly, but how weak its voice becomes when it is confronted with itself. The ongoing reform of the data protection reform offers particularly rich materials to trace this dynamic. It is an arena where search engines, business models, and algorithmic logics are negotiated, but also an arena where Europe is forming and falling apart – both at the same time.

 

So if our information society is at the crossroads, as stated in the conference abstract, we need to understand tight entanglements between technological and social arrangements before taking the next junction. Only when (re)grounding global ICTs in specific socio-political contexts alternative routes may be taken towards more democratic, more sustainable, and more culturally sensitive network technologies (whether this requires stricter regulations of US-American technologies or developing alternative “European” services, or both, remains to be seen). What we may learn from the geopolitics of search engines in terms of global power relations, European identity construction, and concepts of nationhood will be finally discussed.

 

Acknowledgment
The  research  presented  in  this  paper  is supported by the Jubilee Fund of  the Austrian National Bank (OeNB), project number 14702.

References

Barry, A. (2006) Technological Zones. European Journal of Social Theory 9(2): 239-253.

Felt, U. (forthcoming) Sociotechnical imaginaries of “the internet”, digital health information and the making of citizen patients, to appear in Hilgartner S., Miller, C., and Hagendijk, R.: Science and Democracy: Making Knowledge and Making Power in the Biosciences and Beyond, London/ New York: Routledge.

Jasanoff, S. (2005) Designs on Nature. Science and Democracy in Europe and the United States, Oxfordshire: Princeton University Press.

Jasanoff, S. and S. Kim (2009) Containing the Atom: Sociotechnical imaginaries and Nuclear Power in the United States and South Korea, Minerva 47(2): 119-146.

Mager, A. (2012) Algorithmic Ideology. How capitalist society shapes search engines, Information, Communication & Society 15(5): 769-787.

Mager, A. (2014) Defining Algorithmic Ideology: Using Ideology Critique to Scrutinize Corporate Search Engines, Triple C. Cognition, Communication and Co-Operation 12(1).

ICTs and power relations

Together with my ITA colleagues Doris Allhutter and Stefan Strauss I put together a panel for the “THE INFORMATION SOCIETY AT THE CROSSROADS” conference that will take place in  Vienna next year (3-7 June). The title of our panel is ICTs and power relations. Present dilemmas & future perspectives. The deadline is 27th of February.

I hope to see you there! Here’s the abstract:

The increasing presence of ICTs in a multitude of societal contexts alters the relation between social, political, technical, legal, economic arenas. As cross-sectional technologies, ICTs enter and link different societal domains often entailing a number of tensions and controversies, e.g. due to conflicting interests, hegemonic discourses, socio-political cultures and practices. Novel forms of interactions are accompanied by increasing complexity, diversity and overlaps between public and private spheres. The capacity of ICTs as a political tool is multidimensional: it can boost civil society participation (e.g. the Arab Spring) as well as amplify mass surveillance and privacy intrusion (e.g. revealed by Snowden).

This panel is interested in the manifold interplay between societal power structures and ICTs. In line with the umbrella issue “at the crossroads” particular focus lies on contributions that present controversies, dilemmas, and imaginary futures that open up paths towards socio-technical alternatives.

Target groups

The panel embraces different scientific disciplines and welcomes theoretical as well as empirical contributions bridging different perspectives (e.g. computing and philosophy, technology assessment and science and technology studies, social, political, economic and techno science).

Subjects and scope

Topics of interest thus include but are not limited to:

  • Values in design and responsible technology innovation
  • Socio-technical alternatives (e.g. peer production, commons, free software, etc.)
  • ICT-related political participation, activism and policy making
  • Norms, standards and hegemonies in ICT infrastructures, software, algorithms and code
  • ICT commercialization and ideologies
  • ICT at the intersection of global, European and local contexts
  • Co-emergence of ICTs with gender, sex, age, class, race, dis/ability (social sorting, standardization, etc.)
  • Emerging privacy and security challenges (privacy-by-design, encryption, EU data protection reform, etc.)
  • Technical and regulatory oversight and limits of surveillance technologies and practices

Important Dates

Submission deadline: 27 February 2015
Notification of acceptance: 20 March 2015

more to read

What a nice start of the day: The sun is out (after days of rain). A nice cup of coffee & a package on my desk from De Gruyter. It’s the brand new edited volume:

I’m really looking forward to reading it since it not only contains contributions on search engine use and the filter bubble, but also articles on the regulation of search engines and alternative tools; issues I’m dealing with in my current project too. Thank you Birgit Stark, Pascal Jürgens et al. for putting together such a great volume!

politics of icts

For all STS people out there! My colleague Doris Allhutter and I are organizing a panel for the STS conference “Critical Issues in Science and Technology Studies” taking place in Graz (Austria) next year (5-6 May 2014). Our session focuses on the “Politics of ICTs” since we think that’s an important issue for STS scholars! Now we’re hoping for interesting papers concerned with tight entanglements between ICTs and politics/ socio-political cultures/ practices/ discourses and identity – that’s where you come into play! 😉

Further details on the abstract, deadline (31 January 2014), conference venue etc. may be found here. That’s our call for papers:

— Special Session 7: The politics of ICTs
(Doris Allhutter & Astrid Mager, Institute of Technology Assessment of the Austrian Academy of Sciences)

Information and communication technologies (ICTs) emerge along with hegemonic discourses, socio-political cultures, everyday practices and identities. Search engines, social media, wikis, open access portals, semantic software, surveillance tools, and code in a wider sense, are created not only by programmers and technical people, but also negotiated in wider society. Policy makers, law, media discourses, economic rationales, cultural practices, computational infrastructures and algorithmic logics are all taking part in the negotiation of ICTs. At the same time, they also create, stabilize and change cultural meaning, socio-political relations and materiality. ICTs and social power relations thus co-emerge.

Our panel welcomes both theoretical and empirical papers on practices of software design, power relations and material dimensions, socio-political implications of ICTs. Topics of interest include but are not limited to:

•          How are ICTs negotiated in design practices and wider socio-political frameworks?
•          What actor-networks, practices and arenas are involved in the creation of ICTs?
•          How are norms, values, and hegemonies inscribed in algorithms, code and software?
•          How are power relations enmeshed in such infrastructural materials?
•          What politics (e.g. gender relations, race biases, commercial dynamics, ideologies) do ICTs carry?
•          How can we investigate the micro-politics of artefacts?
•          What social, political, economic, cultural implications and challenges do ICTs cause?
•          How can we open up, investigate and renegotiate the politics of ICTs?
•          How can we work towards value-sensitive design and responsible innovation in ICTs?

momentum13. technology & regulation

Last week I had the pleasure to take part in the Momentum13 symposium. Momentum is a conference series that aims at bridging the gap between the sciences and politics. Initiated by the EU politician Josef Weidenholzer and Barbara Blaha its main purpose is to integrate critical research, leftwing politics and practical experience to think about issues such as “progress”, the motto of this year’s conference. The 3-day event was organized in tracks focussing on various topics including gender equality, social movements, arts & culture, the future of work and politics, and technology & regulation – the track I moderated together with fukami; partly on a huge terrace by the lake with a decent glass of wine.. thanks for that! :)

In our track we had heated debates on small technical details such as internet ports and exploit regulations and big societal questions relating to privacy, democracy and the future of the internet. But these two aspects, of course, closely relate to one another. Seemingly small technical decisions on the legitimacy or illegitimacy of a particular piece of code have largescale political consequences in terms of IT security and the stability of infrastructure we’re using day by day. And vice versa, broad societal developments and power relations influence the construction of information technology and the way the internet looks today. In a capitalist age for-profit companies like Google, for example, figure as central driving force in terms of technology development. The integration of more and more services in the web browser, for example, results in a black-boxing of technology. The less you understand your tools, the more dependent you are on their creators. Or, as fukami put it: “If you can’t break it, you don’t own it”.

This, however, causes a couple of questions: Do we all need to learn programming to use the computer? (or how else would we be able to “break it”?) Or isn’t it the role of politics and law to set limits where limits are needed (e.g. data protection and the exploitation of user data by big US-American companies) and to protect us from harmful technology? Or is that an illusion in post 9/11 societies where extensive surveillance has become a central interest not only of companies, but also of nation states around the globe? And what can we do about all that? How can we regulate Google, Facebook, Twitter and other tech companies that increasingly shape our information universe, social relations, and political discourses, as we’ve seen in our track in presentations on Twitter politics and data journalism? What role can technology funding play in regard to the steering of information technology? How can we make legal practices more transparent or measure – and promote – open data strategies; or “open everything”? What kind of copyright is feasible in times of file-sharing platforms and how can data protection be secured in companies aiming at full-scale observation of employees? How can we manage risks? Those types of questions were discussed in our track. However, those are also the types of questions that future decision-making processes in the field of technology and society will be concerned with. Negotiations of the new EU data  protection law, for example, will serve as an interesting test case for future technology development and socio-political agendas. How this negotiation process will end remains to be seen. That both lobbying on parts of internet businesses and the NSA scandal will be crucially influencing the reform process seems to be clear by now. Or, to cite fukami again, “we should thank Snowden” since his leaks have not only shaken up civil society, but EU policymakers too (hopefully!).

Our track discussions were accompanied by good food and great evening events, such as the keynote by Robert Pfaller or the book-reading by Kathrin Passig. Unfortunately, I missed the huge party that took place Saturday night and the Sunday evening matinee. But I’m sure that was fun too! Next year’s conference will be focused on “emancipation”. I highly recommend going there! (and not only because of the scenic location). More information can be found on the Momentum website (including info on the journals Momentum Quarterly and Momentum Policy Paper).


it’s the network, stupid

Yesterday I was on a panel discussion on surveillance organized by quintessenz and emergence of projects. It was a lively discussion, which left me with more open questions than answers though. Reinhard Kreissl (sociologist of law and criminology) and Markus Kainz (quintessenz, moderator) easily agreed on the bad guys (usual suspects like the state, government, Google, Billa) and identified civic disobedience as an appropriate way to fight surveillance. Practical examples of such guerilla activities were swopping Billa Vorteilscards or defrauding the population census (by reproducing sheets and feeding them with wrong data). Even though I like the idea of creating a critical mass of disobedient citizens to mess with statistics, I think it’s not that easy anymore in times of digital surveillance. Cheating with sheets of paper and swopping discount cards is easy compared to messing with big data and algorithmic logics. The reasons for that are multiple:

First, digital surveillance is almost seamless. As Markus put it: “We are surveilled not once or twice, but various times”. Combinations of data from cell phones, surveillance cameras, credit cards, and digital tools like search engines and social networks make it hard to escape from your own data body. The data points we leave are simply too many and too heterogeneous. Here, I agree with Manfred Kreissl: “We are leaky containers”.

Second, most individuals do not have the knowledge and technical know-how to mess with such complex digital networks. And why should they? Most people, the majority, most probably, is pretty happy with how things are. They get discounts with their Billa card, they get free – and pretty good – online services from Google & co, they have become used to or even grew up with extensive surveillance and advertising so that they don’t care anymore. That does not necessarily mean that people agree with all these data collections, it just shows that people take on a pretty fatalistic attitude in their daily lives. And yes, some people don’t care at all or simply like contemporary consumer culture – just like one of my interviewees, working in human design and engineering, phrased it: “I think the driving force behind this information economy is our, kind of, probably, possibly a little bit unhealthy desire to just keep consuming, and communicating, and producing at such a frenzy rate.” (Mager 2012: 10)

And, finally, even if people are discontent with the current surveillance state, why should it be the responsibility of the individual to fight a system that even politics and regulations seem to face with powerlessness? And how could we even step out of these powerful networks of surveillance? A quote by Scott Lash came to my mind when cycling home from the discussion: “The point that this book has tried to make is that we can no longer step outside of the global communication flows to find a solid fulcrum for critique. There is no more outside. The critique of information is in the information itself.” (Lash 2002: 220). Lash’s Critique of Information may be seen as an explanation for the digitization of political action. Even politics has become a matter of mouse clicks. Signing an online petition, liking a political group, sharing a critical initiative, all that is political engagement these days. The good thing though, and I think that’s something we should not forget, is that also new social movements are emerging from these activities, Occupy Wall Street, the Arab Spring (whether successful in the end or not), or Uni Brennt have (also) been organized online and have ended on the streets.

So what I’m trying to say, I guess, is that things are more complicated than they seem at first sight. Of course, surveillance states, Google, Billa and other players are spying on us and (ab)using our data and that’s bad. No doubt about that. Blaming them, however, is not enough in my view. Rather, it’s important to understand power relations and dynamics that are stabilizing them. Political decisions, media debates, but also our own behavior that is essentially feeding their power. Only then proper ways out may be found. Ways out that may even be digital. Times have changed since 1968 and so have we.

image © http://commons.wikimedia.org
Lash, Scott (2002) Critique of Information, London: SAGE
Mager, Astrid (2012) Algorithmic Ideology. How capitalist society shapes search engines, Information, Communication & Society, 1-19

become a data dealer

This is a great project! The online game DATA DEALER playfully deals with practices of user profiling, sellout of private data, privacy violations etc companies like Google, Facebook, and other for-profit IT companies raise with their advertising-based business models. In the article Algorithmic Ideology I’ve shwon that the power of search engines (and social media platforms, &&&) is enacted and stabilized in a complex network of actors and social practices. I’ve argued that it’s not enough to blame Google (and other companies) for making profit and having gained a quasi-monopolist position on the internet . Rather, it’s important to understand how various actors including programmers and advertisers, but also policy makers, journalists, jurists – and last but not least – users help to stabilize its powerful role by simply using their services and contributing their data to the sophisticated caplitalist accumulation cycle. Accordingly, critically examining and debating business models and practices of Google, Facebook & co is a valuable first step on the long road towards a better understanding of new media services and, ultimately, a change of existing and future practices, products and privacy settings. The reform of the EU data protection law, for example, is a long and tough negotiation process! Playing, supporting, and sharing DATA DEALER, on the contrary, is a quick move enabling us to think about and raise awareness on the matter. And it is fun too!

If you wanna join the undertaking, go to their website, watch their video trailer, install the demo version or donate money. They’ve managed to raise $50 000 via crowdfunding just recently. I’m sure they’ll manage to create an awesome – non-profit (!) – online game! Good luck!!!

net politics convent

Before I took off to Greece (two weeks of internet absence – yay!!!) I participated in a net politics convent of the Austrian civil society organized by the World-Information Institute, Vienna (participants from activist groups, research institutes, arts & culture, technology experts, engaged citizens; supported by servus.at). The primary aim of the gathering was to formulate claims in the context of net neutrality, data protection and privacy rights, open data and open knowledge and, finally, copyright. The claims are directed to Austrian politics. The time is right now since all parties have started campaigning for the elections in fall. Net political issues should be part of their strategy! And there is much to discuss as the vivid debates at the convent have shown! It was not easy, but we finally came up with three straight claims per issue that are summarized below (in German). For more in-depth information on and discussion of these claims go to the convent’s website. If you wanna support our claims, please sign the petition here and share it widely – via Facebook, Twitter or old-fashioned email and word of mouth!

Netzneutralität

  • Gleiches Internet für alle!
  • Das Netz muss öffentlicher Raum sein!
  • Keine Überholspur für Großkonzerne!

Datenschutz und Recht auf Privatsphäre

  • Privacy by Design!
  • Durchsetzungsfähige Behörde für Informationsfreiheit und Datenschutz!
  • Entbündelung von Datenmonopolen!

Offene Daten und Offenes Wissen

  • Transparenzgesetz und Öffnung der Datenbestände des öffentlichen Sektors!
  • Freier Zugang zu wissenschaftlicher Forschung und Produktionen aus öffentlichen Mitteln!
  • Freie Verfügbarkeit von Lehr- und Lern-Unterlagen öffentlicher Einrichtungen!

UrheberInnenrecht

  • Ausweitung der freien Werknutzung (z.B. Remix) bei entsprechender Vergütung!
  • Stärkung der Position der AutorInnen durch UrheberInnenvertragsrecht!
  • Kürzere Schutzdauer, mit Verlängerungsmöglichkeit durch UrheberInnen!