Thanks for your interest and great response to the FOSSASIA 2019 workshop I advertised in my previous blog post! Are you a SUSI.AI developer/ contributor? Are you up for an experiment? Would you be willing to write a short piece of text on how the social appears in the technical development of SUSI.AI/ your daily work practices? This text should only be half a page or a page and you should’t think about it too hard; rather: you just find a nice spot (like I did last spring in Berlin, where the picture above was taken) and quickly write down what comes to your mind when you hear the following question:
When and how did you encounter SUSI (standing for the social in terms of social biases, user imaginations, gender relations, your own desires and expectations, or something else that comes to your mind..) when developing/ contributing to SUSI.AI and how did you handle SUSI back then?
Please send your memories to me (astrid.mager(at)oeaw.ac.at) so that we can discuss/ work with them during the workshop. Based on these texts we’ll be able to draw out how to (better) handle SUSI in the future, but also how SUSI can be made productive in terms of creating more “open”, “transparent” or “fairer” (AI) technology more generally.
If you don’t find the time to write such a memory, don’t worry! I’d still be happy to see you at the workshop and learn about your ideas on the way SUSI figures in your work and how you usually deal with it!
Remember: The workshop titled “Where is SUSI in the AI?” will take place on Saturday, 16th March, 18-18.55, at the Event Hall 2-1. I’m already looking forward to seeing you there!!! Please use this link to sign up for the workshop! Thank you!
If you’re interested in learning more about working with memories in software design, I’d be happy to give you further insights in the method “mind scripting” I’ve been toying around with just recently. It’s a method developed by my colleague Doris Allhutter, who particularly created this method to investigate (and potentially also to intervene in) software practices.
If you’re a SUSI.AI developer I’d love to get in touch with you to learn about your work practices, your ideas about SUSI.AI and open source more generally, and to discuss what role the social – in terms of social biases, user imaginations, gender relations, your own desires, or something else that is important for you as a coder – plays in the technical development of SUSI.AI/ your own work. I’ve organized a workshop to provide a space for mutual learning experiences and to initiate a dialogue between informatics and social sciences; an interface I find tremendously important in times of growing social biases, discrimination and surveillance corporate tech triggers. Please let me know if you’d like to participate in the workshop and what you’re interested in to better prepare it in advance! Also, please spread the word and motivate other SUSI.AI developers to show up! The more participants, the better! 😉 If you don’t have time to participate in the workshop – I’m sure you guys will be busy over there – I’d still be happy to hear from you and find some other opportunity to chat at the summit. It’s going to be my first Asian Tech Summit so I’m really looking forward to be there and learn more about your great work!! Thanks also to Michael Christen and Mario Behling for supporting my work so far! I’m of coures looking forward to meeting you guys in Singapore too!!! YAY!
This ethnographic study on SUSI.AI is part of my ongoing research project “Algorithmic Imaginaries. Visions and values in the shaping of search engines”; funded by the Austrian Science Fund (FWF). A short – bit outdated – description of my project can be found at the ITA website. I’m happy to explain it further once we meet, of course!
Here’s the abstract for the workshop titled “Where is SUSI in the AI” (Saturday 16th March, 18-18.55, Event Hall 2-1). Please use this link to sign up for the workshop.
There is a long research tradition in the field of science and technology studies (STS) showing the importance of the social in technical design processes. The notion sociotechnical design practices, for example, stands for tight entanglements and co-shaping processes of technical and social elements. Following this basic assumption critical algorithm studies, infrastructure studies, and software studies have started to investigate how social biases in big data, preferences of designers and coders, or imaginations of future users shape digital tools, software, or artificial intelligence. Moreover, innovative methods have been developed to not only analyze, but also problematize and intervene in software practices. “De-biasing” has become an issue of concern bringing together computer scientists and social scientists to learn from each other in the attempt to bring fairness, accountability and transparency to the table of software design.
Following this research tradition the proposed workshop tries to bring together developers, coders, researchers and other contributors working on SUSI.AI to address the following question: “Where is SUSI in the AI”? During the workshop the participants are invited to show and share how SUSI (standing for the social in terms of social biases, user imaginations, gender relations, developers’ own desires, or something else that is important for the SUSI.AI team) actually figures in the design process and how they deal with SUSI/ or hope to deal with SUSI in the future. While the workshop mainly invites contributors working on SUSI.AI, it is open to developers working on similar AI projects as well.
If you’re up for experimenting with a method using memory work before and during the workshop, please check out my next blog post! To be continued.. 😉
In January I was kindly invited to give a lecture on my habilitation project “Algorithmic Imaginaries“. This talk was part of the lecture series “Aspects of the Digital Transformation” at the The Centre for Informatics and Society (CIS) of the Faculty of Informatics. Thanks a lot to Florian Cech and Hilda Tellioglu for the warm welcome including fine wine and bread! Thanks also to the audience who triggered really interesting discussions! You can find the video on the C!S website if you want to watch it (in English):
Suchmaschinen in Europa – europäische Suchmaschinen?
Suchmaschinen sind gesellschaftspolitischen Entwicklungen unterworfen. Doch welche Rolle spielt Europa dabei?
Enjoy reading the the full text here (in German)!
If you want to learn more about all the great members of the Young Academy, check out the summer series portraits of new YA members. Mine is titled “Kleine Davids gegen Google Goliath“. It’s a fine compilation of interdisciplinary research my young colleagues are doing.
I’m so (so so so) happy that my project “Algorithmic imaginaries. Visions and values in the shaping of search engines” will finally come true! After a really long application process the Austrian Science Fund (FWF) decided to fund this awesome habilitation project! You’ll find the abstract below; more information will follow once the project has started (November 2016 since I’m still on maternity leave). For all of you who have projects under review (or rejected already): don’t give up! It’s a nerv-wrecking process, but if you finally manage to succeed, it’s all worth it!!! (Of course, in times like these peer review has become some sort of strage academic lottery, which does not make the practice any better..)
Visions and values in the shaping of search engines
Search engines like Google are developed in the US-American context, but are used around the globe. Their business models are based on user-targeted advertising. They collect user data, turn it into user profiles, and sell them to advertising clients. Since the NSA affair practices of user profiling are critically discussed; especially in European contexts with diverse data protection laws, historically shaped notions of privacy, and very different tax systems. The ongoing reform of the EU data protection legislation is an important arena where tensions between global search engines and European policy visions and values can be observed. Besides, European search engines emerge that aim to provide users with alternative styles of search. Some are explicitly developed as a European competitor to US-based search engines (Quaero or Independent Web Index). Others are developed in Europe, but draw on other value-systems to distinguish themselves from big search engines, such as respecting users’ privacy (e.g. Ixquick), protecting the environment (e.g. Ecosia), or creating a non-commercial search engine owned by the public (e.g. YaCy).
This poses important questions: What motivations, value-systems, and visions guide the development of European search engines? How are these imaginations translated into sociotechnical design practices? What power struggles, negotiations, and compromises may be observed? How do place and cultural context matter in the design process? Researchers in Science and Technology Studies (STS) investigated the politics of search engines, the relevance of algorithms, and internet governance. What is missing is an in-depth analysis of the shaping of search engines in specific cultural contexts and the role shared value systems and visions play in it. Rooted in the discipline of STS the suggested habilitation project will fill in this gap by investigating design practices of European search projects using a case-study approach (qualitative interviews, workshops, ethnographic observations).
Results from this analysis will be compared to and cross-analyzed with results from my past research on capitalist ideologies driving global search engines like Google and my present research on visions and values guiding European search engine governance. This overall analysis will result in a typology of algorithmic imaginaries, which describes visions and values in the development and governance of search engines in global, European, and local contexts. It will show how search technologies and society co-emerge in specific economic, political, and cultural settings. The primary focus on European contexts is a particular strength of the project since tensions between global search engines and European governance structures and search projects are growing, but have not been systematically studied yet, both in the field of STS and internet research.
This week I spent two sunny days in Graz to attend the STS conference “Critical Issues in Science and Technology Studies”. Doris Allhutter and I organized a panel on the “politics of ICTs”, which turned out to be really interesting! Great presentations, great topics, great participants. Also, we discovered quite a number of overlapping issues and shared interests, which is not always the case with regard to conference panels. I particularly liked the presentations on the material/ technological dimension of ideology and gender relations, sociotechnical/ digital work practices and cultural specificities, and questions on power relations in design practices of ICTs. Anne Dippel struggling with computer problems while talking about bugs in the CERN software and how they affect physicists’ work practices was just one highlight of our panel 😉 I still hope Doris and I will manage to put together a special issue on the fascinating co-emergence of social and digital cultures.
The second highlight of the week was the arrival of the Society of the Query Reader (eds René König & Miriam Rasch; Institute of Network Cultures (INC) reader #9). It’s great to see my contribution on big search and its alternatives in such a nicely designed book. Didn’t the conference designers even get an award for the beautiful flyers, badges and stuff? Anyway, the reader is a wonderful compilation of essays on corporate search engines and alternative styles of search. If interested, you can order or download the book for free (!) more information here..
A preprint of my Society of the Query #2 article has been published in the ITA manu:scripts series. The article is related to the talk I gave at the SOTQ conference in Amsterdam, November 2013. It’s concerned with the ideology of Google and alternative search engines. A final version of the paper will be published in the Society of the Query Reader edited by René König and Miriam Rasch (Geert Lovink as editor of the Institute of Network Cultures (INC) Reader series; spring 2014). I’d like to thank the conference participants, Georg Aichholzer as editor of the ITA manu:scripts series, and both the reviewers of the INC reader and the ITA manu:scripts for their helpful comments and feedback. That’s the abstract:
Google has been blamed for its de facto monopolistic position on the search engine market, its exploitation of user data, its privacy violations, and, most recently, for possible collaborations with the US-American National Security Agency (NSA). However, blaming Google is not enough, as I suggest in this article. Rather than being ready-made, Google and its ‘algorithmic ideology’ are constantly negotiated in society. Drawing on my previous work I show how the ‘new spirit of capitalism’ gets inscribed in Google’s technical Gestalt by way of social practices. Furthermore, I look at alternative search engines through the lens of ideology. Focusing on search projects like DuckDuckGo, Ecosia, YaCy and Wolfram|Alpha I exemplify that there are multiple ideologies at work. There are search engines that carry democratic values, the green ideology, the belief in the commons, and those that subject themselves to the scientific paradigm. In daily practice, however, the capitalist ideology appears to be hegemonic since 1) most users employ Google rather than alternative search engines, 2) a number of small search projects enter strategic alliances with big, commercial players, and 3) choosing a true alternative would require not only awareness and a certain amount of technical know-how, but also effort and patience on the part of users, as I finally discuss.
That’s the link to the full article. I would love to hear what you think about it!
The society of the query conference (Amsterdam) has sadly come to an end. It was a truly great event! Thanks to Geert Lovink, René König & Miriam Rasch for having made it happen! For all of you who missed the exciting discussions on the Google domination, search beyond borders (China, India etc.), artistic projects, search in context, the dark side of Google, or the filter bubble: there’s quite some material circulating online, e.g. abstracts to all sessions & talks, blog posts of all talks, links to alternative search engines, loads of pictures, and, finally, there should be videos of all talks coming up soon, so stay tuned! & here they are!
photo credits: society of the query (Martin Risseeuw)
Last week I had the pleasure to take part in the Momentum13 symposium. Momentum is a conference series that aims at bridging the gap between the sciences and politics. Initiated by the EU politician Josef Weidenholzer and Barbara Blaha its main purpose is to integrate critical research, leftwing politics and practical experience to think about issues such as “progress”, the motto of this year’s conference. The 3-day event was organized in tracks focussing on various topics including gender equality, social movements, arts & culture, the future of work and politics, and technology & regulation – the track I moderated together with fukami; partly on a huge terrace by the lake with a decent glass of wine.. thanks for that!
In our track we had heated debates on small technical details such as internet ports and exploit regulations and big societal questions relating to privacy, democracy and the future of the internet. But these two aspects, of course, closely relate to one another. Seemingly small technical decisions on the legitimacy or illegitimacy of a particular piece of code have largescale political consequences in terms of IT security and the stability of infrastructure we’re using day by day. And vice versa, broad societal developments and power relations influence the construction of information technology and the way the internet looks today. In a capitalist age for-profit companies like Google, for example, figure as central driving force in terms of technology development. The integration of more and more services in the web browser, for example, results in a black-boxing of technology. The less you understand your tools, the more dependent you are on their creators. Or, as fukami put it: “If you can’t break it, you don’t own it”.
This, however, causes a couple of questions: Do we all need to learn programming to use the computer? (or how else would we be able to “break it”?) Or isn’t it the role of politics and law to set limits where limits are needed (e.g. data protection and the exploitation of user data by big US-American companies) and to protect us from harmful technology? Or is that an illusion in post 9/11 societies where extensive surveillance has become a central interest not only of companies, but also of nation states around the globe? And what can we do about all that? How can we regulate Google, Facebook, Twitter and other tech companies that increasingly shape our information universe, social relations, and political discourses, as we’ve seen in our track in presentations on Twitter politics and data journalism? What role can technology funding play in regard to the steering of information technology? How can we make legal practices more transparent or measure – and promote – open data strategies; or “open everything”? What kind of copyright is feasible in times of file-sharing platforms and how can data protection be secured in companies aiming at full-scale observation of employees? How can we manage risks? Those types of questions were discussed in our track. However, those are also the types of questions that future decision-making processes in the field of technology and society will be concerned with. Negotiations of the new EU data protection law, for example, will serve as an interesting test case for future technology development and socio-political agendas. How this negotiation process will end remains to be seen. That both lobbying on parts of internet businesses and the NSA scandal will be crucially influencing the reform process seems to be clear by now. Or, to cite fukami again, “we should thank Snowden” since his leaks have not only shaken up civil society, but EU policymakers too (hopefully!).
Our track discussions were accompanied by good food and great evening events, such as the keynote by Robert Pfaller or the book-reading by Kathrin Passig. Unfortunately, I missed the huge party that took place Saturday night and the Sunday evening matinee. But I’m sure that was fun too! Next year’s conference will be focused on “emancipation”. I highly recommend going there! (and not only because of the scenic location). More information can be found on the Momentum website (including info on the journals Momentum Quarterly and Momentum Policy Paper).