...

Civic machines, now!

What does a simple traffic light and a social media platform have in common? How can we design socially just AI applications by learning from technologies of the previous century? This short post is an introduction to a series of blog posts that seek to undestand algorithmic politics and to illustrate alternative paths for designing fair algorithmic implementations.

Munich. It’s two o’clock in the morning, and I’m leaving my favorite Munich bar. The pedestrians’ traffic light is red. I wait to cross the road, taking a glimpse at Google maps to find my way to the subway. Not a lot of cars drive by, but as the cars’ traffic-light goes red, a red Mazda stops in front of me. For a couple of seconds, the driver and me just look at each other. I bet you have also experienced these moments, when all the cars at an intersection stay still. That day I stood still, consciously understanding the power of a traffic light.

Well, technology is an instrument, and as such traffic lights are really efficient in preventing accidents and regulating transportation. But from a political perspective, this technological artifact consisting of a red, a green, and an orange lamp has the potential, not only to give an order to me and the driver of the red Mazda, but to thousands of people passing by the specific intersection each day. Not only that, they give orders in moments that it would intuitively be absurd to do so. For example, the specific street was a tiny one, and I could have crossed before the red car appeared. What is even more absurd, is that I chose to follow the order of the traffic light, even if I did not have to. As my favorite Michelle Foucault said “power comes from below"; I was the one choosing to obey a set of lights.

The second thought I had reflecting on the situation was realizing the power that the engineers who program the lights had. As designers, they are able to regulate and dictate to a whole city when, how, and how quickly people travel between their daily destinations. For them, each decision about the infrastructure is a result of careful calculation based on theoretical and empirical studies spanning over decades. A slight change has the potential to affect the function of a whole city, assigning to their job a high degree of responsibility. It makes sense, then, that in each country there are extensive legal frameworks that guide engineers’ planning of these systems.

The third thought was about the big picture. The red car, me, the traffic-light, and the engineers who designed its function, all were momentarily part of the same hybrid society, consisting of technological artifacts and humans. As Robert Wiener would say, we were parts of a cybernetic system. As Nigel Shadbolt and his colleagues would say, we all were participants of a social machine. Social machines appear when humans and technology interact, forming and reforming the society. Studying these systems from a cybernetic perspective is always interesting, because one can uncover associations and understand phenomena in ways that were not possible before. Because I care a lot about politics, I would argue that each social machine is always also a political machine. As social behaviors always have a political dimension, so do the specific types of cybernetic systems. The traffic light, although made out of soulless plastic, glass and metal, was exerting power on me and the driver of the car. The same applies for the engineers who planned the system. Going one step further, the behavior of the engineers was governed by a political institution – the state of Bavaria.

...

The example of the traffic light is an excellent example of a social machine. A trivial and banal interaction hides so many dimensions of influence, ethics, and communication. And actually, it helps me get you to the bigger picture. The most powerful aspect of the story above was shortly mentioned and passed unnoticed. Leaving the bar and not knowing the way, I ended up at the intersection because google maps prescribed me to do so. In today’s datafied, digitized, or whatever fancy word you want to put there, world, the biggest designer of transportation flows beside state employed engineers are the google employed engineers. They, by fulfilling their own zombie purposes, to apply fancier, faster and more efficient algorithms that optimize cost-functions that a google employer decided to formulate, are actually determining how the whole society will reach their destination: which route to take, what means of transport. But this time, they do something more than the state engineers. Through their map functions, rating systems and integrated advertising, they also influence us on what our destination is going to be.

Me, my phone, the google maps app, the algorithms that calculate and optimize the routes, the google employers, and everybody else who uses google maps app to reach a destination, to get informed, or to advertise their service, all belong to the same social machine. And in contrary to the banal traffic light, the political dimensions of that machine are uncountable. On top of my head:

  • The power that the app exerts on me as a zombie-user.

  • The in app-services who have to be added on the service if they want costumers to see them.

  • The circulation of data and their property rights.

  • The social influence bias that rating systems come with.

  • The redistribution of communication processes between humans and analog maps to highly digitized systems.

  • The way my behavior influences algorithms’ recommendations, decisions, and generally their model training.

  • Not to forget, too, the social norms that the maps application mediates, by making us prefer destination of high ratings, based on reviews of some arbitrary (and sometimes even fake) accounts.

  • And in all these, there are uncountable regulations that certify the functioning of the app based on some legal frameworks: contracts I opted into without even reading them, cookies saved on my device with and without my consent, personalized systems that potentially violate human rights and discriminate against individuals and social groups.


I can go on an on but I think you get the big picture. This is only an introductory post after all. Google maps is, of course, not a unique example of a political machine constituted by such complex ethical, legal and political relations. An even higher degree of that complexity can be found on social media, where significant political processes explicitly take place. On them, politicians externalize their views and perform political campaigns, individuals get informed by news media articles, contribute their political opinion, argue, engage, influence or get influenced. Furthermore, manipulation and misinformation attempts belong to the daily agenda of the platforms. The platform themselves follow ethical norms, and filter contents according to the platform owners’ personal opinions. Algorithmic systems and recommendations personalize the information users see under criteria that contribute to the achievement of the platforms business models’ goals, making decisions and influence processes highly opaque and unexplainable. Questions about who is accountable, who is responsible, why something happened and how a specific phenomenon or platform property contributed to the reorganization of human behavior are circulated but rarely have an answer.

Political machines do not only appearing at the level of the tech giants and their services. Similar systems self-organize constantly in socio-technological ecosystems: the application of robotics in a factory, the exploitation of data-intensive services for automated decision making (ADM), the regulation and controlling of individuals at work places based on performance evaluation software and monitoring tools, the application of technological tools for law enforcement purposes. These are all systems where individuals and technological artifacts interact and reform human behavior. In these systems, a simple single perspective analysis cannot uncover the multiplicity of influence processes, nor the ethical and legal dimensions involved.

The most important feature on the above social machines that differentiates them from the traffic lights, is the extent of consent that participants actually hold about the organization of the machines; or rather, the lack thereof. If I did a survey asking you whether you agree with the function of traffic lights, their purpose and general implementation in the society, the majority would agree more or less that they serve the society pretty well, under ethics that you would find legitimate. If I posed the same question about the google maps app, or a social media platform, or an ADM algorithm, the answers would be quite deferent. That happens, because technological innovations have been disruptively generated and applied in the last decades, with their diffusion taking place at an uncontrollable pace and their integration in the society unregulated. Social media platforms are not made for political communication, but everybody uses them for that. Google maps app was not chosen to govern and co-design transportation flows, but it does. That happens because technology as I said in the beginning is powerful and efficient, and we as technology zombies become greedy for the possibilities it opens to us.

Nevertheless, as the initial excitement of new technological advances wears off, we are now realizing more and more that these technologies and the associated socio-technological ecosystems are not fruits of heaven that came without consequences. Technology always reforms us, in ways we do not see a priori, changing and affecting us in multiple ways. As the old Marshall Mcluhan said “Medium is the message.”

Now it is time to ask ourselves, are these new political machines fair? Do they actually serve us the way we want we want them to? Are they inclusive, diverse and contribute to an open society that take into consideration the needs of everyone? Did company owners and governments ask us to change the politics of the society in such way, and do we legitimize their goals? Are these political machines actually supporting civic society? If not, it is now the time to transform the political machines into civic machines. To co-design systems in ways we want.

To choose our message and then design the medium according to it.

We need civic machines now!

-->