Spaces of (Im)Possibilities#

The prevailing zeitgeist today is technological. By this, I mean that while contemporary societies may critique the specific applications of technology, they rarely question its essence. While the general Western public acknowledges technological missteps, such as the industrial extermination executed by the Nazis—which was only possible because of technological inventions—technological progress is seen, by most, as general progress. The misuse of technological tools is a matter of personal or social repsonsibility. The assumption is that humans maintain control over technology and can therefore determine whether it is used for beneficial or harmful purposes. Technology is viewed as having both positive and negative effects, with its impact largely dependent on who controls it. Thus technology is a mere tool, adhering to the notion that guns don’t kill people, but people do. A good example is the European Commission that states in their white paper:

Like any technology, AI brings opportunities and risks. – [Com20]

Janina Loh refers to this view as the Neutrality Thesis of Technology [Loh19]. We view technology as something distinct from us, something we can command and control precisely. If we understand the modern project as the emancipation of humanity from the harsh environment determined by nature through science and technology, then this view is inherently modern. With this perspective, technology is the means to an end that is our emancipation from nature. It relies on the concept of the subject as an autonomous entity ruling over objects. The object can be harnessed for good or evil, for the ugly or the beautiful, and it’s our prerogative to dictate its usage.

[…] there is an idea that technology is in its essence something human beings have under their control. In my opinion, that is not possible. Technology is in its essence something that human beings cannot master of their own accord. – Martin Heidegger

The distinct subject-object differentiation might form a rational foundation for a worldview, enabling one to navigate through life ordinarily. Via Kant and other thinkers of the Enlightenment, it led to a libertarian Western society that, in theory, safeguards an individual’s right to a fulfilling life. However, the model of a society as merely the sum of all individuals no longer encapsulates its complexity and dynamics due to overwhelming evidence. And the same applies to technology. The dynamic relationship is far more intricate than the simple notion of a group of subjects negotiating and controlling the use of technology.

This viewpoint is especially challenged by postmodern and posthuman views on technology and by the advancements in artificial intelligence, especially in the field of machine learning. The question about our place within a technological society looms large. While machines compute and humans think, machines are now able to participate in communication [Esp22]. And since, according to Niklas Luhmann, society is, in fact, all of the communication that is happening, artificial communication may now be part of society. I think it is reasonable to construct a worldview where bodies, minds, technology, and other systems are deeply interconnected—they are interdependent; they co-evolve.

Lawrence Lessig outlines four regulatory mechanisms that influence the behavior of individuals and institutions: the law, norms, the market, and architectures [Les06]. He posits that, paradoxically, no society can exist without some form of regulation and Niklas Luhmann adds that there is no society without norms. This forms our first strange loop. However, this paradox can be unraveled over time, as regulation and society are mutually dependent. The market, architectures, and norms arise to fill the gaps left by the law. Therefore, if the law does not regulate all the other mechanisms will.

For Luhmann, a norm represents a type of expectation that we consciously decide not to modify, even if it is disappointed. For instance, if we anticipate our new coworker to be tall, and they aren’t, we adjust our expectation. But if we assume the coworker can send emails and they can’t, a societal norm is breached. Instead of adjusting our expectations, we express discontent. Luhmann believes these normative expectations are stabilized through societal communication. It’s important to note that for Luhmann, communication extends beyond just language; for instance, monetary transactions also convey messages. He dismisses the idea of natural or metaphysical norms or moral principles, arguing they’re no longer applicable in our modern world where traditional societal hierarchies have largely dissipated. For example, Human Rights which are inspired by The Declaration of Independence have the kind of status that the dine right of kings had in the Middle Ages. But even if “We hold these truth to be self-evident, that all men are created equal, that they are endowed their Creator with certain unalienable rights […]”, Human Rights are socially constructed and stabilized via communication of mostly western societies. There is no divine force or absolute natural law establishing or guaranteeing these rights. In fact, the enforcement of Human Rights require a certain kind of power imbalance such that one party can enforce these rights, if necessary violently. Therefore, it is a dangerous path to use military intervention to enforce those rights if a society does not stabilize these rights via their communication.

Luhmann emphasis has shifted from norms to decisions. Yet, these decisions are informed by norms, which are themselves constructed through decisions. Luhmann believes the normative structures of modern society are intertwined with values (or preferences). Our society envisions a hierarchy of values or specific ideals that are inherently beyond dispute. We behave as though these values are tangible truths, without requiring empirical verification. No one seeks the tangible existence of justice—justice stands firm as a value. Concepts like justice, peace, equity, and health are considered values. From a societal standpoint, values are perceived as innate principles, meaning we feel no obligation to defend assertions based on them. We don’t question: Why should we champion peace or justice? Moreover, values aren’t mandated by any authority.

Luhmann contends that our value system, which melds necessity with contingency, ensures a stable structure while preserving the utmost freedom for decisions. Nevertheless, most decisions are characterized by a tug-of-war between values. For instance, do we prioritize safety over potential progress? Equity over individual freedom? Wealth equality over capital accumulation for more significant investments? Decisions invariably lead to these value conflicts, and a mere list of values, like a political party’s platform, can be empty of substance.

Now that we’ve delved deeper into the meanings and purposes of norms and values, let’s examine some instances of how technological advancements have influenced them.

In the 1920s, architect Robert Moses designed overpasses in New York that were intentionally too low for buses to pass under. Since buses primarily catered to low-income communities and people of color, this design strategically limited their access to certain areas, preserving racial segregation. Here, architecture acted as a regulatory and controlling force.

When a building lacks wheelchair access, it effectively excludes a specific group, thereby embodying discrimination. Such exclusion is now largely frowned upon in our society, as we recognize it lacks justification. However, not all forms of discrimination are viewed as unjustifiable in contemporary times. For instance, the denial of voting rights to children is a type of discrimination that we currently accept.

Another prime illustration of this principle can be seen in digital spaces, i.e., digital architectures. These environments are imbued with values, determined by the control and regulation of permissible actions. Digital spaces, or cyberspace, as it’s often termed, have never been free from regulation, despite common misconceptions. Rather than being shaped by traditional architectural forms, digital spaces are regulated by source code. The design of websites or applications, dictated by this code, significantly influences the behavior of their users. In other words, the very structure of these platforms function as a form of regulation, guiding user interactions and activities.

Code constitute cyberspaces; spaces enable and disable individuals and groups. The selections about code are therefore in part a selection about who, what, and, most important, what ways of life will be enabled and disabled. – [Les06]

Web pioneers often look back at the “good old days”. Their (mine included) nostalgic view of the beginnings of the internet is that everything was possible. It was easy to build a website. A little bit of HTML, which you could hack into a simple text editor, and you are done. The emphasis lied on negative freedom, i.e., freedom from institutional interventions and regulation. However, without these regulations coming from institutions, social norms and code (architecture) regulated the web instead. In the western societies this led to power concentrated in the Big Five: Google, Facebook, Amazon, Apple, and Microsoft [Rosengrun21].

Similar to Lessig, Don Ihde discusses in Technology and the Lifeworld [Ihd90], technology as an expansion or limit of possibilities. [Rosengrun22] argues and warns that western societies are currently shifting from Rule of Law to Rule of Code:

[…] source code is about to become the main regulator of individual and institutional behavior that regulates all other regulators including law – [Rosengrun22]

Rosengrün’s argument is not centered on the premise that regulation is inherently detrimental, or that large companies harbor malicious intentions. Instead, he acknowledges the necessity, and indeed the inevitability, of regulation. Moreover, he observes that the alignment of corporate regulatory practices with their profit-making objectives is to be expected. He forcefully advocates for the Rule of Law as an indispensable prerequisite for a democratic society. Consequently, any attempt to replace it, could precipitate the dissolution of such a society. For this reason, Rosengrün asserts that code must be not only open but also subject to regulation. He underscores the potential dangers of allowing code to regulate law. When machine learning is used in the policy-making process, it is unrealistic to expect that law will maintain supremacy over code.

Rosengrün refers to Lessig and his emphasis on Wikipedia as influential institution that rather successfully stayed to the principle by the rule of the people [Rosengrun21] as an example of how artificial intelligence can be democratized. The free- and open-source spirit in the IT industry, research and education is still a vibrant one. One that has survived probably because of its (irrefutable) values it realizes: freedom, prosperity, equity, and self-determination. In contrast, Sam Altman, the CEO of OpenAI, recently proposed a licensing-based regulation during a US Senate hearing. This proposal is concerning as it would potentially undermine the open-source community, achieving exactly the opposite of the intended effect. While licensing could introduce some safety measures, ultimately it would protect large corporations and obscure the inner workings of the technology that regulate regulations—it would serve the surveillance capitalism.

The notion of regulation by code is familiar to us programmers, as we understand that even the selection of a programming language can either expand or limit our possibilities. For instance, if one’s objective is to engage in machine learning, the programming language Scheme wouldn’t be the optimal choice. Furthermore, the popularity of specific languages can influence their usage. It’s important to note, however, that this degree of openness isn’t solely controlled by a single programmer. Instead, it’s shaped by the broader social and technological systems that surround us. As a significant amount of control lies with large corporations, they inherently influence the horizon of the possible.

Programming is indeed very similar to building an architecture but it feels much more unrestricted. If you are the only one on a project you can feel like God who creates a world as he desires. It is a powerful kind of control and absolute determination.

A programmer is the creator of a universe of which he is the sole lawmaker. – [Wei78]

Technology not only provides a framework for action, but also actively shapes our environment. It constructs particular subjects and defines our identities, delineating not only who we are but also who we can potentially become.

Another stark illustration of technology’s pervasive (positive) influence can be found in modern birth control methods and in the strategies we deploy to shape our identities, such as curating online profiles to garner peer validation. It’s widely acknowledged that social media can inadvertently amplify negative aspects of human behavior, underscoring the necessity to incorporate media into our understanding of this phenomenon. Further instances of technology’s impact manifest in the realm of beauty and medical advancements. Technologies related to beauty and cosmetic surgery exert pressure to conform to evolving standards of beauty, standards that were unattainable just six decades ago. Similarly, assistive tools have profoundly enhanced the agency of individuals with disabilities, and advancements in medical transitions have carved out new avenues for expressing and experiencing transgender identities.

In summary, it’s difficult to refute the idea that technology molds our behavior and how we interact with the world and one another. McLuhan’s assertion that the specific ways individuals use common tools are mostly insignificant and that the sole existence of the specific technology shape society, holds true.