Peter Weibel: Preface
Infosphere: The Transformation of Things into Data
In the old world, the analog world, there were mainly things. Human beings gave the things names, and these relationships between words and things defined culture and civilization for thousands of years. That’s why contemporary philosophical books still have titles like »Word and Object« (Willard Van Orman Quine, 1960) or »Les mots et les choses« [literal translation: Words and Things] (Michel Foucault, 1966). But people didn’t just name things, even back in primitive times, they were already making images of things. In the course of time, the world of words and the world of images have both developed lives of their own and become autonomous worlds. The relationships between words and objects and between images and objects constitute the two most important evolutionary stages of abstraction. The third stage was the substitution of objects, words, and images by numbers. Thus arose the new digital world of data. How did it come about that things, images, and words became data? It took an infinite number of theories and inventions to bring about this transition from things to data. I shall single out only a few.
Numbers created an abstract realm that transcended the existence of things, i.e. the existence of »sensua« (sensual data). You can express all the numbers in the world with ten figures (1 to 0). Gottfried Wilhelm Leibniz’s 1697 invention of the binary number system, the expression of all numbers by combining just two digits, 0 and 1, constitutes one of the decisive axioms for the infosphere. Around 1800, the mental and machine-assisted efforts to mathematize the world were intensified. In the preface to his masterpiece, »Méchanique analitique« [Analytical Mechanics] (1788), Joseph-Louis de Lagrange had already emphasized that he was able to completely describe the world, solely using algebraic operations. This brilliant work implicitly conceived the universe as a digital machine. In »Mechanization Takes Command« (1948), Sigfried Gideon describes the industrial effects of this mathematization. George Boole proved, in »An Investigation of the Laws of Thought« (1854), that logic and algebra are identical. This led to the development, with the aid of Gottlob Frege (»Begriffsschrift. Eine der arithmetischen nachgebildete Formelsprache des reinen Denkens« [Concept Notation, a Formula Language of Pure Thought Modelled upon the Formula Language of Arithmetic], 1879), and Bertrand Russell and Alfred North Whitehead (»Principia Mathematica«, 1910–1913), of a mathematical logic whose ideal was the isomorphism of thought in logic, and of logic in mathematics. This created the prerequisites for the programming languages that were to evolve in the 1950s.
The mother of programming languages, developed from 1958 to 1963 by Peter Naur, Friedrich L. Bauer and John W. Backus, among others, is ALGOL 60, for »algorithmic language«. All programming languages are so-called semi-Thue systems, based on Axel Thue’s essay »Probleme über Veränderungen von Zeichenreihen nach gegebenen Regeln« [Problems Involving Combinations of Sequences of Symbols in Accordance with Certain Constraints] (1914). The models of universal grammars of natural languages by Noam Chomsky (from 1956 on) are also semi-Thue systems. Today’s programming languages, the numeric code on which the infosphere is based, have their philosophical forerunners in the work of
Today’s programming languages, the numeric code on which the infosphere is based, have their philosophical forerunners in the work of the twentieth-century logicians and philosophers from Poland and Austria. Two exemplary titles are Rudolf Carnap’s books: »Der logische Aufbau der Welt« [The Logical Structure of the World] (German original 1928) and »Logische Syntax der Sprache« [Logical Syntax of Language] (German original 1934). They clearly evidence the aim to extend logic, beyond the analysis of the logical structure of language, to represent the universe itself as a mathematical structure. To exemplify the position of digital philosophy, a panorama projection (a production by ZKM’s Insitute for Visual Media with the Institute Vienna Circle) shows the development of the Vienna Circle. Today this investigation is reflected in Max Tegmark’s book (»Our Mathematical Universe«, 2014).
All these mathematicians, logicians, and analytical philosophers of the last two hundred years were committed to the task of transforming, by means of mathematizing the world, the world of objects into a world of data controlled by human beings. In their »A Logical Calculus of the Ideas Immanent in Nervous Activity« (1943), Warren S. McCulloch and Walter H. Pitts successfully formalized, as a logical calculus, neural activity as the basis of thought.
Last but not least, Kurt Gödel and Alan Turing, at the end of their deliberations concerning the mathematizations of the world, reflected on the mathematization of mathematics itself. Turing’s essay »On Computable Numbers« (1936/1937) poses the question, how numbers and numeric processes themselves can be computed. It thus became evident that earlier notions of truth, created and expressed in verbal sentences, had to give way to logical operations; illustrated, for example, by Ludwig Wittgenstein’s truth tables (»Tractatus Logico-Philosophicus«, 1921). The truth of statements is based on logical provability and, since Turing and Alonzo Church, only that is considered provable that is computable. Numbers operate through numbers, but numbers also operate through images, words, and things. Numeric operations have an influence and effect on images, words and things. This gives rise to a new form of ontology, whose contours are only beginning to dawn on us. The Theorem of Parmenides, asserting that thought and being are the same, is, in an odd way, confirmed by digital technology.
Normally, people write numbers and calculation specifications, such as addition, as a »+« on paper; but do the computing themselves, in their heads; and then write the result of the computation on paper. In pocket calculators, the symbols and calculation specifications are part of the machine, which the human user applies as part of the arithmetic operation. The machine produces the result. The machine computes for the human – mentalism and mechanism being identical. In other words, if computation is an element of thought, computation, and thus thought, can be mechanized. The extended Church-Turing thesis asserts precisely this: Anything that can be formalized, can be computed. And anything that can be computed, can be mechanized. In the digital universe, thinking and language, language and being, thinking and being converge at a defined event horizon. Not everything that exists can be thought, and not everything that can be thought can be said. So there is more than we can think and say. But that part of being that can be thought and that part of thinking that can be said, can be formalized and digitized. In this area, the dictum of Parmenides is true, verified by the example of Boole and Shannon. We have only limited access to the universe, and it reveals to us only its formal, mechanical, digital side. Our minds and our tools grant us access only to the digital side of the universe, but to an increasing extent. To quote Parmenides, “Either it is, or it is not.” Statements like this reveal the ontology itself to be a digital, binary code: To be or not to be, 1 or 0. Numbers operate above being, through being, with being. One is tempted to speak of an operative ontology. Anything that can be formalized, can be realized. As a consequence, the domination of machines is followed by the domination of data. Both are new forms of reality construction; both are new manners of being, extending ontology.
This strange ontological turn was posited by Claude E. Shannon in his master thesis »A Symbolic Analysis of Relay and Switching Circuits«. In this work, he proved that Boolean algebra can be used to simplify the layout of relays; and, vice versa, a targeted application of electronic circuits can be used to solve Boolean equations. The combination of these two systems, proposed by Shannon, plus the use of the binary property of electrical circuits (on – off, 1 – 0, current – no current) for the execution of logical functions, subsequently defined the set-up of all electronic digital computers. Shannon showed that the mental formulas of Boolean algebra could also be applied to material switching algebra.
Electronic circuits – i.e. matter – behave in accordance with Boolean algebra, with the rules of the mind. Mind over matter? Linking Leibniz’s invention of the binary code with electronic circuits now enabled machines to compute all numbers, using just two digits; thus answering Turing’s question. The computer age began, characterized by the transformation of things into data and by operating on this data itself as if it were something existential. This explains why, in physics, mathematics is so, apparently inexplicably, effective (Eugene Wigner, »The Unreasonable Effectiveness of Mathematics in the Natural Sciences«, 1960).
Digital philosophy doesn’t claim that everything can be formalized, on the contrary (Kurt Gödel, »Über formal unentscheidbare Sätze der Principia Mathematica und verwandter Systeme I« [On Formally Undecidable Propositions of Principia Mathematica and Related Systems], 1931). It realizes that more exists than we can express in language, i.e. can formalize. Nor can everything that we think be formalized. Yet human beings increasingly try to grasp with their thoughts what is or could be; and, to get an ever stronger grasp on – i.e. formalize – their thinking. So the world of data doesn’t complete the world of things, words and images; on the contrary, it transforms them into an open system.
Aside from the mathematical information theory of communication, numerous other telecommunication inventions were necessary – from Heinrich Hertz’s experiments (1886–1888) to the transistors of John Bardeen, Walter Brattain, and William Shockley (around 1946) – to create the infosphere, meaning the technical infrastructure of the data world. For some 150 years, the telematic media – telegraphy, telephony, television, radar, radio, satellite, Internet – have created a technical network spanning the globe and enabling the global exchange of data, as well as the organization of the transportation of people and goods. With his experiments, Heinrich Hertz provided empirical proof for the existence of electromagnetic waves. This launched the age of wireless radio technology, which enabled the separation of messenger and message, allowing data to travel through space without the body of the messenger.
In the twentieth century, building on the technical innovations of radio technology, combined with a computer technology based on basic mathematical and logical research, a closely interconnected communications and information network of mobile media evolved – the infosphere: an envelope of radio and other electromagnetic waves covering the planet. By means of artificial, technical organs, human beings, for the first time, can use the electromagnetic waves, for which humans so far had no sensorium, for the line-less transmission of words, images and other data. The social media, which have changed our everyday life, are part of these technical networks. Therefore, the equation “Machinery, materials, and men” (Frank Lloyd Wright, 1930), which was valid for the nineteenth and twentieth century, must be reformulated for the twenty-first century, into the equation “Media, data, and men” (Peter Weibel, 2011). Since the replacement of the alphabetical code by the numeric code, algorithms – from stock exchange to airport – have become a fundamental element of our social order. Today, people live in a globally interconnected society, in which biosphere and infosphere are interpenetrating and interdependent.