Boken och biblioteket

Boken och Biblioteket finns nu att beställa hos Ink Bokförlag. Jag ser fram emot att läsa den av flera anledningar, kanske framförallt ur ett objektorienterat perspektiv. Titeln bjuder nämligen in till en sådan förväntan.

Några fria associationer kring titeln:

Sein und Zeit – Korrekt svensk översättning är Vara och Tid, men av någon anledning blev det Varat och Tiden i bestämd form, något som översättaren Richard Matz senare ångrade. Men distinktionen är viktig, eftersom Sein und Zeit i sin “helhet” inte är objektorienterad (endast vissa, förvisso viktiga, passager är det, ex. verktygsanalysen). Här kan vi jämföra med Varat och Intet, vars tyska titel är Das Sein und Das Nichts. Har tyvärr inte läst den än.

Nu var kanske inte denna existentialistiska utsvävning poängen. Jag är tveksam till att varken Heidegger eller Sartre figurerar i B&B. Utan snarare är det intressant med den bestämda formen som en kritik av den så utbredda formatnihilismen som präglar vår samtid. (när det gäller böcker föddes ju textismen i och med televisionen).

Kom på en till sak. Böcker vars titlar får förkortningar. “B&B” råkade jag skriva för att länken slutar på b&b.html. Andra förkortningar som ofta används är GT för gamla testamentet, ATP för A Thousand Plateaus och TQO för The Quadruple Object.

Tingets rätta rum

Idag kikade jag förbi tingsrätten där den så kallade 15-åringen friades från åtalet om att olagligen ha delat filer. Själva rättegången blev det bara några minuter av, eftersom försvaret ville att det skulle ske bakom stängda dörrar med hänsyn till den ringa ålder som den åtalade hade. Det gjorde inget, för den stora behållningen ligger för det mesta i de samtal som tillåts uppstå kring ett fall.

Paradexemplet på produktiva tingsrättsseminarier, som jag skriver om i det nätpolitiska manifestet, är Spectrial 2009, dvs. rättegången mot The Pirate Bay. Dags för citat:

Spectrial utspelade sig istället på en fysisk plats och utgjorde därmed en mötespunkt for människor som flödade in och ut ur stadsrummet. I tingsrätten bildades små öar av människor som planerade framtida aktiviteter, utanfor trotsade aktivister vinterkylan i Piratbyråns buss, ett par kvarter bort samlades folk på cafeer för att blogga om vad som hade hänt under dagen och på pubar och klubbar arrangerades fester mer eller mindre anslutna till dramat i tingsrätten. Spectrial var, som händelse betraktat, lika mycket en ockupation av en mängd lokala platser som en tre veckor lång juridisk process.

Tingsrätter är sannerligen underskattade offentliga platser! Men offentliga platser finns inte utan offentligheter. De måste först ‘ockuperas’ och sedan måste det offentliga samtalet göras. Det första som gjordes var att ett fildelningsnätverk sattes upp. Bland annat laddades min bok ned.

Sedan talade vi om proteströrelserna Netopia, Rättighetsalliansen, och Ifpi. Dock var det ingen av dem som anordnade en protest mot oss just där. Istället skedde det via åklagaren innanför de stängda dörrarna.

Därefter gjorde vi ett tappert försök att bryta med publiken när en reporter från GT ställde olika frågor. En vanlig fråga är ju “vad representerar du?”. Representation kräver ju att någon står för något/någon annans intressen, eller för andras talan. Så i det här läget pekade vi på datorerna och menade att det inte behövdes föras någon talan för någon annan. Filerna delades ju redan när vi talade. Representations- och publiktänkandet genomsyrar ju vår kultur till den milda grad att det har blivit en slags gängse abstraktion som man ofta hänvisar till: “Det sker en diskussion i offentligheten”, “Vi ska informera allmänheten”, “Jag för mina medlemmars talan i den här frågan”.

Nätpolitikens karaktär av tunnel undflyr alltid den typen av abstraktioner. Det är hög tid att vi skärper analysen ett par nivåer. Jag tror nämligen att den absurda situationen att en rektor går och polisanmäler sina elever, och därmed riskerar att sabba deras liv för några filers skull, går att undvika om vi demonstrerar kopian (i betydelsen förevisa, instruera) i sin helt enkla tillvaro. Först kom kopian, sen kom proteströrelsen med sina advokater och industrier!

Det kommer fler rättegångar på samma tema framöver i Göteborg. Hoppas att vi får fler tillfällen att bygga nät och bygga samtal. (nästa gång tar vi med bättre routers och mera fika).

How to study the social sciences, part VI

Time for another piece from my in-the-making dissertation, this time on the blackboxing of statistical measurements. It took more time than I expected, mostly because I had to struggle with statistics :D. Comments by statisticians are mostly appreciated (in Swedish or English). Here we go!

The black boxes of quantification and statistics

So far we have looked very directly at how blackboxing takes place, how it is communicating via interfaces, and how black boxes and interfaces are combined in epistemic assemblages. There is however, another riddle that has to be solved, namely that of quantification and statistics. Once more I shall give an example from the Sociology of Scientific Knowledge, to be more precise of one work of Donald MacKenzie. I will also consider Latour’s notion of centers of calculation, the history of statistics and its epistemic status, and further advance into some more recent research in the sociology of quantification.

Pearson and Yule – A statistical controversy

In his article Statistical Theory and Social Interests (1978), Donald MacKenzie analyzes a controversy, and an emerging breakthrough in statistical methods, that took place during the first one and a half decades of the 20th century, between Karl Pearson and Udny Yule, both regarded as two pioneering statisticians today.

The controversy between Pearson and Yule concerned how to measure association on a nominal scale level. Pearson had in 1905 suggested the tetrachoric coefficient as a solution to how to quantify nominal scales, something which Yule criticized openly during several years (1). MacKenzie’s elaboration on this controversy is interpreted through an analysis of their respective differences in social interests:

/…/ Pearson’s commitment to eugenics played a vital part in motivating his work in statistical theory. Pearson’s eugenically-oriented research programme was one in which the theories of regression, correlation and association played an important part /…/ Regression was originally a means of summing up how the expected characteristics of an offspring depended on those of its parents; the bivariate normal distribution was first constructed by [Francis] Galton in an investigation of the joint distribution of parental and offspring characteristics. (MacKenzie 1978: 53)

MacKenzie’s point is that advances in statistics, even though regarded to be esoteric and mathematically ‘disembodied’, are guided and influenced by a set of social and cognitive interests, that orient the goals and directions of what to develop and what to disregard of. The early 20th century statistics in Britain was thus, at least partially, influenced by a need within eugenics and population control. In Britain, at the time, eugenics and ‘national efficiency’ were regarded as legitimate political options, and were even discussed in government departments. Yule, on the contrary, had no affection for eugenics, and instead argued that heredity was a largely unimportant factor in comparison with environmental ones (MacKenzie 1978: 58-59).

What we have is thus a classical social explanation of how statistics develops in line with the needs defined by group (such as the eugenics movement) interests and larger social interests (for example state governance). What MacKenzie pays less attention to is what happens next:

Contemporary statistical opinion takes a pluralistic view of the measurement of association, denying that any one coefficient has unique validity /…/ Yule’s Q remains a popular coefficient, especially amongst sociologists. Pearson’s tetrachoric coefficient, on the other hand, has almost disappeared from use except in psychometric work. (MacKenzie 1978: 65)

I am not in the position to evaluate whether this is valid or not for statistics in general. What I on the other hand find necessary, is to think the dispersion, usage and effects of statistical methods within the terminology of blackboxing. Back in the early 20th century, many of the core statistical measurements today used for the social sciences were developed, for example the chi-square test, Pearson’s r, advances in correlation and regression by Galton, etc. (see Hacking 1990: 180-188).

Just like the case of the Michelson-Morley experiments, also now deprecated statistical methods, may very well come to reinforce or weaken what black boxes to open and which ones to leave closed. Statistical methods may be blackboxed, taken out of its context of discovery, and applied widely. Or, it may be broken, considered obsolete, or just veiled in historical darkness for other reasons, perhaps only to emerge in the detailed archives of the historian of science.

An example of a ‘successful’ blackboxing is the Pearson r. In a textbook on social scientific methods, which is written by Gothenburg researchers close to the SOM-institute, and taught in many social science classes locally, an interesting passage appears:

The calculation of Pearson’s emph{r} is complicated to say the least /…/ Even though it can be useful to on some occasion make the calculations yourself /…/ – not long ago the researchers had to employ assistants to be able to do these calculations at all /…/ it is of course [today] the computers that calculate Pearson’s r for us.
(Esaiasson et. al. 2002: 392)

The Pearson product-moment correlation coefficient (r) demands a time consuming work and plenty of mathematical skills to calculate manually. Then the first moment of delegation meant to involve more humans to do this work (assistants). And finally, we today have computers doing the same work in milliseconds. A statistician, or social scientist for that matter, must of course be able to master the usage and interpretation of the output of the computer, but in routine work, he or she is able to forget the assistants and hard work it once took to use this statistical tool.

This it is possible to conclude that the statistical measurements, developed in a context of the British eugenics movement, can dislodge from its context of discovery through blackboxing, and find its way into the software packages that are used today in statistical calculation, as standardized measurements and tests to evaluate the quality of survey data. Now, this de-contextualization not only means that it is possible to forget the tedious work that had to be done before computers. It also means that it would be absurd to accuse someone calculating the Pearson’s r for being a follower of eugenics, just as it is equally absurd to accuse someone of militarism for using the internet, just because the internet was originally constructed as a military computer network. For statistics, Latour’s actualist principle still applies: The fate of facts and machines are in later users hands, and their qualities are a consequence of collective action (Latour 1987: 29, see also the above sections on Latour).

But not only are statistics blackboxed as they are assembled as research practices. They also function as interfaces that are able to translate research results into comprehensible facts. The time is ripe to go further along this line, to investigate how especially the modern state has requested such scientific information.

Footnotes:

1. The controversy is much more elaborate than this. To save space, I refer to MacKenzie’s article to be read in its entirety.

How to study the social sciences, part V.

Today was an very good day in my dissertation writing. I managed to scribble down 10k characters and I decided to instantly put them here on interweb. This draft concerns what I will call “withdrawn hardened functions” in epistemic objects. It is partly influenced by object oriented ontology which I am trying to enrich the standard Science and Technology Studies terminology with. It’s quite a heavy read, so enjoy or surf along!

So far I have only dealt with the positive domains of scientific knowledge; blackboxing, interfaces and assemblages as productive elements. But a core problem in theory of science is what is unknown, what hides in the unconscious or hidden domains of imperceptibility, what is included and what is left behind. This means that in order to deal with this complex issue I need to adapt also a terminology for talking about what evades a concrete epistemic assemblage.

To these questions, at first glance, pure actualism gives little room to navigate. I will propose that the black boxes I encounter have withdrawn hardened functions, which makes them combinable and plastic. This feature has been called “immutable mobiles” by Latour (1999: 306-307), and it falls close to what Star & Griesemer (1989) call a “boundary object”. Since as mentioned already before, blackboxing actually is a process of forgetting, embedding and ‘hard coding’ tasks and processes that are needed to produce on a surface level something that is positive knowledge. For example, a pre-compiled dataset of statistical information gathered from a survey makes computerized statistical calculation possible, and quite user friendly compared to doing it manually, precisely because it in a given moment gives us the opportunity to forget thousands of questionnaires and how they were collected and assembled, to instead paying attention to creating bars and diagrams for a scientific report.

This is a wholly different approach than what is the outcome of a Kuhnian thinking. Whereas the process of forgetting is historically very dramatic in Kuhn, I shall argue that it is shallow compared to the sort of object oriented aspect of blackboxing. Lets take a look at a central passage in The Structure of Scientific Revolutions:

In short, they [textbooks] have to be rewritten in the aftermath of each scientific revolution, and, once rewritten, they inevitably disguise not only the role but the very existence of the revolutions that produced them. Unless he has personally experienced a revolution in his own lifetime, the historical sense either of the working scientist or of the lay reader of textbook literature extends only to the outcome of the most recent revolutions in the field. (Kuhn 1996: 137)

This leads Kuhn to thinking about different historical paradigms as incommensurable, and that the disguising of past revolutions leads to a view of scientific progress as being linear and cumulative. But, while these two points are valid and refreshing to the history of science, they are indeed very clumsy for the more close studies of scientific activities. A paradigm, would then appear as a monstrously large black box, where a whole generation of scientists are only able to think ‘within the box’, while the actual workings of the machinery is veiled. Only, according to Kuhn, when enough anomalies appear the scientists start to doubt that the whole paradigm might be wrong.

Two problems arise here. The ‘monstrous’ aspect of Kuhnian historicity leads to a sort of empirical over-determination. In the reports of the SOM-institute we find for example a terminology resembling the sociology of Durkheim, Parsons, Merton etc. The methods of surveys and quantification are also ‘borrowed’ from the intensified usage of these methods in sociology towards the end of the 19th century. Even though this is true on one level, I argue that it adds very little to our understanding what is done, and what that practice means. The abstractness of paradigms, rather ironically, makes the co-production of scientific objects and other objects invisible. To make a crude example (which is unfair to attribute to Kuhn himself); if I read in the local newspaper “The researchers talk about a Gothenburg effect and a slow norm shift” (as already quoted in the prelude section of this chapter), and then conclude that this is knowledge within a Durkheimian paradigm since it talks about norms and norm shifts, I would instantly remove myself from a process that has significant value for translating the research practice of the SOM-institute into a circulation of facts. The concept of norms is indeed built into theoretical tools used (which in turn may be blackboxed), but if we ignore the relevance of that another actor, the Göteborgs Posten local newspaper, made use of and valued highly enough the much debated question of corruption scandals, the role of science and its interfacing with other societal assemblages is abruptly veiled in darkness, and analysis would stop on what I consider to be a shallow level.

Another more serious flaw in Kuhnian-inspired theories of science is their human-centered character. For science to change, either the scientists need to change their beliefs, theories and everyday practices, or they have to be replaced by a new generation of scientists (1). This is not true for technology, and with technoscience, it is not valid either. Let me give two examples, one simple and one advanced:

Example 1 – The hammer

A carpenter uses hammers (2) as a routine piece of equipment when building houses. It is connected to other objects such as nails, human users, and wooden planks. Hammers are constructed objects, and in one respect they reconfigure the human user too, which has to learn how to use it. One could even say that hammers are paradigmatic technologies of house building, since they imply methods, can be calculated with by architects, etc. Now, the hammer may also be used for committing a brutal murder. Then it becomes a piece of evidence in a murder investigation, is placed in a plastic bag, checked for fingerprints and may even be a technical evidence putting the murderer away for prison for several years. A skilled carpenter knows the difference between a good and a bad hammer, but in the moments of driving nails into wood his or her attention lies elsewhere than with the technological advances, means of production, and the price of the hammer. It is precisely because it is blackboxed, that it may withdraw from full inspection and reflection, that is is a powerful tool. As the house is completed, and populated with new people, they in turn do not need to know anything about hammers, even though they may be ‘implicated’ in the house, and need to be brought forth once again as the house is repaired. The hammer is thus more than its use together with nails and planks, more than the carpenter’s skills, and more than evidence in a courtroom. The hammer survives the house.

Example 2 – Experiments in relativity

Even though I consider the Sociology of Scientific Knowledge to be unsuitable to my theoretical needs, Harry Collins and Trevor Pinch (1993) have produced a textbook example of how scientific experiments may reinforce each other throughout historical paradigms. In their chapter Two Experiments that ‘Proved’ the Theory of Relativity, Collins & Pinch set out to understand how the 1919 solar eclipse experiment led by physicist Arthur Eddington was accepted very swiftly, even though the results of the actual experiments were quite poor and inconclusive due to harsh conditions of photographing light as it was supposed be displaced by the large gravity field of the sun (and thus proving the theory of relativity). The experiment was very difficult to perform at the time, cameras had to be mounted on remote islands to be in time for the solar eclipse, and they were sensitive to temperature and vibrations due to the long exposures needed to make the photographs.

A contributing factor to the quick acceptance of the inconclusive results of the Eddington experience, was according to Collins & Pinch, that beginning in 1881 Albert Michelson (later in collaboration with Edward Morley) had performed series of experiments of a wholly different purpose. They wanted to measure the ‘aether drift’ that was thought to occur as earth moved across space. It was believed that light traveled through the medium ‘aether’, and thus the movement of the earth would produce slightly different speeds of light in different directions. These experiments, that over time took place for half a century, however failed to account for any significant variations, and thus many considered the speed of light to instead be constant.

Now, it may seem that the Eddington experiment and the Michelson-Morley experiments are disconnected. But Collins and & Pinch connect them despite being about two different things:

The way the 1919 observations fit with the Michelson-Morley experiment should be clear. They were mutually reinforcing. Relativity gained ground by explaining the Michelson-Morley anomaly. Because relativity was strong, it seemed the natural template through which to interpret the 1919 observations. (Collins & Pinch 1993: 52)

As the Michelson-Morley experiments kept failing, they unintentionally reinforced the Einsteinian relativity theory, because it presupposes the constant speed of light. The results of Michelson-Morley, even though they were a ‘failure’, could be a component part in strengthening the Eddington experiments, even though Eddington had a wholly different theoretical purpose. What I am getting at here is a somewhat dramatic comparison: Just like the hammer can be used both for carpentry and murder, scientific results, methods and machinery can be used for very different purposes, in different setups and epistemic practices. Even though carpentry and relativity physics are radically different activities, the point is that parts and components can be taken out of their contexts, since they are rendered mobile by way of blackboxing. Assemblages, architectural or scientific, mobilize and assemble their equipment, where most of them are already there. But assembling and selecting what components to choose is not only about actively knowing where to go. It is equally important to forget. Be it about the theoretical functioning about the hammer or the ‘aether wind’, exclusion is as important as exclusion.

This is, I will argue also the case for the social sciences, and especially concerning its use of quantification, which will be the topic for the next section.

(1) Of course paradigms may extend over centuries, but it can still be said that Kuhn also classifies the durability of scientific beliefs around scientists and communities of researchers.

(2) Selecting this example is a tribute to Heidegger’s tool analysis in §15 in Sein und Zeit (1972[1926]) where the hammer is used as an example on how a piece of equipment is always in relation to other objects, and that equipment has to withdraw from consideration to be used for something.

Booklooking* The Quadruple Object: Gdansk – Warzaw ICC

Two of my favourite objects are books and trains. Two of my dearest friends are Opalmar and Isabelle. Assembling these four objects in an event equals a great time!

So, I had them starting to read The Quadruple Object, while simultaneously giving mini-lectures on Heidegger’s tool analysis and OOO-philosophy. The train served as my example, and it is a fantastic one because it is clearly a system, a very complex one in fact, where components interact all the time. But as an object, it withdraws from the rails, the wires and the passengers, and must do so in order to be more than just an effect, or a temporary occasion.

Poland is by the way one of my favourite places, and while re-reading the section “Anti-Copernicus” in TQO, I came to think about the fun fact that Copernicus actually stayed in Gdansk. Oh, and Schopenhauer was born there too. Philosophical travel at its best!

The fact that we had an early morning and a late night before us, the train after a while gently withdrew from our human consciousness and tucked us in for a nap.

Finally home again in Gothenburg, it is time to give all of my attention to my dissertation again, so expect heavy texts in English in a near future.

* Footnote: To “booklook” something is a philosophical fashion-hack invented by my friend Karl, who here wears a DeLanda book.

Kina och censur

Medverkade idag i Sahlberg i P1 där Hanna Sahlberg, Ola Wong och jag pratar kinesisk nätcensur. Programmet ger en hel del bakgrund och blickar även tillbaka till nittiotalet.

Jag pratar mest om den Gyllene skölden samt hur “klonerna” av de i väst så populära “sociala medierna” (Twitter och sånt) fungerar för att integrera själva övervakningen. Egentligen är det fel att säga att de är kopior eller kloner. Sina.com har en större mikroblogg än Twitter, och snart har Kina lika många internetanvändare som USA och EU, tillsammans. Rätt ska vara rätt och nät ska vara nät.

Men, man kan ju å andra sidan ifrågasätta om det Kinesiska nätet verkligen är ett internet. Samma fråga är alltid lika aktuell här i Europa.

Länkar:

Lyssna på SRs hemsida

Ladda ned .mp3 från SR

Ladda ned .ogg från internet

Why technological concepts are smarter than sociological ones; interfaces I

A key question in philosophy and civil sociology (as opposed to the ordinary State sociology) is to grasp how entities and objects communicate, interrelate and every now and then shape new emergent bodies.

Engineers already have one such brilliant concept: interface. But before elaborating on that, let us see what happens when an average sociologist “borrows” the terminology of technology:

/…/ interface analysis grapples with ‘multiple realities’ made up of potentially conflicting social and normative interests, and diverse and contested bodies of knowledge. It becomes imperative, then, to look closely at the question of whose interpretations or models (e.g., those of politicians, scientists, practitioners or citizens) prevail in given scenarios and how and why they do so (Long 2001: 88)

In two sentences much of the world disappears and we are left with the “multiple realities” of human access, where there is nothing but “social” and “normative” interests. And the only objects connecting are “politicians, scientists, practitioners or citizens”. What a weak interface! No wonder sociologists never seem to find the missing masses.

To understand concrete things and events, such as workplaces, scientific laboratories, parties, infrastructure and telephones, we are far better off by turning to computer science or industrial design.

Interfaces can be made of hardware or software, of object-oriented code, or in the case of soft humans; by places, protocols and translations.

We take the average desktop computer. It has multiple hardware interfaces, some of them on the outside such as USB-ports, VGA screen connections and ethernet plugs. These in turn follow specific protocols to interface with other devices, such as keyboards, screens and the internet (TCP/IP is (one of) my religions <3). Besides these pretty blackboxed protocols (to hackers and programmers they are gray or white) there is usually a GUI (graphical user interface) and/or CLI (command line interface). The average Macintosh user only uses the GUI to render advanced computing understandable to his or her performances. Different interfaces enables you to do different things, with different speeds and accuracy, and a computer off the shelf can thus be many different things. The same hardware interfaces can be configured to be either a web server, a crypto-device or just a word processor for someone writing a novel. This is one example of the fantastic power of interfaces and their ability to make things multiple!

But it’s not only computers and state of the art technology that comes with interfaces. Take a library. It also has plenty of hardware and software interfaces. There is a catalogue of thousands of books, there are chairs and tables for interfacing with books that you pick off the shelves, and if the library is nextlevel, it is equipped with a café where humans can interact using the protocol “language”.

Once you study one of the interfaces closely, you can find even more interfaces. The library catalogue is programmed with lines of code that interact with computer hardware, which in turn synchronizes data via the network interface. The books in the shelves are usually in the standard ink-on-paper GUI, with exception of tactile alphabets and audio books. And the café interface of interacting humans may be configured to promote people sitting together, excluding perhaps certain people by adjusting the price of coffee, etc.

Interfaces are always created with a certain degree of plasticity, so that they are able to go beyond a single-purpose link. The USB-port is able to talk to thousands of devices, linked in a serial fashion. The Tahrir square is able to host millions of people overthrowing a dictator, and my notebook allows me to scribble down text in all languages, and as a bonus feature I can draw pictures, diagrams and funny cats.

But the plasticity is always conditioned and configurable. My firewall prevents malicious code from entering my network interface, the electronic gates of the library tries to stop book-thieves, and by using academic jargon in the pub, I can in a very unsympathetic way exclude people from entering a conversation.

But on a philosophical level there is a more profound feature of interfaces. They seem to interdefine sensible objects. With a few keystrokes on my command line interface I can turn an old half-wrecked computer into a web server, that can be reached and interacted with over the internet. By hanging out in cafés talking about cool clothing, aesthetics and trendy cigarettes, I can turn myself into a hipster, and by reconfiguring a street with concrete barriers the local municipality can change the identity of a noisy traffic-saturated street to a posh walk for window-shoppers (gentrification).

These are only a few preliminary thoughts on the roles of interfaces and objects. Perhaps more will follow another day.