Traumas of Code (Code)

Aus Philo Wiki
Wechseln zu:Navigation, Suche

Exzerpt aus: N. Katherine Hayles: Traumas of Code. Critical Inquiry Volume 33, Number 1, Fall 2006

Language isn't what it used to be. In computer-mediated communication, including cell phone conversations, email, chat room dialogues, blogs, and all documents written on a computer, the language we learned at mother's knee is generated by computer code. Though computer-mediated language may appear to flow as effortlessly as speaking face-to-face or scribbling words on paper, complicated processes of encoding and decoding race up and down the computer's tower of languages as letters are coupled with programming commands, commands are compiled or interpreted, and source code is correlated with the object code of binary symbols, transformed in turn into voltage differences. Most of this code is inaccessible to most people. At the level of binary code, few are equipped to understand it with fluency, and even fewer can reverse engineer object code to arrive at the higher-level languages with which it correlates.1 As a result, contemporary computer-mediated communication consists of two categories of dynamically interacting languages: so-called natural language, which is addressed to humans (and which I will accordingly call human-only language); and computer codes, which (although readable by some humans) can be executed only by intelligent machines.

The vast majority of the literate public who are not computer programmers becomes aware of this dynamic interaction through ordinary experiences. The easy flow of writing and reading human-only languages on computers, increasingly routine for the millions who populate cyberspace, is regularly interrupted by indications that unseen forces are interacting with the language flow, shaping, disrupting, redirecting it. I mistype a word, and my word processing program rearranges the letters. I think I am making the keystroke that will start a new paragraph and instead the previous paragraph disappears. I type a URL into the browser and am taken to a destination I do not expect. These familiar experiences make us aware that our conscious intentions do not entirely control how our language operates. Just as the unconscious surfaces through significant puns, slips, and metonymic splices, so the underlying code surfaces at those moments when the program makes decisions we have not consciously initiated. This phenomenon suggests the following analogy: as the unconscious is to the conscious, so computer code is to language. I will risk pushing the analogy even further; in our computationally intensive culture, code is the unconscious of language.

How literally should we take this aphorism, hovering somewhere between an analogy and a proposition? If we take it seriously as a proposition, a skeptic may object that code is easily read and understood, whereas the unconscious is inherently unknowable. Such an objection depends on a naïve notion of programming that supposes code is transparently obvious to anyone who knows the coding language. On the contrary, people who have spent serious time programming will testify that nothing is more difficult than to decipher code someone else has written and insufficiently documented; for that matter, code one writes oneself can also become mysterious when enough time has passed. Since large programs — say, Microsoft Word — are written by many programmers and portions of the code are recycled from one version to the next, no living person understands the programs in their totality. Indeed, the number of person-hours necessary to comprehend a large program suite such as Microsoft Office exceeds a working lifetime.2 In the case of evolutionary algorithms where the code is not directly written by a human but evolves through variation and selection procedures carried out by a machine, the difficulty of understanding the code is so notorious as to be legendary. These examples demonstrate that in practice both code and the unconscious are opaque, although with code it is a matter of degree, whereas the opacity of the unconscious is assumed. Psychoanalysts position themselves as informed theorists and practitioners who can understand, at least partially, the workings of the unconscious; programmers constitute the group who can understand, at least partially, the workings of code.

A more cogent objection is articulated by Adrian Mackenzie in his groundbreaking study Cutting Code, where he considers code as the site of social negotiations that structure and organize human agency, behavior, and intention.3 His book illustrates the advantages of not black-boxing code. This stance is a valuable option, and the rich insights in his work testify to the need for more studies of this kind. Nevertheless, the argument Mackenzie makes for the agency of code — one of his major points — can be appropriated for the case I am making here for code as the unconscious of language. With admirable clarity, he shows that code is not merely a neutral tool but an ordered system of cognitions making things happen in the world, both among humans who can (sometimes) understand the code and those who cannot. The agency of code underscores its similarity to the unconscious in producing effects even when it remains hidden under a linguistic surface.

A framework extending code's effects into the nonlinguistic realm is provided by Nigel Thrift's technological unconscious.4 Thrift uses the term to reference the everyday habits initiated, regulated, and disciplined by multiple strata of technological devices and inventions, ranging from an artifact as ordinary as a wristwatch to the extensive and pervasive effects of the World Wide Web. Implicit in his argument is the idea that both the conscious and unconscious are influenced and shaped by the technological environments with which humans have surrounded themselves as far back as the domestication of fire. The argument suggests that the unconscious has a historical dimension, changing in relation to the artifactual environment with which it interacts. Thrift's vision resonates with recent arguments for thinking of cognition as something that, far from being limited to the neocortex, occurs throughout the body and stretches beyond body boundaries into the environment. Andy Clark and Edwin Hutchins, among others, see human thought as taking place within extended cognitive systems in which artifacts carry part of the cognitive load, operating in flexible configurations in which are embedded human thoughts, actions, and memories. For Hutchins, an anthropologist, an extended cognitive system can be as simple as a geometric compass, pencil, and paper.5 It is not only a metaphor, he asserts, that drawing a line on a navigation chart constitutes remembering, and erasing it is forgetting. Clark carries the argument further to envision humans as natural-born cyborgs who have, since the dawn of the species, excelled in enrolling objects into their extended cognitive systems, from prehistoric cave paintings to the laptops, PDAs, and cell phones pervasive today.6

...

Enmeshed within this flow of data, human behavior is increasingly integrated with the technological nonconscious through somatic responses, haptic feedback, gestural interactions, and a wide variety of other cognitive activities that are habitual and repetitive and that therefore fall below the threshold of conscious awareness. Mediating between these habits and the intelligent machines that entrain them are layers of code. Code, then, affects both linguistic and nonlinguistic human behavior. Just as code is at once a language system and an agent commanding the computer's performances, so it interacts with and influences human agency expressed somatically, implemented for example through habits and postures. Because of its cognitive power, code is uniquely suited to perform this mediating role across the entire spectrum of the extended human cognitive system. Through this multilayered addressing, code becomes a powerful resource through which new communication channels can be opened between conscious, unconscious, and nonconscious human cognitions.

Code and Trauma

A promising site for the possibility of new communication channels is trauma. In clinical accounts of trauma, such as those presented by Bessel van der Kolk and Onno van der Hart, trauma overwhelms the ability of a human to process it.9 In this view, traumatic events are experienced and remembered in a qualitatively different way from ordinary experience. The characteristic symptoms of trauma — dissociation, flashbacks, reenactments, frighteningly vivid nightmares — suggest that traumatic memories are stored as sensorimotor experiences and strong emotions rather than as linguistic memory. Dissociated from language, trauma resists narrative. When traumatic events are brought into the linguistic realm, they are frequently divorced from appropriate affect. As Dominick LaCapra puts it, "Trauma brings about a dissociation of affect and representation: one disconcertingly feels what one cannot represent; one numbingly represents what one cannot feel."10 Moreover, van der Kolk and van der Hart's research indicates that when people experience traumatic reenactments while sleeping, their brain waves differ significantly from those characteristic of REM dreams. In light of these results, he suggests that traumatic nightmares should not be considered dreams but a different kind of phenomenon; to recognize the distinction, I will call traumatic reenactments and related experiences that occur outside and apart from conscious awareness the traumatic aconscious.

Experienced consciously but remembered nonlinguistically, trauma has structural affinities with code. Like code, it is linked with narrative without itself being narrative. Like code, it is somewhere other than on the linguistic surface, while having power to influence that surface. Like code, it is intimately related to somatic states below the level of consciousness. These similarities suggest that code can become a conduit through which to understand, represent, and intervene in trauma. Code in this view acts as the conduit through which traumatic experience can pass from its repressed position in the traumatic aconscious to conscious expression, without being trapped within the involuntary reenactments and obsessive repetitions that typically constitute the acting out of traumatic experience.

This possibility was explored in the early days of virtual reality, through simulations designed to help people overcome such phobias as fear of heights, agoraphobia, and arachnophobia. The idea was to present a simulated experience through which the affected person could encounter the phobia at a distance, as it were, where fear remained at a tolerable level. As the person grew habituated and less fearful, the simulated experience was gradually intensified, with habituation occurring at each step. When the stimulus reached real-life levels and the person could tolerate it, the therapy was considered successful.11

...

To explore these interrogations of the role of code in the cultural Imaginary, I will focus on three works, each setting up a different relationship between trauma and code and each produced within a different medium. First, William Gibson's print novel Pattern Recognition represents a complex transmission pathway for trauma in which code plays a central role, by breaking the cycle of obsessive repetition and allowing the trauma to reach powerful artistic expression that can touch others and even initiate a process of healing.12 Pattern Recognition makes extensive use of ekphrasis, the verbal representation of a visual representation, creating through its verbal art the representation of video segments released on the internet (and therefore mediated through code).13 The footage, as the 135 segments are called by those who avidly seek them out, becomes a topic of intense interest and speculation for the online discussion site F:F:F (Fetish:Footage:Forum), leading to a confrontation with trauma staged on multiple levels. Second, Mamoru Oshii's film Avalon explores a different problematic, how code controls and delimits the space of representation.14 Compared to the sensory richness and infinite diversity of reality, computer simulations are necessarily much more limited, typically evolving only within the parameters specified by the code. The film sets up a structural dichotomy between real life and the eponymous virtual reality war game Avalon. Death is the ultimate signifier separating the real world from the simulacrum, for in the game "reset" can be called and the game replayed. Code lacks the seriousness of real life because it provides only a simulacrum of death, not the thing itself. Paradoxically, the inability to experience the ultimate trauma becomes itself the presenting trauma of Avalon, a condition generated by and mediated through code. Finally, Jason Nelson's online fiction Dreamaphage takes this implication to its logical conclusion, presenting code as an infectious agent that inevitably leads to death.15 The three works thus present a spectrum of possibilities, from code opening the way to overcoming trauma, to code becoming so ubiquitous it threatens the very idea of real life, and finally to code as a virus eating away at life from the inside. Their differences notwithstanding, all three works entwine code with trauma and explore code's ability to influence and entrain human conscious, unconscious, and nonconscious cognition.

The different thematic significations of code in these works correlate with how deeply code entered into the work's production, storage, and transmission. As a print novel, Pattern Recognition was produced by manipulating electronic files. Indeed, digital encoding has now become so essential to the commercial printing process that print should properly be considered as a specific output form of digital text. Code thus generated the text but was not necessarily involved in its transmission or storage. Code was also used in the production of Avalon, created through a combination of filming live actors, generating special effects through computer graphics, and using nondigital effects such as hand mattes. In contrast to the print novel, code was also involved in transmission and storage processes, especially in marketing the film as a DVD. For the online work Dreamaphage, code is obviously crucial in all phases of creation, storage, and transmission. As code enters more deeply into the production and dissemination of these works, they become more concerned about the adverse effects of code on the fabric of reality. Thematic anxiety about code within the text thus appears to be reflexively entwined with how deeply code was involved in the production of the work as an artistic object. The more the work depends on code, the more it tends to represent code as not merely involved with traumatic pathways but itself the cause of trauma.

At crucial points in the narratives, each work highlights a doubled articulation, as if acknowledging the double address of code to humans and intelligent machines. The specific configuration of the doubling serves as a metaphor for the work's exploration of the ethical significance of coupling code with trauma. In Pattern Recognition, the doubled articulation connects a physical wound with the representational space of the footage, suggesting that the transmission pathways opened by code can overcome dissociation by forging new associations between life and fiction. In Avalon, doubling blurs the boundary between life and simulation; rather than promoting healing, the interpenetration of life and code troubles the quotidian assumption that there can be life apart from code. In Dreamaphage, doubling takes the form of imagining a physical virus that is indistinguishable from viral computer code. Here the transmission pathway opened by code is figured as an epidemiological vector along which disease travels, with fatal results for human agency, consciousness, and life. The implication is that code is a virulent agent violently transforming the context for human life in a metamorphosis that is both dangerous and artistically liberating. Notwithstanding the different ways in which the encounter with code is imagined, the works concur in seeing code as a central component of a complex system in which intelligent machines interact with and influence conscious, unconscious, and nonconscious human behavior.


=== Dreamaphage: Infecting Code

In Jason Nelson's online digital fiction Dreamaphage, code penetrates reality by first colonizing the unconscious. The backstory is narrated by Dr. Bomar Felt, investigating doctor for the Dreamaphage virus. People infected with the virus start dreaming the same dream every night; the dream differs from person to person, but for any one person it remains the same. Becoming increasingly obsessed with the dream, the infected person finds that it starts looping, a term significantly associated with the programming commands of machine cognition rather than the putative free will of humans. Soon the dream occupies waking thoughts as well as sleeping visions. Within three to four months after initial onset, the infected person slips into a coma and dies. Dr. Felt has encouraged patients to keep dream journals, and he suggests that they may hold the key to understanding the virus.

The next screen, an interactive animation programmed in Flash, shows rectangles whirling within a frame, suggesting that the work proceeds as an exploration of this digital space rather than as a linear account. Represented in diminishing perspective, the space seems to recede from the screen, intimating that it is larger than the screen can accommodate, perhaps larger than anyone can imagine. The navigation requires the user to catch one of the rectangles and, with considerable effort, drag it into the foreground so it can be read. The task is difficult enough so that the user may feel relieved when she finally succeeds and finds the rectangle imaged as a small handmade book. If so, the relief is short-lived, for she discovers that the book's contents can be accessed only by laboriously catching onto the lower page corner and carefully dragging it to the other side, as if the work was punishing her for her desire to return to the simplicity and robustness of a print interface.

The dream journal narratives are wildly incongruous, telling of chairs impossible to move, grocery coupons exploding under the shopper's hat, and skin cells inhabited by couch potatoes. They are accompanied by clever interactive animations that do not so much act as illustrations as performances accentuating the surrealistic mood. The following illustrates the logical disjunctions that the verbal narratives enact:

And by sunlight I mean those sparkling particles the super-intelligent viruses manipulating the fiery burst we call the sun use [sic] to control our, deceivingly harmless, aquarium fish. But then that's another story now isn't it. Moving on, this substance holds our world and all other worlds together. It makes us sad and happy and hungry for humping. [next page] Sometimes this goo collects between two people ... [next page] but love has nothing to do with goo. Instead love is governed by a complex system of ropes and wires haphazardly connected to cattle in the Texas panhandle. [next page] Lucky for us it seem the cattle haven't yet discovered their power over love.26

Although the text presents itself as narrating a linear causal chain, the connections it posits are preposterous, from sunbursts to aquarium fish to love controlled by wires and ropes running through Texas cattle. Recall the interactive animation from which this text was pulled; with its swirl of many different shapes receding into the distance, it suggests a large matrix of reading trajectories, which I have elsewhere called a possibility space.27 The narratives make no sense qua narratives because they function as if they were constructed by making random cuts through the possibility space and jamming together the diverse elements, resulting in texts that present themselves as sequential stories but are socially illegible as such. This does not mean that the narratives (or, better, pseudonarratives) fail to signify. They do so, powerfully, testifying to a cognisphere too dense, too multiply interconnected, too packed with data flows to be adequately represented in narrative form.

This intricately coded work, with its interactive animations, accompanying sound files, and complex screen designs, testifies through its very existence to the extent to which code has become indispensable for linguistic expression. If, as noted earlier in the discussion of Pattern Recognition, database is displacing narrative as the dominant cultural form of our computationally intensive culture, here we see that process represented as an infection of narrative by data. Generating the linguistic surface, the code infects that surface with its own viral aesthetics. The symptomatic monologic dreams indicate that the unconscious has been colonized by the Dreamaphage virus, a screenic word generated by the underlying code (as are all the screen images). Readers of Neal Stephenson's Snow Crash will recognize in the Dreamaphage virus a remediation of the idea that computer viruses can be transmitted to humans and make them behave as if they were computers, here specifically by making them execute an endless programming loop consisting of the dream.28 Since it is not clear in Nelson's text how the virus is transmitted, we may suspect that viewing the screens of computers infected with the virus is a disease vector for human transmission (as in Snow Crash). In this case, the word that appears on the screen, Dreamaphage, at once names the phenomenon and spreads the infection, an implosion of signifier into signified that is possible because code is the underlying causative agent for both the screenic word and the diseaseit signifies. In a certain sense, then, the disease consists of nothing other (or less) than collapsing the distinction between artificial and human cognitions and a consequent conflation of computer code and human-infectious virus. The code-virus preempts the normal processes that produce dreams and installs itself in their place, creating visions of the cognisphere, its native habitat, that appear nonsensical when forced into the linear sequences of human-only language. It is not the virus that is diseased, however, but the human agents who cannot grasp the workings of the cognisphere except through stories no longer adequate to articulate its immense complexity. The individual patients may die, but the cognisphere continues to expand, occupying more and more of the terrain that the unconscious used to claim. That at least is the story Dreamaphage enacts, a bittersweet narrative that exults in the power of code to create digital art even as it also wonders if that power has exceeded the capacity of humans to understand—and by implication, control—the parasitical ability of machine cognition not merely to penetrate but to usurp human cognition.

Code/Coda

Although previous arguments have established that code is available as a resource to connect with trauma, they do not fully explain why, as our culture races over the millennium mark, this resource should be taken up by contemporary cultural productions. To explore that question, I want to reference a moment in Joseph Weizenbaum's Computer Power and Human Reason: From Judgment to Calculation, when his secretary becomes so engaged with the ELIZA computer program mimicking a psychoanalyst's routine that she asks him to leave the room so she can converse with the machine in private.29 The moment is all the more extraordinary because, as he notes, she's fully aware how the program works and so is not deceived by the illusion that the machine in any way understands her problem.30 Shocked by the intensity of her engagement, Weizenbaum feels compelled to issue a stern warning about the limits of computer intelligence. Humans must not, he argues, think that computers can make ethical, moral, or political judgments — or indeed engage in any judgment at all. Judgment, in his view, requires understanding, and that is a faculty only humans possess.

I propose to revisit the scene with the secretary and ask again why she was so intensely engaged with what she knew was a dumb program. Let us suppose she was suffering from a traumatic experience and was using the computer to explore the significance of that experience for her life. What qualities does the computer have that would make it the ideal interlocutor in this situation? It does not feel emotion and so cannot be shocked or repulsed by anything she might reveal; it does not betray anyone (unless programmed to do so) and so can be assumed to function in a perfectly logical and trustworthy manner; and — precisely the point that so bothered Weizenbaum — it does not judge because it lacks the rich context of the human lifeworld that would make it capable of judgment. In brief, it possesses the kind of cognitive state that psychoanalysts train for years to achieve.

After four decades of research, development, and innovation in information technology, computers are becoming more humanlike in their behaviors. Research programs are underway to give computers "emotions" (although as software programs they remain very different from human emotions mediated by the endocrine system and complex cortical feedback loops). Object-oriented languages such as C++ are designed to mimic in their structure and syntax human-only languages, making possible more intuitive communication between humans and computers. Neural nets, within the parameters of their feedback information, can learn to make a wide variety of distinctions. Genetic programs use diversity and selection to create new emergent properties, demonstrating that computers can achieve human-competitive results in such creative endeavors as electronic circuit design.31 In addition, more and more code is written by software programs rather than humans, from commercial software like Dreamweaver that does html coding to more sophisticated programs designed to bootstrap computer-written software through successive generations of code, with each program more complex than its predecessor.

The present moment is characterized, then, by a deep ambivalence in the roles that computers are perceived to play. In certain ways they remain like the relatively primitive machine on which Weizenbaum created the ELIZA program — unendingly patient, emotionless, and nonjudgmental. In this guise they are seen as interacting positively with humans to provide transmission pathways for the articulation of trauma. In other ways, however, they are taking over from humans more of the cognitive load, a maneuver widely perceived as an implicit threat to human autonomy and agency. The double speaking that characterizes my three tutor texts — in Pattern Recognition the map/trigger, in Avalon the bleeding/pixilated body, and in Dreamaphage the code-virus—signifies more than the double addressing of code to humans and intelligent machines. Rather, it interrogates the ambivalence inherent in the double role that the computer plays, as the perfect interlocutor and as the powerful machine that can not only penetrate but actually generate our reality.

Increasingly computers are seen as evolutionary successors to humans that are competing for the same ecological niche humans have occupied so successfully for the past three million or so years. The evolutionary progression that gave humans the decisive advantage over other species — the development of language, the coordination of larger social groups and networks that language made possible, and the rapid development of technologies to make the environment more friendly to the species — is now happening with intelligent machines, as computers have ever more memory storage and processing speed, as they are networked across the globe, and as they move out of the box and into the environment through interfaces with embedded sensors and actuators dispersed across the world.

The issues at stake, then, go well beyond linguistic address (although this is, I would argue, the fundamental characteristic from which other behaviors evolve, just as language was the fundamental development that initiated the rapid development of the human species). As the technological nonconscious expands, the sedimented routines and habits joining human behavior to the technological infrastructure continue to operate mostly outside the realm of human awareness, coming into focus as objects of conscious attention only at moments of rupture, breakdown, and modifications and extensions of the system. Trauma, the site in these fictions through which the ambivalent relations of humans to intelligent machines are explored with special intensity, serves as the archetypal moment of breakdown that brings into view the extent to which our present and future are entwined with intelligent machines. No longer natural, human-only language increasingly finds itself in a position analogous to the conscious mind that, faced with disturbing dreams, is forced to acknowledge it is not the whole of mind. Code, performing as the interface between humans and programmable media, functions in the contemporary cultural Imaginary as the shadowy double of the human-only language inflected and infected by its hidden presence.


1. The immense difficulty of reverse engineering object code was the key factor in the Y2K crisis. Although the feared catastrophic failure did not materialize, attempts to correct the problem vividly demonstrated code's opacity. 2. Robert Bach, vice president of Microsoft's Marketing Desktop Application division, reports that the company employed 750 people, working full-time for two years, to bring Office 97 to market; see "Office 97 Q and A with Robbie Bach," Go Inside, http://goinside.com/97/1/o97qa.html. Assuming forty-hour weeks and fifty weeks per year, that amounts to 1.5 million person-hours. To put this number in context, the average person puts in 80,000 person-hours at work during a lifetime. Of course, my argument is concerned with the amount of time necessary to understand the code, whereas the above figures indicate the time required to create and test the Microsoft product. Nevertheless, the comparison gives an idea of why no one person can comprehend a complex large program in its totality. 3. See Adrian Mackenzie, Cutting Code: Software and Sociality (New York, 2006). 4. See Nigel Thrift, "Remembering the Technological Unconscious by Foregrounding Knowledges of Position," Environment and Planning D: Society and Space 22, no. 1 (2004): 175–90. 5. See Edwin Hutchins, Cognition in the Wild (Cambridge, Mass., 1996). 6. See Andy Clark, Natural-Born Cyborgs: Minds, Technologies, and the Future of Human Intelligence (Oxford, 2003). 7. See Antonio Damasio, Descartes' Error: Emotion, Reason, and the Human Brain (New York, 2005). 8. See Thomas Whalen, "Data Navigation, Architectures of Knowledge," www.banffcentre.ca/bnmi/transcripts/living_architectures_thomas_whalen.pdf 9. See Bessel van der Kolk and Onno van der Hart, "The Intrusive Past: The Flexibility of Memory and the Engraving of Trauma," in Trauma: Explorations in Memory, ed. Cathy Caruth (Baltimore, 1995), pp. 158–82. 10. Dominic LaCapra, Writing History, Writing Trauma (Baltimore, 2000), p. 57. 11. See, for example, the study conducted at the Human Interface Technology Laboratory at the University of Washington, Seattle, "VR Therapy for Spider Phobia," http://www.hitl.washington.edu/projects/exposure/. For a comprehensive list of publications on the subject, see the Delft University of Technology and the University of Amsterdam website on collaborative research at a number of universities, especially Charles van der Mast, "Virtual Reality and Phobias," http://graphics.tudelft.nl/~vrphobia/ 12. See William Gibson, Pattern Recognition (New York, 2003); hereafter abbreviated PR. 13. This definition and insight is offered in W. J. T. Mitchell, "Ekphrasis and the Other," Picture Theory: Essays on Verbal and Visual Representation (Chicago, 1994), p. 152. 14. See Avalon, DVD, dir. Mamoru Oshii (Miramax, 2001). 15. See Jason Nelson, Dreamaphage, http://www.heliozoa.com/dreamaphage/opening.html

...

30. The ELIZA program was designed to prompt its human interlocutor by picking up and repeating key phrases and words as questions or comments. For example, if the human mentioned, "I saw my father yesterday," the computer would respond, "Tell me about your father." See Weizenbaum, "ELIZA—A Computer Program for the Study of Natural Language Communication between Man and Machine," Communications of the Association for Computing Machinery 9 (Jan. 1966): 35–36. 31. See John Koza et al., Genetic Programming III: Darwinian Invention and Problem Solving (San Francisco, 1999).