The title of this essay derives from the words uttered by the protagonist, Henry Case, in William Gibson’s novel ‘Neuromancer’ (1984). This metaphor, which equates the human body as mere data turned into flesh encompasses the theory that in an age of increasing focus on information technologies and the ways in which people interface with them, the boundary distinguishing an individual from their surroundings becomes blurred, if not shattered entirely. As University of Chicago’s William Fulton attests, we exist simply as information systems that happen to inhabit the material instantiation of our bodies, (Theories of Media, 2007).
However, one cannot enter discussion about the above topic without first alluding to the particular school of thought which harbours critical theory of this ilk. Exponents of this style of dictum would usually come under the banner of ‘Posthumanists’. The term itself, ‘Posthumanism’, is steeped in hyperbole in that it carries with it an ominous sense of foreboding in contemporary culture, where there is a strong case for the premise that society is becoming less ‘human’, as we retreat behind the veil of technology.
To draw upon the direct translation of ‘post’ as ‘after’ would infer the meaning, ‘after-human’ which in some respects gives us a clearer understanding of the concept, as it tends to deal with the modern development of the integration of technology and biology and the human body. It is basically a notion that we, as humans are becoming increasingly embedded in technology and the technological environment that, in a sense, the paradigm of the ‘natural’ human being has shifted in meaning.
Juxtaposed with this is the idea that as humans are more and more subsumed in technology, technology is becoming more and more human with advances in science and Artificial Intelligence (AI). Fukuyama advanced the concept of Posthumanism as a negative case: that of ‘anti-humanism’ or absence of humanism. He bemoans the transgression of “crucial moral boundaries” that have eroded the ethical distinctions between therapy and technological enhancement (Our Posthuman Future 2002).
Gordijn & Chadwick describe a posthuman as a being that has at least one posthuman capacity i. e. a general central capacity greatly exceeding the maximum attainable by any current human being without recourse to new technological means (Why I Want to be a Posthuman when I Grow Up, 2006). Mc Luhan did not use the exact word but he predicted a future that dovetails succinctly with posthuman theories surrounding ‘cybernetics’ when he foretells a society whereby – to paraphrase Hayles’ theory of ‘reflexivity’ (1999) – that which has been used to generate a system is made to become part of the system it generates.
McLuhan alluded to the ‘cybernetic’ possibility of human beings interfacing and entangling with machines on a neurological and functional level. Just as binoculars are an extension of the eye and clothes are an extension of the skin, then information technologies become McLuhan’s extension of the mind. In this respect, the term ‘post-humanism’ has only really worked itself into contemporary critical discourse in the humanities and social sciences since the mid 1990s, over a decade after Mc Luhan’s death.
However, it may be traced back to the Macy conferences on cybernetics from 1946 to 1953 and the invention of systems theory (What is Posthumanism? , Wolfe, 2010). At these conferences they converged on a new theoretical model for biological, mechanical and communicational processes that removed the human and Homo sapiens from any particular privileged position in relation to matters of meaning, information and cognition. The term ‘cybernetics’ had been coined by Wiener in the 1940s to denote “the entire field of control and communication theory, whether in the machine or in the animal.”
Even at this early stage of technology there was a definitive study underway into the correlation of information between machines and living creatures. In the 1960s this theory was modified into the concept of ‘reflexivity’ alluded to above. Systems re-entangle with themselves, and become referential to themselves blurring the traditionally accepted borders imposed on the world between subject and object, object and environment, or in other words between the organic and the natural and the technological and the cultural, a principal tenet of modern posthumanist thought.
In current popular use, Fulton describes cybernetics as most often associated specifically with the development of artificial intelligence, virtual technology and cyberspace. He attests, “cybernetics in the context of technology is only a limited part of a greater whole, which deals with the study of information systems and the media in which they exist, both inorganic and organic. ” Further and further extension of the idea leads to a model of the world in which media serve as a series of “irrelevant substrates” through which pure information freely flows.
This situation ties in with many of Mc Luhan’s ideas about media in society but most especially with his ideas about the extension of the mind, extending human beings’ central nervous systems into electromagnetic technology, a topic, I will consider later in the essay. There is much critical opinion to support Gibson’s notion of humans as information systems just like a machine or computer.
Clark (Natural-born Cyborgs, 2001) believes that it is by virtue of our intrinsic ability to merge with external resources to perform even the most rudimentary of calculations that we are designed to walk hand in hand with technology in a posthuman future. He offers the example of how we utilise pen and paper to work out moderately complex mathematics, storing the immediate results outside the brain and then repeating the pattern until the larger problem is solved.
“It is because our brains, more than any other animal on the planet, are primed to seek and consummate such intimate relations with non-biological resources that we end up… capable of abstract thought. …we are natural-born cyborgs forever ready to merge our mental activities with operations of pen and paper and electronics…. ” (Clark, Natural-born Cyborgs, 2001) Wiener asserted that we have to become technophiles to operate in a technological world (1954). He noted that because we have modified our environments so radically it is now necessary to modify ourselves in order to exist in this new environment.
Furthermore, he equated the now routine breakdown and repair of the human body with that of replacing a faulty part in a machine. With contemporary advances in technology allowing us to alter or perfect many undesirable parts or areas of our bodies with a specifically manufactured replacement it begs the questions, what does it mean to be human and what does it mean to be a machine in the 21st century? “If corneal implants are part of us, why not contact lenses? If contacts, why not eyeglasses? If eyeglasses, why not automated telescope?
If a telescope, why not the computer interfaced with it? (Hayles, Designs on the Body: Norbert Wiener, Cybernetics and the Play of Metaphor, History of Human Sciences, Vol, 3 No 2) Mc Luhan’s adjunct to this concept is: if a telescope can be the extension of the eye, then information technologies can become extensions of the mind. He stated that after extending or translating “our central nervous system into the electromagnetic technology, it is but a further stage to transfer our consciousness to the computer world as well.”
The Internet clearly serves as the next step in this process of extension. By connecting all computers as part of a pervasive, global network of information, man is not only able to extend his nervous system to interface with technology, but is able to use that mediation to directly connect with the nervous systems of other human beings, also tapped into the network. Present day studies are also showing human’s capacities to monitor their bodies in the same way that one might monitor a car for potential faults.
With one report (The Quantified Self: Counting Every Moment, 2012 ) attesting that more people are using smartphone and tablet applications to monitor their health in an effort to sustain a healthy life-style but also, in many instances, as a substitute for the much more expensive trip to the doctor. As populations age and health-care costs increase, there is likely to be a greater emphasis on monitoring, prevention and maintaining “wellness” in future, with patients taking a more active role – an approach sometimes called “Health 2.0” (The Quantified Self: Counting Every Moment, 2012 ).
Allied to this is the plethora of people who are undergoing voluntary surgical procedures in order to modify particular parts of their body for a number of different reasons, in the same way someone might change the wheels on their car or update the driver in their personal computer. Converse to the problem of humans becoming more like machines is the question also raised of machines becoming more like humans.
If human identity has been reduced to an information system that happens to inhabit the body as medium, what’s to say that another information system inhabiting a computer, or the Internet, couldn’t be perceived as being equally as “human? ” In a recent article (Mind vs. Machine, The Atlantic, 2011) Brian Christian describes an annual contest between the world’s most advanced artificial-intelligence programs and ordinary people.
The contest, known as the Turing Test(, endeavours to find out whether a computer can act “more human” than a person and Christian discovers that the march of technology is not just changing how we live, it is raising new questions about what it means to be human. He realises that convincing the judges that you are human “is about more than simply showing up [and being yourself]”. It is something that has to be “worked at. ” This notion has certain resonance for society as a whole.
With this in mind, I recall a report (Makwana & Irwin-Brown, We’re the Kids in Austerity, 2012) I came across some months ago which stated that 57% of 7-15 year olds in the UK find it easier to talk with friends online than in person, 56% find it easier to talk by SMS than in person. These figures represented for me a sea change in the emphasis on ‘natural’ human interaction, and in what it means to be a ‘natural’ human in today’s society. Perhaps, we are envisaging a new beginning for society – one where children feel more comfortable interacting with technology than they do with their fellow human beings.
To understand why our human sense of self is inextricably linked with computers, it’s important to realise that computers used to be human. From the mid-18th century onward, computers, many of them women, were on the payrolls of corporations, engineering firms, and universities, performing calculations and numerical analysis. In the mid-20th century, as the ‘digital computer’ developed, it was said to be “like a computer. ” In the 21st century, it is the human mathematical whiz who is “like a computer” (Christian, Mind vs. Machine, The Atlantic, 2011 ).
In a strange but significant turn of events, humans are said to be “like” something that used to be “like” us. By this reasoning, one could assume that the modern-day computer is so-called because it is intended to carry out any operations which could be done by a human computer. During the same period that gave rise to the human computer, there too, was much debate amongst philosophers surrounding the idea of what it was to be human. French philosopher Julian Offray de la Mettire (1747) suggested that human beings are only complex animal-machines.
This suggestion was, in no doubt, inspired by Descartes uttering in the 16th century that the body was essentially like a machine, pointing out that the only thing not reducible to mechanism is the human mind. Furthermore, the notion of man as a machine or machine-like was something that resonated during the Industrial revolution of the 18th and 19th centuries. Ferguson describes the perception of the plight of the factory worker in these times: “Many mechanical acts require no intellectual capacity.
They succeed best under a total suppression of sentiment and reason….. Manufacturers, accordingly, prosper most where the mind is least consulted, and where the workshop may, without any great effort of the imagination, be considered an engine, the parts of which are men. ” (Ferguson, An Essay on History of Civil Society, 1767) The question of human identity being reduced to an information system that happens to inhabit the body as medium has been a driving force for the study of artificial intelligence, and has manifested repeatedly. As Christian puts it some people imagine the future of artificial intelligence as a kind of heaven:
“Rallying behind an idea called “The Singularity,” people like Ray Kurzweil (in The Singularity Is Near) and his cohort of believers envision a moment when we make smarter-than-us machines, which make machines smarter than themselves, and so on, and the whole thing accelerates exponentially toward a massive ultra-intelligence that we can barely fathom. ” Such a time will arrive in which humans can upload their consciousness onto the Internet and get assumed—if not bodily, than at least mentally—into an eternal, imperishable afterlife in the world of electricity (Christian, Mind vs. Machine, The Atlantic, 2011 ).
Others imagine the future of computing as a kind of hell, an almost Terminator style apocalypse. Machines black out the sun, level our cities and enable an atmosphere that destroys all living things. There is no doubt that technology has become an integral part of human lives and will only become increasingly so. We have already made the first step into the realm of the posthuman or the cyborg, common examples include the athlete Oscar Pistorious who has prosthetic blades for legs, anyone who has undergone a sex change, or anyone who has modified their bodies with artificial implants for cosmetic reasons.
I do not believe there is any going back but I feel technology and humans certainly have the capacity to complement each other and work side by side as we look to the future. It remains to be seen if this synergy will come to pass but there certainly is the capacity for it. And as the human race faces up to some of the toughest questions that have been put to us heretofore I would cautiously back us to prevail – To paraphrase Wiener, humans can continue to modify themselves to keep up with the modifications of the environment they find themselves in.