Christopher Carlson
Humanity’s Extinction: The Loss of the Human form for the Artificial One
The idea of what makes mankind human has always been a topic of
interest for me. Some of my favorite movies, tv shows, and books take this into
account, but instead look to see when one can be human vs artificial. I plan on
looking to see how other science fiction and speculative fiction take this idea
further by examining the very nature of cybernetic augmentation and at what
point one loses their sense of humanity or, alternatively, when artificial
intelligence attains this sense of humanity as well as presenting the testimony
of experts on this topic to help in explaining it.
Humanity is an exceedingly difficult word to define. When I think
about what it means to be human, I often imagine a regular person with morals
and feelings like myself. This definition, however, has started to become
slightly more complex. When AI is getting to a point where it can mimic our
actions and perform commands that indicate a higher intelligence then even
ourselves, how can a human mind and AI really be all that different? Slipper and
More provide an example using the medical profession when they state, “It is not
hard to imagine an advanced med-coach, where the patient relaxes comfortably
while Artificial Intelligence (AI) algorithms diagnose the irksome hairline
fracture, whereupon a 3D printer produces a cast, which a robotic arm proceeds
to place posthaste. And all this done sans human.” (1). If we look at what they
are discussing and apply it to an AI with advanced enough systems that can
simulate humans, the role of diagnosing others and applying the proper
procedures as a doctor would do, would that not be any different than a mind and
soul without a body? Who’s to say that not all these actions could be done by
one machine with the ability to think and feel like a human can?
Ghost in the Shell shows
another related idea with the character of the Puppet Master. The main character
of the film and manga is entirely cybernetic except for her human brain.
However, the puppet master, who is the main villain, is able to permanently gain
access to her brain, by hacking into it, and lives alongside her as a separate
consciousness, sort of like a multiple personality disorder patient. When one
loses all aspects of his/her human body except their brain and then has this
hacked into, would not both presences, the human mind and the artificial one, be
seen as being equally human? Huang and Rust describe a similar notion in their
work about AI replacing human jobs.
Rust and Huang state that, “Eventually, AI will be capable of performing even
the intuitive and empathetic tasks, which enable innovative ways of human
machine integration for providing service but also results in a fundamental
threat of human employment.” (1). Part of our day to day lives is the ability to
perform a task that only certain people are capable of doing. Whenever the idea
of automation comes up, it is stating that a machine is better suited to take on
a human task, such as building or diagnosing. If AI becomes advanced enough to
perform all the tasks that a normal human would perform, what really sets it
apart from us? One might argue that since a machine has taken on a human
position, they are more human than you, in terms of being able to do human
labor. While my claim does not show that the AI has human feelings or a soul,
this is not too far off.
Blade Runner 2049 examines
this very idea. There is a character within the film named Joi. She is an
advanced holographic AI that takes the place of a physical romantic partner for
the main character of the movie. She is able to experience complex human
emotions like grief, happiness, longing, and love, despite the fact that she
lacks a human body and the ability to move independently of the flash drive she
inhabits. Like AI replacing human jobs, this film presents the idea of AI
replacing human relations. When one does not have to undertake the complex
journey of finding a compatible romantic partner, but can instead have an
advanced AI programmed that is capable of feeling and experiencing nearly all
things a normal human can, except physical intimacy, what really makes the
artificial partner different from a human partner? Oskar Gruenwald feels there
is still a great deal of problems here when he says, “Are we, then, at the
threshold of exchanging the complexity of genuine relationships with other human
beings for the artificial relationships with machines that can be programmed to
feign human emotions?” (7). By using words like genuine, one can see that he
feels AI will be unable to surpass the emotional complexity of a human being,
due to its limited understanding of complex situations,
A not so advanced emotion feeling AI is also shown in the work Somebody Up
there Likes Me. Ralph Lombregia writes, “A high-pitched squeal was emerging
from the thing. An oscilloscope portrayed the computer’s demise in ghostly green
wiggles-lots of waves, lines with some waves, nothing but lines…‘He’s an abuse
tester,’ I whispered to Boyce. ‘You didn’t tell me that. He kills computers for
a living.’” (228). The verbage that the author uses here implies that the
machine is more human than current computers. His usage of words like ‘squeal’,
‘demise’, ‘abuse’, and ‘death’ all point to the idea that the AI is advanced
enough to feel pain, suffer, and die, as if it were a living organism. The way
the main character feels about the destruction of the computer implies that he
sees it as being alive and Mickey acting as a murderer. If AI can become
so human-like that we question its artificiality, can the opposite of this take
place? Can a human lose nearly all aspects of its humanity to the point where
he/she is now artificial?
This very idea of eventually losing one’s humanity to the machine is usually
mentioned when dealing with prosthesis or bodily augmentation, which is similar
to the topic that the author Stefan Greiner mentions in his work. Greiner states
that, “This advance in the history of prosthetics is important to note because
it marks the moment when the inside/outside view of the human body began to
blur. Myoelectric and, nowadays, nanotechnology-based neuroelectronic interfaces
are basically following in this tradition and make the inappropriateness of
distinguishing a biological body from its technical extensions even more
obvious.” (300-301). Here, Greiner is claiming that we are becoming more and
more cybernetic in nature overall and that this makes us look inward and outward
at ourselves and question how human we really are. By using a word like blur, he
is implying that it is difficult to tell when one is still human or machine when
this augmentation enters into the mix.
Stone Lives examines
a similar idea with his prosthesis. Stone’s new eyes give him the ability to see
all the world around him, as well as different ways of seeing that are of a
non-human variety. He mentions how his augment has caused him to lose his
heightened hearing and smelling that he had cultivated while living in the
poorer region. Since this was what made him part of who he was, does his losing
of this because of the augment imply that he has lost some of his humanity as
well? Like what Greiner said, can we really look at ourselves as still being
human when we see artificial augments on our bodies? We may still possess human
emotions and tendencies, but do we not also lose pieces of ourselves that make
us less of who we once were? Ghost in the Shell examines this idea
further.
As I mentioned in an earlier paragraph, Major Motoko, the main character, is
entirely cybernetic except for her very human brain. She works as a crime
fighter for the Japanese government. Her coworkers are also augmented to varying
degrees, one is as augmented as her while another is entirely human in nature.
These varying degrees of augmentation allow the viewer to get an idea of at what
point they feel the human has become the machine. Another aspect that makes the
human more like the artificial is its ability to travel along the net like an AI
does when searching the web. Hyewon Shin claims, however, that her humanity is
still retained in his essay about the film. She claims that, “The voice, a
material sign of identity and human property, is an ambiguous vestige of
disembodied presence: the ghost. It is a simulacrum, an echo, an effect of life
in Ghost in the Shell. The ‘bodiless’ voice throughout the net seems to
imitate the theological self-presence of a spirit.” (10). The author here seems
to be stating that since the soul is able to travel along the net, the human
still exists, even when it is not connected to a body. By describing it as such,
it almost seems that the author is claiming the soul to be like artificial
intelligence. While explaining that the soul traveling is still proving human
existence, it travels along the artificial medium that AI utilizes. In this way,
her idea almost seems to be saying that the life is there but not like it used
to be. This talk almost sounds evolutionary in nature. Perhaps authors are
claiming that instead of becoming artificial, the artificial will help shape
what we become in terms of our evolution? This is called the singularity, the
point when AI and human merge into one being, and many thinkers have some
insight into this idea.
Authors Wang, Liu, and Dougherty doubt the existence of such an AI ever
achieving a status above that of humans. They state, “Since
we do not believe an R-Type system can exist, we do not think “singularity” (in
its original sense) can happen. However, we do believe AGI systems can be built
with meta-level capability comparable to that of a human mind (i.e., neither
higher nor lower, although not necessarily identical), and object-level
capability higher than that of a human mind.” (7). While this quote sounds
exceedingly complicated, these authors are essentially stating that based off of
the current systems in place, the kind of AI necessary for a singularity would
probably not occur, although AI could develop to a point relatively close to
this idea. AI might not be able to learn quite like a human might, but still the
idea that it can come even close to that is still relatively terrifying. This is
especially true when you consider what was deemed impossible in the past.
Hundreds of years ago people thought we would never leave our planet, and here
we are in our current era having gone to the moon multiple times with quite a
few people also having gone to the ISS. They might think it impossible, but the
future of computers is one that is tough to figure out. While these authors seem
skeptical on the idea of an artificial intelligence and human hybrid, the short
story Drapes and Folds seems to show
a different stance on this issue.
Drapes and Folds features a character that
is a hybrid between human and cyborg. Her difference, however, was that she was
born/made as a human robot mix and did not adapt into this status. The book
describes these individuals as, “NewOnes, NewSociety citizens farmed after the
year 2025, were a ghastly mix of human and roboid.” (129). While the process by
which these beings are created sounds exceedingly alien, since it almost sounds
like they are grown and harvested and not bred like humans, Xera still exhibits
some humanistic qualities. One such instance is when she says, “‘Gran,’ Xera
vevved. Oh, that word! The word I’d waited so long to hear!..as I moved to
embrace my dear girl, she rolled backwards and popped her arm right out of the
socket.” (138). Here, we hear a term of endearment being said by Xera that
almost gives her a kind of primitive human emotion of love. While the removal of
her arm makes us again question her stance as a human, it seems that the author
wants us to realize the difficulty in making the distinction. It is no different
from her grandmother who has the rollers put on to make it easier for her to get
around. (128). While this is not a total cybernetic augmentation, this story
shows the blurred line between human and robot by making the childbearing
process seem like a mixing between human DNA and robotic implants as well as
looking at human augmenting their own bodies with robotic parts. This is further
seen in the story
Johnny Mnemonic.
In this story, an advanced world is
shown where cybernetic augmentation impacts a wide variety of people, and
animals, in a myriad of ways. Johnny acts as a kind of carrier. He stores
secrets for different individuals, and this is accessed by that person saying
the code phrase which causes Johnny to lose consciousness and forget that he
spilt this information. This can be seen when he says, “I
had hundreds of megabytes stashed in my head on an idiot/savant basis,
information I had no conscious access to. Ralfi
had left it there. He hadn’t, however, come back for it. Only Ralfi could
retrieve the data, with a code phrase of his own invention.” (1.5). Here, Gibson
is describing the memories of Johnny a lot like a computer. One needs a password
to gain access into the data within a computer and once the password is
accepted, the computer turns on. Ironically, when the password is said to
Johnny, his human side literally turns off and it seems like a receptive AI of
sorts takes over and spouts off the info, which would explain Johnny’s lack of
memory in this story. In my opinion, it seems like Gibson is saying that mankind
can live alongside the mechanical world, but it can often be abused by others to
bring harm upon others. The tech in this story does not seem to be able of
consciousness itself, but it simply acts as a firewall for the human mind, which
sounds like an evolutionary idea. Since it is leaning more towards singularity
in my discussion, Dr. Logan’s work will provide further background on this
topic.
Dr. Logan appears to share the same
stance on a possible singularity as Dougherty, Liu, and Wang. He claims that, “A
computer through AGI can become a brain of sorts but not a mind because it does
not possess language and therefore cannot listen to its internal speech and
therefore cannot become conscious. A form of intelligence that is not conscious
of its mental processes is severely limited and therefore could never compete
with the human mind.” (4). When we think of the process of developing a
language, we as people had to work to develop the minute details and determine
how all the pieces would weave together to create the varied tapestry of our
language. AI currently does not have the means to create such a vast intricate
network in which to commune with itself and other AI’s. As such, Logan sees this
as a major blow against the notion that there will be a singularity. He also
states that because they are currently unable to develop emotions, they will not
be able to solve complex issues and gain new knowledge from them he claims that
many respected thinkers feel that this is a vital piece to the puzzle. (5).
Since Logan sees two strikes against the idea of a singularity, it is quite
possible that such a phase in our evolution may not occur or may take multiple
generations to come to fruition.
While I personally lean towards the belief that AI will become advanced enough
to comprehend all of our complex emotions and nuances, many of my experts seems
to feel that while AI will become exceedingly more technologically advanced, it
will never be able to replace human interactions. Some of my other experts feel
that we as people are becoming more cybernetic and that this infusion with the
machine is a difficult idea to define. While some seem to feel that we are
losing our humanity, Shin makes it sound like that as long as we have our souls,
we will always remain human no matter what container we are placed in.
Work Cited:
Blade Runner 2049. Dir.
Denis Villeneuve. Perf. Ryan Gosling, Ana De Armas, and Harrison Ford. Alcon
Entertainment, 2017
Ghost in the Shell.
Dir. Mamoru Oshii. Perf. Atsuko Tanaka, Akio Ōtsuka, and Iemasa Kayumi. Bandai
Visual, 1995.
Greiner, Stefan. "Cyborg Bodies—Self-Reflections on Sensory Augmentations." Nanoethics,
vol. 8, no. 3, 2014, pp. 299-302.
Huang, Ming-Hui, and Roland T. Rust. "Artificial Intelligence in Service." Journal
of Service Research, vol. 21, no. 2, 2018, pp. 155-172.
Logan, Robert. "Can Computers Become Conscious, an Essential Condition for the
Singularity?" Information, vol. 8,
no. 4, 2017, pp. 161.
Lombreglia, Ralph. “Somebody Up There Likes Me.” Virtually Now, edited by
Jeanne Schinto, Persea Books, 1996, pp. 208-237.
Shin, Hyewon. "Voice and Vision in Oshii Mamoru's Ghost in the Shell: Beyond
Cartesian Optics." Animation: An Interdisciplinary Journal, vol. VI, no.
1, 2011, pp. 7-23.
Sipper, Moshe, and Jason H. Moore. "Artificial Intelligence: More Human with
Human."BioData Mining, vol. 10, no. 1, 2017, pp. 34-2.
Wang, Pei, Kai Liu, and Quinn Dougherty. "Conceptions of Artificial Intelligence
and Singularity." Information, vol.
9, no. 4, 2018, pp. 79.
|