More Options

Skepticism and the Singularity

The Good Word

Karen Stollznow

August 20, 2010

A Brave New World?

"Do you want to attend a conference about artificial intelligence?" I was asked by Sean McCabe, former intern to James Randi.

"Sure," I responded, not knowing that he was referring to the Singularity Summit, the annual conference held by the Singularity Institute. This is artificial intelligence (AI) with a philosophy behind it.

The Technological Tipping Point

The futurist concept of "the singularity" is far from Aldous Huxley's early futurist ideas of subliminal learning and reproductive technology in Brave New World. "Technological singularity" is the theory, prediction, and objective that artificial intelligence will soon surpass human intelligence.

A range of ideas and ideologies underpin the singularity. Back in 1965, I.J. Good envisioned an "intelligence explosion," leading to the invention of "ultraintelligent machines."1 In 1993 Vernor Vinge postulated that "superintelligence" will render us the self-executioners of humankind, creating our own obsolescence and ending "the human era."2 Ray Kurzweil forecasts that science will soon emulate then exceed the capabilities of the human brain, citing the exponential growth trend in technological development as evidence for his theories.3

These beliefs intersect with transhumanism, which aims to improve human characteristics and capabilities using science and technology while considering the resulting ethical issues. There are humanitarian ambitions to overcome poverty and disease, extend human longevity, and address problems of dwindling resources. Then there are bold claims that we will ultimately reverse engineer the human brain, replicate consciousness, and graduate from programming computers to programming people.

A Computational Critical Mass

Skeptics are often skeptical of the singularity.

Some skeptics see singulatarians as optimists at best or extremists at worst. Some think that the proponents posit unrealistic timelines and have unrealistic goals of a scientific Utopia. Others see singulatarians as evil doctors playing eugenics in the transhumanist name of "human enhancement." Some see the singularity as science fiction, with its dreams of immortality and the ability to upload and download the human brain.

Skeptics often draw parallels between singulatarians and cults or religious fervor. Kurzweil is called a guru or mystic, while The Singularity is Near title smacks a little of "the end of the world is nigh." P.Z. Myers describes him as "just another Deepak Chopra for the computer science cognoscenti."4

More than once that weekend I heard the singularity referred to as "the rapture for nerds." Conversely, skeptics are accused of being closed-minded, technology-resistant neo-luddites. However, there was no brainwashing or propaganda at the Singularity Summit. There was no hard-selling or recruitment. There was no introduction to the theory, let alone indoctrination. Moreover, there was no single opinion as to how, when, where, or if the singularity would occur. There is no unified theory.

Also, there is no agreed-upon timeframe for the realization of these achievements. There are as many projected dates for "the coming of the singularity" as there are for armageddon. Singulatarians speculate that it will occur by 2020, 2030, or 2045. Michael Shermer has quipped that advances in AI are always ten years away from success.

The Singularity is Not-So-Near

The summit was a conglomeration of scholars, students, science enthusiasts, and technophiles with diverse backgrounds, all bound by an interest in emerging technologies.

The conference was interdisciplinary, although most of the presenters treated their areas of expertise without necessarily correlating it to the singularity. They were not invariably advocates. In contrast, biophysicist Gregory Stock and neurobiologist Dennis Bray spoke pessimistically about the theories with reference to their own fields. It became apparent to me that the singularity is mostly about conjecture, and discussions raise more questions than answers.

However, the summit was valuable for knowledge-sharing, networking, and introducing the latest advances in science. The seminars covered a wide range of themes central to the singularity, including neuroscience, evolutionary psychology, energy, robotics, machine learning, nanotechnology, and biotechnology. Some academics presented their research into life extension, which seems more realistic than notions of ending "involuntary death." Ben Goertzel examined the genomes of fruit flies and the "superflies" he has bred for longevity; Ellen Heber-Katz, who works in mammalian regeneration, discussed the implications of Goertzel's work for humans.

There were many novel talks, including that by Steve Mann, the "world's first cyborg." Interested in wearable computers, Mann joked "eye am a camera" while wearing a live-streaming camera over one eye. Due to an influx of viewers, his retina is now password protected. He also helped invent the haudraulophone, a musical instrument that produces sound from water vibrations. Mandayam Srinivasan spoke about his research in the field of haptics and the creation of the tadoma method that allows vision-impaired people to "see" through touch. David Hanson is a roboticist who designs and develops robots with human-like expressive capabilities. This was one of the few lectures at the summit that investigated social cognition, where psychology plays a secondary role to biology. As Kurzweil conceded, consciousness may be too subjective to measure, but this can't be overlooked if smarter-than-human intelligence is the ultimate ambition.

Skeptical During Singular Times

There was still plenty of critical thinking at the conference. The speakers treated the scientific method, the evolution vs. creationism debate, and the importance of Ockham's Razor. Rationalist issues are a vital part of transhumanism, after all, and one of the objectives of the Singularity Institute is to "encourage rational thought about our future as a species."5

There was also a large contingent of skeptics in attendance, including Skepchick blogger Sam Ogden, magician Andrew Mayne, the James Randi Educational Foundation's D.J. Grothe, and James Randi himself-the plenary speaker of the summit.

At 6:15 pm on Sunday afternoon, a time at most conferences when the crowd has usually thinned and the stragglers are falling asleep, the room was filled to capacity with people waiting to hear from Randi, who called himself the "curmudgeon of the conference" and looked a little like the author of On the Origin of Species. "The Charles Darwin look is good for me, don't you agree?" Randi asked. When the audience responded with applause, he added, "Charles Darwin is dead; remember that."

Randi remarked that we all make assumptions. For example, the audience assumed that he was talking into a hand-held microphone-until he switched on the beard trimmer he was using as a prop. We also assumed that he was wearing spectacles, too, until he revealed that they contained empty frames. His point was that assumptions are natural and necessary, but sometimes they can lead our thinking in the wrong direction.

Randi recounted a story about one of the times a caller announced that they had "won the Million Dollar Challenge." This particular call was from the Lawrence Livermore Laboratory at the University of California–Berkeley. The claim was that someone there defied the laws of physics with a mysterious matchstick that could stand on its own. Within minutes, Randi's assistant faxed the caller a page from a magic book explaining this common trick.

Randi's lesson for the audience was that even experts can be deceived. He cited another instance, when a physicist accompanied him to a magic show. Amazed by a feat of levitation, the scientist marveled that the magician "must have used superconductivity!" Then he proceeded to provide a convoluted explanation for how this could be achieved. Randi replied simply, "No; it's an illusion!"

We can be fooled if we see only through the eyes of our own expertise. As Randi put it, "Sometimes you don't use your brain because you're too busy using your education." To demonstrate that the simplest explanation is often the correct one, Randi showed some classic footage of his Peter Popoff exposé on the Johnny Carson show and a performance of psychic surgery. He mused that as long as we still believe in magic, we can be fooled.

As an inventor himself, Randi encouraged the spreading of innovative ideas tempered with a healthy skepticism. Disappointingly, some attendees didn't grasp the relevance of a magician speaking at a conference about the singularity. However, skepticism is critical to a movement that is essentially about prediction.

References

1. Good, I.J. 1965. "Speculations Concerning the First Ultraintelligent Machine." Advances in Computers, vol. 6.

2. Vinge, Vernor. 1993. "The Coming Technological Singularity." Available at http://accelerating.org/articles/comingtechsingularity.html. Accessed 08/18/2010.

3. Kurzweil, Ray. 2005. The Singularity is Near: When Humans Transcend Biology. Penguin Group.

4. Myers, P.Z. "Ray Kurzweil Does Not Understand the Brain." Pharyngula. Available at http://scienceblogs.com/pharyngula/2010/08/ray_kurzweil_does_not_understa.php.

5. Singularity Institute for Artificial Intelligence. Available at http://singinst.org/aboutus/ourmission. Accessed 08/18/2010.

Karen Stollznow

Karen Stollznow's photo

Karen Stollznow is an author and skeptical investigator with a doctorate in linguistics and a background in history and anthropology. She is an associate researcher at the University of California, Berkeley, and a director of the San Francisco Bay Area Skeptics. A prolific skeptical writer for many sites and publications, she is the “Good Word” Web columnist for the Committee for Skeptical Inquiry, the “Bad Language” columnist for Skeptic magazine, a frequent contributor to Skeptical Inquirer, and managing editor of CSI’s Scientific Review of Mental Health Practice. Dr. Stollznow is a host of the Monster Talk podcast and writer for the Skepbitch and Skepchick blogs, as well as for the James Randi Educational Foundation’s Swift. She can be reached via email at kstollznow[at]centerforinquiry.net.