in printfrom | SmithsonianReaching the singularity.

feat. Ray Kurzweil
August 29, 2022

IMAGE


— contents —-

~ the story


publication: Smithsonian
supplement: Air + Space Magazine
story title: Reaching the singularity may be humanity’s greatest + last accomplishment
deck: Should we be searching for post-biological aliens?
date: March 2020

read | story

presented by

Smithsonian | home
tag line: Chronicling science, history, art, popular culture, and innovation.

the Smithsonian Institution | home ~ channel
tag line: The world’s largest museum, education, and research complex.


the STORY

An introduction.

In a new paper published in The International Journal of Astrobiology, Joseph Gale from The Hebrew University of Jerusalem and co-authors make the point that recent advances in artificial intelligence (AI) — particularly in pattern recognition and self-learning — will likely result in a paradigm shift in the search for extraterrestrial intelligent life.

While futurist Ray Kurzweil predicted 15 years ago that the singularity — the time when the abilities of a computer overtake the abilities of the human brain — will occur in about 2045, Gale and his co-authors believe this event may be much more imminent, especially with the advent of quantum computing. It’s already been four years since the program AlphaGO, fortified with neural networks and learning modes, defeated Lee Sedol, the Go world champion. The strategy game StarCraft II may be the next to have a machine as reigning champion.

If we look at the calculating capacity of computers — and compare it to the number of neurons in the human brain — the singularity could be reached as soon as the early 2020s. However, a human brain is ‘wired’ differently than a computer, and that may be the reason why certain tasks that are simple for us are still quite challenging for today’s AI. Also, the size of the brain or the number of neurons don’t equate to intelligence. For example, whales and elephants have more than double the number of neurons in their brain, but are not more intelligent than humans.

The authors don’t know when the singularity will come, but come it will. When this occurs, the end of the human race might very well be upon us, they say, citing a 2014 prediction by the late Stephen Hawking. According to Kurzweil, humans may then be fully replaced by AI, or by some hybrid of humans and machines.

What will this mean for astrobiology? Not much, if we’re searching only for microbial extraterrestrial life. But it might have a drastic impact on the search for extraterrestrial intelligent life (SETI). If other civilizations are similar to ours but older, we would expect that they already moved beyond the singularity. So they wouldn’t necessarily be located on a planet in the so-called habitable zone. As the authors point out, such civilizations might prefer locations with little electronic noise in a dry and cold environment, perhaps in space, where they could use superconductivity for computing and quantum entanglement as a means of communication.

We are just beginning to understand quantum entanglement, and it is not yet clear whether it can be used to transfer information. If it can, however, that might explain the apparent lack of evidence for extraterrestrial intelligent civilizations. Why would they use “primitive” radio waves to send messages?

I think it also is still unclear whether there is something special enough about the human brain’s ability to process information that casts doubt on whether AI can surpass our abilities in all relevant areas, especially in achieving consciousness. Might there be something unique to biological brains after millions and millions of years of evolution that computers cannot achieve? If not, the authors are correct that reaching the singularity could be humanity’s greatest and last advance.

— end —