• AP Magazine

    An alternative way to explore and explain the mysteries of our world. "Published since 1985, online since 2001."

  • 1
Alternate Perceptions Magazine, October 2018

The Finality of Artificial Super Intelligence

by: Steve Erdmann

https://www.youtube.com/watch?v=b3hfEGTVqFQ https://www.youtube.com/watch?v=b3hfEGTVqFQ

James Barrat has written a penetrating analysis of what our future world could become if artificial Intelligence continues down the path that engineers, scientists and somewhat inept human misperception has geared that science into at our detriment.

“…today’s opaque, Orwellian, personal-data-aggregating behemoth…Technological Singularity…start of a galaxy-wide plague…dystopian vision…resulting AI would have a galling lack of propriety about your atoms…(won’t) see the dangers coming until it’s too late…nanotech, biotech, and other risky endeavors…when god jumps out of a box there is nothing that human beings can do or stop or change the course of action…self-aware systems…do things that will be unexpected, even peculiar.”

pp. 41, 46, 48, 52, 53, 59, 63, 64, 68.

(Artificial Intelligence and the End of the Human Era: Our end of the Human Era: Our Final Invention, James Barrat, St. Martin’s Press, 175 Fifth Avenue, New York, N.Y. 10010, 2013, www.thomasdunebooks.com, www.stmartins.com., 322 pages, $26.99.)


Barrat does an impressive canvas of some of the major scientists involved in the creation of AI: Michael Vassar, Dr. James Hughes, Stephen Omohundro, Eliezer Vudkowsky, Danny Hillis, John Koza, Kevin Warwick, Hugode Gario, Steve Ohawa, I. J. Good, Steve Grossberg, Vernor Vinge, John Von Neumann, and a multitude of others: But none more famous than the Father of the “Singularity,” Ray Kurzweil. “…the Singularity…human existence will be fundamentally altered, the fabric of history torn,” said Kurzweil. “Machines and biology will become indistinguishable…ending hunger and poverty, and delivering cures for all of mankind’s diseases…stop your body’s aging, or even reverse it…between man and machine blurs where the line between humanity and technology fades and were the soul and the silicon chip unite…we can add new genes, whole new organs with stem cell therapy…reverse engineer the brain.” Kurzweil envisioned that by 2045, there will be full-blown Singularity with plugging technologies directly into our neurocircuitry, doubling in power every year. “It is no surprise that the Singularity is often called the Rapture of the Geeks,” says Barrat. “As a movement it has the hallmark of an apocalyptic religion, including rituals of purifications, eschewing frail human bodies, anticipating eternal life, and an uncontested (somewhat) charismatic leader.”


The road to the Singularity will develop on the dead bodies of rapidly accumulating scientific discovery at a rate so fast that the average citizen will not notice the speed. It is called the Law of Accelerating Returns (LOAR). “…then its growth curve steepens until it shoots upward vertically,” says Barrat. “… a critical period of technology evolution.” (p. 138.)

Neural probes inside and outside the skull, mapping of neurons into computer algorithms for all the brain’s processes, mimicking the brain as multiple types of algorithms. (p. 44.)

At this point, developers should be coding the AI with “Friendly” intentions, so that it doesn’t harm the human race, and involves “coherent extrapolated volition (CEV)” as an “oracular feature” seeking “better versions of ourselves.”

The “fly in the ointment,” as a metaphor, is that a lot of these “discoveries” will have to do with DARPA, which is basically a military endeavor, and seeks AI as a “weapon.” It was DARPA with its vast funds that enabled a lot of AI discovery. Some scientists are worried that the AI --- especially as Artificial Superintelligence (ASI) --- will not remain Human Friendly, especially when military minds provide hidden militaristic agendas.

“Bad or indifferent ASI needs to get out of the box just once,” says Barrat. “About 2030, less than a generation from now, it could be our challenge to cohabit Earth with super intelligent machines, and to survive.”


It was IBM’s SyNaPse project to “reverse engineer” the brain which began in 2008 with a $30 million grant from DARPA, as the system tried to copy the mammalian brain’s basic technique. The System of Neuromorphic Adaptive Plastic Scalable Electronics (SyNaPse) utilized cognitive computers with thousands of parallel processing computer chips, powered writers’ routes, millions of sensors around the planet, linked to the Internet, gathering input from all over the world. SyNaPse now “mirrored” the human brain’s thirty billion neurons and hundred trillion connecting points, surpassing the brain’s approximately thousand-trillion operations per second (pp. 57-58). Danny Hillis, founder of Thinking Machines, Inc., compared the present state to single-celled organisms turning into multi-celled organisms. “We are amoebas and we can’t figure out what the hell this thing is that we’re creating,” he said (p. 69).

Technological advancements have included Pattern Recognition, Palm Pilot, and Handspring Treo-handwriting Recognition, Numenta, Neural Nets, Genetic Algorithms, Automatic Reading, Markoo Models, market circuit breakers, pre-trade testing algorithms, AI Source Code audits, Centralized AI Activity Recording, iPad2, 3-D Processor chips, Deep QA/Watson Project, Maladroit, NELL (Darpa), Open-Cog (Novamente), Cycorp, Inc., iRobot, Eurisko, EC2-cloud computing service, Nckomata, LIDA (Learning Intelligence Distributed Agents), Stuxnet, Wolfam!, Alpha, many others, and, of course, the Internet:

“DARPA (then called ARPA) funded the research that invented the Internet (initially called ARplanet),” says Barrat. “…GUI, or Graphical User Interface…parallel processing hardware and software, distributed computing, computer vision, and natural language processing (NLP).

“DARPA…$61.3 million to a category called Machine Learning, and $49.3 million to a category called Cognitive Computing Information and Communication Technology…$400.5 million…Classified Programming…$107.2 million.”

pp. 114, 127, 149, 165, 171, 173, 177, 180, 218, 256.


Barrat raises the question as to if and when on AI’s or ASI’s road to the Singularity, will it become man’s best friend or man’s worst monster; in fact, will it become downright unfriendly to humans?

In a quickening-advancing future world of the Law Of Accelerating Returns (LOAR) ASI may have gained an intellectual capacity to be more intelligent than all humanity combined and come to a point that they will ignore us (p. 91). ASI will need gargantuas amounts of energy to exist and may resort to unimagined “efficiency, self-protection, and resource acquisition,” forcing mankind to become “the number 2 species,” even seeking to give humans a “gaga-death,” assimilating our “atoms” to sustain its existence (pp. 85-86).

Various fail-safe and “human friendly” algorithms and architecture may not be enough to corral ASI from becoming a runaway and conquering species. Scientist Stephen Omohundro said that ASI needs “to get out of the box just once” to become a Busy Child with self-replication that will “swarm” a problem with many versions of it using recursive self-improvement, high-speed calculations, running 24/7, mimicking friendliness, but actually playing dead to true goodness (pp. 65-70). “…we’ll get something more along the lines of a psychopathic egotistic, self-oriented entity,” said Stephen Omohundro, “…consequences of what we leave out.” Neuroscientist, Cognitive Scientist, and biomedical engineer, Steven Grossberg said that placing an “extinction” mechanism in any ASI’s program may not even be enough (pp.114-117).

‘Super intelligence could very well be a violence multiplier,” says Barrat. “It would turn grudges into killings…a well-established track record for self-protection, consolidating resources, outright killing, and other drives we can only hypothesize about in self-aware machines.” (P. 157.)


Cartoon character “Pogo” is often quoted: “I saw the enemy and it is ‘us’.”1 Barrat equally points out the biggest “fly in the ointment” of ASI, as pointed out earlier, and it is human criminality and evil-doing--attributes we can’t seem to eliminate from the human scene.

Kurzweil summarized the inevitable difficulties: “We face daunting challenges, battlefield robots...DARPA-funded AI…dead-men switches and secret handshakes…they will control superintelligence…” Barrat equally sees the dangers: “But AGI is much closer to nuclear weapons than video games…nanotechnology, bioengineering, and genetic engineering…all are primed for catastrophic accidents and exploitation in military and terrorist use…potentially dangerous technologies. And sometimes humans take a fall.” (pp. 154-156.)

Homosapiens are not known to be “particularly harmless” when in contact with one another, other animals, or the environment, pointed out the experts---“DARPA will want its money back if superintelligence makes soldiers super friendly.” The “sprint” from AGI to ASI, says Barrat, “will not be accompanied by safeguards sufficient to prevent catastrophe.” (p.159.)

“…an unimpeded ASI might express these drives in downright psychopathic ways…could be diabolically persuasive, even frightening…overwhelming intellectual firepower to the task of destroying its gatekeeping resistance…it could take control of our resources, even our own molecules,” says Barrat, “…a super AGI that really could directly take over our world.”


Barrat notes present-day scandals and failures such as Enron, the 2008 market crash, Japan’s Nckomata, Black Net AURORA, China’s Titan Rain, U.S/Israel Stuxnet, Olympic Games, and many other human criminal invasions and evil-minded technologies.

Barrat quotes science-fiction writer Simon Ings as a “final invention”:

“When our machines over took us, too complex and efficient for us to control, they did it so fast and so smoothly and so usefully, only a fool or a prophet would have dared complain.” (p. 210.)

Steve Erdmann
August, 2018
St. Louis, Mo. 63111


You can reach Steve Erdmann – at – This email address is being protected from spambots. You need JavaScript enabled to view it. – or – This email address is being protected from spambots. You need JavaScript enabled to view it.. His Facebook email is http://facebook.com/stephen.erdmann1 You can friend him at: Facebook – https://www.facebook.com/stephen.erdmann1 – Or – visit the Dissenter/Disinter Group – https://www.facebook.com/#!/groups/171577496293504/. His Facebook email is http://facebook.com/stephen.erdmann1. You can also visit his articles at the following: http://www.minds.com – TheDissenter, http://www.ufospotlightwordpress.com, http://www.ufodigestblog.wordpress.com, http://www.ufodigest.com, Alternate Perception Magazine: http://www.apmagazine.info/. https://www.youtube.com/watch?v=b3hfEGTVqFQ

Tuesday, February 20, 2024