Singularity Is Near: When Humans Transcend Biology

Home
<< Book    click this image for more info on: Singularity Is Near: When Humans Transcend Biology
Singularity Is Near: When Humans Transcend Biology

by: Ray Kurzweil

Topics include:

More info & price

From Publishers Weekly
Starred Review. Renowned inventor Kurzweil (The Age of Spiritual Machines) may be technology's most credibly hyperbolic optimist. Elsewhere he has argued that eliminating fat intake can prevent cancer; here, his quarry is the future of consciousness and intelligence. Humankind, it runs, is at the threshold of an epoch ("the singularity," a reference to the theoretical limitlessness of exponential expansion) that will see the merging of our biology with the staggering achievements of "GNR" (genetics, nanotechnology and robotics) to create a species of unrecognizably high intelligence, durability, comprehension, memory and so on. The word "unrecognizable" is not chosen lightly: wherever this is heading, it won't look like us. Kurzweil's argument is necessarily twofold: it's not enough to argue that there are virtually no constraints on our capacity; he must also convince readers that such developments are desirable. In essence, he conflates the wholesale transformation of the species with "immortality," for which read a repeal of human limit. In less capable hands, this phantasmagoria of speculative extrapolation, which incorporates a bewildering variety of charts, quotations, playful Socratic dialogues and sidebars, would be easier to dismiss. But Kurzweil is a true scientista large-minded one at thatand gives due space both to "the panoply of existential risks" as he sees them and the many presumed lines of attack others might bring to bear. What's arresting isn't the degree to which Kurzweil's heady and bracing vision fails to convincegiven the scope of his projections, that's inevitablebut the degree to which it seems downright plausible. (Sept.) Copyright Reed Business Information, a division of Reed Elsevier Inc. All rights reserved. From Booklist
Continuing the themes of The Age of Spiritual Machines (1999), Kurzweil further expounds his conviction that the human being will be succeeded by a superintelligent entity that is partly biological, partly computerized. Welcoming this prospect, and regarding it as inevitable, Kurzweil plunges into contemporary technological arenas, particularly genetics, nanotechnology, and robotics. Citing examples from medical devices to military weapons in which human control is increasingly detached from the autonomy of machines, Kurzweil stresses that trends are accelerating in terms of miniaturization and computational power. Eventually, smallness and speed reach a point of development, a "singularity," with implications Kurzweil says even he cannot imagine. Disinclined to categorize his views as dystopian or utopian, the author recognizes that his vision is profoundly threatening to concepts of human nature and individuality. A closing section on philosophy and ethics accordingly addresses objections to his optimistic predictions. An involved presentation, this is best for readers of the wide-angle, journalistic treatment Radical Evolution (2005), by Joel Garreau. Gilbert Taylor
Copyright American Library Association. All rights reserved
About the Author
Ray Kurzweil is a prizewinning author and scientist. Recipient of the MIT-Lemelson Prize (the worlds largest for innovation), and inducted into the Inventors Hall of Fame, he received the 1999 National Medal of Technology. His books include The Age of Spiritual Machines and The Age of Intelligent Machines.


Reviews:

Technically brilliant, culturally constrained: Ray Kurzweil is unquestionably the most brilliant guru for the future of information technology, but Joel Garreau's book "Radical Evolution" covers the same ground, with the same lack of soul, but more interesting and varied detail. This is really four booklets in one: a booklet on the imminence of exponential growth within information technologies including genetics, nano-technology, and robotics; a booklet on the general directions and possibilities within each of these three areas; a booklet responding to critics of his past works; and lengthy notes. All four are exceptional in their detail, but somewhat dry. I was disappointed to see no mention of Kevin Kelly's "OUT OF CONTROL: The Rise of Neo-Biological Civilization," and just one tiny reference to Stewart Brand (co-evolution) in a note. Howard Rheingold (virtual reality) and Tom Atlee (collective intelligence) go unmentioned. It is almost as if Kurzweil, who is surely familiar with these "populist" works, has a disdain for those who evaluate the socio-cultural implications of technology, rather than only its technical merits. This is an important book, but it is by a nerd for nerds. [Sorry, but anyone who takes 250 vitamin supplements and has a schedule of both direct intravenous supplements and almost daily blood testing, is an obsessive nerd however worthy the cause.] It assumes that information technologies, growing exponentially, will solve world hunger, eliminate disease, replenish water, create renewable energy, and allow all of us to have the bodies we want, and to see and feel in our mates the bodies they want. All of this is said somewhat blandly, without the socio-cultural exploration or global evaluation that is characteristic of other works by reporters on the technology, rather than the technologists themselves. The book is, in short, divorced from the humanities and the human condition, and devoid of any understanding of the pathos and pathology of immoral governments and corporations that will do anything they can to derail progress that is not profitable. It addresses, but with cursory concern, most of the fears voiced by various critics about run-away machines and lethal technologies that self-replicate in toxic manners to the detriment of their human creators. The book is strongest in its detailed discussion of both computing power and draconian drops in needed energy for both computing and for manufacturing using new forms of computing. The charts are fun and helpful. The index is quite good. I put the book down, after a pleasant afternoon of study, with several feelings. First, that I should give Joel Garreau higher marks for making this interesting, and recommend that his book be bought at the same time as this one. Second, that there is an interesting schism between the Kurzweil-Gates gang that believes they can rule the world with machines; and the Atlee-Wheatley gang that believes that collective **human** intelligence, with machines playing a facilitating but not a dominant role, is the desired outcome. Third that there really are very promising technologies with considerable potential down the road, but that government is not being serious about stressing peaceful applications--the author is one of five advisors to the U.S. military on advanced technologies, and it distresses me that he supports a Defense Advanced Research Programs Agency (DARPA) that focuses on making war rather than peace--imagine if we applied the same resources to preventing war and creating wealth? Fourth, information technologies are indeed going to change the balance of power among nations, states, and neighborhoods--on balance, based on his explicit cautions, I predict a real estate collapse in the over-priced major cities of the US, and a phenomenal rise of high-technology villages in Costa Rica and elsewhere. The singularity may be near, as the author suggests, but between now and then tens of millions more will die. Technology in isolation is not enough--absent broad ethical context, it remains primarily a vehicle for nerds to develop and corporations to exploit. As I told an internal think session at Interval in the 1990's ("GOD, MAN, & INFORMATION:. COMMENTS TO INTERVAL IN-HOUSE". Tuesday, 9 March 1993" can use as a Google search) until our technologies can change the lives of every man, woman, and child in the Third World, they are not truly transformative. This book hints at a future that may not be achieved, not for lack of technology, but for lack of good will. EDIT of 24 Oct 05: Tonight I will review James Howard Kunstler's "The Long Emergency: Surviving the Converging Catastrophes of the Twenty-First Century." His bottom line is that cheap oil underlies all of our surburban, high-rise, mega-agriculture, and car-based mobility, and that the end of cheap oil is going to have catastrophic effects on how we live, driving much of the country into poverty and dislocation, with the best lives being in those communities that learn to live with local agriculture and local power options. Definitely the opposite of what Kurzweil sees, and therefore recommended as a competing viewpoint.

Important extrapolations, but not as careful or concise as I wanted:
Kurzweil does a good job of arguing that extrapolating trends such as Moore's Law is better than most alternative forecasting methods, and he does a good job of describing the implications of those trends. But he is a bit long-winded, and tries to hedge his methodology by pointing to specific research results which he seems to think buttress his conclusions. He neither convinces me that he is good at distinguishing hype from value when analyzing current projects, nor that doing so would help with the longer-term forecasting that constitutes the important aspect of the book. Given the title, I was slightly surprised that he predicts that AIs will become powerful slightly more gradually than I recall him suggesting previously (which is a good deal more gradual than most Singulitarians). He offsets this by predicting more dramatic changes in the 22nd century than I imagined could be extrapolated from existing trends. His discussion of the practical importance of reversible computing is clearer than anything else I've read on this subject. When he gets specific, large parts of what he says seem almost right, but there are quite a few details that are misleading enough that I want to quibble with them. For instance (talking about the world circa 2030): "The bulk of the additional energy needed is likely to come from new nanoscale solar, wind, and geothermal technologies." Yet he says little to justify this, and most of what I know suggests that wind and geothermal have little hope of satisfying more than 1 or 2 percent of new energy demand. His reference to "the devastating effect that illegal file sharing has had on the music-recording industry" seems to say something undesirable about his perspective. His comments on economists thoughts about deflation are confused and irrelevant. On page 92 he says "Is the problem that we are not running the evolutionary algorithms long enough? ... This won't work, however, because conventional genetic algorithms reach an asymptote in their level of performance, so running them for a longer period of time won't help." If "conventional" excludes genetic programming, then maybe his claim is plausible. But genetic programming originator John Koza claims his results keep improving when he uses more computing power. His description of nanotech progress seems naive. (page 228) "Drexler's dissertation ... laid out the foundation and provided the road map still being followed today." (page 234): "each aspect of Drexler's conceptual designs has been validated". I've been following this area pretty carefully, and I'm aware of some computer simulations which do a tiny fraction of what is needed, but if any lab research is being done that could be considered to follow Drexler's road map, it's a well kept secret. Kurzweil then offsets his lack of documentation for those claims by going overboard about documenting his accurate claim that "no serious flaw in Drexler's nanoassembler concept has been described". Kurzweil argues that self-replicating nanobots will sometimes be desirable. I find this poorly thought out. His reasons for wanting them could be satisfied by nanobots that replicate under the control of a responsible AI. I'm bothered by his complacent attitude toward the risks of AI. He sometimes hints that he is concerned, but his suggestions for dealing with the risks don't indicate that he has given much thought to the subject. He has a footnote that mentions Yudkowsky's Guidelines on Friendly AI. The context could lead readers to think they are comparable to the Foresight Guidelines on Molecular Nanotechnology. Alas, Yudkowsky's guidelines depend on concepts which are hard enough to understand that few researchers are likely to comprehend them, and the few who have tried disagree about their importance.

Near, but not imminent: Kurzweil spends considerable energy pressing the power of exponential growth, and how we are approaching the "knee" in the curve of important technologies that will suddenly catapult our existence into a vastly new nano-cyber realm. The logical conclusion of our enhanced billion-human-minds-for-a-penny brain implants is that we'll convert all matter into computation devices, consuming or converting everything in a wave that crashes out over the universe. Kurzweil is a compelling writer, and his central idea, that intelligence is the most powerful force in the universe, is breathtaking enough for a sci-fi novel. But I'm still stuck just a bit in what he derisively calls "linear time". Clauswitz, the great general, wrote about "friction" that upsets and delays the best-laid plans, and I think Kurzweil's predictions will run into much more friction than he imagines. In my favorite section, he covers SETI - the search for extraterrestrial intelligence, a group that does great work on a tiny budget. Although an interested observer, he predicts they are wasting their time, as ETs would be impossible to miss if they wanted to be seen, and impossible to see if they didn't, and would anyway would zip through the radio era to faster-than-light travel in a few decades.

The brainiest quantum computer...in the world is never satisfied that he has to share the planet with other brainy quantum computers, and tries to keep the other hyperbrainy quantum computers in their places. (They start complaining because the Man is riding them all the time with some authority complex.) Then gets bored after exploring condensed methane wastelands orbiting other suns, and still not getting any. (But first, the organic things walking around before them were so machine-like that they didn't even arrive with operating instructions? No troubleshooting manuals in the glove compartment along with birth? That's not very machine-like. Sounds almost spiritual.) Then there's the timeframe--almost as if a Biology 101 class (an elective I guess) was missing from the education track. In 40 years I know that as a quantum computer I may still have trouble getting a reliable automobile from the Big 3--not to mention a stable, secure operating system for my desktop. Sure I'll probably be shopping in 3D online, and having clothing knitted down the street to custom order. But my desktop will still be processing only a few threads at a time at maybe 10 THz.The Dawn Of A New Era!
There is so much in this book, it is difficult to decide where to begin. This book is a treasure-trove of information on post-singularity humankind, and events leading up to it, which will allow us, if we desire, to transcend the physical limits of our bodies. The central message of Kurzweil is that we are rapidly approaching the technological singulatity, where progress, as graphed on a linear chart, becomes almost vertical, changing the way we will view ourselves, our mortality, and the world around us, to nearly beyond recognition, during the next several decades. And for most people, the accelerating pace of scientific and technological change will come as a sort of future shock, as things around them are changed in rapid order. Kurzweil gives many examples of the accelerating pace of advancement, and extrapolates into the future, very well done with plenty of references. He askes: How Singular is the Singularity? Throughout this book Kurzweil attempts to give some meaning to this question and what it will mean for us, as best as we can prognosticate with our current level of understanding. Obviously, over the next several decades many events will surprise us, but overall Kurzweil gives readers, it seems, a good understanding of what lies waiting for us in the near future. Ray Kurzweil has written what I believe to be a landmark book, and he covers much ground along the way. He writes of the difference between knowledge and information, important with regard to the amount of 'noise' in the world today that attempts to pass as knowledge. The current state of brain scanning is given lengthy treatment, as is genetics, robotics, and nanotechnology, all transforming technologies according to Kurzweil. Very interesting to me was a very detailed analysis of the ultimate limits of computation, which physically could perhaps allow a liter sized computer to simulate all the thoughts of all the people who have ever lived in about one ten-thousands of a second, implying that there is plenty of room for improvements over existing computers. In fact, it is thought that a $1000 personal computer will roughly equal the raw processing power of the human brain soon after 2020. In terms of software Kurzweil notes great strides there also, with brain scanning to allow reverse engineering of the human brain into a computer substrate as soon as the 2030's. Nanobots located in a persons brain, by the billions, will allow a person, perhaps in the 2040's, to have the vast majority of his/her thought processes to be located in a non-biological platform, and you really should read this book and find out the details of this most important future possibility. Other topics include the spread of intelligence outward into the universe, other civilizations, or the lack of them, out in space, and Kurzweil's personal view of what it means to be a 'Singularitarian', and his comments on page 370 are somewhat touching, I can relate. One point I think he may have glossed over a bit is our energy crises, which seems to have the potential to become severe beginning in the next 5-15 years. Nanotechnology may help us in this area, but I see it as no panacea. We will see. Also, the modern day Luddites are discussed, they have many followers and are trying to turn back the 'technological clock' to an earlier, simpler time. They are doomed to fail, technology always advances, either in the United States, or elsewhere. I agree with Kurzweil on the near inevitability of progress, and if someone doubts the potential reality of the things expounded in this book they are putting limits on future technology which I do not share. Progress will advance to the limits of the physically possible, in most cases, and usher in a new world order, and beyond. This tome is succinctly written and very comphrehensive, with an extensive bibliography.

The Singularity is near...unless we kill ourselves off, or get squashed by a meteor first: splat! Long before I ran across Kurzweil, I had little doubt that the Singularity would happen to humanity--eventually, somewhere deep into the distant future. Ray Kurzweil's 1999 book, Age of the Spiritual Machine blew me away. Ray, backing up his ideas with convincing arguments, made me realize that Singularity was no more than five decades away. Ray's new book, The Singularity is Near, just solidifies his arguments that some people alive today will live long enough to enter Singularity as transhumans housed in powerful (quantum) computers. Which ones however? The richest ones? Or "all" of us? This is a question Ray's new book, unfortunately, glosses over. Why am I slightly paranoid? As a former officer who once worked on space-based military systems, and a former econophysicist (financial engineer) who priced energy and weather derivatives for the greed driven energy markets using advance math and AI methods, and as a current nuclear physicist at Los Alamos, I've gotten to worrying about extreme wealth gradients and the control of super powerful and risky technologies. It seems to me, unless we start addressing the issue soon, that the super rich will be able to afford Singularity technology long before the rest of us, and that we may thus be left behind to whither away. Take invitro fertilization (IVF). It is very expensive, and few people can easily afford it. Now imagine, say 15 years from now, the exorbitant cost of tweaking IVF embryos with state-of-the-art gene science to produce offspring with significantly improved physical and mental capabilities. Only the richest rich--think of the millionaire space tourists--will be able to afford such cutting-edge science. Then what, except to fall further behind, will happen to our kids when they try to compete against souped-up humans for jobs? Unless we take a stand, the extreme wealth and power gradients that already exist today between us poor slobs and the billionaires will likely grow far worse. Beyond advanced IVF, will privileged "transhumans" sporting highly expensive advanced bio-nano technologies so exceed us, and use up planetary resources so fast that we, the rest of us poor slobs, will be made extinct? This is, after all, what we--the dominant species on Earth--are doing to tens of thousands of other "lesser" species. Fortunately, many people are thinking about the dangerous possibilities of ultra wealth gradients, and working hard to come up with solutions. To this end of making sure we will all have the opportunity to partake of Singularity, among others, is the World Transhumanist Association. I also recommend the book by James Hughes, "Citizen Cyborg" on the issue of the "super" democracy we will need with Singularity. Finally, I am a member of the International Humanist and Ethics Union, a very old UN advisory organazation, and I recently moderated an IHEU/UN panel on the bioethics of advanced cloning involving senior scientists, politicians, ambassadors, bioethicist, the clergy and the public based in part on my recently well received "strong" science fiction book ( Beyond Future Shock I.S.B.N. 1419609440 ). Given our innate competitiveness, it explores the perils and promises of near-future advanced bio/nano technology in a world filled increasing wealth gradients, limited resources, and still roiled with Dark Ages religious conflict. Supplementary material: Google Dr. Hughe's radio show, Changesurfer radio, in which I and many other scientists, futurists, ethicists, philosophers, theologians and industry leaders share their thoughts on Singularity. Lastly, pick up a copy of Peter Turchin's wonderful book, War and Peace and War. (See my review of this great work on Amazon. In summary, Kurzweil's new book does a great, credible job in letting a mass audience know about just how shockingly close we are to the Singularity: just between three to five decades, and I strongly recommend it. But while you read it, think about ethics and the competitive and warlike history of humanity, and the extreme wealth gradients between the billionaires and ourselves. We need to address "Singularity" science now, before we are swallowed up. If we do things right, I am hopeful that we might likely end up in Kurzweil's utopia without first having ripped ourselves a new one.


<< Book  

More books in the category:
Trans-Humanism