?

Log in

No account? Create an account

Previous Entry | Next Entry

A Usenet thread of mine two years ago, on Singularity confusion, humanism of Bujold and Pratchett, and Scottish materialist revel.

Tags:

Comments

( 3 comments — Leave a comment )
pompe
Sep. 1st, 2007 10:43 pm (UTC)
I don't think genetic intelligence enhancement qualifies as Singularity, because genetic-wise while there is nothing to assume we can't increase average (or maximal) intelligence it is odd to me to believe that path is a road to really drastic and sustained intelligence growth qualifying as some sort of Singularity-approaching event. I mean, we can code for higher strength too but no human so coded will ever be able to lift even a fairly average sized mountain.

What I'm mostly tired about is Singularianism as a secular faith. It's the enthusiasthic, almost breathless, expectation some of the visionaries indulge in. That's a bad sign for any ideology.

Then there is a relevant critique of the field which we need to ask ourselves, and it is whether this is really something we should focus on as a species. I mean, it can be argued people basically are happy once some rather basic needs and services are taken care of. Studies don't show high intelligence is one of them, nor do I think we can argue it is a particularly important survival or match-making trait. Maybe social reform is a better first step towards a world where heightened intelligenec actually would makes us happy.
mindstalk
Sep. 1st, 2007 10:50 pm (UTC)
I have to run off to anime club, but I'll point out that pure biotech is in fact one of Vinge's four paths to Singularity (after AI, human-computer links, and group minds) in both his fiction (Tatja Grimm) and non-fiction (the essay the list of four paths comes from) http://mindstalk.net/vinge/vinge-sing.html

We could debate whether a society where everyone has the mental abilities of Mozart, Einstein, (continue list to taste) is possible, and whether it would qualify as Singularity, or whether humans might opt to have even larger brains, to pack in the various capabilities... but if we can't look to Vinge for some touchstone as to what Singularity should mean, we're screwed.
pompe
Sep. 1st, 2007 11:54 pm (UTC)
Our eventual admiration for Vinge's centrality in popular singularianism shouldn't stop us from critically examining what he says. His logic isn't entirely consistent because his fourth path is much looser than the three previous ones, which I agree make some sense. But he takes this:

"Biological science may provide means to improve natural human intellect."

and extrapolates that to mean this

"When greater-than-human intelligence drives progress, that progress will be much more rapid. In fact, there seems no reason why progress itself would not involve the creation of still more intelligent entities -- on a still-shorter time scale. The best analogy that I see is with the evolutionary past: Animals can adapt to problems and make inventions, but often no faster than natural selection can do its work -- the world acts as its own simulator in the case of natural selection. We humans have the ability to internalize the world and conduct "what if's" in our heads; we can solve many problems thousands of times faster than natural selection. Now, by creating the means to execute those simulations at much higher speeds, we are entering a regime as radically different from our human past as we humans are from the lower animals.

From the human point of view this change will be a throwing away of all the previous rules, perhaps in the blink of an eye, an exponential runaway beyond any hope of control. Developments that before were thought might only happen in "a million years" (if ever) will likely happen in the next century. (In [5], Greg Bear paints a picture of the major changes happening in a matter of hours.)

I think it's fair to call this event a singularity ("the Singularity" for the purposes of this paper). It is a point where our old models must be discarded and a new reality rules."

...which although nicely prophetic is not necessarily a result of the first claim. There's no radical change there, it is just an improvement. He's notably fuzzy in the rest of his paper about what that fourth path is supposed to be, there's much more on AI and networks and human-computer interfaces. Which is problematic because it means I think he's missing the critical issue, namely _how_ do you improve the "natural" human intellect by "biological science" to the same degree as the other three paths potentially would improve some sort of global reasoning capacity to enable the Singularity? For that matter, what is "greater-than-human-intelligence"? Einstein and Mozart were both just humans, and both certainly had aspects of their intellects we perhaps do not particularly envy.

Then there's the fifth path which actually I think is the most interesting one which I don't think he mentions much at all. The critical question, if we consider the Singularity to be "a future time when societal, scientific and economic change is so fast we cannot even imagine what will happen from our present perspective" to use a fairly non-loaded version, is perhaps we don't need greater-than-human intellects, AIs and computer-human interfaces to set that off at all.
( 3 comments — Leave a comment )

Profile

Phoenix
mindstalk
Damien Sullivan
Website

Latest Month

October 2018
S M T W T F S
 123456
78910111213
14151617181920
21222324252627
28293031   

Tags

Page Summary

Powered by LiveJournal.com
Designed by Lilia Ahner