Problem with Ray Kurzweil's definition of Singularity

Discussion in 'SciFi & Fantasy' started by baftan, Dec 17, 2009.

  1. baftan ******* Valued Senior Member

    Messages:
    1,135
    “Human beings will merge with machine and this will be “Singularity” (Kurzweil’s definition) around 2045 (again Kurzweil’s prediction).”

    I have no problem with the estimated time for this happening, it is irrelevant, but I have a problem with its definition, or name, “Singularity”. By definition singularity does not harbour any difference in itself. Anything or everything within a “singular” existence must be same, indifferent, unique, and single: If there is no other definition of the term.

    However, what we observe in nature, as well as in human history, things tends to develop from simple to complex, from single to many, from singularity to plurality. Kurzweil claims that when machines will be able to simulate every atom on our bodies and brains, and completely understand how things work behind thinking, intelligence and life around 2029 (Kurzweil); then we will be able to re-program the reality however we wish. And he relies on the idea that technology grows exponentially. Not only that, the various branches of newly emerging areas such as nanotechnology, computer technology and bio-engineering are establishing new dimension to this growth and make it more widespread, substantial and quick.

    Wikipedia defines “technologic singularity” as such:

    From this definition, I get the singularity implies the “unpredictability” of the future. Just as some people call pre-Big Bang period or a centre of a black hole as “singularity”, simply because we don’t know yet. Yet why not “plurality”, why not “unpredictable future” (or “nature” depending on the context), why not something else but “singularity”? The term signifies unshared, concentrated, centralized source of power and administration which is highly problematic and unsupported prediction for anything complex.

    I repeat the problem: I can understand that there would be great shift in technology, our role as being the only super intelligent species on earth can be shattered by some AI intelligence within this century. I can predict that this intelligence, depending on the definition of intelligence, one day may exceed the human control and we would force to become a part of this machine or go extinct. What I do not understand, or not accept, is why do these people call this paradigm shift as “Singularity”? Even the most complicated process known to us is life and its DNA based existence. If anything is going to be more complicated than this, it should include more complex structure, not simpler, at least not a “single” structure. Saying that more complex mechanism will be called as “Singularity” is similar to claiming that a single creator created and controls everything. Maybe I misunderstood the singularity, maybe I missed some modern or futuristic hints behind this term; if not, then I see a big misrepresentation behind this word in terms of defining future path of technological growth.

    By the way, this discussion is nothing to do with “look, our computers work on 1s and 0s, but DNA works on 4 based coding, so ours is simpler than DNA”. This is irrelevant since 1s and 0s are still designed to be different than one another and whole idea is “being more than 1” in terms of allowing play of differences. This debate is also nothing to do with “how future technology will be, how they will call it, when will we disappear on earth” or similar fortune telling issues. I know that we can not pinpoint the future technologic growth, let alone guessing how these next generations would name it. My problem is why do we call it “Singularity” since there is no single evidence on the existence of singular yet complex structures in this universe?
     

Share This Page