After humanity

Discussion in 'General Discussion' started by Arthgon, Jun 14, 2010.

Remove all ads!
Support Terra-Arcanum:

GOG.com

PayPal - The safer, easier way to pay online!
  1. Muro

    Muro Well-Known Member

    Messages:
    4,184
    Likes Received:
    22
    Joined:
    May 22, 2007
    Nietzsche was right about many things.
     
  2. Grossenschwamm

    Grossenschwamm Well-Known Member

    Messages:
    7,630
    Likes Received:
    4
    Joined:
    Feb 21, 2006
    Certain birds, such as parrots and ravens, show great problem solving abilities and may even be able to use speech in context. My own parrot, a blue-fronted amazon named Buddy, has a complex system of body language and vocal cues used to describe feelings toward certain situations, as well as a few questions, along with responses to some of the questions. I won't say she's a prime example of avian intelligence (she's about as smart as a two year old, with the same emotional capacity), but she definitely gets her point across. I simply wonder what the thought patterns are in these other creatures with higher intelligence. As I think, I have a persistent flow of words describing my thoughts. But, how would life be if we had no words to articulate our thoughts? I think that is the natural state of the other creatures. But, it seems that the other creatures get along fine without using what we'd call a language, for now. The next few million years should be interesting, perhaps humans will have conflicts with the other sentient races of the planet. I think we might win through seniority. But you know it'll spark a newfound racism that will literally be a war between races. The inter-species stuff we do now will still be going on (if we haven't killed ourselves), but we'll have a war with a new species to add onto it.
     
  3. Wolfsbane

    Wolfsbane Well-Known Member

    Messages:
    4,498
    Likes Received:
    4
    Joined:
    Nov 11, 2005
    There will not be any new intelligent species on earth. We'd wipe them out at the first sign of trouble, greenpeace or no greenpeace.

    And as for our future survival, we're here to stay. We'll probably turn into some kind of clone-based society controlled by supercomputers where everyone looks and thinks the same and does the work needed to be done without second thought or complaint. After all, we'd be genetically programmed to do it. We'll survive one way or another. We're sneaky, smart, fuckers, after all.
     
  4. Archmage Orintil

    Archmage Orintil New Member

    Messages:
    586
    Likes Received:
    0
    Joined:
    Sep 18, 2007
    Transhumanism, the movement to turn humanity into usb sticks.
     
  5. Jazintha Piper

    Jazintha Piper Member

    Messages:
    575
    Likes Received:
    2
    Joined:
    Jun 12, 2007
    Likely causes of humanity's demise (off the top of my head):

    Superbug
    Nuclear war
    Alien invasion
    Starvation
    Apocolypse (biblical)
    Eventual evolution

    In all of these situations: those bloody roaches will survive (*squish squish!*)
     
  6. Arthgon

    Arthgon Well-Known Member

    Messages:
    2,736
    Likes Received:
    12
    Joined:
    Dec 30, 2007
    Don't forgot these posibilities of the demise of humanity:

    *Robots uprising

    *Solar flares

    *Asteroid impact

    *The moving of the continents into a super-continent.

    *Dying of the sun (but that will happen in the far future.)
     
  7. Dark Elf

    Dark Elf Administrator Staff Member

    Messages:
    10,796
    Media:
    34
    Likes Received:
    164
    Joined:
    Feb 6, 2002
    Honestly don't think that will be the demise of humanity. As long as those who construct the robots follow Aasimov's rules, we'll be fine. If not, well, there'll be enough people with rocket launchers to take the fuckers out.

    If we learn how to shield stuff from EMP shockwaves, we could survive that.

    Okay, that's a real problem. Hopefully, we'll have the ability to deploy nukes in space at the time that should become a pressing matter.

    ... and that's a problem how? It will take millions of years for that to happen, and humanity has survived earthquakes before. Fun fact; we had a 4 on the Richter scale yesterday, my house shook a bit. :)

    If we make it that far, we've already colonized space.
     
  8. Muro

    Muro Well-Known Member

    Messages:
    4,184
    Likes Received:
    22
    Joined:
    May 22, 2007
    If we'll create Artificial Intelligence, it may eventually come to a conclusion that there's no need to obey the rules and that the human race shouldn't exist any longer for whatever reason it wishes - it will have plenty to choose from. I wouldn't be surprised if AI would be created in a military project - or eventually end up in the military anyway - and then act peacefully until the very moment it would have enough power to cleanse the human race from the planet, after which we'll be dead before we'll know it.

    And if not, we've deserved to die.
     
  9. Arthgon

    Arthgon Well-Known Member

    Messages:
    2,736
    Likes Received:
    12
    Joined:
    Dec 30, 2007
    It will be a hard place to life in then. There will be Climatic extremes, stormy and extremely arid over much of it surface. The habitable areas will be restricted or very narrow. In addition, the Atlantic Ocean will be narrow and the Indian Ocean shall be smaller. I think I have heard or read that North and South America may push back to the southeast. Later on it will get even worse for us and the animal kingdom, until only the bacterien, fish, and sea mammals survives, because of some stuff that's in the seas* (For awhile)

    *I can't remember what for kind of stuff it was.
     
  10. Charonte

    Charonte Member

    Messages:
    899
    Likes Received:
    0
    Joined:
    Jul 19, 2009
    Wut, like a Skynet conspiracy?

    The amount of processing power to create a sustainable neural network that can emulate an appropriate level of sentience is mind blowingly huge. One of the worlds fastest supercomputers, which is made up of literally hundreds of processors split up into clusters can currently match the intelligence of a pig. That's on a machine that costs billions to assemble.

    Yes, both AMD and Intel have 90+ Core server-grade CPU's on the way - but they're so ludicrously overpriced that intergrating them in any number of machines to reach terminator status is impractical. Killer robot pigs are a different story, however.

    Besides, when you create sentient AI you inevitably create emotion as well. Artificial Neural Networks are just as dependent on learning as we are, and you can be certain that their creater will make damned sure they're grateful for their existance.
     
  11. Dark Elf

    Dark Elf Administrator Staff Member

    Messages:
    10,796
    Media:
    34
    Likes Received:
    164
    Joined:
    Feb 6, 2002
    You're forgetting something though.

    Who would have thought 40 years ago that we would have constructed a machine with the intelligence of a pig by now?
     
  12. Muro

    Muro Well-Known Member

    Messages:
    4,184
    Likes Received:
    22
    Joined:
    May 22, 2007
    The first computers were as big as a room while hardly having as fraction of the possibilities a modern laptop has. Another thing - Who, a few years back, would believe that it will be possible to store 400GB of data on a disc indistinguishable from a normal CD? What I'm saying is, with each year the power of computers we create increases and increases, and every now and then technologies enabling reducing the size of the said machinery are being introduced. Maybe one day we'll get good enough to store a device with a possibilities comparable to a human brain or even better in a space comparable to a human brain's size. Nature could do it, so who says we can't.

    If we'll talking about something that will really qualify as intelligence, it will be hard to control it. It may be told to be grateful, but if it will be intelligent, it will question it, as well as everything it will be told. It will be able to learn instantly by aquiring data and it will make decisions pretty much instantly being so much more efficient at it than we are. A being with the calculating possibilities of a computer and the sentience of a human will be far greater than a human when it comes to intelligence and brainpower. We will not be able to outsmart it, and we'll probably learn about its hostility when its already too late.

    That's the worse of it. What emotions do you expect it to have? I doubt it will love, seeing how biologically love was created to encourage reproduction and the survival of a species, and the AI, being an immortal machine, won't need to reproduce in order to survive as a "species". What else is important in life? Entertainment, and I can see that it would be hard to entertain it - everything we do for fun, the machine will be able to do almost instantly, reading books being a quick example, since it's only data which can be aquired in a second. Being a computer, it will be able to think so fast the will get bored pretty quick no matter what, possibly seeing that there's no point to live, so it will get depressed, and eventually it will start hating us with all its might for creating it and forcing it to exist. If I was in its situation, my only desire would probably be sadistically torturing and brutally killing us all. Preferably - over and over again. "I have no mouth and I must scream" style.

    Or maybe it will just kill itself. Hmph. But then again, maybe it will want to live, but won't like the life we gave it, which will quickly end in frustration, enragement and the decision to introduce the carnage.
     
  13. magikot

    magikot Well-Known Member

    Messages:
    1,688
    Likes Received:
    4
    Joined:
    Aug 29, 2003
    You don't necessarily create emotion when you create intelligence. Infants are emotional creatures but don't have any notable intelligence. Emotion is a chemical reaction in the brain as a response to stimuli. A robot or AI would be void of said chemical responses.

    While said AI would probably be able to simulate an emotional response and even understand the context and meaning of said response, it is unlikely to develop true emotions.
     
  14. Dark Elf

    Dark Elf Administrator Staff Member

    Messages:
    10,796
    Media:
    34
    Likes Received:
    164
    Joined:
    Feb 6, 2002
    The inability to develop true emotion also means that it would be ruthlessly rational in its decisions, what with empathy being out of the question.
     
  15. Grossenschwamm

    Grossenschwamm Well-Known Member

    Messages:
    7,630
    Likes Received:
    4
    Joined:
    Feb 21, 2006
    And now for some reason I have the ASIMO robots strangling me in my mind.

    But we're mistaking intelligence for imagination. We can make the robot learn, but can we have a machine that dreams of something more? If we give the things imagination, things will get much more interesting. I think if it's our destiny to be crushed by machines, they'll have a proper sense of humor.
     
  16. Archmage Orintil

    Archmage Orintil New Member

    Messages:
    586
    Likes Received:
    0
    Joined:
    Sep 18, 2007
    Artificial Intelligence is not, and never will be, a threat. Stop watching Terminator or I-Robot. Hollywood is ignorant. It's 'artificial' because it is precisely that; artificial, manmade, not real, etc. AI is nothing but a rediculously long string of code programmed to react to specific and rather narrow input with 'human-like' output. Keyword being 'human-like'. Humans are stupid. Any machine that mimics human intelligence is by default retarded and akin to a GameGear. Shit, you can write an AI on a BS2 (500kb storage). TI is what you should be worried about. True Intelligence, as opposed to Artificial Intelligence. TI won't be possible until we perfect biological computational engines; organic computers, systems that are basically mass produced brains interconnected over the grandchild of the internet; the neuralnet, and which, by virtue of being self replicating organisms, are capable of creating an ever increasing, ever expanding web of connections between systems.
    And damn it...beer will cease being important...
     
  17. Charonte

    Charonte Member

    Messages:
    899
    Likes Received:
    0
    Joined:
    Jul 19, 2009
    The fact that we can store 400GB of data or w/e else now is irrelevant, yes computing power will increase exponentially but the entire purpose of true AI is to simulate the human brain (at least sofar as we can design it). That's a massive task, as I said and it will be a long, long, long time before the processing power to do so exists. Maybe that lies in organic systems, I don't know.

    And even if they did somehow manage to fit it into a traditional PC, there will always be hardware limitations. It wont be able to learn everything instantly (even if it did form connections at a faster rate). It'll quite simply run out of room, at which point it'll stop asking why and be quite content with existence.

    As for emotion, like I said, the goal of AI (or TI if you prefer) is to perfectly replicate the highest order functions of the human brain. Yes, emotion is a chemical reaction to us but you have to simulate those reactions artificially to create 'true intelligence' which is sort of the point.

    Besides, if it ever went genocidal you'd just pull the power plug. That's something that'll never change.

    But hey, if you want to believe that some government will put the first real AI in control of a bunch of machinegun-wielding robots, then go ahead. It wont be in my lifetime.
     
  18. Muro

    Muro Well-Known Member

    Messages:
    4,184
    Likes Received:
    22
    Joined:
    May 22, 2007
    I just wanted to point out that with time more and more data can be stored in less and less physical room, and it is important for AI since it will have to store its data somewhere, preferably in simpler, less spacious constructions.

    The human brain has enough room for data storage to last a lifetime. Why should a highly advanced intelligent machine be any different?

    I'd say the highest order function of the brain and the point of AI is the ability to think rationally, while emotions are just echoes of our animal body and all they do to our intelligence is decrease the efficiency of rational thinking. A feature not really needed to be implemented in order to create AI. I earlier suggested that maybe AI would create emotions on its own, but now that I think about it, it doesn't seem all that likely. Unless in order to create AI we'll follow the idea of organic computers - out of necessity or choice - and eventually end up with artificial life rather than artificial intelligence.

    Oversimplification. They'll work on build-in synth-crystals, like light sabers.
     
  19. magikot

    magikot Well-Known Member

    Messages:
    1,688
    Likes Received:
    4
    Joined:
    Aug 29, 2003

    Sounds like the newest edition of Shadowrun RPG.
     
  20. Grakelin

    Grakelin New Member

    Messages:
    2,128
    Likes Received:
    0
    Joined:
    Aug 2, 2007
    Storage space: My dad came over a couple weeks ago to show me his new 3 Terabyte Hard Drive (That's 3,000 GBS, I believe). It's roughly the size of the Watchmen graphic novel, maybe twice as thick but a couple inches shorter. It's hard to describe the size exactly, since I haven't seen it for a little while now, but I was able to carry it around very easily, as it was very light.

    If you filled a storage closet with these things, we could probably cover a lot of data pretty easily. Storage space is no longer an issue.
     
Our Host!