While watching i,robot my thoughts went back to my MDS302 class with Nell Kriesburg last spring. In the class we talked about the differentiation between conscious beings, and those beings who are not at that level - humans/primates/animals/insects...and then to robotic beings. What are some thoughts on giving robots "life," as well as what do you think about robots evolving (like in AI)?I personally believe that robots can be given a certain level of "life" and "being," and that after a certain level of intelligence/being has been achieved, robots could possibly evolve... just as living organisms doheh - #500[Edited on February 6, 2006 at 8:57 PM. Reason : .5k]
2/6/2006 8:54:44 PM
CYLONS
2/6/2006 8:56:48 PM
iRobot was a disgrace. What a stupid movie. Will Smith is a horrible actor and the robots looked like glossed over barbie toys.
2/6/2006 9:44:49 PM
2/6/2006 9:51:01 PM
haha, but i was not just talking about i,robot... more along the lines of P.K. Dick and Asimov
2/6/2006 10:20:50 PM
life is made with matter, advanced science might be able to reproduce it on a meaningful level someday assuming we avoid too much war or dark ages ect, but i don't expect to see the matrix during my life time.
2/6/2006 10:26:27 PM
i believe that life is something we'll be able to emulate electronically, but I really don't believe we'll be able to create real emotions.
2/6/2006 10:51:01 PM
humans as a whole have a hard enough time controlling their own emotions and feelings, I doubt they will be emulatable to any decent quality any time soon.
2/6/2006 10:53:42 PM
wow, i'm surprised at the responses thus fari think we'll definitely replicate human emotion within our lifetimes
2/6/2006 11:14:10 PM
replicate to the point that it seems real, or replicate in a real way?
2/6/2006 11:20:12 PM
2/6/2006 11:32:58 PM
what if?
2/6/2006 11:36:51 PM
yeah, what if. what if there are a million parallel universes and you are simultaneously living a million parallel lives.teh good thing about what if, is what if doesnt have to consider the realities of technology.
2/6/2006 11:39:23 PM
of course, but realistic viewpoints do kill a lot of thought on various topics. and we definitely dont know it all.
2/6/2006 11:40:45 PM
we dont?oh. yeah. good point[Edited on February 6, 2006 at 11:42 PM. Reason : seriously.]
2/6/2006 11:41:19 PM
2/6/2006 11:41:54 PM
the turing test is about as good a measure as you're going to get
2/7/2006 12:08:03 AM
I have another question: Does anyone think it would be possible to merge human essense (or awareness, I can't think of a better word) into a mechanical being? Something along the lines of uploading your conscience into a computer. Because if would do that, it would be awesome.
2/7/2006 12:09:50 AM
do you think it's possible to take the data on a floppy disk and transfer it to a hard drive?
2/7/2006 12:16:50 AM
Doesn't matter, I suspect. Even if we do manage to replicate "real" thought/emotions, I suspect robots will always be treated as the property of others. Another reason it won't matter is because the only people working on "real" thought/emotions will be universities. The computers/robots which will flood the world market will be the non-real intelligence because it will be predictable. You can't have a machine you are selling to people actually "feel pain" because it might refuse the valid orders of its owner. I suspect the I-Robot rules are alright, but actual programming will be a bitch. This, of course, would make the legal questions irrelevant. If I "want" to work for free, that is my decision..."Robot, why do you follow the orders of your owner?""I feel that I must.""You don't have to, legally. If you wanted to quit you could.""I am programmed to obey my owner, I do not want to do anything else."
2/7/2006 12:19:31 AM
what would count as real? would it be similar physical electrochemical reactions as humans have, or would it not count if they didn't have a supernatural element like a soul?
2/7/2006 12:30:46 AM
you have to account for those extra 21 grams somehow
2/7/2006 12:37:08 AM
WHY DO THE SCIENTISTS MAKE THEM???
2/7/2006 12:56:49 AM
2/7/2006 1:09:32 AM
i believe that within a hundred or so years, our technology will have advanced enough so that a computer capable of emulating the human brain will be possible.I'm no computer expert or anything, but i am a firm believer that a brain is 'simply' a "biological computer" that will sooner or later be replicated.as far as emotions go, they are physical reactions to complex stimuli in the brain, and that could be programmed into a computer.and yes, a computer with a brain as complex as a human's brain would be as self aware as a human.that being said, a computer will never have a soul, if you believe in that type of thing.
2/7/2006 1:25:03 AM
and as far as transferring a human conscienceness into a computer...that's tricky. if you can somehow download all of the data and pathways in the human brain into a machine designed to emulate the human brain, then it would appear that the human's conscienceness would be transfered to the computer. but from the instant the download happened, there would then be two minds that up until that point in time had the same uniqu experiences.a way around that could be to replace the physical human brain with implants one piece at a time, so that in the end the human brain would be gone, and the computerized brain would then "be" the personit sounds like i'm kinda talking out of my ass, but i've thought about it... it's just hard to communicate what exactly i feel about the subject without typing for 30 minutes.
2/7/2006 1:29:11 AM
It'll be FAR long after we develop the first true AI before we are able to transfer consciousness. Unless of course the AI is so brilliant, that it figures it out for us.
2/7/2006 1:53:36 AM
2/7/2006 8:28:57 AM
ROBOTS DON'T HAVE A SOUL!!!
2/7/2006 8:49:56 AM
2/7/2006 9:06:52 AM
in MDS we read - i think it was a short story - called "Learning to be Me." The basic just of it (if i can remember) was that the main character had gotten a "chip" installed in his brain. after a period of years, the chip would have learned everything about him.... and basically be exactly as his brain was. soooooo, then he would have his brain removed and replaced with a very similar synthetic organ with which the chip would be attached (basically the same thing as a brain, without functioning brain activity.. still had blood vessels and such). and the story went on about his conflict with this replacement, yada, yada...... which led to a discussion on how long a person should/could live. for some reason i dont know where i was going with that, but you guys should check that story out[Edited on February 7, 2006 at 9:57 AM. Reason : blah]
2/7/2006 9:56:44 AM
2/7/2006 10:38:56 AM
2/7/2006 11:44:34 AM
i knowlike we're not just a meat computeri mean i like what i ami just know what i am
2/7/2006 11:47:46 AM
Meatbag.
2/7/2006 11:49:54 AM
^^^ exactly... who's to say that whatever emotional responses a robot has are not real, if some level of being has been introduced to it. primates and dogs, for example, have various emotional responses to different stimuli... although not like the responses humans might have to the same stimuli, the emotions are still "real," only different.[Edited on February 7, 2006 at 11:53 AM. Reason : postpostpost]
2/7/2006 11:53:30 AM
background emotions are almost certainly identical among mammals at least--they basically are a physical response arising in those same centers responsible for core consciousness. more advanced emotions are a product of extended consciousness (memories of previous experiences linked to a similar physical response). (Demasio says the consciousness of animals is probably most like what we experience when we dream: very strong emotions with no real feeling of "being in control"...b/c, well, they aren't and neither are we). a lot of what gives us emotional depth is the plasticity of our brain: similar stimuli can evoke very different responsed based not only on additional external stimuli but also on "reflection": one of the key components of human extended consciousness (and that of quite a few other animals too, most likely) is the ability of the brain to also stimulate this same type of plasticity within other regions of itself. there is really no reason to program emotions into a machine, however. they are the biological solution of providing negative and positive reinforcements for activities. consciousness and an awareness of your own mortality is the ultimate survival mechanism: an innate fear of death and knowledge of our own mortality is certainly a major contributor to the success of the human species. but it does seem infinitely simpler to just provide guidelines for a machine rather than try to recreate millions of years of evolution.
2/7/2006 12:15:09 PM
2/7/2006 12:30:03 PM
2/7/2006 12:54:18 PM
^^ That will all one-day be possible. However, such machines will NOT be widely destributed for one obvious reason: They might be bad at math. A true neural-network can get wrong answers, sometimes horribly wrong answers against the general will of the owner. We can make BETTER machines by mixing neural-networks, to deal with unforseen events in the real-world, and classical always-true computing based around programming and fixed behavior.[Edited on February 7, 2006 at 1:08 PM. Reason : ^]
2/7/2006 1:08:18 PM
2/7/2006 1:19:37 PM
has anyone ever read "Learning to be Me" ?
2/7/2006 1:28:35 PM
2/7/2006 2:02:55 PM
i doubt you would be able to find any neurobiologists that deny that animals experience emotions on some level. it is almost certain, however, that background emotions like anxiety are probably relatively uniform throughout mammals: it basically involves physiological changes (body temp, heart rate, etc) as well as activation of regions of the brain that are virtually identical in both humans and animals. the underlying neural circuitry does not differ drastically in the more "primitive" parts of the brain involved in these processes. additionally, there is no reason to think these emotions would change drastically due to evolutionary pressures: the anxiety response is extremely effective at increasing vigilence and wakefulness in preparation for a threat.the real problem with creating a machine that feels emotion would be creating the core consciousness to create the illusions of there being both an "observer" and free will. there isn't much research being done into this now, but the general belief now seems to be that it involves self-referential neural circuits that also reference memories of past body states as well as "future" memories of the body states (past real and future anticipated information from basic sensory receptors). there isn't much real research being done in this area which is unfortunate b/c it is something that i would think most people would be deeply interested in.[Edited on February 7, 2006 at 3:28 PM. Reason : .][Edited on February 7, 2006 at 3:28 PM. Reason : a lot of people are saying stuff from phi340 up in here]
2/7/2006 3:26:57 PM
It'll happen, and probably soon. Just read some Ray Kurzweil. Copy a person and you'll have the emotion. A computer will probably pass the Turing Test before 2030.
2/7/2006 4:36:42 PM
speaking of, Ray Kurzweil, i'm actually reading The Age of Spiritual Machines right now (http://www.amazon.com/gp/product/0140282025/). i don't agree with everything he says, but it's an interesting book
2/7/2006 4:52:00 PM
He plans on living forever. And he's got $10,000 that says a computer will pass the Turing Test by 2029.
2/7/2006 4:53:18 PM
bttt?
2/14/2006 9:31:44 AM
ray kurzweil is awesome, but I think alot of what he says is pretty far fetchedi'm a total believer in the singularity, however. Of course, what that means is that there's absolutely no way to know what will actually happen during the singularity, because it's beyond our comprehension
2/14/2006 10:14:45 AM
Supplanter makes a pretty good point -- People freak out so much over the fact that if you were to recreate a person atom for atom, there'd be "two of that person" who'd start to diverge. For some reason, this seems like a peculiar and strange "error case" situation.However, what about yourself? You're a different person that you were even days, weeks ago. You're diverging from yourself all the time. What makes you the same person as a day, year, or decade ago? If you answer "memories" (or especially "a soul"), then you still have a lot of explaining to do.I'm more of the opinion that life is nothing special, just a self-preserving pattern. As long as the pattern is being preserved, the creature is "alive". But what if the pattern changes shape slightly, or even drastically while still maintaining an ordered state? It would still be alive, but fundementally different.It seems that what makes us "us" is our memories, which might explain why we cling to them so often, even when it doesn't do us justice.
2/14/2006 10:20:05 AM