posted by
shriker_tam at 12:14am on 22/10/2008 under jurism
![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
My gut feeling is that things don't change as much as we think they do. Not where it matters. I believe that tech changes while people basically stay the same. I think cave men probably worried about whatever their equivalent to asking girls out was and fought over stupid things like why Grog couldn't keep his cave as tidy as all the other neighbours did. And I think that future humans or post-humans or whatever probably still will do those things a thousand years from now.
The basic human experience stays relevant, topical issues and politics become dated. That's why people still like Homer and Shakespeare, but don't really care about Lewinsky jokes anymore. Until those become of historical interest of course. There is also something to be said about the fact that people use everything for sex. The Sims was originally a practical architecture simulator - but when people where given free rein over it, they (re)created prostitution instead. Regardless of what Demolition Man says, sex will probably be just as much on people's minds in the future as it is now.
We eat, sleep, crap, screw, love, hate, fight, make friends, break up, are jealous, envious, proud, greedy, vain, kind, giving, grateful and rude - and I think we always have, and always will. The methods and expressions may change, but not the essential facts.
On the other hand, our definitions change all the time. Our tools for interpreting those essential facts change. What is a human? And what makes them so? Those questions probably have different answers to different people in different times. And as tech develops, this may become even more true. So will this be a crisis? Or do we just fear it because we (like everyone else since the dawn of time) are cought up in the current and feel out of control?
A cyborg is a creature that is part man part machine. An android is a machine that looks like a man, to a greater or lesser extent. These creatures appear in SF all the time - but what is the essential difference? Data and the Terminator are androids, made (almost) entirely out of artificial parts and programmed like computers. Jamie Sommers, Del Spooner and Berlusconi are cyborgs, originally fully human and outfitted with machine replacements for broken or missing bits. What about Roy Batty? Entirely artificial but entirely biological. What about Krang (if he'd been human) - biological brain in machine body. Where do we draw the line? Are you human as long as somewhere down the line you were born? No matter what you then do to your body later? Conversely - if you were never born, can you ever become human?
Does the fact that we allready ask these questions, as evidenced by the fact that I could rattle all those characters off without doing any reserach, mean that maybe there is not as much of a crisis as some would think? These questions are not new - the essential question, "what is it to be human" has probably been asked since we invented language - it's only the robot part that is new. And even that, not as new as you might think. Frankenstein's monster is sort of like Roy Batty, whatever he is, and asked the same questions. (A Flesh Golem if you play Oblivion... Clay golems are sort of robots, but my mythology knowledge doesn't extend far enough to know wether anyone debated their human/non-human status before Feet of Clay).
We don't have an answer yet - maybe never fully will - but what will it mean that those who will have to decide wether to grant the first fully independently functioning AI citizen status, have allready seen Picard and Data go through that trial on tv?
Is there a paper here somewhere that I would be willing to write? Random filosophising is easy, paper writing is hard... Of course, course requirements are light too.
The basic human experience stays relevant, topical issues and politics become dated. That's why people still like Homer and Shakespeare, but don't really care about Lewinsky jokes anymore. Until those become of historical interest of course. There is also something to be said about the fact that people use everything for sex. The Sims was originally a practical architecture simulator - but when people where given free rein over it, they (re)created prostitution instead. Regardless of what Demolition Man says, sex will probably be just as much on people's minds in the future as it is now.
We eat, sleep, crap, screw, love, hate, fight, make friends, break up, are jealous, envious, proud, greedy, vain, kind, giving, grateful and rude - and I think we always have, and always will. The methods and expressions may change, but not the essential facts.
On the other hand, our definitions change all the time. Our tools for interpreting those essential facts change. What is a human? And what makes them so? Those questions probably have different answers to different people in different times. And as tech develops, this may become even more true. So will this be a crisis? Or do we just fear it because we (like everyone else since the dawn of time) are cought up in the current and feel out of control?
A cyborg is a creature that is part man part machine. An android is a machine that looks like a man, to a greater or lesser extent. These creatures appear in SF all the time - but what is the essential difference? Data and the Terminator are androids, made (almost) entirely out of artificial parts and programmed like computers. Jamie Sommers, Del Spooner and Berlusconi are cyborgs, originally fully human and outfitted with machine replacements for broken or missing bits. What about Roy Batty? Entirely artificial but entirely biological. What about Krang (if he'd been human) - biological brain in machine body. Where do we draw the line? Are you human as long as somewhere down the line you were born? No matter what you then do to your body later? Conversely - if you were never born, can you ever become human?
Does the fact that we allready ask these questions, as evidenced by the fact that I could rattle all those characters off without doing any reserach, mean that maybe there is not as much of a crisis as some would think? These questions are not new - the essential question, "what is it to be human" has probably been asked since we invented language - it's only the robot part that is new. And even that, not as new as you might think. Frankenstein's monster is sort of like Roy Batty, whatever he is, and asked the same questions. (A Flesh Golem if you play Oblivion... Clay golems are sort of robots, but my mythology knowledge doesn't extend far enough to know wether anyone debated their human/non-human status before Feet of Clay).
We don't have an answer yet - maybe never fully will - but what will it mean that those who will have to decide wether to grant the first fully independently functioning AI citizen status, have allready seen Picard and Data go through that trial on tv?
Is there a paper here somewhere that I would be willing to write? Random filosophising is easy, paper writing is hard... Of course, course requirements are light too.
There are no comments on this entry.