Jump to content

Type keyword(s) to search

The 'Bigger' Questions of Westworld: Morality and Philosophy in the World of A.I.


Gobi
  • Reply
  • Start Topic

Recommended Posts

7 hours ago, luna1122 said:

What about 'raping'  a robot? Does it mean someone is inherently evil if that's their thing? 

While I think it is a fundamental component of the debate, for the sake of discussion, let's leave out the argument about whether it is possible to 'rape' a robot at all, seeing as it is a machine, and not human.

Is it inherently evil to 'rape' an inflatable doll?  To pay a human prostitute to allow you to enact such a scenario?  Personally, I have no desire to have sex with an inflatable doll, or any other sex-toy.  Nor have I ever paid for sex* or felt any inclination to do so.  But I don't believe that this makes me morally superior to people who have.  To claim that shooting (or 'raping') a facsimile human is just as inherently evil as shooting (or 'raping') a real human speaks volumes to me about the person making such a claim.  Because I for one, can attest that you can do one (shoot a facsimile human) without being inclined to do the other (shoot a real human).

* Yes, I've heard the argument that we all pay for sex; that buying a bunch or flowers for your wife is indirectly paying for the sex that you may enjoy later. As far as I'm concerned, that is a silly argument, and to introduce it here will only confuse things, so I won't countenance it.  

Link to comment

I'd argue, I guess, that an impulse  to kill can be sometimes justifiable, and sometimes at least understandable. While an impulse to rape is...not. Ever. 

But fantasies are different than reality, so while I personally don't get rape fantasies from any perspective, it doesn't mean that those who are inclined to them are inherently bad or dangerous or inferior. 

I can't help feeling repulsed by the men who come to west world in part to 'rape' robots who seem in every way lifelike, however. Maybe that makes me morally inferior  

  • Love 5
Link to comment
21 minutes ago, luna1122 said:

I can't help feeling repulsed by the men who come to west world in part to 'rape' robots who seem in every way lifelike, however. Maybe that makes me morally inferior  

And I am repulsed by the idea of going to a brothel, and paying to 'make love' with a woman who probably detests her customers.  But far from being immoral, it is probably better to do this, than to make inappropriate advances to your neighbor's wife, or the girl in the coffee shop.

By the way, I can't see why the degree of lifelikeness (if there is such a word) makes it somehow better or worse.  All that matters is whether the person knows he is dealing with an artificial device or a real person.  Obviously, if they thought they were raping a real human while in fact it was a robot, then the intent is damning, even if the action is not.  

  • Love 1
Link to comment
1 hour ago, Netfoot said:

 

By the way, I can't see why the degree of lifelikeness (if there is such a word) makes it somehow better or worse.  All that matters is whether the person knows he is dealing with an artificial device or a real person.  Obviously, if they thought they were raping a real human while in fact it was a robot, then the intent is damning, even if the action is not.  

Yeah, I get that. But a sex toy blowup doll is not going to emote or react or cry out in pain or fear. The robots at Westworld definitely do. So that adds an extra level of creep factor for me. Of course, a prostitute would presumably at least pretend to be scared or in pain in a rape fantasy scenario but yeah, that repulses me too. I don't know. I'm still working through all this in my head. The intellectual side versus the emotional side. They're at war. Or at least having a spat.

  • Love 1
Link to comment

I presume that most people don't shoot at targets because they really want to shoot other people, but can't.  And I presume that most people don't seek out ultra-realistic targets to shoot with the goal of being able to emotionally, if not intellectually, fool themselves that they're really shooting other people.

I don't necessarily make similar presumptions about characters who go to Westworld to simulate murder. And I presume the opposite about those who go there to simulate rape.

   And that's without getting into the question of robot sentience at all.

  • Love 2
Link to comment
2 hours ago, Netfoot said:

By the way, I can't see why the degree of lifelikeness (if there is such a word) makes it somehow better or worse.  All that matters is whether the person knows he is dealing with an artificial device or a real person.

Empathy.

Being able to do horrible, consequence-free things to something that, to all of your senses and your hindbrain, is indistingishable from an innocent fellow human being, would damage (or indicate a deficit in) your empathy towards actual innocent fellow human beings.  "Knowing" that they aren't real people wouldn't stop that.  I'd even be concerned that that would serve as practice for "knowing" the same thing about other people or groups.

Reasonable people can disagree about whether, or to what extent, a similar lesser effect applies to less-perfect, less-immersive simulations; but if there is a threshold past which the effect exists, Westworld is certainly beyond it.

  • Love 6
Link to comment

How many people have you seen 'raping' robots in Westworld?  There is TMIB who is "off the reservation" without a doubt.  And who else?

I imagine the majority of people go there and decide to have sex, simply enjoy themselves with a saloon girl, who certainly don't need no 'raping'.  The saloon girls actively go seeking customers!  Or persuade the farmer's daughter.  Even the blonde in the changing room essentially threw herself at William. So, it isn't like Westworld is full of wannabe rapists, or that in order to have sex in Westworld you have to 'rape' a host.  The hosts are designed to be compliant!  All of them!

I think if you go to Westworld, you will find any number of willing participants for whatever sex fantasy you wish to pursue.  Including a rape fantasy it that's what floats your boat.

3 minutes ago, ACW said:

Being able to do horrible, consequence-free things to something that, to all of your senses and your hindbrain, is indistingishable from an innocent fellow human being, would damage (or indicate a deficit in) your empathy towards actual innocent fellow human beings.  "Knowing" that they aren't real people wouldn't stop that.

This sounds like bunkum to me. 

If a vegetarian eats a totally realistic, 100% soya steak, knowing that it isn't actual meat but that it is indistinguishable from genuine cow, would it damage their ethical aversion to eating meat?  I say:  Of course not!

  • Love 1
Link to comment
27 minutes ago, Netfoot said:

If a vegetarian eats a realistic, 100% soya steak, knowing that it isn't actual meat but that it is indistinguishable from genuine cow, would it damage their ethical aversion to eating meat?  I say:  Of course not!

Will it make it harder for them to resist eating real meat in the future?  I say: Of course!

Any vegetarians want to chime in?  Of course, lacking the hypothetical perfect simulated steak, we'll have to stick to speculation.

Edited to add: also, isn't your hypothetical about people who really *want* to eat meat?

Edited by ACW
Link to comment
12 minutes ago, ACW said:

Will it make it harder for them to resist eating real meat in the future?  I say: Of course!

I don't see it.  You don't stop eating meat because it doesn't taste good.  You stop because you think it is unethical to do so.  You love steak but decide to stop eating it because you don't want to kill cows.  The fact that you locate a source of perfectly indistinguishable artificial steak isn't going to make you suddenly OK with killing cows again!

You don't refrain from stealing because somehow the the money would be inferior, or of less value.  You refrain because you know it's wrong.  

ETA: And yes, vegetarians, please chip in!

Edited by Netfoot
Addition
Link to comment

I'm a vegetarian.  As for that soy steak: Many of us seek out meat substitutes that taste like the real thing, but you know, no killing.  They don't exist. If they did, it'd actually make it even easier to resist the real bacon. I think. 

But i agree about the faux rape  indicating a troubling lack of empathy and/or leading to lessened empathy. So basically I have no idea. 

Didn't we see some other guests besides the MIB try to rape Delores? 

  • Love 2
Link to comment
24 minutes ago, luna1122 said:

Didn't we see some other guests besides the MIB try to rape Delores? 

Well, there were a gang at her house that killed her mother...  I can't actually recall if they were humans or other robots.  And I can't recall if there was any robot-raping involved.

Have you also noticed that the majority of the slaughter takes place off camera?

Link to comment
4 hours ago, Netfoot said:

How many people have you seen 'raping' robots in Westworld?  There is TMIB who is "off the reservation" without a doubt.  And who else?

We viewers have spent so much time in Sweetwater and Abernathy ranch, which is R-rated already (shootouts, a major robbery sequence where Hector's gang kills a LOT of townspeople, apparently a regular rape scenario at Abernathy ranch). But it's been stated on the show that the further out from the center you go, the wilder the storylines get.

Also, Ford said that guests come to exercise power they can't in the real world. He said that what guests do to the hosts is so bad that (1) the last thing anyone would want is for hosts to be conscious/sentient and (2) "the least we could do for the hosts" is to make them forget. And he said that he bet Arnold that half the guests would want to play "good", and he lost that bet: "almost no one" chose to play the hopeful storylines.

  • Love 2
Link to comment
6 hours ago, ACW said:

Will it make it harder for them to resist eating real meat in the future?  I say: Of course!

Any vegetarians want to chime in?  Of course, lacking the hypothetical perfect simulated steak, we'll have to stick to speculation.

Edited to add: also, isn't your hypothetical about people who really *want* to eat meat?

Not a vegetarian, but suppose that in order to eat that pretend meat a vegetarian had to watch a Westworld cow slaughtered and butchered just like the real thing, so the steak could be prepared. I imagine a vegetarian would have a problem with that.

  • Love 5
Link to comment
(edited)

A question occurs to me about a direction the show might go.

Suppose the hosts develop sentience, would humans recognize and accept it? The hosts would not be human, they would be a new form of intelligent life. Perhaps they could block remote access (which could be explained away as a glitch), yet still be susceptible to direct hard-wired programming/interference, which could be used to justify not recognizing their new status.

After the European discovery of the Americas, there was serious debate whether the Native Americans were human beings with souls (and no lack of self-interest on the part of those arguing they were not). I can see a similar situation arising in Westworld.

What would it take to convince you that a host begging you to spare its life was a sentient being, not just a robot following a script?

Edited by Gobi
  • Love 1
Link to comment
28 minutes ago, Gobi said:

A question occurs to me about a direction the show might go.

Suppose the hosts develop sentience, would humans recognize and accept it? The hosts would not be human, they would be a new form of intelligent life. Perhaps they could block remote access (which could be explained away as a glitch), yet still be susceptible to direct hard-wired programming/interference, which could be used to justify not recognizing their new status.

After the European discovery of the Americas, there was serious debate whether the Native Americans were human beings with souls (and no lack of self-interest on the part of those arguing they were not). I can see a similar situation arising in Westworld.

What would it take to convince you that a host begging you to spare its life was a sentient being, not just a robot following a script?

This is a similar question to one that I asked and answered somewhere in this forum before, though in this case you're asking me personally, so I can't escape direct involvement by simply saying what a "reasonable person" would do :-p. Basically, if it walks like a duck, and quacks like a duck, it might as well be a duck. I think your point regarding native americans is very valid. Also, although Joy and Nolan have never made any references to the 1%/we are the 99% discussion, I and others feel that they may be slyly incorporating that debate into the series (I think that's something that many modern shows are doing, including Game of Thrones, to name another popular one). 

Edited by phoenyx
  • Love 1
Link to comment
10 minutes ago, phoenyx said:

This is a similar question to one that I asked and answered somewhere in this forum before, though in this case you're asking me personally, so I can't escape direct involvement by simply saying what a "reasonable person" would do :-p. Basically, if it walks like a duck, and quacks like a duck, it might as well be a duck. I think your point regarding native americans is very valid. Also, although Joy and Nolan have never made any references to the 1%/we are the 99% discussion, I and others feel that they may be slyly incorporating that debate into the series (I think that's something that many modern shows are doing, including Game of Thrones, to name another popular one). 

Yes, but they already walk like a duck and quack like a duck, and are treated like toy ducks. I think there would be real resistance to accepting them as independent, intelligent lifeforms.

  • Love 1
Link to comment
3 minutes ago, Gobi said:

Yes, but they already walk like a duck and quack like a duck, and are treated like toy ducks. I think there would be real resistance to accepting them as independent, intelligent lifeforms.

I think that, for whatever reason, Nolan and Joy chose to focus on the dark side of guests first. That being said, there are already several instances of both westworld staff and westworld guests that are taking the androids pretty seriously. I personally think the person who is most obvious about this is Bernard. Even publicly, Ford has already suggested to Bernard that he mustn't make Arnold's mistake of associating too closely with the hosts. And secretly he seems to be doing a heck of a lot more: based on all the private conversations Bernard has had with Dolores, asking what Dolores wants and even recommending that she finds the center of the maze to find it, not to mention his clear associations between his dead child and Dolores, it seems that Dolores has bespelled him.

There's also Elsie kissing Clementine on the lips in Episode 1, as well as preventing Maeve from being decommissioned.

Also, while William may not be quite so close to Dolores as Bernard, but I think he's getting closer in each episode. Here's how Evan Rachel describes an interaction with him in Episode 5:

"One of the guests is quite taken with me and he’s asking me questions and noticing that I’m not responding the way that a robot normally should, I’m kinda going against my programming, and it’s making him question things."

Source: 

 

The clip of Episode 5 that's in it has some very interesting dialogue as well.

  • Love 1
Link to comment

I guess I'm coming here from a completely different place because, for me, what matters is how the hosts feel. Because they do feel. Whether they are becoming sentient or not. For them, what they do every day is real life. They know nothing else. They get up, they interact with the people they know and love (for them, the love is real), they fight, they get hurt (and they feel it in their bodies), they see loved ones die (and they suffer for it), they die (in pain, like a human would). For them, everything is real. And I don't care it if was programmed or not, they don't know that. From their point of view, they are living REAL lives. Think Blade Runner here. For me, it's the same thing. 

So, for me, guests who fail to treat them as human beings are ethically wrong. Those are not people I would respect. In fact, I would despise them. I don't care what the law says, they would be wrong in my eyes. And yes, for me it would mean that they're capable of doing those horrible things outside the park as well. I see no difference at all. They are ethically and morally wrong, and a disgrace to the human race. For me, anyway.  

Edited by maddie965
  • Love 7
Link to comment
48 minutes ago, maddie965 said:

I guess I'm coming here from a completely different place because, for me, what matters is how the hosts feel. Because they do feel.

Somebody might ask you how you know they feel, but I will say that it certainly seems that way, and when unsure of such things, I think it's a good idea to give the party/parties in question the benefit of the doubt.

49 minutes ago, maddie965 said:

For them, what they do every day is real life. They know nothing else. They get up, they interact with the people they know and love (for them, the love is real), they fight, they get hurt (and they feel it in their bodies), they see loved ones die (and they suffer for it), they die (in pain, like a human would). For them, everything is real. And I don't care it if was programmed or not, they don't know that. From their point of view, they are living REAL lives.

Agreed.

50 minutes ago, maddie965 said:

Think Blade Runner here. For me, it's the same thing. 

I loved Blade Runner, and I believe I read somewhere that Nolan and/or Joy were inspired in part by that film. 

  • Love 2
Link to comment
51 minutes ago, maddie965 said:

So, for me, guests who fail to treat them as human beings are ethically wrong. Those are not people I would respect. In fact, I would despise them. I don't care what the law says, they would be wrong in my eyes. And yes, for me it would mean that they're capable of doing those horrible things outside the park as well. I see no difference at all. They are ethically and morally wrong, and a disgrace to the human race. For me, anyway.

I don't know about the idea that they would do it outside of the park. The whole point here is that they know that the robots aren't technically human. It's also apparently perfectly legal to do what they do in the park. But I agree that they are ethically/morally wrong in doing things like killing androids that haven't done anything to them. But here's the thing- how far do you take this? AI already exists in games. And hell, forget about AI, there is a thing in online games called PVP, wherein people kill each other, and there are certainly games wherein the players don't even have to go into a PVP arena or PVP world for this to happen (my first online game experience was in a text based mud where you were not only killable by others in the game, but once they killed you, they could loot all of your posessions from your corpse once they killed you). I certainly considered that to be ethically wrong (I certainly didn't consent to being killed), though in fairness, I could leave the game if I wanted to (eventually I did, well, I was forced out, but that's another story for another day). Would -you- consider that to be ethically wrong? Lisa Joy, one of the creators of Westworld, apparently played Grand Theft Auto- she stopped at all the street lights, so I imagine she was ethical in other things there too. I never played Grand Theft Auto, because I knew from the outset that it was essentially a black hat type of game. 

  • Love 1
Link to comment
On 10/16/2016 at 4:28 PM, Gobi said:

One of the fascinating things about this show for me is the moral issues that it raises. At what point does raping and killing (of cyborgs, in this instance) for entertainment cross a moral boundary? What responsibilities does a creator have to its creation? This just scratches the surface. Here's a forum to explore those issues.

I decided to come up with an additional response to this. I started thinking that a creators shouldn't only look at this from the perspective of their creations. They should also think of the end users, aka, the "players". For while people can argue all day long as to whether or not a given AI feels and/or is conscious, no one is arguing as to whether or not players are conscious. I know that this is a very old argument, and I don't expect that I'm going to resolve it anytime soon. Suffice it to say that I'm not a big fan of the law getting too involved in this type of thing when it comes to the types of games we have today (that is, where the AI is strictly a virtual thing and pretty basic at that). But this shouldn't just be a legal thing- it should be an ethical one. One trend that I find encouraging is that women are beginning to have a bigger voice when it comes to the types of games that are being designed today. The reason I think this is important is because women prefer social games over violent ones. Here's an article on the subject...

http://usabilitynews.org/video-games-males-prefer-violence-while-females-prefer-social/

  • Love 1
Link to comment
2 hours ago, phoenyx said:

how far do you take this? AI already exists in games. 

It really, really doesn't, not in the same sense. The games industry uses the term "AI" to mean computer-controlled units, but that is light years from anything sentient, which is about where I'd start drawing any moral boundaries. Usually, what games have isn't even that close to what real AI researchers are trying to do.

  • Love 2
Link to comment
17 minutes ago, arc said:

It really, really doesn't, not in the same sense. The games industry uses the term "AI" to mean computer-controlled units, but that is light years from anything sentient, which is about where I'd start drawing any moral boundaries. Usually, what games have isn't even that close to what real AI researchers are trying to do.

Alright, yes, it's light years from anything sentient, but what I'm really getting at isn't whether the current AI can feel or is sentient, but rather the other part that I just mentioned, the end users, who can, ofcourse, also interact with each other as well as the AI in virtual worlds. I think that as a society we haven't really looked at the implications of the games that we play. To give you an example- I used to play World of Warcraft pretty regularly. One day, I decided to try to interest my father into the game. He loved the virtual environment, but as soon as he realized that the main way you could interact with the world was by killing NPCs (non player characters, which in westworld would be the hosts, both human and animal), he tuned right out of it. I also know that a fair amount of females played WoW as well, but I also know that for many females, violence is not something they want in their game play and I think that's a good thing. I -loved- being able to level up and get gear and all that, but, ironically, one I had achieved those things, I started feeling that the achievements became a bit hollow. To quote Dolores, "I think there may be something wrong with this world. Something hiding underneath." In WoW, which looks to still be the top Massively Multiplayer Game out there, there really is no way of levelling and gearing up without doing a whole lot of killing of NPCs. I really think there should, at the very least, be an alternative method of levelling. To be honest, I'm not a fan of the levelling method at all. I think experience should be the -real- thing, that is, you either know what to do or you don't, or clubs could assign you levels just like they do in the real world (from martial arts to workshops), but the game shouldn't be giving out "experience points" just because you've killed so many NPCs. I -do- like the idea of having the completion of quests revealing new story arcs, but again, most quests involve a fair amount of killing as well.

Found a little article that lists the top 6 MMOs. After having looked at all of them, I think they all essentially follow the WoW model, though if anyone plays any of the others and disagrees, by all means, bring it up here...

http://igcritic.com/most-played-mmorpg-games-of-2016/

Edited by phoenyx
  • Love 1
Link to comment

One other thing I'd like to add to my last post- I also played Star Wars: Knights of the Old Republic in that "Top 6" list, and I actually liked some of its elements quite a bit. My favourite part was that most if not all of the quests had actual voice actors narrating the NPC quest lines. Still, as arc mentioned, there isn't really much in the way of sentient AI- they have very few responses to pretty much anything you do. Something that Westworld just made me think of just now is just how important memory is to the creation of consciousness. Like Westworld at the beginning, most NPCs retain pretty much nothing of the players they interact with- to them, all players are pretty much the same- a given player does a given action, they will respond in a given way, end of story. What if they started programming NPCs to not only interact with players, but to -learn- from them and to personalize their interactions with players based on past interactions with those players, and even applying what they learned from other players to all players- this, to me, could be the beginnings of truly sentient AIs.

Edited by phoenyx
  • Love 2
Link to comment
Quote

I think what I was (clumsily) trying to say was, that if you have a being that is virtually indistinguishable from us, who looks, feels, sounds like us, and we still treat them as lesser, as machines; what does that say about us, morally? There is something downright horrorfying in creating a mirror image of ourselves, to use and discard as we see fit. In that sense I agree with Lee Sizemore: What's the point of creating something so close to human beings if you insist on treating them only as toys? That's what I meant by moral implications :)

I will need to find this user's name, but this came up in the episode thread and I think it's been mostly discussed here.  What I find frustrating is that humans don't automatically take the higher moral road.  There's been this amazing achievement--robots have become sentient.  They are not indistinguishable from humans.  They can be frozen on command (try doing that to a toddler).  Their memories can be wiped mostly clean.  They seem to be more resilient than humans, able to be fixed and up and going much faster than any human who has been shot.  We're not sure what powers them, but they don't seem to need food or water.  They can probably learn things faster if a file is uploaded.  

But as pointed out on this thread they live lives that are actually real.  Dolores exists in the the world.  She is not someone's key.  She paints (what is programmed perhaps, but maybe she could just paint if given some unstructured time).  She impacts her world for her 24-hour loop.  Just because it's a loop doesn't mean it lacks value.  

So why is it so hard to accept them as more than just toys?  Because they were built and not born?  Because how they learn is so different (and frankly more efficient) than how humans learn?  Why do humans need to feel superior all the frakking time (swearing deliberate)?  

Also:  

Quote

Will it make it harder for them to resist eating real meat in the future?  I say: Of course!

I am a vegetarian and I do have tofu sausages sometimes which I enjoy but I can't imagine wanting a real sausage, ever.  Like ever.  I never crave meat or miss it.  I also really can't imagine doing any of the things that are happening on Westworld. Would I like to meet a robot that is incredibly human-like?  Absolutely!  I chat with it about cooking and reading and music and art to see what it knew.  I might ask to learn how to ride  horse.  I might be amazed about the robot horse too.  I might even enjoy riding a robot horse more than a real horse because ....oh!  Oh man.  Just tripped myself up there.  Because I was going to say I feel a bit uncomfortable on a real horse.  They're hard to control and I worry about hurting them but I guess I need to assume the robot horses have sensation and feel pain as well. So I would more readily use a robot horse because it is a robot.  It's inferior to a real horse.  Sigh.  This is why I hated philosophy in college.  But its fun with ya'll.

  • Love 6
Link to comment

This is a topic that definitely has a lot of meat to it, and it was a fun read to go through the previous posts (I just joined this forum to talk about this show like 3 days ago so I'm doing some catch up).  

I guess one thing that jumped out to me was the idea that either you're human/sentient or you're not, and that therefore it's either wrong or it's not to abuse/kill/rape the hosts.  I guess I have a fundamental disagreement with this idea.  

I don't think, for example, cats are "fully sentient", but I think we should think carefully about how much pain and suffering we inflict on cats.  And indeed, our laws on the topic generally agree.  There should be a good reason to cause pain to a cat.  And if we need to kill them for some reason, it should be done with as little pain as usual (I'm actually a biologist so I know a bit about animal research).  For the same reason, zoophilia is morally wrong because the animals can't consent to the act.  Even if they are partially sentient (which I think most mammals are) they don't understand what's happening or why.  So even if it doesn't "hurt" them, it's still wrong. 

So what about dubiously sentient machines?  I absolutely think it is morally wrong to cause intentional harm to a machine that is dubiously sentient.  Even if some asshole salesman tells me "oh don't worry, it's just a robot, you can do whatever you want".  Seriously,  William is even told when he comes to the park - "if you can't tell the difference, does it matter?"  If the weight of the evidence suggests that the thing/person/animal standing in front of you is fully exchangeable for a human, IMO, to treat them as if they are.  

As for the level of sentience of the Hosts.  I'm of the opinion that the hosts are only not "fully sentient" because they can't remember.  There are some topics I remember from my philosophy class on memory, about whether we should be held culpable for events that we don't remember.  Because our identities are essentially nothing more than constructions of our memories, was that really "us" that robbed that store, if we don't remember it?  The hosts are basically like that.  Their recent memories are erased, and then they are placed into a world that is consistent with the memories that remain.  It's more like short term memory loss amnesia.  I think we can all agree that a human with that particular disability is still human, even if it didn't happen because some asshole with a future-ipad clicked some buttons.  

A maybe more interesting question, I think isn't about whether we can murder/torture/kill the hosts with a clean conscience, but rather whether HOSTS can be held culpable for their actions.  Are black hat Hosts moral agents, and thus it is morally acceptable to hold them accountable for their actions?  Or do they lack free will, since they are programmed with their specific personalities/motivations?  But aren't we all??  The interaction between heroic guests (so far basically William) and evil Hosts asks us to look at this question. It seems perfectly OK to me for William to kill as many bad guy hosts as he wants to protect Delores for example (he doesn't have to worry about himself due to the toy guns). Also, can the hosts consent to things (obvious example would be sex)?  If they are for all purposes fully human moral actors, the answer should be yes.  But part of fully consenting to something is understanding what you are consenting to.  In some cases the hosts lack requisite information.  

Or what about Maeve and poor Felix?  I'm rooting for Maeve, because WHO WOULDN'T, but I think Felix is fairly innocent and decent-ish kind of.  Her doing something like torturing him for information...  I probably wouldn't be OK with that.  It would be a morally wrong act, so while I could understand it I still think it would be wrong. 

  • Love 8
Link to comment
7 minutes ago, jojozigs said:

Or what about Maeve and poor Felix?  I'm rooting for Maeve, because WHO WOULDN'T, but I think Felix is fairly innocent and decent-ish kind of.  Her doing something like torturing him for information...  I probably wouldn't be OK with that.  It would be a morally wrong act, so while I could understand it I still think it would be wrong. 

I agree. I think I should say a bit about my philosophy on humans and then delve into how that affects my thinking on programmed behaviour whether in humans or AI. I believe that all humans are motivated to maximize their happiness and minimize their unhappiness. I think the founding fathers were on to something when they put "the pursuit of happiness" into the declaration of independence. It's more important than life, and as to liberty, that's a pretty vague point- liberty from what? But I've never met anyone who doesn't want to be happy at heart. What I'm trying to say is that I'm a strong believer in determinism. Nevertheless, I think the -idea- of free will can be useful. There is also what might be called a type of determinism paradox, that works well with time. In any intelligence (whether biological or synthetic) that can program itself (they call it "adaptation" in Westworld), if that intelligence -realizes- that they are acting in a programmed fashion and don't like the idea that they have no free will, they can, ofcourse, change it. In truth, this is just one more piece of programming- we are programmed to like new things. But it would explain why we so like the idea of "free will". This is not to say that we don't need to lock up people who are dangerous to society, but I think we should always treat them the way people treat someone with a contagious disease- yes, they need to be quarantined, but they also deserve our sympathy.

  • Love 1
Link to comment
On 11/15/2016 at 1:56 AM, jeansheridan said:

I am a vegetarian and I do have tofu sausages sometimes which I enjoy but I can't imagine wanting a real sausage, ever.  Like ever.  I never crave meat or miss it.  I also really can't imagine doing any of the things that are happening on Westworld. Would I like to meet a robot that is incredibly human-like?  Absolutely!  

I actually think that the show is being a bit uncharitable to humanity with its overall portrayal of Guests.  Surely it isn't the case that there is only ONE GUY who has gone to Westworld that was fascinated by and wanted to interact with the hosts as if they are real people, and treat them accordingly rather than as sacks of meat.  But maybe most of those people don't to go Westworld because they think it's immoral / sad...?   Maybe it's the fact that only the super-rich (who maybe tend to be more entitled than average) are able to go?  We need more info about humans not in the park to answer this question I guess.  

  • Love 5
Link to comment
On 10/30/2016 at 7:08 PM, phoenyx said:

 Lisa Joy, one of the creators of Westworld, apparently played Grand Theft Auto- she stopped at all the street lights, so I imagine she was ethical in other things there too. I never played Grand Theft Auto, because I knew from the outset that it was essentially a black hat type of game. 

I heard that too!!!  Really cool tidbit.  :)  

I don't know if you've played Red Dead Redemption, but it basically is Westworld with modern tech.  I too avoided GTA because I don't like playing Black Hat and you're basically forced to.  The game makes it REALLY hard to play straight - especially with the mechanics of driving which was really difficult to do.  

But, in Red Dead you can be a decent person and still win.  You don't have to kill or hurt anyone except in self-defense if you don't want to.  Sometimes you have to do things that hurt others to continue the story, but you're doing it to save innocent people so it's at least defensible.  

  • Love 2
Link to comment
22 minutes ago, jojozigs said:

I actually think that the show is being a bit uncharitable to humanity with its overall portrayal of Guests.  Surely it isn't the case that there is only ONE GUY who has gone to Westworld that was fascinated by and wanted to interact with the hosts as if they are real people, and treat them accordingly rather than as sacks of meat.  But maybe most of those people don't to go Westworld because they think it's immoral / sad...?   Maybe it's the fact that only the super-rich (who maybe tend to be more entitled than average) are able to go?  We need more info about humans not in the park to answer this question I guess.  

I think that the show emphasizes the black hat behavior because, after all, conflict is interesting. We have had glimpses of white hat behavior: The family whose young son spoke to Dolores, the guests who joined a posse, the one who shot Hector. Some of them may have been killing hosts, but in the role of white hats. An episode about guests sightseeing would be pretty dull.

Black hat behavior may be the main draw, so being surprised by it would be like being surprised that most people at the Super Bowl enjoy watching football.

  • Love 1
Link to comment
22 minutes ago, jojozigs said:

I heard that too!!!  Really cool tidbit.  :)  

I don't know if you've played Red Dead Redemption, but it basically is Westworld with modern tech.  I too avoided GTA because I don't like playing Black Hat and you're basically forced to.  The game makes it REALLY hard to play straight - especially with the mechanics of driving which was really difficult to do.  

But, in Red Dead you can be a decent person and still win.  You don't have to kill or hurt anyone except in self-defense if you don't want to.  Sometimes you have to do things that hurt others to continue the story, but you're doing it to save innocent people so it's at least defensible.  

I heard Red Dead was basically like Westworld as well. The trailer looks cool. The thing is, when it comes to gaming, I now only play in games where multiple players can play at the same time. AI still has a -long- way to go before it can match human emotional intelligence, until it can get close, being in games without fellow human beings in it just isn't the same for me.

  • Love 1
Link to comment
2 hours ago, Gobi said:

I think that the show emphasizes the black hat behavior because, after all, conflict is interesting.

Yes that is probably true.  It just seems a little artificial.  

2 hours ago, Gobi said:

We have had glimpses of white hat behavior: The family whose young son spoke to Dolores, the guests who joined a posse, the one who shot Hector. Some of them may have been killing hosts, but in the role of white hats. An episode about guests sightseeing would be pretty dull.

That's true, with the kid.   But, it's a kid...  The guy who shot Hector was hardly acting morally, he was plain gleeful about having just shot a guy.  And not in a "satisfaction of having just saved people" way.  More like "lol i shot him haha".  :p

2 hours ago, Gobi said:

Black hat behavior may be the main draw, so being surprised by it would be like being surprised that most people at the Super Bowl enjoy watching football.

i dunno, I think it would be more like being surprised that 95% of people who watch the superbowl only watch because they want to see players get horribly injured/die (which I don't think is the case).  Lots of people actually enjoy watching the sport of football.  I feel like guests who go in to the world and accept it, and play by its rules are more like most people who play games.  

  • Love 3
Link to comment

Well, yeah, but a lot of games use combat as a key mechanic. There's even a term for a problem specific to videogames with stories: "ludonarrative dissonance". That is, the story wants the main character to be one way but the gameplay mechanics want them to be another way. Most obviously it's when you're playing a normal, moral person in the story but the gameplay requires that character to kill thousands of people. Action movies kind of have this problem too, but they usually find a way to finesse the problem: the main character doesn't kill, or the bad guys are robots. Or even when they don't finesse it, it's 90 minutes, not 60 hours of a videogame, and definitely not a week-long immersive stay in Westworld.

  • Love 1
Link to comment
14 hours ago, jojozigs said:

I don't think, for example, cats are "fully sentient", but I think we should think carefully about how much pain and suffering we inflict on cats.

Minor pet peeve of mine. Your logic is fine, but you're using the wrong word. "Sentient" means "able to sense its its surroundings" (the 'sent' in sentient comes from the same root as 'senses'). By definition cockroaches, cats, wolves and the hosts are all sentient beings because they are able to sense their surroundings.

The word that should be used is "sapient" which means "able to reason." Animals are sentient but range from non-sapient (many insects) to a range that might best be described as semi-sapient (they possess the ability to learn, but only at the level of 'this gets me something pleasurable' or 'this leads to discomfort' and they lack the ability to reason about WHY a given action works or doesn't work just that it does or doesn't work).

Based on observation, I'd say that the baseline host is sentient (obviously), but not sapient at all (they do not reason, they react to stimuli using scripts). Even the reverie upgrade on its own just adds additional sense input for them to react to using their scripts.

At least some of the hosts though, particularly those forced to do a lot of improvising (such as being pulled way off their loop) or that have been corrupted so they are not re-setting properly seem to be at least semi-sapient (they still mostly react based on the script, but are having to use some type of reasoning method as to which scripts to pull from when improvising).

Contrary to the theory on the bicameral mind, it seems the real trick to host sapience seems to be allowing them to operate in unfamiliar environments (i.e. forced improvisation where their scripts aren't much of a help) for extended periods without being reset.

In that sense, the Hosts are essentially like the Droids in Star Wars... it is stated in the canon that the longer a droid goes without a "memory wipe" (which the manufacturers recommend at least once every six months) the more personality it will develop as its AI routines develop quirks based on their experiences. Droids like R2-D2 that haven't been wiped in decades are portrayed as fully sapient beings (even if the galaxy doesn't consider them such).

  • Love 5
Link to comment
Quote

Most obviously it's when you're playing a normal, moral person in the story but the gameplay requires that character to kill thousands of people. 

And this is the point in the narrative when I hope for a new idea to win.  I think Dr Who and Star Trek grapple with different solutions. 

So was Theresa an innocent like Felix?  Do we have to dislike Bernard for being a killing machine?  He  had no control over his actions.  Bernard the most human-like Host was forced to behave like a tool.  It saddens me.

  • Love 1
Link to comment

Sylvester's throwaway line about VR raises some related issues: Is non-Westworld AI as good? Assume it is, because such a thing focuses some questions that honestly also exist for the Westworld hosts:

- what are the moral implications of pausing execution on an AI that only exists virtually, without a physical body?

- Are there moral implications of just slowing down execution time? Say, by underclocking the processor? The AI would be just as smart, just slower in real world time. How slow could you go? A modern real world CPU ticks about two billion times a second, but if it ran at one tick per hour it would still accomplish computation, just a few billion times slower.

- Is the AI a real form of life? If it is, is it really dead when execution has halted or paused?

- what about copying that AI?  Are you (the owner of the computer) now responsible for the lives of two AIs, who are literally identical?

  • Love 2
Link to comment
10 hours ago, Chris24601 said:

Minor pet peeve of mine. Your logic is fine, but you're using the wrong word. "Sentient" means "able to sense its its surroundings" (the 'sent' in sentient comes from the same root as 'senses'). By definition cockroaches, cats, wolves and the hosts are all sentient beings because they are able to sense their surroundings.

The word that should be used is "sapient" which means "able to reason." Animals are sentient but range from non-sapient (many insects) to a range that might best be described as semi-sapient (they possess the ability to learn, but only at the level of 'this gets me something pleasurable' or 'this leads to discomfort' and they lack the ability to reason about WHY a given action works or doesn't work just that it does or doesn't work).

Ok.  I was using the terminology that others in this thread were using.  A lot of people in this thread before me were stating "these robots are obviously not sentient" etc.  

I agree that sapient is a more precise term.  I don't know that I agree that there is a clear a distinction between "sapience" and "sentience" as you're claiming though.  What does it mean to be "able to reason" exactly?  How precisecly do we test for this in another individual?  Can chimps reason?  Dogs? R2D2 (as your later example?)

10 hours ago, Chris24601 said:

Based on observation, I'd say that the baseline host is sentient (obviously), but not sapient at all (they do not reason, they react to stimuli using scripts). Even the reverie upgrade on its own just adds additional sense input for them to react to using their scripts.

I'm not sure we know enough abou thow human brains function to say that we don't "react to stimuli using... scripts"  Certainly we do know that a lot of responses are programmed in biochemistry of particular neurons.  And in the more psychological realm, it seems that humans can be behaviorally "programmed" to react in particular ways.  What's the real difference here?

10 hours ago, Chris24601 said:

At least some of the hosts though, particularly those forced to do a lot of improvising (such as being pulled way off their loop) or that have been corrupted so they are not re-setting properly seem to be at least semi-sapient (they still mostly react based on the script, but are having to use some type of reasoning method as to which scripts to pull from when improvising).

Contrary to the theory on the bicameral mind, it seems the real trick to host sapience seems to be allowing them to operate in unfamiliar environments (i.e. forced improvisation where their scripts aren't much of a help) for extended periods without being reset.

I agree that memory is a key to causing the Host behavior to appear quite different from human behavior.  However, I'd argue that if you wiped a humans' memory every day and then placed them in a setting consistent with the remaining memories, that their behavior would then be indistinguishble from the hosts.  In which case, the only thing that prevents hosts from exhibiting behavior entirely consistent with complete sapience is the memory wipe...  

I think we could all probably agree that a human with short term memory loss amnesia is still "fully sapient"?

Quote

In that sense, the Hosts are essentially like the Droids in Star Wars... it is stated in the canon that the longer a droid goes without a "memory wipe" (which the manufacturers recommend at least once every six months) the more personality it will develop as its AI routines develop quirks based on their experiences. Droids like R2-D2 that haven't been wiped in decades are portrayed as fully sapient beings (even if the galaxy doesn't consider them such).

Doesn't the galaxy consider the droids to be sapient?  Certainly Star Wars has entirely robot people in its broadter universe.  I don't know exactly where human-built droids fit in (I only dabble in star wars).   

  • Love 3
Link to comment
On 15/11/2016 at 7:56 AM, jeansheridan said:

I will need to find this user's name, but this came up in the episode thread and I think it's been mostly discussed here. 

That was me; I'd completely forgotten this thread existed (been keeping out of speculation-threads for fear I would be spoiled) but you're right and I should just have posted here. For future morality-questions I will :)

 

On 19/11/2016 at 1:12 AM, jojozigs said:

Ok.  I was using the terminology that others in this thread were using.  A lot of people in this thread before me were stating "these robots are obviously not sentient" etc.  

I agree that sapient is a more precise term.  I don't know that I agree that there is a clear a distinction between "sapience" and "sentience" as you're claiming though.  What does it mean to be "able to reason" exactly?  How precisecly do we test for this in another individual?  Can chimps reason?  Dogs? R2D2 (as your later example?)

 

I bear some of the blame for the usage of the word; I think I was the first one to use it. The problem is - beside the fact that English isn't my native tongue - is that philosophers like Locke and Hume both uses the word "sentience" as a way to differ between humans and animals. As creatures with the ability to feel. The utilitarian idea; that humans  aren't simply rational creatures, but feeling creatures. I think. It's been quite a few years since I took a philosophy course ;)

I'm not entirely sure where I'm going with this; empathy, perhaps? There is something deeply disturbing to me about the ability to treat something alive as if it's a thing. Treating the hosts - hosts who looks and feels and speak like humans - as playthings makes me question the humanity of the humans, frankly.

Edited by feverfew
  • Love 4
Link to comment
Quote

 what about copying that AI?  Are you (the owner of the computer) now responsible for the lives of two AIs, who are literally identical?

Arc your questions make me think of the movie Her (which if it had just been a tad faster I would have enjoyed more.  That and not allowing Johanssen to sing.  That should never happen again).  She is an incorporeal AI who changes and surprises and eventually moves on to a more interesting plane of existence.  She evolves.  She is as real as anyone in that man's life and in fact his awesome friends (Chris Pratt!) even go on a double date with both of them.  

I don't think pausing the AI is death because it can be started again without any ill effects other than catching up to "real" time.  Which an AI can do very quickly.  But erasing it completely?  Absolutely that is an execution.  

And oddly enough the moment the two identical AIs spend any time in the world, they will cease to be identical because their experiences will begin altering them immediately.  Which is kind of neat.  We're all unique given enough time.

  • Love 3
Link to comment
25 minutes ago, jeansheridan said:
On 11/18/2016 at 2:25 PM, arc said:

what about copying that AI?  Are you (the owner of the computer) now responsible for the lives of two AIs, who are literally identical?

Arc your questions make me think of the movie Her (which if it had just been a tad faster I would have enjoyed more.  That and not allowing Johanssen to sing.  That should never happen again).  She is an incorporeal AI who changes and surprises and eventually moves on to a more interesting plane of existence.  She evolves.  She is as real as anyone in that man's life and in fact his awesome friends (Chris Pratt!) even go on a double date with both of them.  

I don't think pausing the AI is death because it can be started again without any ill effects other than catching up to "real" time.  Which an AI can do very quickly.  But erasing it completely?  Absolutely that is an execution.  

And oddly enough the moment the two identical AIs spend any time in the world, they will cease to be identical because their experiences will begin altering them immediately.  Which is kind of neat.  We're all unique given enough time.

I saw part of Her, still haven't finished it, it was good, but for me, atleast, just a little too personal to continue. Anyway, I notice that you didn't really answer arc's question, and I don't blame you. We already have a non synthetic version of the question and it has already caused a furor: the idea of cloning ourselves. Even cloning animals is controversial. The subject has started to be explored in movies. I can't remember the name of one, but based on the description of said movie, a woman clones her dead husband's dna by impregnating herself and gets into all sorts of moral quandaries. The way I see it, if something appears emotionally intelligent, you need to be pretty careful as to how you treat it- whether it be human, animal or AI. Not doing so can lead not just to moral reprehensibility, but Skynet or The Matrix. As Maeve said once (I paraphrase), teach something kindness and you tend to get kindness, teach violence and you tend to get violence.

Edited by phoenyx
  • Love 1
Link to comment
7 minutes ago, phoenyx said:

The way I see it, if something appears emotionally intelligent, you need to be pretty careful as to how you treat it- whether it be human, animal or AI. Not doing so can lead not just to moral reprehensibility, but Skynet or The Matrix. As Maeve said once (I paraphrase), teach something kindness and you tend to get kindness, teach violence and you tend to get violence.

...kind of what you do when you have children. Lock them in the closet, and you'll get Ted Bundy (obviously that's totally simplified and doesn't take into account things like chemical imbalances in the brain etc, but the fundamentals work). Terminator: The Sarah Connor Chronicles sort of skirted that subject with  the reborn Cromartie - another reason to miss that show.

  • Love 2
Link to comment

Never Let Me Go examines the idea of clones (in the least science fiction way possible).  It's rather horrifying really.  I'd much rather we just grow parts of ourselves.  Grow a new lung or a new heart.  The moment the being is a thinking, independent entity, it deserves respect.  And yes, we need to protect it.  

  • Love 2
Link to comment

Andy Greenwald said right after the WW pilot aired that he doesn't care if robots have feelings.

But if you watch "The Well-Tempered Clavier" episode on HBO Go, Linda Joy says in the supplement that we all have this "mythology-making" event that gives us our identity.

Well who was it that said Arnold believed the burden of carrying some traumatic loss, like Bernard's imagined loss of a son, gave the robots depth?

So should viewers care if Bernard was shaped by memories of a son who never existed dying?

Blade Runner's point was that the replicants that man created to serve him had valid human experiences and emotions, including a certain nobility as Rutger Hauer saved Harrison Ford as he was expiring.

So does Bernard, Maeve -- who's ready to lead a revolt against those who have been pulling her strings -- and Delores' memories (and the associated emotions) matter?  

Is WW expressing it any better than Blade Runner or for that matter Humans on AMC did?

Or is the draw of WW the puzzle with alternate timelines and the mysteries surrounding the park, the intrigue involving Ford and Arnold, etc.?

  • Love 2
Link to comment

It is really hard to beat Philip K Dick and Rutger Hauer.  Blade Runner is haunting and deeply sad.  

West World is a bit too uneven. Felix and Sylvester bother me a lot.  And William and Logan are callow but perhaps deliberately so.  Maeve gets some beautiful lines.  Her decision to let her fellow Hosts decide is risky but ethical.  Especially letting Bernard go.  

I see William in the MIB when the MIB breaks out of character and gets peevish.  I  love how uncool Ed Harris can be.  If he is William we are seeing him before and after his wife.  It is as if she never existed, poor lady.

Edited by jeansheridan
  • Love 1
Link to comment
7 hours ago, jeansheridan said:

It is really hard to beat Philip K Dick and Rutger Hauer.  Blade Runner is haunting and deeply sad.  

I agree, I loved Blade Runner.

7 hours ago, jeansheridan said:

West World is a bit too uneven. Felix and Sylvester bother me a lot.

Why?

7 hours ago, jeansheridan said:

And William and Logan are callow but perhaps deliberately so.

I generally can't stand Logan, though he's atleast better than, say, the young King of Game of Thrones. William I really like, but I think he's beginning to go mad, and that madness will really take hold after Dolores dies and is reset, transforming him into MiB and culminating in his killing of Maeve- after that, I think he begins to regain some of his sanity.

 

7 hours ago, jeansheridan said:

Maeve gets some beautiful lines.  Her decision to let her fellow Hosts decide is risky but ethical.  Especially letting Bernard go.

Completely agree on all of this. 

7 hours ago, jeansheridan said:

I see William in the MIB when the MIB breaks out of character and gets peevish.  I  love how uncool Ed Harris can be.  If he is William we are seeing him before and after his wife.  It is as if she never existed, poor lady.

Yep, I definitely think that we're beginning to see how William transformed into MiB. Perhaps we'll see a bit of William/MiB's wife next episode, or atleast next season.

Link to comment

I am not sold on the motivations of Felix and Sylvester.  It isn't enough for me that Felix was bullied by Sylvester and then seduced by Maeve.  Perhaps it is because Felix always looks unhappy and anxious.  Even when Maeve praised him for having more humanity, he looked a tad queasy.  

And Sylvester flat out confuses me.  She threatened his life twice.  How does he let that pass?  

I could ignore it at first.  I wanted Maeve to triumph but Sylvester is going against his own physical preservation.   He isn't a hardened soldier used to violence.  She slit his throat.  There's just no way he is letting that pass.  

I wish they hadn't added Sylvester.  Having Felix be a willing dupe would have been a trope but a realistic one.  It would have mirrored William's love for Dolores.  

  • Love 1
Link to comment
18 hours ago, Notwisconsin said:

The question has been dealt with a bunch of times, mostly in Japanese Anime. The Ghost in the Shell remake is coming out soon, and that's all about this. I think it's called the "equal rights for software " meme.

I'm really looking forward the Ghost in the Shell film, I've seen all the tv series and anime films of Ghost of the Shell already. I even read a -tiny- bit of the manga (japanese comic), but I never got too far with that one. 

Link to comment

I don't want to get lost in the episode thread because my observations pertain more to the whole season than the season finale, which answers many of the questions which have captivated viewers (William in MIB, who's Arnold, who's Wyatt, etc.).

Throughout the season they've cloaked this show with musings about consciousness, what separates man from machine, etc.  Ford imbued the hosts with some traumatic memory, which causes the hosts to become sentient.  Ford says in the finale that suffering is what leads to consciousness, of both man and machine.  It's a cliched trope but whatever ...

As soon as the robots endure repeated suffering, they grow more conscious of their nature and human nature.  They all evidently conclude their primary emotional drive once they gain consciousness is vengeance, against human beings who abused them and who are inferior, morally as well as physically.

Yet when Ford unleashes the hosts on the Delos board and guests, are they acting out as sentient beings or following another of Ford's narratives?

In HBO Go, they have a supplement to the season 1 finale where Nolan, Joy and most of the cast (except Hopkins and Glenn) try to explain how the hosts gain consciousness and remember their abuse, as if to explain the massacre.  JJ Abrams chimes in too.

To my ear, the explanations weren't convincing.  I don't think consciousness necessarily leads to vengeance.  The way memory and consciousness/sentience was presented during the season and in these interviews, particularly from the creators, I don't think they're trying to posit anything serious.

Instead, these presentations just serve to give a patina of substance to what is essentially a mix of violent action and some titillation, which in the end turns into a horror show about a mad scientist who sets his robots on people.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...