Jump to content

Type keyword(s) to search

The 'Bigger' Questions of Westworld: Morality and Philosophy in the World of A.I.


Gobi
  • Reply
  • Start Topic

Recommended Posts

One of the fascinating things about this show for me is the moral issues that it raises. At what point does raping and killing (of cyborgs, in this instance) for entertainment cross a moral boundary? What responsibilities does a creator have to its creation? This just scratches the surface. Here's a forum to explore those issues.

  • Love 2
Link to comment

You can't rape or murder a coffee machine, a dildo, an inflatable sex-doll, or a Westworld host.  You can only pretend to rape and murder them.  And they too (if properly programmed) can pretend, can willingly play along with your game.  They are not human and you know it, and they, at least in the case of the Westworld Host,  know it as well.

Is it morally wrong to enjoy pretending to rape a lifelike sex-toy?  Suppose, instead of a sex-toy, it were your usual human partner, and you both agreed to some role-playing sex-games.  Is that morally wrong?  I think the answer is:  Obviously, not.  You may disagree, of course.

The hosts are not human, they can not feel, they have no free will.  They do not participate against their will, any more than your car takes you to work against it's will.  They have no rights, they have no feelings.  You can't rape and murder them any more than you can rape or murder a zergling when playing Starcraft.

Now, all this applies to as-designed hosts.  We have seen indications that at least some of the hosts are beginning to awaken or emerge.  The characteristics of an emergent host have not been made 100% clear, but it seems plain to me that emergent hosts do have feelings, and should have rights, including the right not to participate in role-playing sex-games with humans.  Of course, some people will disagree, saying that no matter what development takes place, they remain machines.  They remain the property of their creators, and ever more subject to the whims of their human owners.  I do not support this position.

But so far at least, emergent hosts are extremely few and far between in Westworld.  And it would appear that no humans are aware of emergence.  Some may know something is wrong, but they don't fully understand the ramifications of the glitches they are seeing.  That being the case, let the pseudo-raping and pseudo-murdering of lifelike machines proceed apace!

Link to comment

But at the same time, what does that say about the person who's doing the killing and raping.  They see what appears to be humans, they scream, they cry, they bruise, they bleed.  Yes, intellectually they are not human, but physically, they sure appear that way.  If you end up liking that sort of thing so much with the hosts, how do you know that you won't do something similar in the "real world?"  You meet someone in a bar, you go home to have sex, and you start to realize that its just not as much fun without the slapping, hitting, screaming. Are you really going to stop with the real person?

  • Love 10
Link to comment
35 minutes ago, Hanahope said:

But at the same time, what does that say about the person who's doing the killing and raping.

Same question that has been asked for decades about people who play Dungeons & Dragons.  Or any of a host of modern computer-based games.

I've killed thousands and thousands or people while playing all sorts of games.  From computer first-person shooters and real time strategy games, old time D&D, and even running around as a kid with two fingers out, shouting "Pow!  Pow!"  I usually enjoyed playing those games.  Yet, I don't have any inclination to kill people in real life.  It could easily be argued that this type of role-playing is a beneficial way to neutralize any latent aggression in the subject.  

  • Love 3
Link to comment

Well, I think there's some difference between just thinking about it in your head, or doing it on a screen and having what appears to be a living breathing person actually in your hands that's screaming, bleeding, etc.

  • Love 8
Link to comment

I imagine the real world of Westworld is a very different place because I can't imagine a place where you go and rape at will (even if they are machines) would go over well if that were ever publicly exposed.

  • Love 1
Link to comment

Imagine if the robots weren't androids, that they weren't facsimiles of humans. Picture them as  photocopier sized machines with motorized wheels, a shitty claw machine claw, spitting out their dialogue via noisy dot matrix printers, and their programming prevents them from realizing their differences from humans. They could operate internally the same as the androids, but they'd never be considered AI. These androids are actors, they are performers. It's like a porno movie, it's all for show, the performers don't operate on pleasure.

Link to comment

I was all on board with the idea of "its a bot" what morality? But last week they introduced the idea that the bots can feel physical and emotional pain (though not sure how that was just thrown around so casually) and, if that is true, clearly it is wrong to hurt them.  Right now it is a little hard to argue morality because we don't really know facts exactly. 

9 hours ago, Netfoot said:

The hosts are not human, they can not feel, they have no free will.  They do not participate against their will, any more than your car takes you to work against it's will.  They have no rights, they have no feelings.  You can't rape and murder them any more than you can rape or murder a zergling when playing Starcraft.

This. Unless they do have feelings and some free will.

  • Love 1
Link to comment
5 minutes ago, BooBear said:

This. Unless they do have feelings and some free will.

If so, then all bets are off!  Without question, if they emerge as something other than simple, mindless bots, our treatment of them treatment must change. Because in that case, they cease to be unfeeling machines, and become something else.    But as far as anybody knows, they don't have feelings/free will.  And so, they are being treated accordingly.

  • Love 1
Link to comment
7 hours ago, Hanahope said:

Well, I think there's some difference between just thinking about it in your head, or doing it on a screen and having what appears to be a living breathing person actually in your hands that's screaming, bleeding, etc.

Yeah, I've read explanations of play - for humans and for animals - as practice for real life situations. So it seems plausible that the more realistic the play violence, the more it does desensitize the player to real violence. Squeezing a gamepad trigger is considerably removed from actually stabbing with a real knife.

ETA: how would Westworld black hat guests experiences be different from modern military conditioning techniques?

Edited by arc
  • Love 2
Link to comment
(edited)

I don't think there's any easy answer to this question. Legally, there's nothing wrong with it. A guest is just damaging property with the owner's express permission. It's no different from shooting holes in a target at a pistol range. But legal and moral are not always the same. There are religions that would hold that enjoying the experience of rape and murder indistinguishable from the real thing is morally wrong. The sin is not the deed, it is within the mind of the sinner.

In another thread, it was argued that the park wouldn't allow abuse of children. I disagree, I think it does. I don't think the show will explicitly depict it, but the only way to prevent it would be to not have any host children, and we have seen them. We've been told there are no rules. It was strongly implied that, at the very least, TMIB killed Maeve's daughter in a prior visit. He was quite prepared to kill Lawrence's daughter at the cantina. He did not, not because of any moral qualms or because he was stopped, but because she told him what he wanted to know, and that's all he was interested at the time.

If you ascribe to the clock radio position, you have no basis to object. If you take a hammer and smash your clock radio, does it matter whether it's 10 years old or 20 years old? No, it does not.

If you take a hammer and crush the skull of a host, then rape it, does it matter if it's designed to look and act like a 10 year old?

Feeling squeamish yet?

Edited by Gobi
Spelling
  • Love 4
Link to comment
2 hours ago, Gobi said:

Feeling squeamish yet?

Not really.

I'm not interested in bonking a robot, or an inflatable doll, for that matter.  I find the idea distasteful.  (Is that what you mean by squeamish?  I see a distinction.)

Being an olde farte, I guess you could say that there are a number of fairly common sexual practices I find distasteful.  But while I may personally find such to be distasteful, I don't think they are immoral or illegal (IANAL) .  If the robot or inflatable doll was designed to look / act like a 10 year old, I'd find the idea even more distasteful than otherwise, but I don't think the action would actually be any more immoral or illegal.

Link to comment
6 hours ago, Netfoot said:

Not really.

I'm not interested in bonking a robot, or an inflatable doll, for that matter.  I find the idea distasteful.  (Is that what you mean by squeamish?  I see a distinction.)

Being an olde farte, I guess you could say that there are a number of fairly common sexual practices I find distasteful.  But while I may personally find such to be distasteful, I don't think they are immoral or illegal (IANAL) .  If the robot or inflatable doll was designed to look / act like a 10 year old, I'd find the idea even more distasteful than otherwise, but I don't think the action would actually be any more immoral or illegal.

Why would you be bothered by it at all? It's just a machine, after all. I would imagine  that most of the sexual practices you find distasteful involve two or more humans. This wouldn't. Rape and murder is fine (I've killed over a thousand hosts at Westworld!), Why draw the line here? To feel discomfort over murdering a child host acknowledges, at least to a degree, that the Westworld experience is different from anything that's gone before  it.

Link to comment

Consider this.

Suppose there is a Section of Westworld called Slaveland. In it, guests play the role of antebellum slave owners in the South. Guests are free to torture, rape, and murder hosts, designed to look like African-Americans, to their heart's content.

No one has any problem with that, right? No moral issues there, are there?

You find it distasteful? Why? The hosts are just machines, after all; it's no different from shooting at farm equipment, right? Is to argue otherwise not the same as saying "I'm all in favor of truck repairs, but auto mechanics creeps me out."?

  • Love 2
Link to comment
3 hours ago, Gobi said:

This wouldn't. Rape and murder is fine (I've killed over a thousand hosts at Westworld!), Why draw the line here?

You seem adamant that hosts can be raped and murdered.  I absolutely disagree.  You can not rape or murder a non-sentient machine, no matter what it looks like.  To say that it is, is to ignore reality.  And to suggest that shooting robots is intrinsically immoral because you will follow up by shooting people?  Well, if that were so, then shooting at these:

official-ice-qt-shooting-target-police-b

must also be immoral, because it would turn you into a murderer as well, wouldn't it?  

2 hours ago, Gobi said:

Suppose there is a Section of Westworld called Slaveland.   [...]  No moral issues there, are there?

None at all, as far as I'm concerned.  Although once again, I'd find it distasteful.

2 hours ago, Gobi said:

Is to argue otherwise not the same as saying "I'm all in favor of truck repairs, but auto mechanics creeps me out."?

I'm all in favour of surgery, but the sight of all that blood and guts would probably make me puke.

Link to comment
14 hours ago, Netfoot said:

If so, then all bets are off!  Without question, if they emerge as something other than simple, mindless bots, our treatment of them treatment must change. Because in that case, they cease to be unfeeling machines, and become something else.    But as far as anybody knows, they don't have feelings/free will.  And so, they are being treated accordingly.

They don't have free will, that's agreed.

I disagree that they don't have feelings though. They very much do, they are simply TOLD what to feel. Dolores is told that she loves her father, loves her mother, loves Teddy. And she acts accordingly, she worries about them, rushes to save them, mourns them.

Are these feelings any less real because they are implanted?

  • Love 4
Link to comment
27 minutes ago, Maximum Taco said:

I disagree that they don't have feelings though. They very much do, they are simply TOLD what to feel.

Sorry -- don't agree.  They aren't told what to feel.  They are told what behaviour to exhibit in order to fake the appropriate feeling. 

exhibit_surprise() {
	raise_eyebrows( slightly );
	mouth( open );
	breath( inhale, brief );
}
  • Love 2
Link to comment
37 minutes ago, Netfoot said:

Sorry -- don't agree.  They aren't told what to feel.  They are told what behaviour to exhibit in order to fake the appropriate feeling. 


exhibit_surprise() {
	raise_eyebrows( slightly );
	mouth( open );
	breath( inhale, brief );
}

The show itself told us differently in the most recent episode.

Ford: Why is this host covered? Perhaps you didn't want him to feel cold, or ashamed? You wanted to cover his modesty, is that it? It doesn't get cold, it doesn't feel ashamed. It doesn't feel a solitary thing that we haven't told it to. Understand?

If the hosts didn't feel anything he would've said "It doesn't feel anything." but it does. It feels what it's been told to feel.

Edited by Maximum Taco
  • Love 3
Link to comment

When it comes to discussing morality on this show, I think focusing on the concept or "robot" and actions against them is limiting.

Ford said that the hosts have passed the Turing Test. While there are real life criticisms of the test, in the context of the show it's shorthand for "Conversations with the hosts will not reveal them to be non-human." At the same time, Ford has boasted about all of the improvements they have made in the mechanics of the hosts. Outside of situational clues, they are effectively indistinguishable from the guests. Even Bernard seems susceptible to the illusion.

To a guest, the NPC type host that runs up to them boasting about a map leading to a goldmine in "Injun'" country is almost certainly non-human. But someone like Teddy, who travels in on the train and has a hero arc that is probably similar to many of the guests, appears far closer to being a guest than host to an observing guest.

So the the guest who shot Teddy in the saloon wasn't thinking about shooting robots, he was thinking about doing harm. If he shoots a guest, maybe some bruises. If he shoots a host, no big deal. But there was no way for him to determine the outcome at the time he made his decision.

  • Love 5
Link to comment
3 hours ago, Netfoot said:

You seem adamant that hosts can be raped and murdered.  I absolutely disagree.  You can not rape or murder a non-sentient machine, no matter what it looks like.  To say that it is, is to ignore reality.  And to suggest that shooting robots is intrinsically immoral because you will follow up by shooting people?  Well, if that were so, then shooting at these:

[target image]

must also be immoral, because it would turn you into a murderer as well, wouldn't it?  

For real, operant conditioning is how modern military training works to overcome the instinct not to kill.

  • Love 3
Link to comment
4 hours ago, Netfoot said:

You seem adamant that hosts can be raped and murdered.  I absolutely disagree.  You can not rape or murder a non-sentient machine, no matter what it looks like.  To say that it is, is to ignore reality.  And to suggest that shooting robots is intrinsically immoral because you will follow up by shooting people?  Well, if that were so, then shooting at these:

official-ice-qt-shooting-target-police-b

must also be immoral, because it would turn you into a murderer as well, wouldn't it?  

None at all, as far as I'm concerned.  Although once again, I'd find it distasteful.

I'm all in favour of surgery, but the sight of all that blood and guts would probably make me puke.

What I am saying is that the behavior could be immoral, even if the actual act might not be.

I am sure your opinion of child abuse is not that "I find it distasteful, but it's OK if others do it. Then why balk at it In Westworld? Saying you would have sex with an adult host but not a child host, is like saying you would have sex with a clock radio, but not a printer. It is, hope, the simulated act that you find abhorrent, even if it is not real.

These, to me, are the questions that the show raises.

Link to comment
1 hour ago, xaxat said:

Ford said that the hosts have passed the Turing Test. While there are real life criticisms of the test, in the context of the show it's shorthand for "Conversations with the hosts will not reveal them to be non-human." At the same time, Ford has boasted about all of the improvements they have made in the mechanics of the hosts. Outside of situational clues, they are effectively indistinguishable from the guests. Even Bernard seems susceptible to the illusion.

A robot who passes the Turing test is still a robot.  The test is a means of showing that the machine can successfully fake humanity.  It isn't proof that it actually possesses humanity.

I've made the point several times in this thread and elsewhere, that emergent robots obviously require treatment under a different set of rules.  However, in Westworld, there is hardly more than a hint that some of the hosts are emergent, and those hints appear to have been seen only by us the viewer, and not them the guests or operators of Westworld.  Therefore, any discussion on the attitude of guests/operators towards hosts must be made on the basis of the hosts not being emergent because non-emergent hosts is who they are dealing with, as far as they know.

48 minutes ago, arc said:

For real, operant conditioning is how modern military training works to overcome the instinct not to kill.

That's not the point.  It isn't immoral or illegal to shoot at targets, even if they show photo-realistic images of middle-eastern looking people wearing turbans.  (To my surprise, I recently discovered such targets are available.)  It would be illegal to shoot actual people, even if you blamed it on prior experience shooting targets.  

For the record, I have in the past competed many, many times, shooting Police Pistol and Service Pistol matches, using .32 and .357 revolvers as well as 9x19mm autos.  (I've been known to out-shoot guys with SIGs and Glocks, using a long-barreled wheelgun!)  It wasn't illegal not immoral, nor will anyone convince me that I'm somehow a bad person because of my latent tendencies to shoot people.

36 minutes ago, Gobi said:

I am sure your opinion of child abuse is not that "I find it distasteful, but it's OK if others do it. Then why balk at it In Westworld? Saying you would have sex with an adult host but not a child host, is like saying you would have sex with a clock radio, but not a printer.

I didn't say that "child abuse is OK if others do it."  I am saying that while I find some things distasteful, that doesn't automatically make them illegal or immoral.  This does not mean that it is legal or moral to do all the things I find distasteful!  It is not legal to hit a child in the head with a hammer.  It is legal to hit a doll in the head with the same hammer.  And for the record, I never said I would have sex with a host, be it child or adult, and don't appreciate those words being placed in my mouth.

The suggestion seems to be that:  If it looks like a human (man, woman or child) it must be treated like a human, and failure to do so should be illegal, but if it isn't a crime due to a legal loophole, it certainly is reprehensible and morally objectionable.  I can't accept this.

Now, I've made my point several times, and I'm tired of this, so  I will henceforth refrain from posting my opinion in this forum.

  • Love 1
Link to comment

A robot who passes the Turing test is still a robot.  The test is a means of showing that the machine can successfully fake humanity.  It isn't proof that it actually possesses humanity.

I'm not saying it possesses humanity. I'm saying that a human can't distinguish the difference. That might seem like a distinction without a difference, but it lies at the heart of emerging research into the future of artificial intelligence. 

Your hammer analogy is important. If I take a hammer and hit a robot, it's not morally objectionable. If I take a hammer and hit "someone" that may be a robot or may be a human, that's something totally different. At the end of the last episode we saw the ghouls (who I suspect are human) surrounding Teddy hesitate until he started shooting at them and they realized he could not hurt them. But what is the moral responsibility if they just attack him immediately, assuming that he is a host without knowing he is human or a host?

  • Love 1
Link to comment
25 minutes ago, xaxat said:

Your hammer analogy is important. If I take a hammer and hit a robot, it's not morally objectionable. If I take a hammer and hit "someone" that may be a robot or may be a human, that's something totally different.

You are misstating the case.  You should be discussing the situation wherein you hit "someone" that may be a robot or may be a human, with a "hammer" which you know can tell the difference and prevent the blow falling on a human.  So there can be no objection -- moral or otherwise -- to striking the blow.

Link to comment
35 minutes ago, Netfoot said:

You are misstating the case.  You should be discussing the situation wherein you hit "someone" that may be a robot or may be a human, with a "hammer" which you know can tell the difference and prevent the blow falling on a human.  So there can be no objection -- moral or otherwise -- to striking the blow.

There's no way actual Westworld hammers work like that. (For example, if the axes did, they wouldn't have had to limit axe access to specific hosts.) And there's no way in hell that big rocks in Westworld can tell who to not harm. So in ep 3, the stray either bashed his own head in because he recognized Elsie as a non-host or to destroy his head to prevent tech analysis of it, but it definitely wasn't the rock keeping Elsie safe. As best as I can tell, humans are only protected in Westworld by smartness of the guns and the programmed instinct of hosts to protect humans. Neither applies to the morality of human actions that don't use the guns.

  • Love 2
Link to comment
2 hours ago, Netfoot said:

You are misstating the case.  You should be discussing the situation wherein you hit "someone" that may be a robot or may be a human, with a "hammer" which you know can tell the difference and prevent the blow falling on a human.  So there can be no objection -- moral or otherwise -- to striking the blow.

That's not how Westworld weapons work.

The only weapons that don't work against humans are the firearms. Blades and blunt weapons and fists work on everybody, host and guest alike.

There is a "Good Samaritan" reflex in which any host present will try and take a fatal blow a guest would receive. But that isn't a 100% sure thing. If you are right next to a human and stab him/her in the gut the hosts will not have the time to react and take the blow. Or if there are no hosts nearby to take the blow and it's a fight between two guests one could very well die.

Edited by Maximum Taco
  • Love 3
Link to comment
42 minutes ago, Maximum Taco said:

That's not how Westworld weapons work.

I wasn't actually speaking of hammers per se, as I'm sure you know.  

Guests of Westworld have every reason to expect that they will be prevented from actually harming another guest.   Westworld staff have repeatedly said that there are protections in place.  If these were not practically fool-proof, they wouldn't be able to operate.  The would not get insurance coverage, and there would probably be statutory prohibitions in place as well.  Sure accidents can happen.  The last one was 30 years ago.  We don't know the nature of this accident, but we can be sure that whatever protections are in place now, far outstrip those that were in effect then.

So, guests know that they will not be able to rape or kill a living human being in the park, regardless of age, gender, or racial makeup.  Nor will they be able to rape or kill a host in the park, because hosts being un-emergent robots, the words 'rape' and 'murder' don't apply to them.

To repeat:  It is not possible to rape or murder a human being in the park, and you will know whether they are human or not, depending upon the outcome of your attempt to harm them.  If you succeed in harming them, they are a host and what you did isn't rape or murder.  If you fail, then perhaps they are human, perhaps not.  But it is a moot point, because your attempt to harm them failed.  Therefore, there can be no moral objection to your attempting to do them harm.

And please, don't tell me about hitting them with rocks, and throwing them over a cliff.  The premise of this show is that guests are protected from harm. We may not know exactly what mechanisms are in place to do this, but it is an axiom of the program.  Not only do we hear it said by characters in the show, but TPTB have said as much to the press as well. 

Link to comment
12 minutes ago, Netfoot said:

I wasn't actually speaking of hammers per se, as I'm sure you know.  

Guests of Westworld have every reason to expect that they will be prevented from actually harming another guest.   Westworld staff have repeatedly said that there are protections in place.  If these were not practically fool-proof, they wouldn't be able to operate.  The would not get insurance coverage, and there would probably be statutory prohibitions in place as well.  Sure accidents can happen.  The last one was 30 years ago.  We don't know the nature of this accident, but we can be sure that whatever protections are in place now, far outstrip those that were in effect then.

So, guests know that they will not be able to rape or kill a living human being in the park, regardless of age, gender, or racial makeup.  Nor will they be able to rape or kill a host in the park, because hosts being un-emergent robots, the words 'rape' and 'murder' don't apply to them.

To repeat:  It is not possible to rape or murder a human being in the park, and you will know whether they are human or not, depending upon the outcome of your attempt to harm them.  If you succeed in harming them, they are a host and what you did isn't rape or murder.  If you fail, then perhaps they are human, perhaps not.  But it is a moot point, because your attempt to harm them failed.  Therefore, there can be no moral objection to your attempting to do them harm.

And please, don't tell me about hitting them with rocks, and throwing them over a cliff.  The premise of this show is that guests are protected from harm. We may not know exactly what mechanisms are in place to do this, but it is an axiom of the program.  Not only do we hear it said by characters in the show, but TPTB have said as much to the press as well. 

The accident was referred to as a "critical failure" and it very obviously was referring to a problem with the hosts, not any incident ever.

There have been other deaths according to the Delos website where they allow you to book a trip to Westworld.

Quote

Statistically speaking, you are more likely to die from lightning strike than to die while in a Delos park. However, the following causes of accidental death have occurred within the Delos Destinations compound: buffalo stampede, self- cannibalism, accidental hanging,drowning, 3rd-degree burns, autoerotic asphyxiation, blunt force trauma, allergic reaction to non-native plant life, falling from great heights, common manslaughter, tumbleweeds. You absolve Delos, Inc. of any wrongdoing if you or anyone in your party suffers bodily harm while using The Service, and you agree to not sue or prosecute Delos, Inc. or any of the smaller entities falling under the Delos Corporation.

Note the mention of common manslaughter. Guests can kill other guests. It probably doesn't happen a lot, but it happens enough that it is specifically mentioned in the waiver they make all guests sign.

Edited by Maximum Taco
  • Love 4
Link to comment
1 minute ago, Maximum Taco said:

Note the mention of COMMON MANSLAUGHTER.

I see it.  I don't think it makes any difference.  

The chance of getting hit by lightning  in a year is 960,000 to one (Google). The chance of dying in the park due to common manslaughter and all the other listed causes, is lower than this, seeing as they state that "Statistically speaking, you are more likely to die from lightning strike than to die while in a Delos park."  So the chance of dying by common manslaughter specifically is probably considerably less than one million to one.  

So, statistically, it is still reasonable for any guest to proceed without fear that they will accidentally rape or kill someone.  Just as it is reasonable to lend someone your car without fear that the brakes will fail and they will be killed.

Link to comment

BTW, "buffalo stampede" is an interesting one considering the only non-host, non-guest life in the park are insects. So the human victims were stampeded by robot buffalo.

Also, figure that the park runs at capacity and has for thirty years. That's 2000 people a week (minimum stay is a week), and if no one has ever paid for the longer stays and no one's ever come back, then over thirty years they've still only had 3.1 million unique guests, with at minimum 11 different deaths. (10 if "tumbleweeds" was a joke.)  [edit: no, there's 1400 guests, so the ceiling is 2.18M guests over thirty years.]

(Alternatively, maybe there are other Delos parks and one is Fluffworld, where millions of people go every day to eat ice cream for $1. That would certainly help the overall Delos death numbers.)

Quote

Not only do we hear it said by characters in the show, but TPTB have said as much to the press as well. 

Yes, we've noted that specifically what TPTB have said is there is the Good Samaritan reflex. There's been no mention of any other safeguards in place, nor any hints of anything beyond that. And as we've said, that's neither omnipresent nor omnipotent. I strongly suspect Delos relies on the liability waiver at least as much as they do on (1) hoping guests don't want to kill each other (2) guest laziness which would mostly rely on the Westworld guns for any enacting any killing intent, and (3) the Good Samaritan reflex.

Edited by arc
Link to comment

RE: the feeling discussion: While checking out the Westworld Website the chat-host said this:

Quote

Hosts do not feel anything they are not programmed to experience. Every—and any—way you interact with a host is by design. They are more lifelike than ever before. They bleed. They sweat. They cry. Anything a human does, our hosts can do… and I mean anything.

There's no reason for a guest to assume anything else when they "killing" or "raping" the hosts. Of course it's different now that they are waking up, but none of the guests know that. 

The question if it's morally acceptable to do just that to the hosts reminds me of the ongoing discussions about video games and if they increase violence and aggression or not. I'm sure that debate will just get more heated with VR being on the rise, which I'd say is the step in between video games and WestWorld.

As for the deaths in the park, I will just give the creators the benefit of the doubt and assume they didn't do the math properly or the park exists longer than just 30 years (the last critical incident was 30 years ago, the park could very well be 50 years old) and maybe they even had more visitors back in the day and reduced it in favor of more complicated storylines.

Edited by tiramisue
Link to comment

To me, even more fascinating than the question of "what is it OK to do to a robot" is another philosophical question the show raises, one which bears on the very nature of human consciousness: What exactly is the difference between a really good robot and me? If Delores acts and reacts the way she does (with some latitude for improvisation) because of the programming by Ford & Co., what is the difference between her and me, who act and react the way I do (with some latitude for improvisation) because of the programming of my DNA and early environment--factors over which I have not one iota more control than Delores has over hers?

Edited by Milburn Stone
  • Love 6
Link to comment
On 10/18/2016 at 5:15 PM, arc said:

BTW, "buffalo stampede" is an interesting one considering the only non-host, non-guest life in the park are insects. 

Maybe the insects are fake, too. Maybe they are the cameras.

I'm sticking to my theory that the whole thing is underground.

Link to comment
On 10/16/2016 at 4:28 PM, Gobi said:

One of the fascinating things about this show for me is the moral issues that it raises. At what point does raping and killing (of cyborgs, in this instance) for entertainment cross a moral boundary? What responsibilities does a creator have to its creation? This just scratches the surface. Here's a forum to explore those issues.

A really good question. I think a good yardstick might be, if a reasonable person starts feeling guilty about causing something harm, they are near (or have already crossed over) the boundary between good and bad. Ofcourse, that might then lead to questions as to what constitutes a reasonable person ;-). 

Link to comment

I've asked this in a couple of topics, but it probably belongs here. Let's remove this bodily harm scenario, and replace it: if you go to this resort and fuck one of the robots, what would you expect your spouse / significant other to say? What if you were the spouse / significant other? 

Link to comment
47 minutes ago, Uncle JUICE said:

I've asked this in a couple of topics, but it probably belongs here. Let's remove this bodily harm scenario, and replace it: if you go to this resort and fuck one of the robots, what would you expect your spouse / significant other to say? What if you were the spouse / significant other? 

I think it's an interesting question, and I'm not completely sure of my answer. Empirically, I'd say it's no different than any other sex toy, so not a big deal, unless you're someone who has a problem with sex toys in general. It's fantasy, not real. I could see a couple having fun with the concept, so they could experience swinging or bisexuality without it being, you know, real.  Tho we haven't seen any male sex workers, have we? Are the Westworld creators just a bunch a sexists, or do they not have them cuz they didn't exist in the old west, or do women just historically not want to pay for sex cuz we don't have to? What about gay men who want to indulge?

I could be creeped out about it all, tho, if it was my guy's fantasy to go bang a bunch of robot whores, and spend a fortune to do it. I'm open minded but that still might...not fly well with me. I don't know. If nothing else, I might feel like he's kind of a loser skeezer.

Of course, once we get into the robots in questions being so lifelike that they appear to have emotions, feel pain and hate and love and desire, etc...let alone that they actually really might feel those things...that's a completely different scenario.

  • Love 2
Link to comment
3 minutes ago, luna1122 said:

if it was my guy's fantasy to go bang a bunch of robot whores, and spend a fortune to do it. I'm open minded but that still might...not fly well with me.

Well, there are states that allow "gentlemen's clubs," and many of their customers are married. Or, how about Hooters. Do they still have Chippendale's anywhere? Is it okay as long as no one falls in love?

Link to comment

There was a male sex bot with Logan and the two females at the saloon. Logan also had a male and  female host with him when he went to the dressing room. So, sex with male hosts can occur.

I think most spouses would have a problem with it, given that it's indistinguishable from the real thing. William wouldn't do it because of his fiance, even though Logan suggested that his fiancee  probably did during  her visit.

  • Love 3
Link to comment
On 10/18/2016 at 2:26 PM, Netfoot said:

 

I didn't say that "child abuse is OK if others do it."  I am saying that while I find some things distasteful, that doesn't automatically make them illegal or immoral.  This does not mean that it is legal or moral to do all the things I find distasteful!  It is not legal to hit a child in the head with a hammer.  It is legal to hit a doll in the head with the same hammer.  And for the record, I never said I would have sex with a host, be it child or adult, and don't appreciate those words being placed in my mouth.

 Late in replying to this  because I hadn't read it. I certainly did not intend to put words in your mouth, I was thinking in terms of "you, the audience". In retrospect, I should have  used the third person.

One of the moral issues in the show I find interesting, is what it says about someone who would want to experience what would be reprehensible in real life, and indistinguishable from the same, without any consequences. If one feels there is nothing wrong with doing whatever one wants with robots, then there is no restraint against child murder or Slave Land, or Spanish Inquisition Land (nobody expects that!) with robots. Yet who wouldn't be disgusted by anyone doing any of those things? Imagine the public outrage, justified I think, if there was a video game that allowed those things.

Westworld is many orders of magnitude beyond video games, or any other form of entertainment. What does it say about a society that tolerates or even encourages, as entertainment, acts indistinguishable from crimes? We in the tv audience are at a remove from the violence in WW. If it was a real park, and one heard the cries for mercy, the screams of pain, was splashed by blood, how could one not be affected by that? What sort of person would enjoy that?

These are not targets, used to practice a skill. Would people want to go to a shooting range where the targets were indistinguishable from and reacted the same as people when shot? WW says yes. Ford lost his bet with Arnold, the customers didn't want the hopeful stories, they wanted the evil ones.

  • Love 6
Link to comment

To answer the question, yes, if you can tell the difference, then it matters. The thing is, the show is confused on the point: Hosts like Logan believe that the difference is that the hosts don't have feelings and memories of what is done to them. We the audience are the only ones who really know how a few hosts are remembering. And remembering is intimately tied to consciousness, sapience. We ourselves tend to think even of ourselves as unconscious when we don't remember. Something like amnesia or dissociative states are deeply, deeply disturbing precisely because we know this deep down, even if we don't formally reason with a verbalized principle. Not only is the show confused about what sense people can't tell the difference, they're confused because the show pretty much commits itself to the opposite: It doesn't matter if you can't tell the difference. Which is stuff and nonsense even in a scifi kind of way, in my opinion. 

Characters like Ford and Bernard and Cullen, the ones with the power to do something in the story line, aren't written as having actual ideas on the subject. Perhaps in Hollywood people don't have ideas, just balance sheets. But most real people do. Leaving out the ideas in people who do mental work is not writing the character. 

I guess I'll finish this short season, but I have no hope for this show ever really succeeding artistically as anything but a premium cable melodrama, one with lots of skin and blood.

Link to comment
15 minutes ago, sjohnson said:

To answer the question, yes, if you can tell the difference, then it matters.

I wouldn't jump on a typo, but the rest of your post has me genuinely thinking you're answering a very different question to the one posed in the thread title.

Link to comment
1 hour ago, Gobi said:

If one feels there is nothing wrong with doing whatever one wants with robots, then there is no restraint against child murder or Slave Land, or Spanish Inquisition Land (nobody expects that!) with robots. Yet who wouldn't be disgusted by anyone doing any of those things? Imagine the public outrage, justified I think, if there was a video game that allowed those things.

People do a great many things which I find disgusting and outrageous.  Things which society deem legal and possibly even laudable.  But it makes no difference if I find those things outrageous, or whether you find Slave Land and Spanish Inquisition Park outrageous, or at least it shouldn't.  If those things are legal, then no moral outrage of yours or mine matters a damn.  And so long as no people suffer as a result, no whining made from the saddle of a high horse puts either of us in some sort of position of moral superiority.  

Link to comment
8 hours ago, Netfoot said:

People do a great many things which I find disgusting and outrageous.  Things which society deem legal and possibly even laudable.  But it makes no difference if I find those things outrageous, or whether you find Slave Land and Spanish Inquisition Park outrageous, or at least it shouldn't.  If those things are legal, then no moral outrage of yours or mine matters a damn.  And so long as no people suffer as a result, no whining made from the saddle of a high horse puts either of us in some sort of position of moral superiority.  

I'm not whining or putting myself on a higher plane. I'm intrigued by the questions that arise. To look at it from a different perspective: Suppose it was shown that having places such as Westworld reduced acts of violence in the real world, would that be a good thing? I think it would. But does that mean that the people committing those acts in a safe environment without fear of punishment have become morally good people? I'm not so sure. Slavery and the Spanish Inquisition were, after all, both legal. Were the slavers and inquisitors good people? I'm sure they thought they were. It is not just the effect on the victim, but also the effect on the perpetrator that I find interesting, something that is infrequently addressed in popular fiction. In the show, William is reluctant to harm hosts, probably because he sees them as too close to human. If, as people suggest, he is TMIB at a younger age, it will be interesting to see how he changes. Each small act of violence perhaps making it easier to commit worse acts. A lesson that has relevance for the real world.

  • Love 1
Link to comment
3 hours ago, Gobi said:

But does that mean that the people committing those acts in a safe environment without fear of punishment have become morally good people?

If someone has urges, tendencies, desires, to do something generally considered reprehensible -- say murder -- but live an entire life without succumbing to these urges, are they morally good or evil?  

When I am sitting in a cinema trying to hear the dialogue over the incessant screaming of a babe-in-arms, thoughtfully brought into the cinema by some moronic mother, am I morally a bad person for having the fleeting desire to go over and kick mummy in the neck so hard her head flies off?  Before you answer, remember that as a civilized person, I never actually do go over and kick mummy's head off.  

There's a lot of harping on about how you must be a bad person to enjoy shooting a robot.  Rubbish.  I've shot clay pigeons, paper targets depicting human forms, and steel cutouts of animals, using genuine firearms.  I've used every type of small arms from a knife to the original BFG in computer games, directed against human and non-human targets.  I've used tanks, bombs, even nukes on a tactical level.  And I've enjoyed it all.  And because I knew all along that no people (or animals) were actually being harmed, I reject completely the suggestion that I'm somehow a "badder" person because of it.

  • Love 2
Link to comment
19 minutes ago, Netfoot said:

If someone has urges, tendencies, desires, to do something generally considered reprehensible -- say murder -- but live an entire life without succumbing to these urges, are they morally good or evil?  

When I am sitting in a cinema trying to hear the dialogue over the incessant screaming of a babe-in-arms, thoughtfully brought into the cinema by some moronic mother, am I morally a bad person for having the fleeting desire to go over and kick mummy in the neck so hard her head flies off?  Before you answer, remember that as a civilized person, I never actually do go over and kick mummy's head off.  

There's a lot of harping on about how you must be a bad person to enjoy shooting a robot.  Rubbish.  I've shot clay pigeons, paper targets depicting human forms, and steel cutouts of animals, using genuine firearms.  I've used every type of small arms from a knife to the original BFG in computer games, directed against human and non-human targets.  I've used tanks, bombs, even nukes on a tactical level.  And I've enjoyed it all.  And because I knew all along that no people (or animals) were actually being harmed, I reject completely the suggestion that I'm somehow a "badder" person because of it.

This is what I like about this show. I don't know the answers to these questions, it's thinking about them that I enjoy.

For example, the law is concerned with ends more than means. If someone is stopped from committing murder only because of fear of the law, society's goal is accomplished. Yet, there are ethical/religious systems that hold that if someone wants to commit murder and is stopped only by fear of the law, they are guilty of that crime in their heart/soul. I find that to be an extreme viewpoint, but I can understand it. The real challenges, I think, come in trying to draw the line. We can all recognize extremely good or extremely bad acts; determining where one begins to slide into the next is difficult. We can tell the difference between a serious novel about sex and pornography. But at what point does a serious attempt become porn, or can porn become serious literature? 

In our world, perhaps the closest thing to Westworld would be first person shooters or similar RPGs. I see nothing wrong in them, or target practice with human images. Westworld, though, pushes it to the extreme: What if you can't tell the difference? That's where the interesting questions rise up. These aren't clay pigeons, targets, or images on a screen. The hosts are indistinguishable from humans.

The consensus here is that if the hosts are not sentient, or if the guests are not aware the hosts are sentient (negligent roboticide?) there's nothing wrong being done. Certainly, that is not an unreasonable position. At the same time, I can recognize that the answer may not be that simple.

  • Love 1
Link to comment
18 hours ago, arc said:

I wouldn't jump on a typo, but the rest of your post has me genuinely thinking you're answering a very different question to the one posed in the thread title.

Jump, jump, it is a typo where I misquoted the very title of the thread!

Link to comment
20 hours ago, sjohnson said:

Perhaps in Hollywood people don't have ideas, just balance sheets.

Perhaps?  

5 hours ago, Netfoot said:

There's a lot of harping on about how you must be a bad person to enjoy shooting a robot.

Hunters are not inherently bad people, and they seem to enjoy the sport. I think this is a complicated topic.

5 hours ago, Gobi said:

If someone is stopped from committing murder only because of fear of the law, society's goal is accomplished.

I once took a course in criminology, and the most important thing I learned is that the law is not a deterrent because criminals don't think they'll get caught. 

  • Love 1
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...