Gobi October 23, 2016 Share October 23, 2016 1 minute ago, okerry said: Between the actors and whoever's coaching them, the skills of those playing the hosts are really amazing. The scene between Ford and Delores's father was just astounding. Don't think I've seen anything quite like that. To me, whether it matters to the host or not really isn't the question. It will matter to You. And if You get off on causing pain and torment to what looks and feels to you like another human being, maybe You should go home and rethink your life. Reminds me of stories that some veterans had their PTSD triggered by watching "Saving Private Ryan". What kind of effect would being in Westworld have on people? 1 Link to comment
phoenyx October 28, 2016 Share October 28, 2016 On 10/23/2016 at 0:25 PM, okerry said: Between the actors and whoever's coaching them, the skills of those playing the hosts are really amazing. The scene between Ford and Delores's father was just astounding. Don't think I've seen anything quite like that. To me, whether it matters to the host or not really isn't the question. It will matter to You. And if You get off on causing pain and torment to what looks and feels to you like another human being, maybe You should go home and rethink your life. I completely agree with that 'causing pain and torment' bit. I'd also like to say that if the hosts can be said to have feelings (and they certainly -look- like they have feelings), then it also matters to the hosts. 2 Link to comment
John.G November 25, 2016 Share November 25, 2016 Nobody seems to be talking about this, which might mean I'm wrong, but here goes: it sure seems to me that TMIB killed a Guest during the shootout in Lawrence's town. We see a scared younger man run across the scene, shooting blindly. He hides behind an adobe wall, muttering to himself. Did not seem like "cousin of Lawrence" behaviour to me. TMIB laughs, then does something to the hammer on his pistol, then shoots the kid through the wall. Link to comment
Gobi November 25, 2016 Share November 25, 2016 6 minutes ago, John.G said: Nobody seems to be talking about this, which might mean I'm wrong, but here goes: it sure seems to me that TMIB killed a Guest during the shootout in Lawrence's town. We see a scared younger man run across the scene, shooting blindly. He hides behind an adobe wall, muttering to himself. Did not seem like "cousin of Lawrence" behaviour to me. TMIB laughs, then does something to the hammer on his pistol, then shoots the kid through the wall. That was my thought at first, somewhere there's a discussion here about it. The bit with the trigger involves setting it to fire a large caliber round through a barrel underneath the main barrel. The larger round is shown in the scene with Lawrence's family. The victim says a prayer in Spanish, which may or may not mean he's a host. The consensus is that TMIB is killing another host, not a guest, although it sure doesn't look like it. Even if he is, it does show that he's got a gun and ammo that can shoot through Adobe walls. Seems awfully unsafe. Link to comment
arc November 25, 2016 Share November 25, 2016 We did talk about whether that guy was a host and what TMIB's gun did when the hammer was toggled in the previous two or so pages of this thread. 1 Link to comment
Netfoot November 25, 2016 Share November 25, 2016 4 hours ago, John.G said: then does something to the hammer on his pistol, Essentially, he is lowering the firing pin, so it strikes closer to the central axis of the cylinder. There are nine regular rounds on the periphery of the cylinder, but one larger round located on the axis. By lowering the firing pin, TMIB is setting the weapon to fire the central cartridge instead of one of the smaller cartridges at the periphery. This, I assume, to penetrate the wall. And I'm guessing he knows he will need to do this, because he's played out this scene before. 1 Link to comment
MrWhyt November 25, 2016 Share November 25, 2016 14 minutes ago, Netfoot said: Essentially, he is lowering the firing pin, so it strikes closer to the central axis of the cylinder. There are nine regular rounds on the periphery of the cylinder, but one larger round located on the axis. it's a modification of a LeMat revolver. 1 Link to comment
Gobi November 25, 2016 Share November 25, 2016 23 minutes ago, Netfoot said: Essentially, he is lowering the firing pin, so it strikes closer to the central axis of the cylinder. There are nine regular rounds on the periphery of the cylinder, but one larger round located on the axis. By lowering the firing pin, TMIB is setting the weapon to fire the central cartridge instead of one of the smaller cartridges at the periphery. This, I assume, to penetrate the wall. And I'm guessing he knows he will need to do this, because he's played out this scene before. I don't think he knew he would have to do this, just that he was preparing for such an eventuality. I say this because he was surprised to find out that Lawrence had a family and also, I believe, mentioned that he had never been in that town before then. 2 Link to comment
ElectricBoogaloo December 13, 2016 Share December 13, 2016 (edited) The Man in Black is super creepy. On a shallow note, for me the Man in Black title goes to Westley in The Princess Bride so I just keep calling the guy on this show Evil Ed Harris (sometimes abbreviated in my head to Crazy Asshole). I'm still getting used to the cast and trying to keep the names of the characters straight. The non-Thandie prostitute looked so familiar to me so I kept trying to figure out where I'd seen her before. It turns out Angela Sarafyan has been on a bunch of shows but I recognize her from Buffy the Vampire Slayer, Nikita, and The Good Guys. And now I know that her name is Clementine Pennyfeather, which cracked me up. I thought the little boy was Ford's kid. I know we saw some kids in the previous episode. They were running around on the main street in town. I really hope the kid robots are just to provide a realistic atmosphere, not because there are perverts who want to rape them. For some reason, the Danish actress's accent was much more pronounced to my ears in this episode and I had a harder time understanding everything she said. I like that we got some focus on some of the other hosts in this episode (no offense to Dolores). I was kind of appalled by the way the park people were trying to "fix" Maeve by making her more aggressive. Maybe the recent guests are more into Clemetine, or maybe the recent guests are more interested in the other adventures available at the park. I really enjoy seeing Bernard and Ford talk to the hosts. They both seem very lonely. I'm impressed by how the actors playing the hosts change their facial expressions when talking to Bernard and Ford so that we can see when they're using their host personas versus when they're in their analysis modes. Edited December 13, 2016 by ElectricBoogaloo Link to comment
markx February 8, 2017 Share February 8, 2017 The whole question of whether AIs can be sentient, and whether what we see in the show is unethical, is something that I'm loving about this series. We still don't have a clue as to what causes conciousness, or what are qualia. Chances are we still won't have a clue, even by the time we are capable of creating human-level AI. A philosophical zombie is a being that's indistinguishable from humans, except that they lack consciousness. They'd react to pain in the same way, they'd tell you they experienced pain, but in fact, there would be no mind. One viewpoint is that philosophical zombies can't exist. Once you have an intelligent being with all of the intelligent functions, memory, reactions of a human, such a being is necessarily conscious, even if this happens only as part of a computer program. On the other hand, maybe human brains contain an important property that a computer does not. Many people would see the robots as glorified NPCs, and even if they thought the actions distasteful for whatever reason, they wouldn't be doing so out of concern of AIs being abused. People often think of "what computers can't yet do" as examples of intelligence, but when they can do it, "it's just a program". Someone from 100 years ago might wonder if the Google Assistant bot in a phone is like a real person, yet no one today what even think there was any sentience, or ethical issues. A self-driving car might seem pretty cool and futuristic, but I don't think anyone's going to think it has any kind of sentient mind. Now I don't think this means software can never be sentient - but I can see there will be a problem that we won't have any way to know if/when it happens. I do think it's a bit odd they don't have any kind of separation for different types of visitors though - I mean, even if I think there's nothing wrong with people who want a rape/murder fantasy, if I'm there to enjoy the experience or play it like an RPG, I'd be a bit annoyed if some other guy decides to stab all the hosts in the room. It would sort of ruin the immersion (even "But it's the Wild West" doesn't work when the guests have god-like powers, nor do other guests have the ability to stop them without resorting to real violence). MMORPGs generally don't let players go on god-mode killing sprees of all the NPCs, because even if some people might like it, other players won't. Second Life caters to many users who roleplay sex and BDSM, but my understanding is there tend to be different regions for different kinds of things. Second Life also has a minimum age of 13, and age restricted regions - it's odd to see child visitors in Westworld, but maybe things have changed in the future it's set in. "I'm no biologist, but if the hosts have vision -- which requires receptors between eye and brain -- isn't it possible that they have pain receptors too? If they can see and differentiate between light and dark, can they also differentiate between heat and cold? Soft and hard?" There's the difference between a receptor that can detect light of say a certain wavelength, and the experience (qualia) of seeing say a particular colour. A thermometer doesn't feel heat. One could make a robot that could detect the things which to a human cause pain - it could be linked up to that robot giving the same reactions that a human gives. It could be programmed to say "Yes that really did hurt, I do feel pain". At which point is it sentient and really experiencing pain? 2 Link to comment
SlackerInc April 9, 2017 Share April 9, 2017 On 2/8/2017 at 3:48 PM, mdwh said: The whole question of whether AIs can be sentient, and whether what we see in the show is unethical, is something that I'm loving about this series. We still don't have a clue as to what causes conciousness, or what are qualia. Chances are we still won't have a clue, even by the time we are capable of creating human-level AI. A philosophical zombie is a being that's indistinguishable from humans, except that they lack consciousness. They'd react to pain in the same way, they'd tell you they experienced pain, but in fact, there would be no mind. One viewpoint is that philosophical zombies can't exist. Once you have an intelligent being with all of the intelligent functions, memory, reactions of a human, such a being is necessarily conscious, even if this happens only as part of a computer program. On the other hand, maybe human brains contain an important property that a computer does not. Many people would see the robots as glorified NPCs, and even if they thought the actions distasteful for whatever reason, they wouldn't be doing so out of concern of AIs being abused. People often think of "what computers can't yet do" as examples of intelligence, but when they can do it, "it's just a program". Someone from 100 years ago might wonder if the Google Assistant bot in a phone is like a real person, yet no one today what even think there was any sentience, or ethical issues. A self-driving car might seem pretty cool and futuristic, but I don't think anyone's going to think it has any kind of sentient mind. Now I don't think this means software can never be sentient - but I can see there will be a problem that we won't have any way to know if/when it happens. I do think it's a bit odd they don't have any kind of separation for different types of visitors though - I mean, even if I think there's nothing wrong with people who want a rape/murder fantasy, if I'm there to enjoy the experience or play it like an RPG, I'd be a bit annoyed if some other guy decides to stab all the hosts in the room. It would sort of ruin the immersion (even "But it's the Wild West" doesn't work when the guests have god-like powers, nor do other guests have the ability to stop them without resorting to real violence). MMORPGs generally don't let players go on god-mode killing sprees of all the NPCs, because even if some people might like it, other players won't. Second Life caters to many users who roleplay sex and BDSM, but my understanding is there tend to be different regions for different kinds of things. I'm a little late to this party, but I only get HBO four months a year, and I'm just starting to get into this show. Pretty good so far! (And please, no one spoil me.) Nice to see someone conversant with the idea of "qualia" and "philosophical zombies", meaning they have thought deeply on the nature of consciousness, and what it means regarding AI. I am one of those people you referred to who don't believe philosophical zombies are possible. I believe if an AI can simulate every nuance of a person's personality, it must have sentience, must experience qualia. There is "something it is like" to be that so-called "zombie", meaning it's not a zombie at all. This is of course a nearly impossible opinion to prove, and it may continue to be after AI becomes very advanced. We may have AIs that seem to have emotions and intellects beyond ours, but how will we know they are conscious? We technically don't know other humans are conscious (the solipsistic perspective), but we can reason by analogy that since we were created the same way they were (in a woman's womb) and have the same basic stuff inside of us, they must be conscious since we are. This obviously doesn't hold in the case of an AI. Something I find specifically interesting (albeit annoying) in response to fictional representations of AI on TV and in movies is that there are many viewers (including not a few professional critics) who are so unable or unwilling to imagine an AI could be conscious, it makes them "watch [the show or movie] wrong". I'm thinking in particular of reactions I've seen to the movies Her and AI: Artificial Intelligence. Even if you think conscious AIs are impossible in the real world, I think it's hard to dispute that the people who made those movies intended the AIs in them to be considered sentient. To watch them and stubbornly see it otherwise, as their being mere unthinking machines, is for my money missing the point: like watching a Jurassic Park movie and saying there's no reason for the characters to run from the T-rex, because "it's impossible to recreate dinosaurs from amber". The other part I bolded is also a really good point. The people who choose black hats should be sent to another section of the park. Or rather, it maybe should go the other way: those who want an experience without that kind of random, senseless violence should be able to go to a "bunny hill" area. But maybe that's too limiting, acting as though only children or the very sensitive would want that. You could also have an area where people can be violent, but they are expected to respect the narrative and preserve a sense of realism so other guests can feel a fully immersive experience. So that would also include not loudly talking about how this is a vacation or using anachronistic language. Relatedly, I wondered about the fact that the guests (as we've mainly seen with the Ed Harris character) cannot be harmed. I don't play video games myself, but I understand the concept of "god mode". I also understand it's generally only available via a "cheat code". Clearly this isn't because video game makers want to keep people from having fun. Okay, in arcades they want you to put in more money. But I don't think many games sold for home use have "god mode" as an overt option, or at least not the default option; and I'd venture to say that's because it's boring. What is the point of being a quick draw if your enemy's shots just flutter your clothing a bit? The Ed Harris character is not actually as badass as he makes himself out to be: we have seen him get "hit" quite a bit, but he's not on a level playing field. So I'm surprised we haven't (at least thus far) seen anyone playing like it's paintball or Lazer Tag: that when you get shot you would obviously not actually get hurt, but you'd at least "lose a life", meaning you'd have to go down, and then when the action moved away from you, be ushered through a back door to either be done completely, or at least be sent back to the train station. (You would need to have some way of penalizing people who refused to play along: a financial penalty, or maybe being banned from the park.) 2 Link to comment
femmefan1946 April 21, 2018 Share April 21, 2018 (edited) Maybe (MIB) is a sadistic bastard who wants people to die slowly, but if that is so, that aspect of him hasn't been introduced so far. But his first action was to rape Dolores. His second was to tell her he would not be visiting her tonight, which would only confuse her as a host, but would terrify a human woman. So sadism from the first. when William and his friend are talking he says he thought the friend didn't want to talk about work and the guy goes "this is work." Makes me wonder if they don't work for Delos co.. Or some sort of team building exercise. Like the one in Good Omens. the ones with the lower parts of their faces painted (in what looked like red, white and blue?) they were in Maeve's dream and then were 'introduced' as part of a new storyline that Ford rejected. It’s the Wild West. And many, if not most, First Nations used faceprint, with some patterns being traditional and some personal. How does Ed Harris stay in the park so long if it's so expensive? At $40K a day, that’s “only” about $14,500,000 annually. And he may get Frequent Flyer points. It would take 68 years to spend a billion dollars at that rate. Kind of like if you go to Renaissance Fair and buy a turkey leg there. There is no way turkey legs in the medieval time could be as big and as juicy as what you buy at the fair. Since turkeys are American birds, they would not exist in Medieval Europe. Goose or swan would be available. Since Bernard was never wiped at the end of the day like other hosts, he became more human with every passing day. He would also never age. If the rest of the staff are around for any length of time, this might be noticeable. They showed Maeve talking to Japanese guests in Japanese, so it isn't just an American thing in the show. Westerns have been very popular in Euorpean cuuntries as well, so I think there would be a market for Westworld beyond America. It would be big with Germans. They love their Karl May. Can you rape or murder someone who in every respect gives you the same experience you'd get from doing the same thing to a human, even if you know intellectually that they're not human, without it having some kind of effect on the psyche? Race-based slavery was only outlawed in the British Empire in 1834. Other governments took longer. (Bonding and debt slavery is still with us.) To most slave holders, the slaves were property. Among other atrocities, gynecological experimentation was still going on well past the date when anesthesia became common. Because they decided the slaves did not feel pain. Edited April 22, 2018 by femmefan1946 Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.