Jump to content

Type keyword(s) to search

S01.E02: Chestnut


paigow
  • Reply
  • Start Topic

Recommended Posts

But computers can get infections and the Hosts are essentially computers. I think the scientist was anthropomorphizing because Maeve presented  expressions that mimicked human discomfort. That's what she was built for, to trick humans into thinking she's real.

I'm really looking forward to the moment when they jump from pre-programmed machines to fully conscious beings with free will, but I don't think we're there yet.

They're pretty close already. Dolores went to town to get medical help for her father, which wasn't remotely part of the narrative. Maeve woke up inside the design facilities and she knew that there were other people in the room, that a knife could threaten them, how to run away in a strange structure that doesn't resemble anything she'd be programmed to recognize... They're certainly not pre-programmed in the sense of being irrevocably fixed on a narrative path written for them. What have we seen from the hosts already that suggests they aren't fully conscious?

(and again, the human guests in show get a pass for now on their behavior towards the hosts because they've been told that hosts aren't conscious and so far it seems reasonable for them to believe that. Maybe. After all, Elsie (Shannon Woodward's char) did say it'd be terrible if hosts remembered what guests put them through.)

Edited by arc
  • Love 4
Link to comment
1 hour ago, arc said:

What have we seen from the hosts already that suggests they aren't fully conscious?

Although only a few of them have exhibited any kind of self-awareness at all, I think as a species they are "waking-up". But they will never be human. Can they be a parallel race of beings? Can they exist without programmers? Can they reproduce? Can they create? Can they be anything at all without human intervention and human will? 

Can a mere tv series both entertain and answer these questions?

  • Love 2
Link to comment
3 minutes ago, Broderbits said:

Although only a few of them have exhibited any kind of self-awareness at all, I think as a species they are "waking-up". But they will never be human. Can they be a parallel race of beings? Can they exist without programmers? Can they reproduce? Can they create? Can they be anything at all without human intervention and human will? 

Can a mere tv series both entertain and answer these questions?

If they become sentient and gain the knowledge of creating new hosts, is that not reproduction because it is not sexual? They will never be human, but they can be a different life form, even if not a fully biological one. There must be some biological element to them, else Maeve would not be affected by MRSA.

  • Love 3
Link to comment

If I were a guest at Westworld and involved in the middle of a particular, engrossing storyline, I know I'd be really ticked off if another guest shot up my host just because he could. That makes me wonder if there is a some sort of procedure in place for complaining about other guests - or am I expected to either take it or get even on my own?

  • Love 9
Link to comment
46 minutes ago, Gobi said:

If they become sentient and gain the knowledge of creating new hosts, is that not reproduction because it is not sexual?

That's a really interesting question! Can manufacturing be considered a form of reproduction? 

The Creator and his protege seem to want the Hosts to be human but do we know yet what their standards are? There must be more to it than outwardly visible characteristics.

  • Love 1
Link to comment
6 hours ago, Gobi said:

There must be some biological element to them, else Maeve would not be affected by MRSA.

It's interesting. They weep, sweat and bleed, so evidently there is a fairly large biological component to their make-up. They also seem to have the normal amount of blood. I guess they also eat and drink and have a normal human digestive/nervous/endocrine etc system. Whether that extends to the reproductive system who knows. I remember Boomer got pregnant didn't she? She also almost got raped on board the Galactica by that Pegasus officer, and I remember the controversy that caused.

Regardless of how human-like the hosts appear, it is what they feel and think that propels the narrative and our sympathies. If a host can feel terror, pain, anguish, grief and other deep emotions, then to me they are living creatures, human or otherwise, not machines, and ought to be treated as such. To say that they are machines and therefore guests have carte blanche to do anything horrific to them is to condone sadistic violence to any living thing. If your toaster could feel pain, would you then beat it just because there was no law saying you couldn't?

  • Love 1
Link to comment
10 hours ago, Terrafamilia said:

If I were a guest at Westworld and involved in the middle of a particular, engrossing storyline, I know I'd be really ticked off if another guest shot up my host just because he could. That makes me wonder if there is a some sort of procedure in place for complaining about other guests - or am I expected to either take it or get even on my own?

I never considered this. It's one thing to shoot your hooker or tour guide but it would be really frustrating to take up the treasure hunt with the old man, only for that annoying asshole to stab him through the hand. 

I don't think being able to seek your own revenge is suitable either. You can't physically harm the Guest that does it and even if you managed to ruin their quest, you would still not get the enjoyment you originally wanted. 

Link to comment

That's really why I hate William's BIL (don't remember his name). Like, it's fine if all he wants to do is kill and screw his way through Westworld (after he pretended he it was about so much "more" than that for him), but damn, maybe William WANTED to go on the treasure hunt? They split up anyway at the brothel, so he should have just let William go off and do what he wanted. I can't stand people that only want you to have fun they way they think you should be having fun. 

  • Love 10
Link to comment
12 hours ago, Gobi said:

There must be some biological element to them, else Maeve would not be affected by MRSA.

I hope the show didn't mention MRSA so they can use it for a simplistic demise of all the robots at the end of the final season (like the aliens in War of the Worlds).

  • Love 1
Link to comment
18 hours ago, ennui said:

Except the robot horses were panicking during the shootout, just as natural horses would. 

There seems to be a lot of speculation about WW being a game. I think it's only a game for MiB. For most of the guests, it's an entertaining resort. I doubt they get a lot of repeat visitors, given the price, but maybe they sell season passes. 

Going back to the shootout -- clearly, the hosts are programmed for deadly accuracy. That blonde villain didn't miss a single shot. 

The shootout was a planned special event meant to extract and clean up the hosts "infected" with the recent (faulty) upgrade.  It could be assumed that the programmers added all of the "horses panicking" and "deadly shooter villain" protocols to the script to add realism.

Edited by DarkRaichu
  • Love 2
Link to comment
8 hours ago, spottedreptile said:

If a host can feel terror, pain, anguish, grief and other deep emotions, then to me they are living creatures, human or otherwise, not machines, and ought to be treated as such. To say that they are machines and therefore guests have carte blanche to do anything horrific to them is to condone sadistic violence to any living thing. If your toaster could feel pain, would you then beat it just because there was no law saying you couldn't?

But perhaps the host is a machine, created by man, and designed to fake terror, pain, anguish, grief and other deep emotions, so as to augment the realistic nature of the artificial construct that it is.  We know they express other emotions, such as desire and arousal, in order to facilitate their narrative.  Why not the others as well?

</devil's_advocate>

  • Love 2
Link to comment

I probably should write this in the first episode thread, but I think that TMIB wanted a frantic Dolores in hopes that she would divulge information about the maze. Do we really know he raped her or just assumed it?

  • Love 1
Link to comment
17 minutes ago, Enigma X said:

I probably should write this in the first episode thread, but I think that TMIB wanted a frantic Dolores in hopes that she would divulge information about the maze. Do we really know he raped her or just assumed it?

I thought the same thing and posted about it in that thread.  I think we assumed it because of his comment that he wanted her to put up a fight -- as foreplay for him.  It reminded me of a scene in a very old movie -- The Vikings -- Kirk Douglas is about to rape Janet Leigh.  She stops fighting him and just lays there. He tells her to "Fight! Scratch!"  It's all about the violence, not the sex.

With MiB, now that we know he's not the usual Guest and that he's encountered Dolores several (many?) times before, maybe it wasn't about rape. 

  • Love 2
Link to comment
2 hours ago, Netfoot said:

But perhaps the host is a machine, created by man, and designed to fake terror, pain, anguish, grief and other deep emotions, so as to augment the realistic nature of the artificial construct that it is.  We know they express other emotions, such as desire and arousal, in order to facilitate their narrative.  Why not the others as well?

</devil's_advocate>

But if the hosts feel those emotions and pains, just as  humans would,  is it any less real because it was programmed?

Edited by Gobi
Spelling
  • Love 1
Link to comment

I'm not understanding how they would ever actually feel pain.  Fake programmed pain when their sensors sense, sure.  But, since we saw them being made synthetic strand by synthetic strand via 3D printer, and dipped in more silicone coating or whatever the white stuff is, what exactly is going to hurt and what "brain" is gong to "feel" actual pain? 

Milk drinking dude was walking around with holes in him, and milk pouring out of the holes, but he still kept walking and drinking and laughing up a storm.  And HE was even infected! 

I still don't believe they actually feel emotions, either.  It is evidenced by how they immediately drop the emotions at a command from one of the center people, just like they drop the accents when commanded to do so.

Edited by izabella
  • Love 2
Link to comment
13 minutes ago, Gobi said:

But if the hosts feel those emotions and pains, just a  humans would

Who says they are?  Perhaps they feel no pain at all, and only pretend to feel it, because that's what would be expected of the human they are pretending to be.

They are constructed by humans to fill various roles as required.  So as to fill those roles, they have realistic pretend-skin, realistic pretend-hair, realistic pretend-eyes.  They have realistic pretend-speech/voice, probably realistic pretend-B.O. and all.  The next step towards realism is adding realistic pretend-emotional responses, which would appear to the casual observer as if the bot actually experienced the emotions.

  • Love 3
Link to comment
20 minutes ago, izabella said:

Milk drinking dude was walking around with holes in him, and milk pouring out of the holes, but he still kept walking and drinking and laughing up a storm. 

That's a good point ... and sloppy writing, is all I can say. Either the hosts feel very real, non-performative "physical discomfort" (which would account for Maeve clutching her stomach in the bar when [as far as she knew] she had no audience) or are at the very least programmed to mimic it (which you would probably want to do, so your guests who are pretending to have a shootout with robots get the appropriate reaction) ... OR, they are machines that don't "feel" anything at all, and can walk around cackling while they're full of bullet holes. Now we can assume that the guy full of holes was just glitching and that's why he wasn't feeling/reacting to all those gunshot wounds, but that doesn't explain why Maeve was experiencing pain when there was no one watching for her to experience it for.

Link to comment
11 minutes ago, withanaich said:

Now we can assume that the guy full of holes was just glitching and that's why he wasn't feeling/reacting to all those gunshot wounds, but that doesn't explain why Maeve was experiencing pain when there was no one watching for her to experience it for.

Don't computers perform automated tasks even when no human is at the controls?

  • Love 2
Link to comment
8 minutes ago, withanaich said:

That's a good point ... and sloppy writing, is all I can say. Either the hosts feel very real, non-performative "physical discomfort" (which would account for Maeve clutching her stomach in the bar when [as far as she knew] she had no audience) or are at the very least programmed to mimic it (which you would probably want to do, so your guests who are pretending to have a shootout with robots get the appropriate reaction) ... OR, they are machines that don't "feel" anything at all, and can walk around cackling while they're full of bullet holes. Now we can assume that the guy full of holes was just glitching and that's why he wasn't feeling/reacting to all those gunshot wounds, but that doesn't explain why Maeve was experiencing pain when there was no one watching for her to experience it for.

But they were also having long conversations with no one around to experience it for. Like Dolores and her dad every morning or her and Teddy's conversation about the Judas Steer. It's the same thing, as long as they are "active" and not glitching, they are supposed to mimic human behaviour and feelings on every level. Otherwise Dolores would never be able to stop herself from hyperventilating and having a panic attack of sorts at a moments notice, just because someone says she should shut off her emotions. It's probably the same with pain.

I could also imagine that Maeve would need a lot more sensory receptors than the milk guy because of her role (if a guest touches the skin on her stomach she is supposed to react), whereas the milk guy is just there to shoot and be shot and doesn't need the same amount of sensors and maybe his hardware wasn't updated as much because of it.

  • Love 1
Link to comment
16 minutes ago, Netfoot said:

Who says they are?  Perhaps they feel no pain at all, and only pretend to feel it, because that's what would be expected of the human they are pretending to be.

They are constructed by humans to fill various roles as required.  So as to fill those roles, they have realistic pretend-skin, realistic pretend-hair, realistic pretend-eyes.  They have realistic pretend-speech/voice, probably realistic pretend-B.O. and all.  The next step towards realism is adding realistic pretend-emotional responses, which would appear to the casual observer as if the bot actually experienced the emotions.

That's what makes the  show so interesting to me. Do they feel pain or emotion? Is it just programmed, and if it is, does that make a difference? I'm hoping the show will explore those ideas.

  • Love 1
Link to comment
44 minutes ago, izabella said:

I still don't believe they actually feel emotions, either.  It is evidenced by how they immediately drop the emotions at a command from one of the center people, just like they drop the accents when commanded to do so.

Technically, Dolores didn't drop her emotions on command, only her outward expression of them. She was still terrified, and said so. She just said it in a calm voice.

 

1 minute ago, Gobi said:

That's what makes the  show so interesting to me. Do they feel pain or emotion? Is it just programmed, and if it is, does that make a difference? I'm hoping the show will explore those ideas.

The changing room host's question is a key theme for the show! "If you can’t tell, does it matter?"

  • Love 7
Link to comment
41 minutes ago, Broderbits said:

Don't computers perform automated tasks even when no human is at the controls?

Your computer might run a self diagnostic, or do a scan of your files while you aren't using it, but these tasks serve a purpose, they improve the computer's functionality.

This could explains why the hosts talk to each other and carry out narratives when nobody else is around, it's practice, and it improves their function.

But it doesn't explain why they feel things when no one else is around. How does Maeve experiencing pain that she tries to ignore and push through improve her functionality? If anything it detracts from it, that's why she went in for repairs to cure her MRSA infection. It serves no purpose for her to be in pain when she isn't supposed to be, she's not practicing being in pain, that's evidenced by her shock and confusion. She simply is in pain.

Edited by Maximum Taco
  • Love 2
Link to comment

My point is they "feel" things because they've been pre-programmed to do so. Automated tasks can be set to perform at certain times or when certain conditions are met. If The Creator wants his works of art to seem utterly real, he would definitely think of this. What might be random to us may just be coding.

  • Love 2
Link to comment
17 minutes ago, tiramisue said:

But why should they practice talking and not feeling, when feeling (emotions as well as pain) is probably the more complicated function (and should ideally modify their behaviour and their conversations).

They do practice feeling but in the appropriate scenario.

Dolores feels love when she interacts with Teddy, that's her practice scenario for a romantic love, should a guest ever choose to pursue that with her. She doesn't practice feeling romantic love when someone holds a gun to her face, or when she takes a bite of oatmeal, or when she is painting horses, that wouldn't provide the appropriate response for when she's interacting with a guest. 

Maeve shouldn't be feeling pain (or practicing feeling it) when she is trying to put the moves on someone. It's not appropriate to the scenario.

Link to comment
6 minutes ago, Broderbits said:

My point is they "feel" things because they've been pre-programmed to do so. Automated tasks can be set to perform at certain times or when certain conditions are met. If The Creator wants his works of art to seem utterly real, he would definitely think of this. What might be random to us may just be coding.

"when certain conditions are met", depending on complexity, can get pretty close to being real AI.  Dolores had a very human-seeming response (panic, going for help) to her dad being sick. That was 100% not scripted in the way that going to town and dropping a can while loading her saddlebags was scripted. It's more like it's "coding" the way humans have responses based partly on evolved instinct and partly on learned behavior. And about learning, we generally don't make a big distinction between whether that learning arrived in our heads via reading, lived experience, listening to others talk... so how different is it when the learning arrives via someone else transferring it to a host via "programming"?

Link to comment
3 minutes ago, Maximum Taco said:

They do practice feeling but in the appropriate scenario.

Dolores feels love when she interacts with Teddy, that's her practice scenario for a romantic love, should a guest ever choose to pursue that with her. She doesn't practice feeling romantic love when someone holds a gun to her face, or when she takes a bite of oatmeal, or when she is painting horses, that wouldn't provide the appropriate response for when she's interacting with a guest. 

Maeve shouldn't be feeling pain (or practicing feeling it) when she is trying to put the moves on someone. It's not appropriate to the scenario.

Sure, but the sensors are active whenever she's in the park. It wouldn't make any sense if they shut down some sensors for some scenarios since it's realistic for all of them to be active all of the time so she can react to everything as it happens. Like say a treasure hunter being stabbed in the hand without any warning (yes, there where guests present, but this is the kind of scenario feeling when no one is present could be practice for).

Link to comment
19 minutes ago, Broderbits said:

they "feel" things because they've been pre-programmed to do so.

Do they actually feel?  You punch a human in the nose and they exhibit distress, because they feel pain.  Punch a host in the nose and they, too, exhibit distress, because they are pretending to be human, and respond as a human would.   But they actually feel any pain at all?  Or does the punch simply trigger the expected response?  

9 minutes ago, arc said:

"when certain conditions are met", depending on complexity, can get pretty close to being real AI.

Hmmm.  AI in this context means "Artificial Intelligence".  So, "real AI" means "real artificial intelligence."  Intelligence can not be real and artificial at the same time.  Not according to my understanding of the words.  So, perhaps AI is the wrong term to use in this debate?  

We need two words, one indicating that something was created entirely by nature, and the other indicating that mankind took a major role in it's creation, like perhaps Natural Intelligence and Synthetic Intelligence.  Both of these indicating genuine intelligence, albeit of different origins, whereas Artificial Intelligence can imply something which appears to be intelligent, but isn't really.    So a human would have Natural Intelligence, whereas a bot would exhibit Artificial Intelligence because it was programmed to pretend to be intelligent.  Should that bot transcend, thereby becoming genuinely intelligent, it would be Synthetic Intelligence.

Or something.

Gee, nomenclature is almost as confusing as for a time-travel show!

  • Love 1
Link to comment
33 minutes ago, Maximum Taco said:

 

Maeve shouldn't be feeling pain (or practicing feeling it) when she is trying to put the moves on someone. It's not appropriate to the scenario.

To quote the Terminator: " I am aware of injuries. The data could be described as pain." Since the hosts are susceptible to infections, it makes sense that they would react to them even if no one was around, as it would be important for any infections or injuries to be noted as soon as possible. The park would want to keep infections from spreading, or injuries from affecting behavior, to ensure smooth running of operations. Since  the hosts already feel pain, it is logical to use that function whenever appropriate.

Edited by Gobi
Spelling
  • Love 2
Link to comment
2 hours ago, Gobi said:

But if the hosts feel those emotions and pains, just as  humans would,  is it any less real because it was programmed?

The difference is also that they can and do shut it off sometimes. I'm pretty sure they didn't use any anaesthetic when Maeve was on the operating table (because why would they?) they thought sleep mode was enough and as it was shown it was. She didn't wake up from the pain,  one would feel when someone is poking around in an open wound, but from her nightmare and Maeve training herself to the "wake up on 3" command. So her feelings of pain and emotions aren't exactly like they are with humans, but only work when the park administration say they should. Which to me means the pre-glitch versions are non-sentient beings just programmed to react and therefore "not real" in the sense that humans or animals are. 

Though that doesn't make interactions with them not real and there could be a whole other debate if it's okay to shoot and rape robots, just like there is a debate if it's okay to spend time killing in video games. Probably even more so since it mimics reality on a whole different level, but that to me is more "user-specific". 

But all of that changes anyway after the update and them slowly becoming more than their code. Which brings in a whole other debate, if someone thinks their are non-sentient, but they are starting to become sentient without the guests or administrators noticing, how much can you blame the humans for continuing to treat the robots the way they did before?

Edited by tiramisue
added a thought
Link to comment
1 hour ago, Netfoot said:

Do they actually feel?  You punch a human in the nose and they exhibit distress, because they feel pain.  Punch a host in the nose and they, too, exhibit distress, because they are pretending to be human, and respond as a human would.   But they actually feel any pain at all?  Or does the punch simply trigger the expected response?  

1 hour ago, arc said:

I believe that their reactions are either pre-programming or learned responses. I don't believe they are (right now) human and will wait for the show to prove otherwise. The closest experience I have with AI is my Echo. When I ask Alexa if she can feel pain, she doesn't know how to answer; but when I ask her favorite color, she does respond with an answer that would be acceptable if it came from a human. If the Echo looked as realistic as the Westworld Hosts would I think she was human? No, because it's still not flesh and blood. R2D2 is adorable, but it's still a robot.

Here's a theory about The Man in Black: what if he is an experiment by The Creator, a Host programmed to believe he is a Guest and made to be as impervious to the other Hosts as a real human would be. That might account for his insanity and search for answers, what he refers to as a "deeper game".

Link to comment
1 hour ago, Netfoot said:

Hmmm.  AI in this context means "Artificial Intelligence".  So, "real AI" means "real artificial intelligence."  Intelligence can not be real and artificial at the same time.  Not according to my understanding of the words.  So, perhaps AI is the wrong term to use in this debate?  

The opposite of "real" is "fake", not "manmade". The opposite of "artificial" is "natural", not "real". Something can absolutely be real despite being man-made. My phone is real. It's not natural, but it's real. A mimic octopus is a real (and natural) octopus but it's not a real poisonous lionfish... but even as a fake lionfish, it's not artificial.

1 minute ago, Broderbits said:

I believe that their reactions are either pre-programming or learned responses. I don't believe they are (right now) human and will wait for the show to prove otherwise.

Ah. To me, based on what we've seen, the hosts are clearly sentient already. The guests largely or entirely don't know, and the people who run the park mostly don't know due to willful blindness, because it would be unbearable to think they were subjecting sentient beings to rape, torture, and death for the amusement of guests. (Ford though, he probably does know.)

Link to comment

The question is not so much I think whether the pain they feel when they are attacked is "real" because it was programmed. Even in people it seems to be sort of programmed as well as being rather crassly biological, what with nerves being required most of the time.

The question I think is not even whether it is "real" because there's not necessarily an idealized consciousness aware of it. Personally I spend a great deal of time unconscious but I flatter myself I'm no less real because of it. How an unthinking habitual action is fundamentally different from something programmed into a robot/android isn't clear to me either. 

But I do think there is a significant question about what they want? I will want to eat, drink and sleep on a regular basis. And a multitude of other wants will arise in the day, some of them quite long term, like "put up with the boss until I can retire." Until the show addresses this kind of wants (which is a fundamental of characterization in drama too I think,) I don't think you can say it's saying much at all on a deeper level. If they just want to stop being tortured, they can accomplish that just by being shut off. 

Link to comment
8 minutes ago, sjohnson said:

But I do think there is a significant question about what they want? I will want to eat, drink and sleep on a regular basis. And a multitude of other wants will arise in the day, some of them quite long term, like "put up with the boss until I can retire." Until the show addresses this kind of wants (which is a fundamental of characterization in drama too I think,) I don't think you can say it's saying much at all on a deeper level. If they just want to stop being tortured, they can accomplish that just by being shut off. 

like this scene?

Link to comment

^^^When Ford asked Abernathy what he wanted to say, he says he wants revenges and he also says he doesn't know what revenges he wants. He also said meeting his maker was his itnerary, not what he wanted. And before that he wanted to warn Dolores, although it's not quite clear what of.  And it does all this in response to questioning. This isn't quite there yet. 

In the older movies where the computer communicates through a telephone or teletype or vocoder, the ones that were becoming sapient usually illustrated it by initiating the communication. When Westworld's robots/androids start initiating actions, what they do will tell us what they want. If you're right, we already know, it's to kill humans, horribly, and this is just a drawn out zombie movie in disguise. 

Link to comment

Not to overdo the videogame comparisons, but since the creators have spoken in those terms:

In videogames there's a so-called 'speedrunning' scene. Speedrunners do not simply run fast through a game, but break it, exploit glitches to get to their goals, reach the areas they want to go to. 

This is done through extensive trial and error and poking at the walls of your virtual world. No stone left unturned, no interaction with in-game characters or objects unexplored. A popular game for speedrunning is the 1996 joint Super Mario 64. 20 years later we are still seeing awe-inspiring new discoveries of how to manipulate the game and break its boundaries. A popular video from this year shows how one speedrunner is accessing "parallel worlds" within the game to achieve his goals. 

My point is, if the MiB is getting secrets from some girl by killing her family, it's not necessarily a scripted mission or a natural reaction for her character to seeing her parents killed, but something MiB has discovered triggers a reaction, a glitch. Discovered by poking around in the world for 30 years as an advanced gamer of it systems. 

  • Love 4
Link to comment

I never heard of speedrunning, but I think (so far) that that's what's going on. Still not sure if the MIB is a robot that the creators are letting run amok to see how far he can get, or if it's Richard Benjamin's character and he was legally given carte blanche as recompense for the events of the movie, but that is probably the best explanation. There is obviously a "deeper level" to the game (or why else did someone put the maze/circuit board inside the robots' scalps?) and the MIB found out about it, and is obsessed with getting in. After 30 years, there's only so much you can do in regular Westworld. Even if there are other parks, at some point, you've gone on every regular romp, and unchecked rape and slaughter probably gets a bit boring eventually for even the most depraved individual. Now he's getting more creative with it, experimenting with j-u-u-u-s-t how much "blood" you can let before a host "dies," figuring out exactly what sequence of actions to take to get the robots to break character and reveal clues, and looking for this second level of the game. 

Link to comment

TMIB is not playing a game.  He hates the game and everyone in it.  He is out to destroy the game for reasons yet unrevealed.  His quest is to find the innermost heart of the game because that would be the control nexus or operational command center or what ever you want to call it.  He seeks it so he can destroy it.  

Link to comment
7 hours ago, abcfsk said:

Not to overdo the videogame comparisons, but since the creators have spoken in those terms:

In videogames there's a so-called 'speedrunning' scene. Speedrunners do not simply run fast through a game, but break it, exploit glitches to get to their goals, reach the areas they want to go to. 

This is done through extensive trial and error and poking at the walls of your virtual world. No stone left unturned, no interaction with in-game characters or objects unexplored. A popular game for speedrunning is the 1996 joint Super Mario 64. 20 years later we are still seeing awe-inspiring new discoveries of how to manipulate the game and break its boundaries. A popular video from this year shows how one speedrunner is accessing "parallel worlds" within the game to achieve his goals. 

My point is, if the MiB is getting secrets from some girl by killing her family, it's not necessarily a scripted mission or a natural reaction for her character to seeing her parents killed, but something MiB has discovered triggers a reaction, a glitch. Discovered by poking around in the world for 30 years as an advanced gamer of it systems. 

I still think MIB is a tester / Quality Control person for Westworld.  Testers explore every part of the game trying to find glitches, even the hardest level that are hardly explored by casual gamers.  At least to me he was just following some steps/scripts to get to this higher level and did not necessarily enjoy what he was doing.  So, what he did to Dolores that night did not necessarily have anything to do with "finding the maze" quest.  He could be just testing the "2 bandits come to Dolores' house" script/scenario to see what happened when a guest killed everyone but Dolores (ie. extreme case).  

When he met Dolores in town the next morning, that was the beginning of a new test for a different script.  Why would he need to start from the town the next morning?  He had full access to the park, he could have just grabbed the drunk right after he was done with Dolores.  The answer, IMHO, is because he was following a test script for a new quest he was testing, ie. step1: start in town like any newly arrived guest.

We still do not know if he is human or WHO hired him for the job.  Head office might hire him without telling the staffs at Westworld.  Kind of like when main HQ sends a secret shopper to individual grocery store to check on quality of goods and service at that branch store without local Store Manager knowing about it.

Edited by DarkRaichu
  • Love 1
Link to comment
23 minutes ago, DarkRaichu said:

I still think MIB is a tester / Quality Control person for Westworld.  Testers explore every part of the game trying to find glitches, even the hardest level that are hardly explored by casual gamers.  At least to me he was just following some steps/scripts to get to this higher level and did not necessarily enjoy what he was doing.  So, what he did to Dolores that night did not necessarily have anything to do with "finding the maze" quest.  He could be just testing the "2 bandits come to Dolores' house" script/scenario to see what happened when a guest killed everyone but Dolores (ie. extreme case).  

When he met Dolores in town the next morning, that was the beginning of a new test for a different script.  Why would he need to start from the town the next morning?  He had full access to the park, he could have just grabbed the drunk right after he was done with Dolores.  The answer, IMHO, is because he was following a test script for a new quest he was testing, ie. step1: start in town like any newly arrived guest.

We still do not know if he is human or WHO hired him for the job.  Head office might hire him without telling the staffs at Westworld.  Kind of like when main HQ sends a secret shopper to individual grocery store to check on quality of goods and service at that store without Branch Manager knowing about it.

That's an intriguing theory, I like it. I also like the one where he's a gamer who wants to experience all levels and possible experiences. You guys bring so many great ideas that I sure hope he's not just a sadistic bastard, because that would be most disappointing. But I'm hopeful, this seems like a clever series and I'm eagerly waiting the next episode. Silver lining of not binge watching is that I can explore each episode more thanks to you all and somehow get deeper into the show, if that makes sense. 

Link to comment

The MIB as a sort of bug tester for Westworld is an intriguing scenario.

After all if you want to keep people from breaking your game, you need to find out how they would break it. And he is doing things that a bug tester would do, pushing at the boundaries of what a normal player would do, looking for holes and cracks in the game to exploit. 

  • Love 2
Link to comment

I'm no biologist, but if the hosts have vision -- which requires receptors between eye and brain -- isn't it possible that they have pain receptors too?  If they can see and differentiate between light and dark, can they also differentiate between heat and cold?  Soft and hard? 

Or does their programming tell them how to react to different stimuli? 

Link to comment
3 hours ago, Netfoot said:

TMIB is not playing a game.  He hates the game and everyone in it.  He is out to destroy the game for reasons yet unrevealed.  His quest is to find the innermost heart of the game because that would be the control nexus or operational command center or what ever you want to call it.  He seeks it so he can destroy it.  

The operations center is where the train departs from. If that was his goal, he should just have skipped out of the changing room and stayed in the hub.

  • Love 1
Link to comment
14 minutes ago, AuntiePam said:

I'm no biologist, but if the hosts have vision -- which requires receptors between eye and brain -- isn't it possible that they have pain receptors too?  If they can see and differentiate between light and dark, can they also differentiate between heat and cold?  Soft and hard? 

Or does their programming tell them how to react to different stimuli? 

I'd assume that the creators instilled features that they considered necessary.  Eyeballs capture vision-data and pass this to the processing core, or an ancillary sub-processor.  I can't see bots like ours performing without the power of sight, so it would seem natural for the creators to imbue them with that power.  

Why would they install the ability to experience pain?  Humans experience pain as a warning system.  To protect against accidental damage and to indicate that action is needed when actual damage occurs.  When our outer skin is breached (for example) it signals our brain with a burst of pain.  It hurts, so we take steps to eliminate the injuring phenomena, and tend our wounds.

Some computers are equipped with anti-tamper switches.  When their outer shell is breached, those switches signal the CPU with a logic voltage.  But does it hurt the computer?

A host who gets shot in the back will no doubt be informed of the injury by it's network of internal sensors and signal paths.  It will scream, fall down, moan and groan and writhe around...  but is it feeling pain?  Or playing a game?  Acting a role?  I believe that hosts as designed, do not feel pain in the same sense that we do.  They are designed to simulate the usual human responses, but as they are not real humans, there is no reason for their wounds to generate pain, as opposed to a simple signal that tells their "brain" that there is damage that needs to be responded to in a scenario-appropriate way.

Of course, it appears that the hosts are developing beyond their original design parameters.  (Their creators for the most part, don't seem to realize this.) If they are indeed mutating away from their original design, who can say what characteristics the mutated host will exhibit?  I suspect that it will include genuine anger, genuine pain, and so forth.

23 minutes ago, arc said:

The operations center is where the train departs from.

Perhaps he doesn't know this.

  • Love 3
Link to comment
On 10/13/2016 at 9:04 AM, Enigma X said:

I probably should write this in the first episode thread, but I think that TMIB wanted a frantic Dolores in hopes that she would divulge information about the maze. Do we really know he raped her or just assumed it?

Wouldn't raping her make her frantic? He murdered a young girl's mother (Lawrence's family) in front of her and presumably knifed Maeve and her child. If he wants to provoke the robots to have extreme emotions, why would he stop at rape?

Link to comment
5 minutes ago, numbnut said:

Wouldn't raping her make her frantic? He murdered a young girl's mother (Lawrence's family) in front of her and presumably knifed Maeve and her child. If he wants to provoke the robots to have extreme emotions, why would he stop at rape?

I am not saying that he would stop at it but we don't know if he actually did that.

I think many people have been writing a lot of speculation about a lot of things in these first few episode threads as facts that we really don't know for sure happened yet.

Edited by Enigma X
  • Love 2
Link to comment
56 minutes ago, Netfoot said:

A host who gets shot in the back will no doubt be informed of the injury by it's network of internal sensors and signal paths.  It will scream, fall down, moan and groan and writhe around...  but is it feeling pain?  Or playing a game?  Acting a role?  I believe that hosts as designed, do not feel pain in the same sense that we do.  They are designed to simulate the usual human responses, but as they are not real humans, there is no reason for their wounds to generate pain, as opposed to a simple signal that tells their "brain" that there is damage that needs to be responded to in a scenario-appropriate way.

My problem with this line of thought is that it's a pretty short line from there to thinking that other people don't really feel pain. "They're acting." "It's an appropriate response to a stimulus."

  • Love 3
Link to comment

I love the Secret Shopper analogy.  lol  Takes me back to the old retail days.  Now I'm thinking about how he mentions to each of the robots how well he knows them.  Perhaps in addition to testing out the boundaries of the game, he is monitoring the individual hosts for anomalies.   There is a snippet in the previews where he

Spoiler

is surprised by something Teddy does and says, "It's like I don't even know you."  And Teddy responds, "You don't know me at all."

Even so, I think he, like all our characters, has his own motivations beyond what he's "supposed to be" doing.

  • Love 1
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...