Jump to content

Type keyword(s) to search

The Companion

Member
  • Posts

    1.2k
  • Joined

Everything posted by The Companion

  1. I think Delos believes it is close (which we know is not as true as one would hope). I do think this level of knowledge could allow for some pretty specific targeting. Corporate espionage, for example, by bringing in an applicant to a position that you know will particularly appeal to one of the guests. Creating a sleeper host as a potential partner or spouse for an influential guest. Creating a shadow/electronic version of the person to try and figure out how they will react to particular things (and tailor an approach based on those tests).
  2. I guess what Dolores became was more than Teddy could bear. Sorry, not sorry.
  3. To clarify, I don't think the subject number means they have been replaced. I think it is the designation within The Forge. I am admittedly speculating based on the screen, which gives a subject number, personality type and other data gathered about the particular person. That indicates to me that it is basically the file number within The Forge. I don't know that Logan was actually ever "forged." William didn't propose collecting the data until after he visited with Logan. His info wouldn't have been gathered and backed up unless he returned to Westworld. Possible but not probable based on his state at the subsequent party.
  4. I don't think the Subject Number 002 thing really tells us much. While others had been in the park, he might have been the second subject because they weren't tracking people and trying to recreate them until Delos took over. So his father-in-law was Subject Number 001 because there was an intent to re-create him initially. William likely would have set himself up next, to allow himself to be recreated down the road at some point. He was the second person to get the full scan treatment, because he and his father-in-law were the ones who started that program.
  5. Technically, MiB original flavor could also be still alive and hidden. I think the story is better if he has just lost touch with reality, but they definitely left the possibility open that he is no longer human. I agree (though I also agree with the sentiment that he was thinking of all the hosts as his children and Maeve as a particular favorite). From the scenes we saw, in her loop Maeve was happy, confident and satisfied with the life she had carved out for herself. She seemed to be subject to a lot less abuse than Dolores from the glimpses we got.
  6. I think it is a testament to Angela Sarafyan that she hasn't said anything all season and I still want to take her somewhere quiet, give her a cup of tea and let her just be for a minute. Poor Clem. She looks so broken. I really hope she gets something awesome next episode or next season. Perhaps it is a different answer to the question. People can't tell, but at the heart of it William feels nothing for his family and delights in wreaking havoc. Perhaps that does matter. It certainly mattered to his family. We already saw Teddy's body in a secondary location, so I am holding out hope. Maeve is the best and she had better make it in some form. I was a little confused, because why kill yourself? Does she think that she won't be able to convince others? That doesn't keep her from walking away, getting a divorce and threatening him with exposure if he doesn't give her what she needs to survive. Some of what she did felt premeditated. She obviously watched him put the card up and pretended to be sleeping. Were we supposed to assume she planned to kill herself and this just gave her some closure? I thought Teddy was re-asserting himself. He said he remembered everything later, which would indicate he woke up. If that is the case, he might be able to fight programming including the changes from Dolores. I agree. I thought that was where it was going (and it would have been consistent with what we know he feels is his real character). I thought the water he gave her was going to be drugged. I guess its fine either way . . . as long as Charlotte dies. I am not sure if William is going to end up being a host (and I think it was intentionally ambiguous although we certainly saw the test stating he was human), but there is something still open about his daughter (Emily/Grace) being able to find him multiple times. We are missing some critical information there (teased but never explained in this episode).
  7. Poor Teddy. James Marsden killed that scene. I do love the connections between the hosts that seem deeper than the narrative. Speaking of amazing scenes, the scene where Bernard is trying to push Ford out of his head was awesome. Loved that his daughter knew all about him after all.
  8. The legal world is already trying to catch up with reproductive science, and I suspect you would have a substantial amount of time where things were unsettled. In fact, interestingly enough, Louisiana treats embryos as "juridical persons" while other states treat them similarly to tissue or organs. The surrogacy laws are all over the place, as are laws on donor eggs and sperm. It is a really interesting exercise, academically, to think about how you would want to set up the replicants and hosts. I think you would need to give them basic personhood status, because you need them to have the ability to sue or be sued. Even if they are bought and sold like property (something that is possible with corporations), you can't have the owner legally liable for something he/she can't control. That would make your options the corporation who programmed the bot (which is going to be something Delos fights against and which is probably impractical with something that evolves and changes over time) or the host/replicant itself. That being said, you might then end up with the equivalent of a bunch of shell companies with no assets running around to limit liability. You could make a minimum wage, but you might want it to be lower. Full personhood status would open a can of worms with regard to basic human rights. Depending on how you see the replicants and/or hosts, that may or may not be something that works from a policy perspective. The answer may vary depending on whether or not they are considered a continuation of the person. Ok, so maybe just an interesting exercise academically for me.
  9. I am a lawyer, so of course I go to the legal nerd stuff, but it would present an interesting legal problem. If "immortality" were achieved for certain individuals, how do you handle successions/estates? Is someone who is brought back after death considered dead and their estate considered settled? If a bot replaces someone in an accident, does that affect damages in a court of law? Depending on the cost of the bot, it might be more beneficial to replace them than pay lifelong medical expenses for a plaintiff. Yeah, I don't think the initial separation was particularly wrong. I think those groups would definitely exist, I just also think there would be other considerations at play. What I do find interesting on the host side (and possibly on the replicant side) is how much deviation would be considered permissible. If you wanted hosts to work a particularly dangerous job or to do something that might require special equipment in humans, for example. A host might be cheaper to work as a deep sea diver, for example, if you could eliminate the need for oxygen. There is also the question of personality duplication. We have already seen it somewhat in Samarai World, but you might theoretically have types for mass production in an industry. You also might give them different skills for their purposes. A nannyhost might be faster to protect kids, and have some sort of basic medical package built in. That being said, you might also choose to have greater resilience for a replicant. You wouldn't want to pay all this money just to have a replicant die again in a few months because of an accident. Would it be considered permissible to make them less likely to die in an emergency situation? Would people with enough money consider replicants disposable enough to take greater risks? If they could eat and drink and smoke(which I think you would want and which we did see Delosbot doing), would you remove any harmful effects? I also wonder how much the biology is actually compatible with human biology. Could you create a surrogacy bot with the ability to constantly monitor the baby? Grow organs compatible with a human? That opens up a whole different can of worms, ethically and morally.
  10. I agree. I was really thinking as far as treatment by third parties and designation of legal status. Essentially, I can imagine treating the Delosbot style bots as continuations of the prior person. They would operate under the same Social Security number, continue in the same profession, etc. Arnold and the hosts are "unique creations." Ford seemed to think this was a distinction worth noting. I do think that, as a legal matter, they would be treated as new people if they were granted that status at all (I think that assumes someone takes their consciousness seriously and doesn't treat it as a programming glitch). I can imagine, as well, that they might be similarly to "juridical people" like a corporation if it is presumed they do not get full personhood status (allowing them to be sue and be sued, enter into contracts, etc.), but they would most likely be treated as property until their species status is recognized. Even if that were to occur, I suspect the Delosbot style bots would still get the continuation of personality treatment. I think there are definitely plenty of people who would jump at the chance for immortality, in any form that was offered. I can also imagine people who are severely limited in their day-to-day lives jumping on it. If you had the chance to be able-bodied again after a devastating accident or if you were suffering from a degenerative disease, for example. Certainly not everyone would, but there are those who would jump on the chance. There would also be great minds that we might, as a society, hope to preserve. How many more works or theories could the scientific and artistic greats produce had they been given the chance? Then there would be the vanity. You could be a younger, hotter version of yourself. That is going to appeal to some people. I agree. I think this is actually what they are getting wrong. What makes the hosts more human is stepping outside of their programming.
  11. I think that is true, but I would also maybe hone it a bit further. I think there are some people who might consider our consciousness like a soul, even if they are an athiest (and they might not believe the actual consciousness is the same as even the most faithful recreation) and others who are religious who might believe that this development was a gift from God and could in some way be used to either bring back those who were taken too soon or extend life for those who were dying. I really love the treatment of this in Old Man's War, and I think a lot of really good work has been done on the concept because it is truly one of the great unknowns. Come to think of it, I also love the treatment in the Bobiverse series which sort of comes to a different conclusion than Westworld is presenting without a purely theistic view (but I don't want to go into it because spoilers). Certainly, Ford sees the brain as a computer to be decoded. Others would probably see things the same way. I think there are plenty of people who would believe the Company if it said your consciousness was actually transferred into a robot. I think there are two different concepts: the Hosts (they are often treated as something new or different, a new creation) and the Delosbot style bots (which are being somewhat considered as a replacement for humans). I do think some interesting things could be considered when you take away the need for an organic body. I do think the premise of: "Well, if you can't tell, does it matter?" is definitely being explored in detail, and the answer seems to be changing for the characters.
  12. I found this odd during the episode. Given how tight most of the show is, I sort of expected some confirmation that Arnold didn't know about them. After all, when we saw Arnold working with the hosts prior to that point, he was still trying to get them to behave naturally. The Natives we saw seemed to be much more natural/closer to the final product. I wondered if they were secreted away by Ford (though it does seem less likely given that Aketcheta was definitely present for the presentation, and therefore would have likely interacted with Arnold). I can't imagine Arnold not wondering what happened to that host. Ford did say:"I built you to be curious," which could imply that he built the native tribe (as he created himself without anyone's knowledge and later Bernard), but that is definitely not a confirmation. I supposed the other explanation is that he never intended Dolores kill every host. Instead, she just had to do enough damage to destroy the funding, and therefore any chance that the park would open. Did they actually say she was killing all the hosts, or did we just assume that? I honestly don't remember.
  13. I mean, that's fair but why watch it then? I admittedly love this show. I am a nerdy puzzle and sci fi type, so that is probably not surprising. I don't like traditional tear-jerking dramas (this is probably as close as I get), for example. We like what we like. This episode was one of my favorites, and I still can't quite put my finger on why. There is something really impressive about writing an almost stand alone story with so much heart, and making it fit into the season. It answered some questions, but the real meat of this episode was an in depth look at some of the quiet hidden tragedy and the connections the host make when the guests aren't looking. It was about the unintentional and casual cruelty of the humans. It also established another measure of consciousness. There has been a lot of talk this season about what makes us human. What makes us us. There have been several different answers, but I think loving another is a valid answer (and was the answer in this case). The connection between Akecheta and Kohana arguably transcended their programming. The connection between Maeve and her daughter as well. It was a quiet episode in a sea of chaos, and I think this season is the better for it.
  14. Imagine the pranks DelosJim could pull on DelosDwight.
  15. Yes, the security is terrible. I think it speaks to employee apathy and a bit of arrogance. In their mind, these are just robots and they do what they are programmed for. You wouldn't expect your car to suddenly start up and head to the store. Given what we have seen of the employees, it is very possible that they do have strict guidelines for host lockdown that just aren't followed. I suspect there is a false sense of security on the part of Delos because there is a controllable method for entry and exit to the park. If you have a malfunctioning host, it isn't going to get very far. I actually really like the fact that many of the employees tend to phone it in. Humans are, after all, inherently human and they do human things. They make mistakes, they take the easiest route, they have varying levels of work ethic.
  16. Even if he hadn't been, I suspect they have an innate subconscious understanding of escalators, elevators, etc. built in to allow them to be moved from place to place. We have seen them being walked by techs multiple times (rather than moved on a gurney). It would make sense for them to be able to navigate these things subconsciously. It was a little odd that he knew to go to the bottom floor, but I actually thought it might tie to the down below myth. They reference going "down below" so it would make sense to start at the lowest part of the headquarters. Yeah, that definitely seemed like a CYA move to me. Get it done and get rid of him before anyone thinks to look too closely.
  17. I loved this episode. It packed a potent emotional punch. After watching Coco today, I have ugly cried at my tv way too much today. I have a lot of thoughts, but my initial reaction is that I actually loved the placement. Earlier in the season would have placed it too close to the other outlier with regard to story and it was a nice contrast to the high action story from last week. The love story was beautiful. Sweet and compelling and heartbreaking. I loved the connection between Akecheta and Maeve. I loved the resolution of several unanswered questions. The music was, as always, stellar. Who knew paint it black was a love song. I loved the use of Heart Shaped Box as well. The reveal at the end was great. I figured out something was happening when he mentioned the promise that Maeve could not keep, but was still a bit surprised by the reveal and found the final words really touching. This show is always visually amazing, but I found it to be particularly gorgeous this week. Those sweeping views of the landscape were gorgeous.
  18. I suspect the original intent in making the humans so much more objectionable was the concern that people would over-identify with the humans. That being said, there are good humans (Elsie is pretty rad) and plenty of bad robots (note how much murdering is happening among the hosts when their programming plays out). I think Season 1 was supposed to make us root for the robots. I am not sure Season 2 is delineated in the same way at all. We aren't really rooting for a side, and even the bad characters are becoming different shades of grey (except Charlotte, honestly). It's certainly a rich area that I hope they explore moving forward. There are a lot of great treatments of the issue out there (another is how Scalzi approaches it in Old Man's War), and they could certainly go in a lot of different directions. I do think we have already seen a bit of exploration with the Westworld gang meeting their duplicates. More directly rooted in the episode: I think it is interesting that Bernard/Arnold is distinct from Delosbot (both in approach and intent). It does raise the question: if you were going to market to family members would the Bernard/Arnold version be enough? Would people want that option or be creeped out by it? If Dolores could be used to calibrate a copy, the idea of a duplicate bot is already something attainable.
  19. Meh. The allure of eternal life might be worth the risk, particularly if you were sick or dying or otherwise limited. You have a bot who says: "It's me! They took me into a room and hooked me up to a machine and the next thing I remembered, I was walking up in my own body, but younger and healthier." Their loved ones also assure you that the bot is the same person. Same personality, same memories, same outlook. They tell you the mcguffin transfers you to another place, which is a special database perfectly tailored to you. How do you prove otherwise? It really goes back to the question of what makes us us. What are we? Is there a soul, or are we just a collection of memories and thought patterns and electrical impulses? Personally, I think we are more, and it sounds like you do too. There is something more. However, I don't think everyone sees it that way. And if one was dying or sick or vain enough, perhaps the allure of a healthy, possibly custom designed body is enough. It's better than nothing. Don't get me wrong, I think the recap is also absolutely correct. A surviving loved one is going to have a huge temptation to move heaven and earth to bring back their loved one, and they are going to be likely to be willing to accept a substitute if it is good enough. The other, more sinister use, would be to replace someone without anyone's knowledge. It's not what they are selling, but it wouldn't be, would it? How much money could you make by controlling rich and powerful people for a few years and then having them meet with an accident before failure to age becomes noticeable?
  20. The answer for an evil corporation, of course, is to make it a destructive process. Delosbot didn't know he wasnt original and didn't seem to have much concern about being the same guy when he was ready to head out. You tell a person, ok we are going to transfer you now and kill them. Wake up their bot and tell them they have been transferred. Nobody will be around who has a complaint.
  21. I don't necessarily disagree, but I still think there could be some assumptions in effect here. Even if they are told the robots are malfunctioning and dangerous, they still wouldn't necessarily expect tactics. They don't have problems killing the robots they believe present a threat, but I don't think they expected something like an ambush from what they see as malfunctioning equipment. They expect unthinking robots shooting at everyone, not self-aware tactical engagements. They continue to treat the hosts as equipment, and therefore they continue to underestimate the level of critical thinking the hosts are capable of. Basically, they have a lot less information to go from than we do, as the viewers, and I don't think their actions are particularly out of the realm of possibility except for talking to a hot host instead of taking her down (although, that also could be the same assumption that the host is either going to be unthinkingly shooting everything that moves or sticking to its original programming). We know they eventually get to the shoot on sight, ask questions later mentality, but at the train point in the timeline, there is reason to believe they are not yet there. ETA: I think that is some of the progression we are getting as well. The difference between the way hosts were treated on the beach and the way they were treated in this episode indicates a change in the way the security team sees them.
  22. I am not going to argue any of these security teams are competent, for sure, but I wonder if we aren't supposed to see this in part as a continued assumption by the humans that the hosts are inherently not dangerous or autonomous. There has to be a lot of conditioning to see them as harmless, and the robot rebellion is still relatively recent. These guys are treating the hosts like malfunctioning equipment, rather than as people with autonomy and the ability to strategize/think independently. I got nothing here, though. Snerk.
  23. I agree generally, but the MiB was pretty firmly bad guy last year and Dolores was pretty firmly good, so I won't rule it out. Nonetheless, it feels like we are heading towards a satisfying villain death on her.
  24. I think this is a possibility (as is some sort of enhanced healing ability). We know health research has advanced considerably and there was a reference to cutting particular research by Delos, which indicates the company has medical holdings, so he could even have access to something not on the market. She really does evoke a strong reaction, doesn't she. Her smirk and arrogance make her so hateable. She is unrepentant. I can't decide if we are leading to a satisfying death scene or a redemption.
  25. I was a little embarrassed by how hot I found BadTeddy. I loved when she informed Hale begging never worked, as she had learned from her many timelines. I am not sure. She offered to put her out of her misery, and then left her to the fate she chose. I loved the second meeting. They seemed to have a mutual respect and understanding, even if they had totally different viewpoints. So many amazing things about this episode. I don't even know where to begin. Loved the big reveal at the beginning. I should have seen it coming, but I didn't. Bernard seems so vulnerable. I am assuming there was more to his directions, but you still want to protect him. Speaking of vulnerability: RIP Clem? I want her to hop back up, head out to a nice safe area and get some rest. I really enjoyed the scenes in that timeline. Delores saying goodbye to her father and showing vulnerability, the gang fighting it out, the great use of Welcome to Westworld by Angela. Also really enjoyed Laurence turning on the MiB. Being semi-decent to a few hosts this time around doesn't square up your past plays where you raped and murdered them. And what a menacing performance by Anthony Hopkins. He really is a master.
×
×
  • Create New...