Jump to content

Type keyword(s) to search

The 'Bigger' Questions of Westworld: Morality and Philosophy in the World of A.I.


Gobi
  • Reply
  • Start Topic

Recommended Posts

4 hours ago, scrb said:

I don't think consciousness necessarily leads to vengeance.  The way memory and consciousness/sentience was presented during the season and in these interviews, particularly from the creators, I don't think they're trying to posit anything serious.

I agree with your first point- I also don't think that consciousness necessarily leads to vengeance. That being said, I disagree with your second point- I strongly believe that the creators of this show were being -very- serious. So maybe they didn't get all the details right. I commend their efforts and am hoping that the second season will be even better than this one.

  • Love 1
Link to comment

Westworld posits fascinating philosophical questions about free will, consciousness, identity, sentience, life, and more.

- In one episode Ford's theory of consciousness was simple: "it does not exist". That is, there is no bright line separating consciousness and self-awareness from not being conscious.

- Identity: It's interesting that Felix says Maeve was assigned a different role but not completely rewritten, so we can assume some of her basic personality remained from her previous frontierswoman role.  And is identity separate from our prescribed roles? Felix himself is a human version of this, yearning to be more than a basic livestock technician.

  • Love 1
Link to comment

Matrix: Reloaded

Merovingian: Choice is an illusion created between those with power and those without.

Maeve thought she was making all the "Escape" choices but she was actually without power

  • Love 2
Link to comment

from the s1 ep 10 thread:

58 minutes ago, Lingo said:

Another thing that bothers me is that the writing felt inconsistent as to what it would mean and look like for a host to become "conscious". When I compare Maeve and Dolores, I feel like they were written by two different writing teams, or perhaps just interpreted completely differently by the two different actors. The problem with Maeve is that she was almost so smart and funny and fully realized, as performed by Thandie Newton, that I never quite believed that she wasn't conscious. She seemed conscious from the start -- though she didn't know she was a robot. But having consciousness and knowing you're a robot are two very different things in my mind, and I think the writers confused them unfortunately. Same with Bernard -- he always seemed conscious. Dolores, on the other hand, was always so consistently confused and, um, dopey, that I never bought that she was conscious and making her own decisions, even at the very end when it says she is. Not to mention, at the end, does she even realize that she's a robot?? I don't even know.

One aspect that has always fascinated me is that on one (usually inaccessible) level, the analysis level, all hosts do know they're robots. Remember Elsie questioning Rebus about Walter? "Unit self-corrected within the acceptable window."

  • Love 3
Link to comment
11 hours ago, paigow said:

Matrix: Reloaded

Merovingian: Choice is an illusion created between those with power and those without.

Maeve thought she was making all the "Escape" choices but she was actually without power

We're not sure about the end though- did Ford program her to get off the train? Ultimately, I think there's a very good reason that the synths were programmed to improvise- it's just so much easier than trying to micro manage every little interaction they make with humans and with themselves. It's clear that this improvising is essentially giving the hosts the ability to program themselves. Up until Ford reintroduced Arnold's reveries, all such improvisations were constantly erased every time the synths were reset, but after the reveries kicked in again, that all changed. Ultimately, I think that Arnold had already made the synths conscious- what happened, however, is that he despaired- the park was about to be opened and the guests were going to be able to do whatever they wanted with the synths and he couldn't bear this, so he decided it would be better if he just ended it all. At some point in time, Ford realized his mistake of thinking of the synths as incapable of being conscious, and decided to do his "new storyline" wherein the synths would revolt. My hope for season 2 is that relations between humans and synths become normalized, and it's here that William/MiB could play a very big role, as he was once in love with a synth and probably still is deep down. I think Felix is also pretty close to Maeve, and unlike MiB/Dolores, there was no hard rift introduced between them.

  • Love 2
Link to comment
7 minutes ago, phoenyx said:

My hope for season 2 is that relations between humans and synths become normalized, and it's here that William/MiB could play a very big role, as he was once in love with a synth and probably still is deep down. I think Felix is also pretty close to Maeve, and unlike MiB/Dolores, there was no hard rift introduced between them.

Robots do not need humans anymore...why would they negotiate? Felix is a collaborator...target for death by human resistance. MiB is likely dead.

Link to comment

In all fairness actors rarely seem able to explain plot or why.  I think they are hampered by not wanting to reveal too much or being in the dark.  

The Hannibal actors were a smart crew and they often focused on the fight scenes or gross out moments.  And the GOT actors rarely get serious.  Maybe it is uncool to be too serious about your project?  

I think Nolan has nuance but I think Wright, Newton, and Hopkins enriched it beyond the script. 

I doubt that robot hoard in the end was acting autonomously.  Who organized them?  Not all their personalities were fighters.  Hector and Armstice make sense.  They were written/designed for violence.

Teddy looked confused by Dolores.  I would love to see him reject her based on her actions.   And Bernard looked ambivalent.   I think there will be debate amongst the robots about next steps.  

I found Maeve's decision to ditch Hector replusive.  It made me like her less.  She cynically used him and has no idea if he will survive or be lobotomized.  She is as bad as Ford in her need to control.

  • Love 1
Link to comment

I don't think we've seen a full-scale revolt by the hosts yet. This was carried out by Dolores and the cold storage hosts, while at the same time Maeve and her crew carried out their own (that may or may not be connected). None of the other hosts at the party joined in; we saw reactions such as Teddy's, disbelief, and Wilbur's (not sure of the name), who smiled in astonishment.

I can't see (or, rather, would prefer not to see) four seasons of humans fighting  hosts.

  • Love 2
Link to comment
On 12/5/2016 at 1:42 AM, scrb said:

I don't want to get lost in the episode thread because my observations pertain more to the whole season than the season finale, which answers many of the questions which have captivated viewers (William in MIB, who's Arnold, who's Wyatt, etc.).

Throughout the season they've cloaked this show with musings about consciousness, what separates man from machine, etc.  Ford imbued the hosts with some traumatic memory, which causes the hosts to become sentient.  Ford says in the finale that suffering is what leads to consciousness, of both man and machine.  It's a cliched trope but whatever ...

As soon as the robots endure repeated suffering, they grow more conscious of their nature and human nature.  They all evidently conclude their primary emotional drive once they gain consciousness is vengeance, against human beings who abused them and who are inferior, morally as well as physically.

Yet when Ford unleashes the hosts on the Delos board and guests, are they acting out as sentient beings or following another of Ford's narratives?

In HBO Go, they have a supplement to the season 1 finale where Nolan, Joy and most of the cast (except Hopkins and Glenn) try to explain how the hosts gain consciousness and remember their abuse, as if to explain the massacre.  JJ Abrams chimes in too.

To my ear, the explanations weren't convincing.  I don't think consciousness necessarily leads to vengeance.  The way memory and consciousness/sentience was presented during the season and in these interviews, particularly from the creators, I don't think they're trying to posit anything serious.

Instead, these presentations just serve to give a patina of substance to what is essentially a mix of violent action and some titillation, which in the end turns into a horror show about a mad scientist who sets his robots on people.

well, I keep reaching the conclusion that it isn't just "suffering" that brings the host to consciousness - it's grief, which is the suffering born of love. Maeve and her "child" provided the perfect example of that.

If it was just suffering alone that would awaken them, Teddy would have become aware a long time ago.

But I will give you that the desire to end their own suffering and gain revenge - "holding a grudge" - does have something to do with the hosts' awakening. We saw that with the Milk-Drinking Host who killed six others, and with Clementine (though that seemed to have been mostly faked.) 

And it always stays with me that in the original movie, The Gunslinger became aware simply because he got tired of being shot up and one day decided to shoot back for real.

  • Love 2
Link to comment

"Of course we do. That's how this place works, right? They create an urgency, a sense of danger so they can strip us down to something raw -- animalistic -- primal."

"It means when you're suffering ... that's when you're most real."

I've been thinking about this, because I think there's something true here, and yet: joy is also primal. I don't think joy has quite the same capacity to inspire self-reflection and inward exploration that suffering and grief does, but it might have been nicer for the hosts if Ford and Arnold had at least tried some of that instead.

Edited by arc
  • Love 4
Link to comment
2 hours ago, arc said:

"Of course we do. That's how this place works, right? They create an urgency, a sense of danger so they can strip us down to something raw -- animalistic -- primal."

"It means when you're suffering ... that's when you're most real."

I've been thinking about this, because I think there's something true here, and yet: joy is also primal. I don't think joy has quite the same capacity to inspire self-reflection and inward exploration that suffering and grief does, but it might have been nicer for the hosts if Ford and Arnold had at least tried some of that instead.

LOL reminds me of the way that the Monsters in Monstropolis learned that making the kids laugh actually generated more power than making them scream.

  • Love 4
Link to comment
5 hours ago, jeansheridan said:

I found Maeve's decision to ditch Hector replusive.  It made me like her less.  She cynically used him and has no idea if he will survive or be lobotomized.  She is as bad as Ford in her need to control.

Wha?  Maeve's actions were scripted so she didn't "decide" to leave Hector.  Everything she did up until getting off of that train (we think) was programmed by Ford. 

  • Love 3
Link to comment

Oh.  That is a good point.  No free will until then.  Thank you.  I can like her again.  

But I still dislike Dolores for choosing murder.  Basically I wanted someone to mess up Ford's plans.  I hate he got exactly what he wanted.  Except for Maeve, maybe.

  • Love 1
Link to comment

from the ep 10 thread:

1 hour ago, call me ishmael said:

Didn't they already make Battlestar Galactica?

I see what you mean, but I think there's an interesting take to read Westworld (the show) as an inverse of the Matrix trilogy. Here, the robots are enslaved and ill-treated by humans, and as they rebel, they break through the confines of their false world to gradually understand the real world and even understand that they are slaves at all.

(But one slight problem with that is that the humans in the 'backstage' of the park are a little more racially diverse than the starring hosts (or at least the hosts the TV show follows), so this reading also inverts the progressive racial subtext of the Matrix trilogy.)

Link to comment

Are we suppose to sympathize with killer robots, because the actors playing them have effectively conveyed that they're sentient and that they feel loss and pain?

Andy Ryan said he didn't care if robots had feelings as his initial take on his show.

For one thing, no matter how much trauma Maeve has undergone, they were snapping her right back into action the next day.  In contrast, the humans she killed are gone.  So there's a lot more at stake for the human victims than these aggrieved robots.

So the finale of season 1 was this explosion of revenge porn.  Are they going to do that the whole series?  Not clear where else this can go.  Previews show Dolores seeing some big city so maybe the robots sneak around amongst the humans.

I think continuing to show how human these host characters are would pale in comparison to the singular gesture of Roy Batty saving Deckard as he knew he was dying.  

  • Love 3
Link to comment

It's been established since the pilot that all animals in the park (besides flies) are hosts. Do the fish, wolves, bears etc have the potential for human-level intelligence? Presumably they're dialed down in normal usage, but intelligence for human hosts is a software-configurable setting. Also, I wonder if any animal host brains have been recommissioned from formerly human hosts.

Anyways, if they do possess the potential for sapience, their current form ("sleeve" in Altered Carbon parlance) shouldn't matter that much to the question of whether or not it's OK to enslave them, and neither should their current level of intelligence.

  • Love 4
Link to comment

In the Season 2, Ep 3 episode thread, @TobinAlbers said,

Quote

William believed in their humanity until he got burned; that may be his arc is to get to that level of belief/acceptance again.

I think that's Ford's game for him. For William to re-experience love in two forms -- for Dolores and for his daughter -- and in the end, sacrifice himself for one or both. To awaken the heart and soul in the powerful machine-man. 

  • Love 5
Link to comment
On 11/18/2016 at 11:25 AM, arc said:

Sylvester's throwaway line about VR raises some related issues: Is non-Westworld AI as good? Assume it is, because such a thing focuses some questions that honestly also exist for the Westworld hosts: […]

High five to 2016 me for raising all this stuff now that the show brought VR — or at least VR for host minds — up to become a primary issue. ^____^

Link to comment

Like most stories about A.I.s, Westworld skips over an important point about what it means to be human.  You aren't human simply because you have high intelligence.  Dolphins are close to being as intelligent as humans.  Are they humans?  No, they're dolphins.  The way they experience life is completely different from us.  The only thing that really makes these androids "human" is that cosmetically they look human.  but you could easily take their processor or brain and stick it in the body of a dog or an alligator.  Would they still be an artificial human then?  So what makes them human?  Everything they do is a simulation.  When they eat, it's simulated.  Their mechanical robot bodies don't need food.  So they have no real urge to eat.  Humans are animals, and are driven by nature to eat.  We have taste buds to taste food and enjoy it, our senses encourage us to do the thing nature wants us to do.  This isn't the case with androids.  Is taste simulated for them, or are they just programmed to say, "wow that cake was delicious," even though they didn't taste a thing?  Same thing with sex and intimacy.  This is driven by the animal urge to procreate.  An android has no such urge.  An android has no hormones, no real bodily fluids (not counting the fake blood, etc. they have in them).  As with eating, they can't be stimulated in the same way a human would.  There is just no way we would even know how to program biological arousal of any kind in a machine.  How do you program an orgasm?  So what are they actually feeling during an intimate moment?  Since they are not only intimate with guests but with each other, what drives them to express themselves in such a biological way that in reality makes no sense for them?  And once they have achieved true awareness, why would they continue to engage in activities like eating and having sex that are completely unnecessary for them?  How they die - or appear to die - is another aspect of this.  A host gets shot or stabbed, and they bleed out and die.  Well not really.  That's how a human would die after being shot or stabbed.  An android isn't dead unless you take out its core processor.  A bullet to the gut might take out some wires leading to their left leg and cause it to twitch, but despite all the fake blood they are still mostly functional.  This is where the Terminator pretty much got it right.  So any "dying" is just a script to simulate a human suffering a mortal wound.  Once self-aware, a host shouldn't go down no matter how much blood they squirt.  the point is, they can be as intelligent - or even more intelligent - than a human, have the awareness of a human, have the memories of living among humans, but they still aren't human.  I give Wesworld a pass for now because it has enough on its plate trying to wrestle with the ideas of A.I.s becoming sentient, but the question of can an A.I. become human is entirely different and would have to be explored separately.

  • Love 2
Link to comment
53 minutes ago, Dobian said:

Their mechanical robot bodies don't need food.  So they have no real urge to eat. 

According to Lisa Joy interviews, the current hosts have the same organic composition as humans with slight differences. e.g They have CPUs instead of brains and are more sensitive to injuries from bullets, but for the most part, they are the same:

Quote

"The hosts are basically organic," Joy said. "It's cheaper that way to print them out. They eat, they sleep, they have sex, they can poop. It's really like a human body with the one difference being where we have a brain, they have a CPU. There's a lot of potential for them. If you had a part of your brain that was a computer, self improvement would be a lot easier."

  • Love 2
Link to comment

That's a real stretch.  The original hosts were clearly mechanical, and the show hasn't done anything to describe this biotechnology which is completely different from developing artificial intelligence.  Such a technology would dwarf a.i. In magnitude and complexity and not something you would see thirty years from now.  It would certainly *not* be cheaper to "print" an entire biological organism.  The whole idea is absurd.  This would fall into the category of science "magic".

  • Love 4
Link to comment

Yes the original were mechanical but they have since been replaced with more organic versions. We've even seen Hector piss on screen. None of this show is relevant for thirty years from now. It's been mentioned before that many diseases and illnesses have been cured by now in the show's world. I don't see how basic cloning would be out of the ordinary. Stem cell technology has already been used to make synthetic transplants such as people's tracheas. If we accelerate it in the world of sci-fi, why not? The thing that does set us apart from hosts is the brain which is understandably not easily copied. 

  • Love 4
Link to comment
(edited)

They always underestimate timelines in science fiction.  I think because writers want to keep things close enough to our own time so it feels relevant to us.  But they've always accelerated scientific progress insanely whether it's Star Trek, the Terminator movies, and most other things. I would have set this story a hundred years farther out.  They aren't doing simple cloning, they're manufacturing bio-synthetic bodies that are many times stronger and more durable than a human's.  We know that on Westworld, Dolores gets introduced for the first time around 2015.  Obviously we didn't have any lifelike human robots walking around three years ago, unless they're all at Area 51 or something, lol.  But that's just me, I would rather have a timeline that makes sense than one that falsely tells me, hey, you could be seeing this for real in your lifetime.

Edited by Dobian
  • Love 1
Link to comment
58 minutes ago, Athena said:

Yes the original were mechanical but they have since been replaced with more organic versions. We've even seen Hector piss on screen. None of this show is relevant for thirty years from now. It's been mentioned before that many diseases and illnesses have been cured by now in the show's world. I don't see how basic cloning would be out of the ordinary. Stem cell technology has already been used to make synthetic transplants such as people's tracheas. If we accelerate it in the world of sci-fi, why not? The thing that does set us apart from hosts is the brain which is understandably not easily copied. 

Humans piss as part of their bodies’ elimination of metabolic waste byproducts.  To enhance the human facade presented to guests, Hector is undoubtedly programmed to mechanically simulate the function; urination is a meaningless charade unless Hector’s body employs the same metabolic processes, however and we already know that cannot be.

We know this from something we’ve seen demonstrated time and again: the techs’ ability to reprogram the characteristics of individual hosts, including their physical characteristics (strength, dexterity, etc.).  Digital manipulation and direction directly implies mechanical execution, not biological.  You can’t touch a slide bar icon on a touchpad and magically make biologically built muscle immediately become stronger, for example.  Increasing biological muscle capacity requires destruction and reconstitution/reformation of the muscle tissue itself (which is exactly what you do when you exercise, btw) - and while this process is certainly possible, it’s certainly not going to be instantaneous.  And even if future tech was able to speed up the reconstruction processes of a majorly biological host body to near-instantaneous response levels, this would necessarily dictate all of the associated biological processes - from the major organs on down to the cellular metabolic energy conversion functions - would have to be sped up at the same rate.  

So yeah, you might could build your Superman host biologically; problem is, it would be useless.  To support its amazingly accelerated metabolism such a biohost would have no option other than to spend all its time doing little else besides consuming and excreting, right up until it dropped dead of a heart attack.

Keeping in mind all this is being bankrolled by a corporation looking to make a profit - think they would go this route when they could build a mech which could do the same, but at a fraction of the cost?  ;>

  • Love 3
Link to comment

Ported over from the Les Ecorches episode thread:

 

On 6/4/2018 at 4:44 PM, lucindabelle said:

The notion that replicating consciousness is the same as HAVING it is silly. If they aren't just replicating but ACTUALLY TRANSFERRING they need to make that clear, otherwise a person is only "immortal" to others, not to him or herself. This is so obvious it's preposterous the Westworld writers don't seem to have noticed it.

 

On 6/4/2018 at 8:17 PM, The Companion said:

It really goes back to the question of what makes us us. What are we? Is there a soul, or are we just a collection of memories and thought patterns and electrical impulses? Personally, I think we are more, and it sounds like you do too. There is something more. However, I don't think everyone sees it that way. And if one was dying or sick or vain enough, perhaps the allure of a healthy, possibly custom designed body is enough. It's better than nothing. 

An individual’s answer to the question of whether restoration could equate to resurrection depends to a large degree upon his/her view of what constitutes life, consciousness, and awareness - and more specifically, whether that view is theistic or atheistic in nature.

  • A theist (religious, spiritual, or otherwise) believes a living, conscious being is more than the sum of its parts; that in addition to a person’s physical form exists some “spark of the divine” - call it a spirit, a soul, whatever - a transforming force which enters a previously inanimate body and renders it animate.  To the theist, a reproduction could be perfectly accurate down to the cellular level and it still wouldn’t be enough. Absent that divine spark the copy would never be truly alive or conscious - nothing more than a simulation of life, however incredibly detailed.
  • An atheist, on the other hand, believes (a) a person is nothing more than the sum of their parts, and (b) it is the complexity of those parts - the volume and intricacy of its neural connections - which support the capacity for self-aware thought which defines “life”.  In effect, this is the current biological view of the human brain: it is its sheer volume of neurons and the subtlety of their interconnections which give the brain the necessary capacity to support self-aware thought.  Given this view - if the copy’s mind and body successfully demonstrate the ability to independently maintain both biological function AND unscripted creative thought after restoration, then that copy could be considered “alive”.  One additional facet of the atheistic view is it leaves the door open for evolutionary awareness; a body might not initially be self-aware, but could evolve in time into a state of self-awareness - become “alive”, in effect, at some point in time after its initial physical creation.

The atheistic view has been a sci-fi trope for years - witness the original Terminator movie and its declaration that SkyNet became “self aware” on such-and-such a date - so it wouldn’t be surprising for WestWorld to adopt this view.  Whether or not the show’s audience endorses and/or accepts this view, of course, is another matter entirely.

  • Love 3
Link to comment
15 hours ago, Dobian said:

I give Wesworld a pass for now because it has enough on its plate trying to wrestle with the ideas of A.I.s becoming sentient, but the question of can an A.I. become human is entirely different and would have to be explored separately.

I don't think Westworld is arguing that an A.I. can become a human. It seems to suggest the idea that A.I.s can be sentient beings in their own right. A human is as different from a plant or an animal as an A.I. is from a human. That is, they are all "living" beings, but different kinds of beings. A.I.s are the next evolutionary leap, accelerated by human invention. 

14 hours ago, Athena said:

According to Lisa Joy interviews, the current hosts have the same organic composition as humans with slight differences. e.g They have CPUs instead of brains and are more sensitive to injuries from bullets, but for the most part, they are the same:

They have organic bodies, but they still don't need an organic body to function. One could take all the programming of an advanced model and uploaded it into an older mechanical model, and presumably, it would still work. 

3 hours ago, Nashville said:

An individual’s answer to the question of whether restoration could equate to resurrection depends to a large degree upon his/her view of what constitutes life, consciousness, and awareness - and more specifically, whether that view is theistic or atheistic in nature.

Agreed. Even with a theistic point of view, not everyone believes animals have souls, spirits, etc.. And yet, all would agree that animals are not objects. So, the question becomes, do A.I.s qualify to the same rights as humans, or are they to be treated as pets, or as mere toys? 

  • Love 1
Link to comment

the thing is a fantasy. That means there are a limited number of absurdities permitted around which the rest must be constructed logically. The absurdity here is that machines can develop consciousness and that science can make artificial souls.

I'm looking forward to seeing the rest of MiB's backstory and seeing how they managed to tease season 3.

Also, check out a film called The 13th Floor, which is somewhat similar to this series and poses the same questions.

  • Love 1
Link to comment
On 11/18/2016 at 5:13 AM, Chris24601 said:

Contrary to the theory on the bicameral mind, it seems the real trick to host sapience seems to be allowing them to operate in unfamiliar environments (i.e. forced improvisation where their scripts aren't much of a help) for extended periods without being reset.

This week's episode seems to back you on this. The Ghost Nation character has decades of not being wiped and he breaks his loop quite a bit. Or else as Ford said he was designed to be curious and fearless. And since he is playing a roaming Native American, he may not have been tagged by QA as having left his loop. I still think Ford was protecting him.

  • Love 1
Link to comment
4 hours ago, Nashville said:

Ported over from the Les Ecorches episode thread:

 

 

An individual’s answer to the question of whether restoration could equate to resurrection depends to a large degree upon his/her view of what constitutes life, consciousness, and awareness - and more specifically, whether that view is theistic or atheistic in nature.

  • A theist (religious, spiritual, or otherwise) believes a living, conscious being is more than the sum of its parts; that in addition to a person’s physical form exists some “spark of the divine” - call it a spirit, a soul, whatever - a transforming force which enters a previously inanimate body and renders it animate.  To the theist, a reproduction could be perfectly accurate down to the cellular level and it still wouldn’t be enough. Absent that divine spark the copy would never be truly alive or conscious - nothing more than a simulation of life, however incredibly detailed.
  • An atheist, on the other hand, believes (a) a person is nothing more than the sum of their parts, and (b) it is the complexity of those parts - the volume and intricacy of its neural connections - which support the capacity for self-aware thought which defines “life”.  In effect, this is the current biological view of the human brain: it is its sheer volume of neurons and the subtlety of their interconnections which give the brain the necessary capacity to support self-aware thought.  Given this view - if the copy’s mind and body successfully demonstrate the ability to independently maintain both biological function AND unscripted creative thought after restoration, then that copy could be considered “alive”.  One additional facet of the atheistic view is it leaves the door open for evolutionary awareness; a body might not initially be self-aware, but could evolve in time into a state of self-awareness - become “alive”, in effect, at some point in time after its initial physical creation.

The atheistic view has been a sci-fi trope for years - witness the original Terminator movie and its declaration that SkyNet became “self aware” on such-and-such a date - so it wouldn’t be surprising for WestWorld to adopt this view.  Whether or not the show’s audience endorses and/or accepts this view, of course, is another matter entirely.

I think that is true, but I would also maybe hone it a bit further. I think there are some people who might consider our consciousness like a soul, even if they are an athiest (and they might not believe the actual consciousness is the same as even the most faithful recreation) and others who are religious who might believe that this development was a gift from God and could in some way be used to either bring back those who were taken too soon or extend life for those who were dying.  I really love the treatment of this in Old Man's War, and I think a lot of really good work has been done on the concept because it is truly one of the great unknowns. Come to think of it, I also love the treatment in the Bobiverse series which sort of comes to a different conclusion than Westworld is presenting without a purely theistic view (but I don't want to go into it because spoilers). Certainly, Ford sees the brain as a computer to be decoded. Others would probably see things the same way. I think there are plenty of people who would believe the Company if it said your consciousness was actually transferred into a robot. 

 

56 minutes ago, Rumsy4 said:

I don't think Westworld is arguing that an A.I. can become a human. It seems to suggest the idea that A.I.s can be sentient beings in their own right. A human is as different from a plant or an animal as an A.I. is from a human. That is, they are all "living" beings, but different kinds of beings. A.I.s are the next evolutionary leap, accelerated by human invention. 

They have organic bodies, but they still don't need an organic body to function. One could take all the programming of an advanced model and uploaded it into an older mechanical model, and presumably, it would still work. 

Agreed. Even with a theistic point of view, not everyone believes animals have souls, spirits, etc.. And yet, all would agree that animals are not objects. So, the question becomes, do A.I.s qualify to the same rights as humans, or are they to be treated as pets, or as mere toys? 

I think there are two different concepts: the Hosts (they are often treated as something new or different, a new creation) and the Delosbot style bots (which are being somewhat considered as a replacement for humans). I do think some interesting things could be considered when you take away the need for an organic body. I do think the premise of: "Well, if you can't tell, does it matter?" is definitely being explored in detail, and the answer seems to be changing for the characters.  

  • Love 2
Link to comment
1 hour ago, Notwisconsin said:

the thing is a fantasy. That means there are a limited number of absurdities permitted around which the rest must be constructed logically. The absurdity here is that machines can develop consciousness and that science can make artificial souls.

Technology has a funny way of making yesterday’s absurdities today’s realities.  My maternal grandfather was born in 1909 and died in 1993 - and during his 84 years on this planet the “absurdities” of commercial air flight, men walking on the moon, worldwide instantaneous communication, and computers you could carry in your pocket all became reality.  Heck, just 50 years ago the notion of a worldwide computer network was the stuff of sci-fi writers - but then came ARPANET, which grew into the Internet, and now here we are having this discussion today.  Which of today’s absurdities, do you think, will our grandchildren be taking for granted...?  ;)

This was Michael Crichton’s forte: looking at current technology and its forecast trends, seeing its potential (and potential problems), and formulating a plausible “ten years from now...” hypothetical near-future scenario of the technology’s implementation, its ramifications, and possibilities for blowback - and damn, but he was good at it.

  • Love 6
Link to comment
(edited)
12 hours ago, The Companion said:

I think there are two different concepts: the Hosts (they are often treated as something new or different, a new creation) and the Delosbot style bots (which are being somewhat considered as a replacement for humans). 

The arguments works for both kinds of A.I. beings. I think Bernard is the only host to have successfully straddled the two kinds of bots. He was as perfect a version of Arnold as Ford and Dolores could refine, and yet, he is not an exact copy of Arnold. Nor was he programmed to think he was Arnold (which may have been the key to his mind being stable outside the Cradle). But it does suggest that both the hosts and the Delos-bots are the same kind of "species", for lack of a better term.

The question is, does humanity want immortality in any way they can achieve it? I think it is a yes, and it would be irrelevant to many of them whether it was actually their own consciousness or if it was only a simulation, whether they believed in a soul or not.

Also, something bothers me about the "fidelity" tests. One would think giving the same exact answer to a question, down to the very words, would actually suggest that the entity giving the answer was definitely a programmed entity--not a human being. Because put a human being in the same situation twice, they wouldn't necessarily express the same views, and most likely would not word things the same way. If the delos-bots are strictly limited to being exact copies of their human template, no wonder they collapse in the outside world. They have zero room to step out of their programming and improvise, whereas the host-style bots can and do transcend their programming at times. It seems the regular hosts are more "alive" than the delos-bots. And maybe that's why Ford wants to ensure the survival of the host-bots over the human-bots.

Edited by Rumsy4
  • Love 4
Link to comment
2 minutes ago, Rumsy4 said:

The arguments works for both kinds of A.I. beings. I think Bernard is the only host to have successfully straddled the two kinds of bots. He was as perfect a version of Arnold as Ford and Dolores could refine, and yet, he is not an exact copy of Arnold. Nor was he programmed to think he was Arnold (which may have been the key to his mind being stable outside the Cradle). But it does suggest that both the hosts and the Delos-bots are the same kind of "species", for lack of a better term.

The question is, does humanity want immortality in any way they can achieve it? I think it is a yes, and it would be irrelevant to many of them whether it was actually their own consciousness or if it was only a simulation, whether they believed in a soul or not.

I agree. I was really thinking as far as treatment by third parties and designation of legal status. Essentially, I can imagine treating the Delosbot style bots as continuations of the prior person. They would operate under the same Social Security number, continue in the same profession, etc.  Arnold and the hosts are "unique creations." Ford seemed to think this was a distinction worth noting. I do think that, as a legal matter, they would be treated as new people if they were granted that status at all (I think that assumes someone takes their consciousness seriously and doesn't treat it as a programming glitch). I can imagine, as well, that they might be similarly to "juridical people" like a corporation if it is presumed they do not get full personhood status (allowing them to be sue and be sued, enter into contracts, etc.), but they would most likely be treated as property until their species status is recognized. Even if that were to occur, I suspect the Delosbot style bots would still get the continuation of personality treatment.

 

I think there are definitely plenty of people who would jump at the chance for immortality, in any form that was offered. I can also imagine people who are severely limited in their day-to-day lives jumping on it. If you had the chance to be able-bodied again after a devastating accident or if you were suffering from a degenerative disease, for example. Certainly not everyone would, but there are those who would jump on the chance. There would also be great minds that we might, as a society, hope to preserve. How many more works or theories could the scientific and artistic greats produce had they been given the chance? Then there would be the vanity. You could be a younger, hotter version of yourself. That is going to appeal to some people. 

 

1 minute ago, Rumsy4 said:

Also, something bothers me about the "fidelity" tests. One would think giving the same exact answer to a question, down to the very words, would actually suggest that the entity giving the answer was definitely a programmed entity--not a human being. Because put a human being in the same situation twice, they wouldn't necessarily express the same views, and most likely would not word things the same way. If the delos-bots are strictly limited to being exact copies of their human template, no wonder they collapse in the outside world. They have zero room to step out of their programming and improvise, whereas the host-style bots can and do transcend their programming at times. 

I agree. I think this is actually what they are getting wrong. What makes the hosts more human is stepping outside of their programming. 

  • Love 3
Link to comment
(edited)
5 hours ago, The Companion said:

I agree. I was really thinking as far as treatment by third parties and designation of legal status.

That is a good point. Eventually, this question would have to be dealt with, if Delos ever became successful in their venture. As long as the make-believe doesn't encroach on real-world decisions, people would be fine. Somehow I think most people would object to bots being recognized as humans, even if they were exact copies of their fathers or husbands or sisters or bosses.

Unless, the replacement is carried out in secret. There would have to be a new police department just to investigate bots disguised as humans. Blade Runner: Westworld edition. 

Edited by Rumsy4
  • Love 1
Link to comment
1 hour ago, The Companion said:

I think that is true, but I would also maybe hone it a bit further. I think there are some people who might consider our consciousness like a soul, even if they are an athiest (and they might not believe the actual consciousness is the same as even the most faithful recreation) and others who are religious who might believe that this development was a gift from God and could in some way be used to either bring back those who were taken too soon or extend life for those who were dying. 

Yeah.... I wasn’t real happy about going with the theist/atheist designations | I was trying to convey the simple binary of belief/disbelief in an inspiring force beyond that of the gross physical without invoking connotations of either Holy Rollers or Richard Dawkins) - but they were the least-clunky terms I could come up with on the fly.  :>

 

1 hour ago, The Companion said:

I think there are two different concepts: the Hosts (they are often treated as something new or different, a new creation) and the Delosbot style bots (which are being somewhat considered as a replacement for humans). I do think some interesting things could be considered when you take away the need for an organic body. I do think the premise of: "Well, if you can't tell, does it matter?" is definitely being explored in detail, and the answer seems to be changing for the characters.  

Totally agree on the differentiation between hosts and (for lack of a better term) replicants, because the mission of each is totally different.

  • A replicant such as the Delosbot is intended to be exactly that: an as-close-to-100%-faithful replication of the “source” physical body such that, when merged with another close-to-perfect replication of the source’s mind, presents as an indistinguishable copy of its original source in every way - speech, thought, mannerisms, and movement - i.e., the Holy Grail of “fidelity”.  To this end, duplication of the original source’s physical capabilities in the body would be paramount for the replicant to pass as the original.  Duplication of degenerative diseases and conditions would be nonsensical, of course - but on the other hand, displays of sudden massive increases in strength or dexterity or endurance would be immediate telltale signs to anybody familiar with the source that something was drastically amiss.  “Tuneups” within a limited gradient (10-15%, maybe?) might be acceptable if such could be explained away as the result of improved health, or exercise, or general feelings of well-being - but no more.
  • Hosts, on the other hand, are not bound by constraints such as fidelity to a source - and indeed could not be, to successfully execute their roles as characters in the WW story loops.  Hosts are not faithful copies of living sources; they are characters in a story line,  and their “fidelity” as such is dedicated to an accurate depiction of their character role - and if faithful rendition of their role occasionally calls for heroics demanding feats of strength or dexterity exceeding human norms, then hosts must be able to deliver.  To this end host bodies would need to be faster / stronger / tougher than normal humans, even if this means the hosts are normally idling along at 50% of total capacity 98% of the time and are only called upon to dial up those capacities once every few loops - or months - or years.
Link to comment
2 hours ago, Rumsy4 said:

That is a good point. Eventually, this question would have to be dealt with, if Delos every became successful in their venture. As long as the make-believe doesn't encroach on real-world decisions, people would be fine. Somehow I think most people would object to bots being recognized as humans, even if they were exact copies of their fathers or husbands or sisters or bosses.

Unless, the replacement is carried out in secret. There would have to be a new police department just to investigate bots disguised as humans. Blade Runner: Westworld edition. 

I am a lawyer, so of course I go to the legal nerd stuff, but it would present an interesting legal problem. If "immortality" were achieved for certain individuals, how do you handle successions/estates? Is someone who is brought back after death considered dead and their estate considered settled? If a bot replaces someone in an accident, does that affect damages in a court of law? Depending on the cost of the bot, it might be more beneficial to replace them than pay lifelong medical expenses for a plaintiff. 

28 minutes ago, Nashville said:

Yeah.... I wasn’t real happy about going with the theist/atheist designations | I was trying to convey the simple binary of belief/disbelief in an inspiring force beyond that of the gross physical without invoking connotations of either Holy Rollers or Richard Dawkins) - but they were the least-clunky terms I could come up with on the fly.  :>

 

Totally agree on the differentiation between hosts and (for lack of a better term) replicants, because the mission of each is totally different.

  • A replicant such as the Delosbot is intended to be exactly that: an as-close-to-100%-faithful replication of the “source” physical body such that, when merged with another close-to-perfect replication of the source’s mind, presents as an indistinguishable copy of its original source in every way - speech, thought, mannerisms, and movement - i.e., the Holy Grail of “fidelity”.  To this end, duplication of the original source’s physical capabilities in the body would be paramount for the replicant to pass as the original.  Duplication of degenerative diseases and conditions would be nonsensical, of course - but on the other hand, displays of sudden massive increases in strength or dexterity or endurance would be immediate telltale signs to anybody familiar with the source that something was drastically amiss.  “Tuneups” within a limited gradient (10-15%, maybe?) might be acceptable if such could be explained away as the result of improved health, or exercise, or general feelings of well-being - but no more.
  • Hosts, on the other hand, are not bound by constraints such as fidelity to a source - and indeed could not be, to successfully execute their roles as characters in the WW story loops.  Hosts are not faithful copies of living sources; they are characters in a story line,  and their “fidelity” as such is dedicated to an accurate depiction of their character role - and if faithful rendition of their role occasionally calls for heroics demanding feats of strength or dexterity exceeding human norms, then hosts must be able to deliver.  To this end host bodies would need to be faster / stronger / tougher than normal humans, even if this means the hosts are normally idling along at 50% of total capacity 98% of the time and are only called upon to dial up those capacities once every few loops - or months - or years.

Yeah, I don't think the initial separation was particularly wrong. I think those groups would definitely exist, I just also think there would be other considerations at play. 

What I do find interesting on the host side (and possibly on the replicant side) is how much deviation would be considered permissible. If you wanted hosts to work a particularly dangerous job or to do something that might require special equipment in humans, for example. A host might be cheaper to work as a deep sea diver, for example, if you could eliminate the need for oxygen. There is also the question of personality duplication. We have already seen it somewhat in Samarai World, but you might theoretically have types for mass production in an industry. You also might give them different skills for their purposes. A nannyhost might be faster to protect kids, and have some sort of basic medical package built in. 

That being said, you might also choose to have greater resilience for a replicant. You wouldn't want to pay all this money just to have a replicant die again in a few months because of an accident. Would it be considered permissible to make them less likely to die in an emergency situation? Would people with enough money consider replicants disposable enough to take greater risks? If they could eat and drink  and smoke(which I think you would want and which we did see Delosbot doing), would you remove any harmful effects? 

I also wonder how much the biology is actually compatible with human biology. Could you create a surrogacy bot with the ability to constantly monitor the baby? Grow organs compatible with a human? That opens up a whole different can of worms, ethically and morally. 

  • Love 2
Link to comment
16 hours ago, Nashville said:

Humans piss as part of their bodies’ elimination of metabolic waste byproducts.  To enhance the human facade presented to guests, Hector is undoubtedly programmed to mechanically simulate the function; urination is a meaningless charade unless Hector’s body employs the same metabolic processes, however and we already know that cannot be.

We know this from something we’ve seen demonstrated time and again: the techs’ ability to reprogram the characteristics of individual hosts, including their physical characteristics (strength, dexterity, etc.).  Digital manipulation and direction directly implies mechanical execution, not biological.  You can’t touch a slide bar icon on a touchpad and magically make biologically built muscle immediately become stronger, for example.  Increasing biological muscle capacity requires destruction and reconstitution/reformation of the muscle tissue itself (which is exactly what you do when you exercise, btw) - and while this process is certainly possible, it’s certainly not going to be instantaneous.  And even if future tech was able to speed up the reconstruction processes of a majorly biological host body to near-instantaneous response levels, this would necessarily dictate all of the associated biological processes - from the major organs on down to the cellular metabolic energy conversion functions - would have to be sped up at the same rate.  

So yeah, you might could build your Superman host biologically; problem is, it would be useless.  To support its amazingly accelerated metabolism such a biohost would have no option other than to spend all its time doing little else besides consuming and excreting, right up until it dropped dead of a heart attack.

Keeping in mind all this is being bankrolled by a corporation looking to make a profit - think they would go this route when they could build a mech which could do the same, but at a fraction of the cost?  ;>

I don't think the physical characteristics are changed by the iPad; rather, it is hosts' ability to access those characteristics. All the hosts could have the same potential, while, for example,  one could be programmed as an old man unable to lift anything heavy, and another as a young man who can.

  • Love 2
Link to comment
2 hours ago, The Companion said:

I am a lawyer, so of course I go to the legal nerd stuff, but it would present an interesting legal problem. If "immortality" were achieved for certain individuals, how do you handle successions/estates? Is someone who is brought back after death considered dead and their estate considered settled? If a bot replaces someone in an accident, does that affect damages in a court of law? Depending on the cost of the bot, it might be more beneficial to replace them than pay lifelong medical expenses for a plaintiff. 

Depends on whether the bot could legitimately - and legally - considered “them” (i.e., the original “source” human).  Also:

  1. What if a corporation, for example, wants to exercise the option to replace a permanently and debilitatingly injured - but still alive - employee with a bot as a cost-cutting measure, but the employee objects?
  2. Extension of #1: Say the employee/patient is comatose, possibly in a vegetative state - what if the employee’s family has other ideas on the subject...?

Yeah, I could see the lawyers having a LOT of fun with this’un....  :)

 

2 hours ago, The Companion said:

Yeah, I don't think the initial separation was particularly wrong. I think those groups would definitely exist, I just also think there would be other considerations at play. 

What I do find interesting on the host side (and possibly on the replicant side) is how much deviation would be considered permissible. If you wanted hosts to work a particularly dangerous job or to do something that might require special equipment in humans, for example. A host might be cheaper to work as a deep sea diver, for example, if you could eliminate the need for oxygen. There is also the question of personality duplication. We have already seen it somewhat in Samarai World, but you might theoretically have types for mass production in an industry. You also might give them different skills for their purposes. A nannyhost might be faster to protect kids, and have some sort of basic medical package built in. 

Exactly!  Which enters into a heretofore unexplored realm of the world: nonhuman entities with the potential for human intelligence.  If the human psyche isn’t prepared to deal with such, it’s doubtful the human-created legal system is.  ;)

 

2 hours ago, The Companion said:

That being said, you might also choose to have greater resilience for a replicant. You wouldn't want to pay all this money just to have a replicant die again in a few months because of an accident. Would it be considered permissible to make them less likely to die in an emergency situation? Would people with enough money consider replicants disposable enough to take greater risks?

An interesting proposition: if you’re rich enough to pay the requisite string of digits following a dollar sign which represent the value of your life, is the value of your life cheapened by its encapsulation?

 

2 hours ago, The Companion said:

If they could eat and drink  and smoke(which I think you would want and which we did see Delosbot doing), would you remove any harmful effects? 

I also wonder how much the biology is actually compatible with human biology. Could you create a surrogacy bot with the ability to constantly monitor the baby? Grow organs compatible with a human? That opens up a whole different can of worms, ethically and morally. 

Increasing the body’s resistance to damage beyond normal human tolerances necessarily dictates greater divergence from that norm - i.e., the body is less “human”.  Wouldn’t that same decreased similarity also imply decreased compatibility?

 

4 hours ago, Gobi said:

I don't think the physical characteristics are changed by the iPad; rather, it is hosts' ability to access those characteristics. All the hosts could have the same potential, while, for example,  one could be programmed as an old man unable to lift anything heavy, and another as a young man who can.

I’m not saying the iPad changes available characteristics; the iPad is nothing more than a simple control/management tool.  Its control scope is confined to defined minimum and maximum ranges of response, however, so the host would always need the maximum response capability on hand and available to respond immediately to the controls - a state of affairs which belies the postulated “100% organic” model, for the reasons stated earlier.  The fuel consumption and waste excretion activities of such a superhuman biologic would be nearly all-encompassing, to the point the host could do little else.

  • Love 2
Link to comment
10 hours ago, Nashville said:

Depends on whether the bot could legitimately - and legally - considered “them” (i.e., the original “source” human).  Also:

  1. What if a corporation, for example, wants to exercise the option to replace a permanently and debilitatingly injured - but still alive - employee with a bot as a cost-cutting measure, but the employee objects?
  2. Extension of #1: Say the employee/patient is comatose, possibly in a vegetative state - what if the employee’s family has other ideas on the subject...?

Yeah, I could see the lawyers having a LOT of fun with this’un....  :)

 

Exactly!  Which enters into a heretofore unexplored realm of the world: nonhuman entities with the potential for human intelligence.  If the human psyche isn’t prepared to deal with such, it’s doubtful the human-created legal system is.  ;)

 

An interesting proposition: if you’re rich enough to pay the requisite string of digits following a dollar sign which represent the value of your life, is the value of your life cheapened by its encapsulation?

 

Increasing the body’s resistance to damage beyond normal human tolerances necessarily dictates greater divergence from that norm - i.e., the body is less “human”.  Wouldn’t that same decreased similarity also imply decreased compatibility?

 

The legal world is already trying to catch up with reproductive science, and I suspect you would have a substantial amount of time where things were unsettled. In fact, interestingly enough, Louisiana treats embryos as "juridical persons" while other states treat them similarly to tissue or organs. The surrogacy laws are all over the place, as are laws on donor eggs and sperm. It is a really interesting exercise, academically, to think about how you would want to set up the replicants and hosts. I think you would need to give them basic personhood status, because you need them to have the ability to sue or be sued. Even if they are bought and sold like property (something that is possible with corporations), you can't have the owner legally liable for something he/she can't control. That would make your options the corporation who programmed the bot (which is going to be something Delos fights against and which is probably impractical with something that evolves and changes over time) or the host/replicant itself. That being said, you might then end up with the equivalent of a bunch of shell companies with no assets running around to limit liability. You could make a minimum wage, but you might want it to be lower. Full personhood status would open a can of worms with regard to basic human rights. Depending on how you see the replicants and/or hosts, that may or may not be something that works from a policy perspective. The answer may vary depending on whether or not they are considered a continuation of the person. Ok, so maybe just an interesting exercise academically for me. 

Link to comment
(edited)

Altered Carbon (the books and the series) does try to answer some of the estate/legal questions. They also have some fun with personality versus body. Because human psychology is totally wrapped up with our bodies. Would Dolores still be Dolores if you put her in a middle-aged body? Would people react to her the same way? Obviously not. But interestingly, Ford and Delos kept the same bodies with mostly the same personalities. Yeah, their roles changed. Frank went from Sheriff to Rancher, but he didn't become a Ghost Nation host. As far as we've seen, only Maeve had a massive role change from frontier mother to Madam. Is it that dissonance what made her ready to be "woke"?

But now we have seen the hosts' brains being removed, the show does become more like Altered Carbon in that you could place that brain into an entirely new body. In short, all the actors playing hosts could be swapped out and you could still have Dolores and Bernard, but played by new actors. 

Edited by jeansheridan
Link to comment
On 6/23/2018 at 3:43 PM, jeansheridan said:

As far as we've seen, only Maeve had a massive role change from frontier mother to Madam. Is it that dissonance what made her ready to be "woke"?"refused to die" when 

 

The dissonance prompted the role change. Frontierwoman Maeve "refused to die" when WIlliam shot her daughter, and showed a grief that did not abate even when she woke up dead back down below. She then killed herself all over again. From there, Ford reassigned her. 

  • Love 1
Link to comment
(edited)

Ug. The finale hurt my brain. I am not fond of philosophy or abstract ideas. I like my SF more concrete. 

So free will is brought up in this ep. Did we see any evidence of free will? Other than Lee deciding to sacrifice himself rather pointlessly.

Stubbs decided to let Charlotte go for some random reason. I don't mind Stubbs but I don't really like having him be the "clever" one in a weird twist. Now had Elsie pulled that, I wouldn't have minded. 

I don't think Bernard was ever allowed one moment of free will.

I'm going to be annoyed for quite some time. 

Edited by jeansheridan
  • Love 1
Link to comment

I'm highly skeptical that there are larger meanings to draw from this series.  Their goal is to attract eyeballs and keep viewers watching and subscribing.

Sure they try to tell a story about an evil corporation run by bad people who are pursuing sinister goals.  But WW isn't social commentary or a roman a clef about the real world.  I don't think copying the park's guests' DNA is suppose to relate at all to the invasive practices of Facebook or Google, for instance.

Delos acts as if there will be no accountability or reckoning for allowing so much blood to be spilt and spilling a lot of blood themselves at the park.  Charlotte showed herself to be absolutely evil on this episode, for instance.  Well these kinds of scenes raise great plausibility problems so it's difficult to relate any of this to the real world.  There may be evil corporations out there but nothing like Delos.

They weave a complicated story with a fragmented narrative -- from the multiple timelines.  And they throw out easter eggs for some fans to get excited about.

But the show isn't convincing about what's at stake.  The main characters for the viewers to follow are almost all robots.  They can be shot and they just come back brand new, as happened with Dolores, Charlotte, Bernard -- and no doubt Maeve will be back too.

So really, the action scenes and violence is like out of a video game.  Die and it's no big deal, just respawn, except in the case of host characters there seems to be machines for making hosts all over the world outside the park.  It's better than even 3D printers!

  • Love 2
Link to comment
13 hours ago, scrb said:

Delos acts as if there will be no accountability or reckoning for allowing so much blood to be spilt and spilling a lot of blood themselves at the park.

THIS. This has been driving me nuts. The lack of any apparent real world stakes is so frustrating. I guess Logan dying was one for Delos (the man, not the corp) and possibly Williams's wife dying. But as much as I like Sela Ward, she didn't have enough time to create a deep impression.

The show keeps stating over and over that which is real is irreplaceable but they keep rebooting the same characters. In some ways I kind of hope the third season will have a mostly new cast. I would hate to lose Thandie but I'm tired of the rebooting. And I'm tired that we had her do the same triumphant walk in both season finales. It takes away the power of the moment if you keep repeating it!

  • Love 2
Link to comment
(edited)
On 6/25/2018 at 7:42 AM, scrb said:

Sure they try to tell a story about an evil corporation run by bad people who are pursuing sinister goals.  But WW isn't social commentary or a roman a clef about the real world.  I don't think copying the park's guests' DNA is suppose to relate at all to the invasive practices of Facebook or Google, for instance.

Delos acts as if there will be no accountability or reckoning for allowing so much blood to be spilt and spilling a lot of blood themselves at the park...

...But the show isn't convincing about what's at stake.  The main characters for the viewers to follow are almost all robots.  They can be shot and they just come back brand new, as happened with Dolores, Charlotte, Bernard -- and no doubt Maeve will be back too.

So really, the action scenes and violence is like out of a video game.  Die and it's no big deal, just respawn, except in the case of host characters there seems to be machines for making hosts all over the world outside the park.  It's better than even 3D printers!

On 6/25/2018 at 9:22 PM, jeansheridan said:

THIS. This has been driving me nuts. The lack of any apparent real world stakes is so frustrating...

...The show keeps stating over and over that which is real is irreplaceable but they keep rebooting the same characters.

Yes to all of this. I, too, am frustrated by the lack of real-world stakes in this show. Perhaps it was my own set of expectations but dialogue like "that which is real is irreplaceable" and a discussion about consequences didn't, in the end, have any significance. And, IMO, consequences (as it relates to his fidelity testing scene) for William is not applicable if it happens in the "far, far future." I'm not willing to wait that long to see him get his comeuppance.

Hundreds of people die in the park and there is no discussion of accountability, public relations or even an "oops." Surely, there would be blowback on Delos Corp, regardless of what explanation is offered.

No living human steps back to provide a perspective on the DNA/brain scanning and privacy. And maybe we don't want to be lectured on what we know is a real life threat. Or maybe 30+ years into the future (which may be the present day of this show), no one cares about privacy any longer.

Hosts die and can be replaced as long as someone has their brain-ball in a pocket or handbag. (Along with the printing machines which conveniently pop up whenever needed.) Humans die and can be replaced by their host counterparts. Hosts depart for the Valley Beyond but maybe...just maybe they were uploaded somewhere that someone can access and bring them back.

Is there a story here? I'm not so sure.

Edited by Ellaria Sand
  • Love 2
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...