Jump to content

Type keyword(s) to search

markx

Member
  • Posts

    53
  • Joined

Everything posted by markx

  1. One possible explanation is that the timeline we're watching *isn't* an original timeline, but is one where Steve has travelled back in time (and never returned to his original timeline). It means the old Steve we see at the end isn't exactly the same Steve we've seen all along, but came from another timeline. Meanwhile our Steve travelled back in time to in turn create a new timeline. I see these as consistent - I don't think the article is saying that the alternative timeline is what she was inherently worried about (rather it was the loss of the stone in her timeline, as you say), rather the article tries to use what she said to understand what the time travel rules are meant to be.
  2. I loved the film, and the 90s nostalgia, though I feel the need to point out that audio CDs, even on a computer, played pretty much right away (a short pause to spin up, no copying of files). It said "CD Player" which IIRC indicates an audio CD, but even if it was a CD ROM with files, that'd still play direct from CD. Maybe they were being stupid by copying the files to the computer first - but then if you want to copy files to say a phone today, or from a USB stick, you'd have to wait a moment too (data transfer rates are faster, but I don't think that's what the joke was meant to be...) I suppose in some sense the response was realistic: if that had happened in the 90s, people from the 90s would have been saying "What's it doing", because it *wasn't* normal for that to happen! But it's not an example of poor technology in the 90s. If anything, whilst technology has obviously massively advanced, issues like waiting for a computer or the Internet connection dropping are just more widespread: copying files, having to wait for a phone or TV to boot up (or restart for updates), "buffering" issues, or losing Internet from your phone. Maybe they'll be mocking that in 24 years too... I know, it seems silly to nitpick a joke in a movie genre with all kinds of unrealistic physics, but this seems something that many people would assume to have been true of the 90s. The only reason it seemed odd is because movies rarely cover normal mundane things that take time, however short, unless it's essential to the plot - see Iron Man and Pepper Potts waiting to copy files to USB. It created a tense moment, no one watching it was meant to think "What is she waiting for".
  3. The hand/burning scene was hard to watch (not sure porn is the right word for it, nothing was or was intended to be sexual). I'm not sure I agree with the idea that they went too far or that shouldn't have been shown. I mean, we've already had brutal executions like stoning, physical torture like removing eyes, concentration death camps, and the whole show is based around organised mass rape on a monthly basis. Any one of those earlier examples could also have been skipped - skip too many, and it defeats the point of the show. I was curious by the airport scene. Wouldn't immigration rules be what the destination country requires? Now of course, I'm not expecting Canada to suddenly be relaxing its rules (and sadly, in times of crisis we hear people get even more opposed to immigration and asylum seekers). But they would still recognise same-sex marriage - although I suppose it's not hard to see the US/Gilead authorities deciding to interpret Canada's rules in their ways. Or maybe the point was that Gilead was increasingly clamping down on freedom of movement out the country, no matter what other countries allowed? I agree - although also any woman fleeing was a potentially fertile woman. I agree about comparisons to America. But worryingly many other countries too that are already further down that road, and all some people (in society, not here!) can do is moan about asylum seekers. It would be interesting for the show to show that side: people make it all the way to Canada, only to be turned away; or those who stay experience anti-immigration abuse, with people assuming they must have the same views as those they fled from.
  4. Regarding male pregnancy - I agree with other comments that any question of realism shouldn't apply to an alien species. "He may not be female but he's not male either." Well sure - after all they presumably don't even speak English, so wouldn't use those terms. But the question is then one of translation - why are their sex or genders mapped to male and female (or why does the TARDIS choose to translate in that manner)? Perhaps there are other characteristics that mean a male label is a closer approximation despite pregnancy (see seahorse example - plus if everyone gets pregnant, then that's no longer a feature to distinguish at all). Or maybe it's based on gender expression. This does indeed fall into the trope of "aliens look identical to humans except for one single aspect". On TV we see a male actor playing a male character except for the one difference of pregnancy, but it's unlikely this would ever apply to actual aliens - even if evolutionary pressures still led to two sexes and humanoid beings, they'd likely look very different. I have mixed feelings really - on the one hand I'm sure it's meant to make us think about definitions of sex and gender and how they may vary. On the other hand, it's still having aliens having to be male or female and follow gender roles and appearance almost identical to 20-21st century westerners (except for occasional limited differences), when in reality even among humans there are and have been societies with more than two genders for example. Anyhow. I tried to get into the episode, but grew bored. "And the iPhone of CERN reactors? Um... okay." I was like, what does she mean their ship's CERN engine is outdated and lacking basic functionality despite the high price. I also agree with DanaK, it seemed odd that the Doctor was going on and on about this. That wasn't good science either: antimatter might be useful as a fuel, but isn't an energy source if you are generating the antimatter, so unclear why they'd be doing that on the ship. Maybe the reason for that scene was to get children interested in science, but it just came across as stupid apple reference and bad science imo.
  5. "graham explaining modern cell phones and naming himself Steve jobs" To me that was a cringe worthy point ruining a good episode. This was meant to be an episode about a revolutionary black woman, not some white man who got credit for what Asian companies were already doing with more popularity. Though, if Graham was going to pass off old feature phones as his own, I suppose Jobs makes a good comparison. What with Tauranga, this seems to be one of those series that needs to have a desperate applevert every other episode. So much for no product placement on the BBC.
  6. Oh, I know that it was likely at an earlier time to the San Junipero episode, but I mean there was still no technological reason to restrict her to two outputs. I agree the teddy bear plan might not have been intended to benefit her - it makes it a sinister u-turn from when the hospital was offering options that were meant to be for her benefit.
  7. A depressing thought is that the extremely limited and terrifying existing in the teddy bear still offered slightly more than when she was in a coma (she could see as well as hear; she didn't need to remain in a hospital, which means she could be taken places, as well as making it easier for loved ones to be with them). There have been cases of people in comas for years, who were able to communicate through their brainwaves in a limited yes/no fashion, and show that they had full awareness. In terms of the episode, things were worse for her in the bear, but more due to changed circumstances - in theory that could still happen to coma victims (he's no longer able/willing to visit so often, he could have still met someone else). I suppose there's a risk though that it would be easier to see the bear as just a toy, and no longer a person. Plus she might never die. One thing that didn't make sense was why the teddy bear was so limited - if you can simulate human-level AI along with their actions, you can output more than two options. There's no reason why her speech couldn't be output in full, as well as letting her have a full virtual keyboard etc. Even the most simplest of pattern-recognition neural networks will output multiple values! Unless it was just intended as a toy for a child, with no thought given to the rights of the AI consciousness (similar to the White Christmas "home" cookie). That still doesn't make sense, since the solutions were put to them in order to help her (and San Junipero where he worked was clearly doing this from the point of view of benefitting the AIs, since we know what they ended up creating). What about the Androids in Be Right Back? Maybe he was just being an arsehole, and didn't offer solutions that might have been better for her. I have mixed feelings about this. I do love the ideas revolving around AI and consciousness; also whilst plenty of shows have explored AI and robots, it seems less common to explore the idea of purely virtual conscious AI. On the other hand, it is turning into a bit of a predictable plot-generator. "This new music player, it's actually got simulations of all your favourite artists who are enslaved for eternity!"; "San Junipero gets hacked and our loved characters get repeatedly copied and tortured over and over for a ransom"; "A future society decides to punish criminals by punishing AI simulations, and then implanting the memories of their experience back into the original (it's deemed more humane because surely no one really 'experienced' it, leading to it being accepted to simulate all sorts of awful tortures)". There's half of season 5. It's an open question as to whether we'll achieve human-level AI, or how long it might take - I think it's plausible enough to consider for Black Mirror though, and we've had other future technologies that seem decades in the future too (e.g., the various eye computer/implants episodes). But yes, I do love the episodes that feel like they could happen now or in the very near future, which adds an extra level of fear (National Anthem, Waldo Moment, Shut Up And Dance). To be fair, it's harder to remain original when a TV show goes for several series.
  8. http://www.independent.co.uk/arts-entertainment/tv/features/black-mirror-episodes-list-season-1-2-3-4-chronoligcal-order-shared-universe-netflix-a8143591.html also puts all the episodes in timeline order - it's mostly based on their references to each other, rather than judgement of technology, but some guesswork is still needed and I think they get some of those guesses wrong. Presence of a specific brand like apple cell phones is more likely an indication of product placement deals. White Bear is indeed early on according to the article. To me it seems unlikely that memory control like this would advance so quickly, but who knows. Be Right Back - this is one with no useful easter eggs/references, and interestingly the Independent puts it later on (later than White Christmas, Entire History of You) based on the Androids (not the phones!). But I'd disagree with that, as well as the lack of the eye tech, the level of AI seems more primitive than the human-level cookies we saw in White Christmas. "An important note, however, is that not everyone has the "grain" in Entire History of You." Note it's been mainstream enough that the emergency services seem unable to help IIRC if someone doesn't have it - but yes, I agree this puts it before White Christmas. The article I link puts White Christmas way too early for no apparent reason. I imagined 15 Million Merits as being far in the future, for that kind of life to exist - but the article provides evidence that this must be at the same time as National Anthem and The Waldo Moment. I guess it could be some kind of weird reality show taking place in the near-future. As mentioned in other threads, the series seems a lot more pessimistic on self-driving compared to other technologies. A character is waiting for it to "go mainstream" in Hated in the Nation which takes place at a time when cookies have been given human rights (so fully fledged human AIs before an AI can drive a car? If nothing else, you could stick a cookie in a car...) It's available for limited services (pizza delivery) in Crocodile, at a time when they also have the ability to read human and animal memories. People are still driving around in Be Right Back, ArkAngel and even in The Entire History of You! Flexible screens are another omission. In episodes like ArkAngel, where we have implants that can read and modify visual perception, people are still waving big tablets around like it's 2012. (In reality it seems people use phones more and more - we might in future see people unfolding their phones to give larger displays.) We'll also surely have these before we can have displays small, thin and flexible enough to be put in an eye. Having said that I can understand there being some merit in focusing on the technological changes that are relevant to the plot, and outside of that it's easier to just use existing products as props - Chekhov's Futuristic Gadget. But surely there were sensors? In Black Museum, the teddy bear had a camera which fed audio and visual inputs into the hardware that was running the AI, along with some "haptic" sensor so she could feel a hug. Similarly the White Christmas "house" cookie had at least audio and visual.
  9. Humanoid terminators seem slow and clumsy in comparison. "Both in style and function, [the dogs] are exceptionally well-designed — when Apple unveils its first killing machine, the iMurder will probably look a lot like this." Typical zealots in the media trying to advertise apple at every opportunity - an apple version would be more likely to be boring grey with a big flashing apple logo built using other companies' tech, and slow down when it got the next update. They'd be more concerned about suing the company already producing more popular and well designed killing machines on the grounds of their "body with four legs" patent. And their machines certainly wouldn't be able to interoperate with the cars and house doors...
  10. I just can't bring myself to see people as real, no matter how much their brains make them think they're real. They think and act like that as a result of their brain structure, and that could be different if their brains were different. Now yes, whether a computer program can be sentient is an open question. It might be also that the characters in these episodes were just glorified ruled-based AIs of the kind we see in today's computer games. But conceivably - as with San Junipero - one could have software that simulates and behaves in the same manner that a human brain computes and behaves. I think there are two separate things here. It might be that they are still sentient, but at the same time don't have the memories and aren't reasonably the same people as the real world counterparts.
  11. It seems horrifying that such technology exists, and can be legally used on people against their will if they've been a witness, even having to hand their memories over to private insurance companies for civil disputes. Was it unrealistic? I'd hope so - but I can see Governments trying to pass such laws, and even if it was kept to requiring a search warrant there would be enormous privacy implications. You'd also have employers trying to make it mandatory for an interview, or it being used at customs/border checks. Unusually for Black Mirror we were shown a positive aspect rather than the many negative possibilities for the most part (though there is the point that she wouldn't have committed three of the murders if the technology and associated law didn't exist - and the dentist having to come clean). Even memories from before the technology existed might not be safe. In some ways I think the story would have been more interesting if it had ending up unearthing the crime from 15 years ago - the Leela-style kill-every-witness seemed a bit hard to believe, and unless she confesses it all there's no reason to think they'd find out about the initial crime. I don't know if they claimed it was reliable - eyewitness testimony is unreliable, but it's still often allowed in court, and I can imagine people arguing that the device is therefore no more unreliable (with perhaps the advantage that it eliminates the factor of people lying). Though I think there are still reasons to challenge it - one problem might be that seeing the images might make a jury more likely to accept it, even though it may still be unreliable.
  12. It would have been interesting to see the consequences of someone brought up with that technology until they are 18 (or beyond? If it can't be removed, it won't ever switch off...) On the other hand, I liked that they addressed the obvious flaws early on, and then moved on. A world where it becomes the norm would be scary. No privacy from parents, and the people who've grown up with that might except all sorts of invasions of privacy. Imagine this technology being used on criminals, perhaps even for minor crimes (it seems a perfect way to keep track of them - at the expense of their privacy and of anyone who associates with them; perhaps the filter could be modified to block other things). Or companies that require employees to install one of these. I did wonder if the mother's boyfriend was going to find the tablet and start snooping. Or imagine a hacker gaining access and using it to broadcast secrets, or blind the person by turning the filter up to 11. Making it accessible only through one device seemed unlikely, more likely it'd be viewable through the person's own devices. Maybe this was meant to be a way to keep things more secure (or of course, it's easier to show the idea of putting the tablet in the loft, and then bringing it out again, compared to an online account). I liked that whilst the parent was irresponsibly using some dangerous new technology, the teenagers were doing what teenagers have done for generations. I might be more worried of a future where teenagers are no longer like that! I agree, this seems to be a common mistake in several Black Mirror episodes (e.g., The Entire History Of You). Hated in the Nation at least had a character saying they were waiting for self-driving to go mainstream (and seemed to be set nearer to the present than this episode).
  13. It's unclear to me what laws would cover Henry's service. Laws related to "revenge porn" or distributing without model releases wouldn't criminalise the viewer (e.g., imagine if today if someone secretly filmed, then shared the result). Possibly a future court might rule this under voyeurism laws if it's streamed live; or maybe in a world where there are cameras everywhere, stricter laws have been brought in. Or perhaps such an organised approach could be considered aiding and abetting. The harsh punishment seemed to me a commentary of sex offender lists - societies already accept increasingly harsh restrictions because "pedophiles", but people can be put on the list for the most minor of sex offences. I'm glad people here see it as a harsh punishment, but I can easily see the laws being passed and most people not caring. Never mind Henry, people receiving that punishment might including someone peeing in public, teenagers having underage sex, or someone having pr0n of an 18 year old but where they couldn't prove they weren't 17 - Henry wouldn't find himself getting much sympathy, despite the severe punishment. I hope that such confessions are strongly challenged in court! I think a confession from a perfect copy could be valid (if two people are alleged to have carried out a crime, a confession from one can still implicate both of them), but there's the question of whether a very complex piece of software is a genuinely accurate copy, and free of any biases that might make it more likely to give a false confession. Not to mention that the cookie in this story was in some kind of confused state, not fully aware of the real situation. In a world of AI that can replicate humans, it seems unlikely that the training would be done by an actual human. Even if the training procedure was designed by humans, it would be delivered by the software, entirely virtually. I think that would make it creepier too: in the version we saw, it's possible for the cookies to communicate the outside world. Even if it's a hidden manufacturer switch, some people would enable them - and even if people first thought it was just code, it's hard to stick to that view when an intelligent perfect copy of you is pleading with you[*]. But imagine if that was all hidden away - the cookie had literally no way to communicate to the outside world other than through what job they were doing, and the dreadful torture was itself code hidden away in a complex application. It also seems terribly inefficient - I get the point that a copy would know your needs best, but you'd also have the unpredictability and failures from an (overworked) human assistant. Most of the time I'm using Google for the things that I don't know (and hence my copy wouldn't either). Okay, I suppose the point is the AI version of you would do the Google search for you, in some sort of extreme version of lmgtfy.com... Having said that, it seemed an interesting concept, and something that could be possible. The use of "nothing" rather than other punishments seemed interesting too. I think it's something that would be more accepted by people, and would avoid the "can cookies feel pain" debate - in the same way that modern societies view painful punishments as barbaric, but locking people up in solitary for decades is accepted. The fact that it happens near instantly would also make it seem less real to the outside world ("look, it's all over in five minutes") - yet makes it all the more terrifying to the cookie; you can't even cling to the hope that you might be freed, when the decision is made in mere seconds or minutes in reality. I can just imagine the advertising slogan - "It's like having a little copy of you do all the work!" Much sci fi has viewed human level AI in the form of intelligent robots when asking if they should have rights. But imagine if with sufficient computing power, someone could create millions of sentient AIs, virtually use and abuse them without anyone knowing. If treated as humans with rights - as much as that seems reasonable, does it mean indefinitely keeping an AI alive the moment its created? White Christmas focuses on AI copies, in practice AIs may be created without them believing themselves to be human, or even behaving human-like at all.
  14. Regarding the comments on people's online behaviour - remember that one of those targetted was a columnist spreading vile hate against an innocent (with a platform that gave them far more readers than the typical Twitter poster). Not that I'd excuse sending a hate cake, but it does give us some insight into their reasoning. E.g., "Maybe having to face consequences for their online behaviour would cause people to reconsider what they post. It would be better if we could do that without killing people though." - wouldn't this be the kind of thing that those people would be saying in turn of the columnist? "I have a hard time understanding the deranged motives of someone like the teacher taking time to solicit money from 80 others just to send a hate cake, but it was satisfying to see anonymous Internet commenters get their comeuppance. Killing them was a little much, though." On the one hand, I agree about sending hate cakes - on the other hand, maybe they were people who, like you, wanted to see an Internet/media commenter get her comeuppance. (I don't think they were anonymous? From the police to the bees, their details seemed to be known.) The episode seemed a criticism of mob mentality on social media, though I'd add that when someone becomes most-hated-person-of-the-week, it'll still usually start from over-the-top/disproportionate/biased coverage from the mainstream media, that then gets a comment of hate from the large number of readers. This happened in the past, the difference now is that those targetted have more chance to read what people think about them.
  15. Was it explained how he got not just their DNA, but also a sufficient copy of their brain layout to replicate them exactly with all their memories? They didn't even seem to lampshade it with a handwavy explanation, which seemed weak, compared to White Christmas and San Junipero which covered this. Not being able to escape from something fully immersive is indeed a problem - but it seems so obviously a problem that you'd hope there'd be some fundamental failsafe at a hardware level no matter what happens to the software. I was expecting we'd see them reaching the wormhole, but I half expected it to end with him just creating another version of them - even with the fridge contents gone, it didn't seem like it was hard to obtain it in the first place. It's unclear why he couldn't just use a backup of the software AIs anyway, rather than having to replicate it from the DNA again (other than lazy writing - along with the unlocked patio door - to make the plot easily progress). Flaws aside, I still liked it as an exploration of the negative side of human level AIs, and whether such things would be sentient. It was unclear if the technology he was using was generally available - e.g., were the NPCs in the game all human-level AIs? If so, the horror is that what we saw was just one mere example of what could be happening on a wide scale, including being endorsed by all the employees in that company (from intelligent NPCs being enslaved/blown up in the game, to the cookie versions of themselves that they might have at home). Imagine if the other employees had found out - they might find it immensely creepy (analogous to him having taken photos or written out some fantasy of them), but not believe the AIs to be sentient. Maybe the full game has occasional "the AIs have tried to escape again" that they have to clamp down on (do the starships have holodecks with AI characters, that occasionally become sentient and escape the holodeck, but don't realise they're still in a game?) I do think that White Christmas did a better job of conveying the sense of scale of such horror. For every heaven in San Junipero, there could be a computer simulating a billion sentient AIs (perhaps copies of you) being tortured for a thousand years.
  16. The discussion on whether it would really be you is interesting - it seems that some people seem certain that this wouldn't be possible, to me it's an open question. There's two things - is it possible for a computer/software to be conscious, and would a conscious entity with all your memories still be the same you? It's unclear whether a perfect simulation of a human brain would itself be conscious, or whether it would they be unconscious beings indistinguishable from people ("philosophical zomibes"). But the thing is, even if a computer couldn't be conscious, that doesn't mean it wouldn't be possible for us to create some kind of machine with consciousness - unless you think that consciousness is some supernatural property that can't be replicated in any other way than natural means. Even if it requires being organic (what does that mean - that it includes carbon atoms?), there's no reason why we couldn't build artifial organic brains. With everything from memories to chemicals and hormones, and memory depending on those chemicals, it's not clear to me why that can't be replicated whether by computer or another kind of artificial machine. Whether it would still be you is a harder problem. Miles already pointed out that our argument can't be based on having the same physical atoms in our brains (admittedly, it's not clear to me how often or to what extent atoms in our brains are replaced, but there's no evidence that one atom is different to another of the same kind). This question is independent of being organic or not, the same problem occurs with teleporters that make an exact copy of you, the result is still an organic body, but is it still you? It may be that the question is meaningless. Suppose every night we "died", and the new consciousness each day was a different one with all your memories? How would you know? What about if this happened every hour, second, millisecond? Our sense of continuity comes from our memories. Everyone in SJ would tell you that it definitely worked, it was really was them, and there'd be no dead person to say actually they died, and it wasn't them. What if you could copy, without the original dying? At first this seems a good counter example - the person would tell them that it clearly didn't work, they're still there and didn't jump into the digital afterlife. Except, we still have the digital copy also insisting that they did work. Which one is right? Perhaps they both are. I've thought it'd be interesting to see a TV series explore the idea of an afterlife - it's often viewed as being a paradise, but how would the reality work out? A life where there is no need to eat, no need to feel pain or get disease would be a huge improvement - but people today would soon get bored and miss TV, the Internet, technology in general. Sure, that could be created in the afterlife too, but suddenly it's no longer a world of leisure, people will need to work to create those things. At least in SJ, it seems that a lot of things are provided for them - presumably the nightclubs and so on are maintained automatically or by non-conscious software in this world, but this also means being limited by what the company allows or provides. I'm nostalgic for the music of my childhood - but nostalgia for going back to that technology (tapes!) would run out pretty quick! Would they be allowed to communicate with or receive news (or new music) from the outside world that continues, other than the elderly people who can visit up to 5 hours a week? Were there versions of SJ right up to the present day, or was it limited to the childhood periods of those there? (Although some people do die young, so surely they' have to cater for that?)
  17. I found this terrifying. The twist at the end gives a feeling "Well, they were only after bad people then" - but the reality is that this could be done to anyone, even if the secrets aren't "bad" things. And unlike most Black Mirrors, this is possible today. My laptop webcam is already covered, but there's still plenty of information someone could get from hacked email, cloud storage, or your laptop. Even if sensitive material is encrypted, with malware a keylogger could swipe up the password (either directly, or from your password manager). Even for people with no secrets or sensitive info on their devices, information could be sold for identity theft. "Ah, but I'm not stupid enough to download a random program claiming to get rid of malware!" The worrying thing is a recent trend where hackers go after established applications - such as the recent CCleaner hack - the hackers even managed to sign the executable with the legitimate key. It's at least unlikely hackers would go to the extreme lengths in this episode, especially when there's no financial gain (it seems it would be quite convoluted to hack several people who had all some kind of secret, all living in the same area, and within a small period of time to set this chain of events up). But more mundane ransoms (made all the more easier with Bitcoin) are happening - sextortion blackmail like we saw with Hector; or there have been some cases of malware that threatens to send images, messages, browsing history on your phone/computer to all your contacts. The story in http://www.abc.net.au/triplej/programs/hack/webcam-hackers-catch-man-wanking-demand-ransom/7668434 sounds very similar to the start of this episode! A blackmail for money is not quite the ordeal that Kenny was put through, but imagine having to pay out thousands, and still live with the fear that they may not keep their word (either releasing the information anyway as happened in the episode, or coming back with more demands).
  18. She required at least bending of the knee, so it was more then that. A fair point that that doesn't necessarily require any action, so I may have misinterpreted that. But if they're losers she doesn't need, she shouldn't need bending of knees. "She wasn't even stripping them of their lands and titles or holdfasts" We can put this in the arguing that other people are worse category. I think you're arguing something different to what people are talking about - sure, it might be tactically good to do what she was doing, it might be that being good doesn't work well for people in this world. But that doesn't mean that someone can also claim to be the saviour, breaker of chains, or redefine words like "choice". Do it or die isn't a choice. The lannisters aren't doing great now either - though if they were, no one would be justifying them because it made them better off. I'd say Jon Snow wins the prize for "good" right now. (Also I got the impression Tyrion was more concerned with it from a tactical point of view - the impression it would make - than her hypocrisy; this is why he wasn't concerned about guards - not that that was hypocrisy either.)
  19. "Work for me or die" isn't a choice, we call that slavery. I'm sure the other masters justified their enslavement of captured prisoners in the same way. Arguing that there are worse people doesn't make someone good - and it's the hypocrisy that stands out. If anything, they didn't question her enough - they worried that burning prisoners alive doesn't make a good impression, but didn't call her out on her new slave army.
  20. It seems that despite being a few decades in the future (it's commonplace enough that the emergency services aren't even able to respond to someone without the technology), they don't have self-driving cars. Curious that the dinner table debate between having or not having a grain didn't include the middle ground of "have a grain, but this time keeping it encrypted". (Given that phone/laptop full disk encryption is now standard, I'd hope the possibility of not encrypting would never happen - but I can see some equivalent of having one's phone unlocked.) People don't have backups of their entire life's memories? I can just imagine ransomware threatening to delete everything you ever recorded. One of the scariest aspects for me was the routine of scanning people's recordings as part of airport security. We've already seen this being done for people's phones in the US, potentially being able to grab everything on the device or connected online services - it's not hard to see this scanning becoming routine; also see the demands that people hand over all online passwords to be able to fly to the US. I thought like that for a bit, but I think we can still be morally outraged over assault and threats. One partner gets drunk and has sex; the other partner gets drunk, then drives and commits assault. I don't see what would stop it from being allowed though. The technology would arrive, and attempts to stop it would be impractical, and it would leave people, companies and Governments struggling to deal with the implications. We might see Governments trying to bring in dumb/draconian laws (e.g., it's unfeasible to ban computers or mobile phones; but we've seen the US and UK talking about "banning encryption"), as well as seeing how they can take advantage of them (see the airport security). I agree with methodwriter85 about them being recorded videos - also see the dinner table conversation, about how people's memories can become misleading or even false, so it seems like the device wasn't intended to extract from memories?
  21. Jon Snow: progressive when it comes to gender equality. Not so progressive when it comes to child labour laws. I mean, on the one hand that scene was very uplifting, but on the other, I note that "let's send all the children down the mines" is as progressive as Victorian England.
  22. I agree - and it's also interesting to note the debate in the Episode 4 thread about the doctor. Whilst that was a different circumstance, there was the view that any sex with women in this society must be rape. Yet when presented with the scene in this episode, no one seems to have expressed this view.
  23. Women in this society don't know their own minds, cannot give consent, cannot make the choice to have to have sex? That sounds like the sort of view the authorities in this society have too - I find it bizarre that in a world about taking away choice, people would defend that. Indeed, consider the view about the "rapist" convicted earlier in the season, that he may have just been a man who'd had consensual sex with a handmaid (which the authorities would view as rape) - are you saying that's right, that it'd still be rape? Sure, the doctor is a creep at best (due to not offering to do it artificially), and possibly breaking other laws, though that doesn't make it rape. If a woman - even a handmaid - meets someone and wants to have sex with them, says yes, that's rape? Emily Thrace: "That consent is not black and white and its sometimes more complicated than just yes or no" Indeed it's not black and white, but that applies to the earlier view that all sex with women in this society is rape. The complexity is that she's coerced into it by the threat of needing to make a baby, though that's not a threat from the doctor. The doctor might be seen as supporting that though. I don't think this would apply to sex in general though in this society. "Just like if your boss offers you sex and you refuse its still inappropriate" Is it inappropriate, or is it rape? These are very different things! Ms Blue Jay: "It is one of the definitions of sexual harassment." Yes, not rape.
  24. Maybe he could have tried face time, but he's still stuck because it only works with the minority of people also with an apple cell phone. Or he could have just used video calling.
  25. Enjoyed it, but there was a bit of a "The good people go to church, the bad people watch porn". I agree with the comments about the disappointment of it being some random stranger; the last episode seemed to be fuelling the "perverts who watch porn and then rape random strangers" scaremongering, ignoring the widespread sexual assaults committed by people who know the victims. Of course there's no reason why a TV show has to focus on what's most common, usually they don't, after all - but it did feel a bit preachy
×
×
  • Create New...