The whole question of whether AIs can be sentient, and whether what we see in the show is unethical, is something that I'm loving about this series. We still don't have a clue as to what causes conciousness, or what are qualia. Chances are we still won't have a clue, even by the time we are capable of creating human-level AI.
A philosophical zombie is a being that's indistinguishable from humans, except that they lack consciousness. They'd react to pain in the same way, they'd tell you they experienced pain, but in fact, there would be no mind.
One viewpoint is that philosophical zombies can't exist. Once you have an intelligent being with all of the intelligent functions, memory, reactions of a human, such a being is necessarily conscious, even if this happens only as part of a computer program. On the other hand, maybe human brains contain an important property that a computer does not.
Many people would see the robots as glorified NPCs, and even if they thought the actions distasteful for whatever reason, they wouldn't be doing so out of concern of AIs being abused. People often think of "what computers can't yet do" as examples of intelligence, but when they can do it, "it's just a program". Someone from 100 years ago might wonder if the Google Assistant bot in a phone is like a real person, yet no one today what even think there was any sentience, or ethical issues. A self-driving car might seem pretty cool and futuristic, but I don't think anyone's going to think it has any kind of sentient mind. Now I don't think this means software can never be sentient - but I can see there will be a problem that we won't have any way to know if/when it happens.
I do think it's a bit odd they don't have any kind of separation for different types of visitors though - I mean, even if I think there's nothing wrong with people who want a rape/murder fantasy, if I'm there to enjoy the experience or play it like an RPG, I'd be a bit annoyed if some other guy decides to stab all the hosts in the room. It would sort of ruin the immersion (even "But it's the Wild West" doesn't work when the guests have god-like powers, nor do other guests have the ability to stop them without resorting to real violence). MMORPGs generally don't let players go on god-mode killing sprees of all the NPCs, because even if some people might like it, other players won't. Second Life caters to many users who roleplay sex and BDSM, but my understanding is there tend to be different regions for different kinds of things. Second Life also has a minimum age of 13, and age restricted regions - it's odd to see child visitors in Westworld, but maybe things have changed in the future it's set in.
"I'm no biologist, but if the hosts have vision -- which requires receptors between eye and brain -- isn't it possible that they have pain receptors too? If they can see and differentiate between light and dark, can they also differentiate between heat and cold? Soft and hard?"
There's the difference between a receptor that can detect light of say a certain wavelength, and the experience (qualia) of seeing say a particular colour. A thermometer doesn't feel heat. One could make a robot that could detect the things which to a human cause pain - it could be linked up to that robot giving the same reactions that a human gives. It could be programmed to say "Yes that really did hurt, I do feel pain". At which point is it sentient and really experiencing pain?