To build on Digital Jedi's excellent answer, I find he needs a term of art for this:
the flaws of AI you interact with at home are especially noticeable
The term here is uncanny valley.
The uncanny valley effect is a hypothesized psychological and aesthetic relation between an object's degree of resemblance to a human being and the emotional response to the object.
And
Mori's original hypothesis states that as the appearance of a robot is made more human, some observers' emotional response to the robot becomes increasingly positive and empathetic, until it becomes almost human, at which point the response quickly becomes strong revulsion. However, as the robot's appearance continues to become less distinguishable from that of a human being, the emotional response becomes positive once again and approaches human-to-human empathy levels. When plotted on a graph, the reactions are indicated by a steep decrease followed by a steep increase (hence the "valley" part of the name) in the areas where anthropomorphism is closest to reality.
Data's problem is that his uncanny valley is on the surface (and more than his unnatural skin and eye color). In fact, that's his "personality", if you will. It's a constant foil for storytelling, such as when Data is approached romantically by a female crew-member in the episode In Theory. There's one scene in particular that highlights this very well. Data starts a "lover's quarrel" to improve the relationship, but the whole thing is terribly stilted.
DATA: My most recent self-diagnostic revealed no malfunctions. Perhaps there is something wrong with you.
JENNA: I've never seen you behave so foolishly. Why are you doing this?
DATA: You don't tell me how to behave. You're not my mother.
JENNA: What?
DATA: You are not my mother. That is the appropriate response for your statement that I am behaving foolishly.
JENNA: Data, I think you should just leave.
DATA: You do not wish to continue our lovers quarrel?
JENNA: Is that what this is?
DATA: In my study of interpersonal dynamics, I have found that conflict followed by emotional release often strengthens the connection between two people.
JENNA: But there's something so forced and artificial about the way you're doing it, Data. It's just not the real you.
DATA: With regard to romantic relationships, there is no real me. I am drawing upon various cultural and literary sources to help define my role.
JENNA: Kiss me.
(they kiss)
JENNA: What were you just thinking?
DATA: In that particular moment, I was reconfiguring the warp field parameters, analysing the collected works of Charles Dickens, calculating the maximum pressure I could safely apply to your lips, considering a new food supplement for Spot.
JENNA: I'm glad I was in there somewhere.
At the end of the episode, Data is neither happy, nor sad, that they broke up. He just... is. The uncanny valley is the centerpiece here. Data (incapable of emotion) is trying to force an emotional simulation and failing miserably.
The Holodeck seems to be better at uncanny valleys, but that's largely because the valleys are not front and center. The TNG episode 11001001 features Riker falling in love with an incredibly elaborate holodeck program made by the Binars. The program is so good that, when Picard meets her, he notes how she falls outside the valley, implying she is far more sophisticated than any holodeck program he's seen.
MINUET: Will was saying how much he enjoys this assignment. It's a credit to you. For a ship and crew to function well it always starts with the Captain. You set the tone.
PICARD: At the moment, it's you who are setting the tone. The sophistication of this programming is remarkable.
MINUET: In what way?
PICARD: The holodeck has been able to give us woodlands and ski slopes, figures that fight and fictional characters with which we can interact, but you, you're very different. You adapt. You spoke to me in French.
MINUET: It was very simple. When I heard your name, I merely accessed the foreign language bank.
PICARD: That's very impressive.
MINUET: Oui, mon chou.
The Holodeck is, in many ways, like ChatGPT. It can give you some amazing approximations and even passable personalities, but, as Geordi Laforge discovered the hard way, it can only approximate actual reality based on what it knows at the time. In the episode Booby Trap, he inadvertently asks the holodeck to "show him" what it means, so it generates Leah Brahms (a designer of the ship's warp engines, and attractive woman) as a suitable pointer. The uncanny valley of her coldness gets to Geordi, so he asks the computer to approximate a personality for her, which it does. Unfortunately, the program goes a bit too far, and when the real Brahms shows up (TNG: Galaxy's Child), she's nothing like the simulation (nor is she amused by her holodeck facsimile).
The simple fact is that the holodeck is a form of entertainment, and the expectations there are far lower. In many ways, it's like watching a TV show or movie. There's nothing real there, but you get immersed in the world you're watching. Bad writing and acting are the analogues to the uncanny valley there. The holodeck is just literal immersion. Data is too good at the Sherlock bit, so to add immersion, Geordi asks for a program to "beat Data". Hence, the computer cheats by breaking the fourth wall of the Holodeck and gives Moriarity the capacity to realize that he's a program in a simulation on a starship.
Voyager's EMH Doctor comes with a pre-filled personality that does a passable job at not having a huge uncanny valley out of the gate. But he still pops in with a cold demeanor at first, improving the personality over time. As with any AI, the more data you can supply, the more useful the results.