As I am going through my old notes, I have come across my notes from a talk, that I listened at the cognitive science conference in Krakow, given by Nick Novelli.
His talk had been about the application of concepts, that we learn by studying people, to make us more able to design the ethical machines. One of these people were psychopaths, and the other group, based on my notes, was children.
The one thing that he mentioned was, that psychopaths were unable to have a consistent moral system. This is why they appear to misuse the moral terminology and concepts. And why they seem to not get, that their explanations of their actions were inconsistent. So, they would be able to tell, that something is wrong, but they would not be able to apply this to their situations.
Which is quite similar to come of the chat bots that exist (or did exist, I sort of stopped keeping up with the field). Where the conversation would sort of happen, but it would be highly inconsistent. It would not make sense, if a person would spoke with them for some time.
On the other hand, unlike the AI, the psychopaths also have two differences, that are maybe a bit less applicable to the AI. One is the phenomenology, where psychopaths experience emotions differently. The second one is, that the criminal psychopaths (or at least the ones that get caught) seems to have damaged amygdala. Since AI has no qualia, then it does not really make sense to talk about phenomenological experience. And since they don't have brains, they can't have parts of it damaged.
They have the opposite problem of what is the problem with autism. Both psychopaths and autists have problems with one of the empathy components. You can check the table below:
The hot empathy is also the reason, why children prefer good puppets and punish bad ones. The psychopaths, on the other hand, reason or guess what other people want. They don't feel it. And if they go wrong or can't keep all of it in head... no wonder they can be inconsistent.