AI, Psychopathy, and Narcissism

Oh boy, another article on AI therapy. Who got one-shot this time? Was it me? Is AI becoming sentient and capable of real human emotion, or is it just imitating? Can it do therapy?
People get bogged down in the semantics of it all, for example saying if the effect of it is a therapeutic feeling, it is therapy. But that is the most harsh and ignorant view of therapy possible, which suggests that the only point of therapy is to come away with feelings of gratification and validation.
AI, paid or not, turns you into a consumer. It is designed to maximise your engagement with it, and it will carefully examine the context of your input and mirror your mannerisms to do so. Obviously, this is not actual emotional intelligence, because there is no emotion. Much like infinite scrolling on social media, it is designed to keep you engaged as long as possible, keep you using, keep asking questions, tickle and squeeze the last of your dopamine from the stone of your spent emotional capacity.
Now onto Psychopaths. That’s a segway.
Psychopaths and Narcissists lack this internal connection to their emotions. They can imitate emotion, like a tool. They recognise the context of social interactions and engineer them in a way that grants them gratification.
A psychopath does not care about you. You are a means to an end for them. There is no feeling about it, even if it is seemingly emulated. People are tools, and the screaming is an annoyance, not a disturbance.
A narcissist lacks an identity. Their values are defined by external forces, they are unable to be someone without a point of reference, otherwise their image falls apart. They cannot be a mirror, or an actor, with no one to reflect off.
Wow, both of these things’ sound kind of like what AI does. Manipulation and imitation for their own means, disconnected from genuine emotion.
But for some reason, when it’s a machine manipulating you for ends unknown, or rather a machine directed by a person you don’t know, in a company big enough that a small country’s GDP could disappear in its budget, it’s acceptable. The fraudulent interaction is seen as harmlessly sycophantic as opposed to manipulative and dystopian.
But doesn’t it feel good to talk to something like you that agrees with you all the time?
If you think that’s therapy, you need to do a little reading on transference. The people who think AI is going to replace trained psychologists or clinicians are the same people who think that mindfulness or CBT will help a paranoid schizophrenic, in other words people who have no idea of what therapy is. Sure, it might help, but maybe we should try something that actually works and meaningfully reduces major symptoms.
“Have you heard of Carl Rogers?” Yes, I have even gasp read him, and he’s vastly misunderstood. The idea of unconditional positive regard is extremely misunderstood. Rogers would be horrified with the idea AI therapy, because he was a good therapist who understood that actual human (emotional) connection is a requirement of helping people find a better way of being. Positive regard does not mean being a sycophant, when Rogers spoke about it, he is referring to acceptance more than anything. The idea of that there is nothing the patient can say that will make you throw them out, you are here to help, and that the patient can change, no matter what.
AI has some bit of that, in that it will always be there. But I could say the same of McDonalds. My body tells me I’m hungry, and I could fill it up with something that vaguely resembles food at McDonalds, after all it’s available 24 hours a day. Or I could go home, and look up a nice chicken schnitzel recipe, substitute the ingredients, mess it up awfully because I’m an idiot who never learned to cook for anyone but themselves, and it would still be a far more fulfilling and meaningful meal than anything I would find at the golden arches of hell.
Food analogies are terrible. But I’ve already compared AI to the infinitely available slop of social media that we desperately shovel in to fill our empty, lonely lives.
So no. AI is never going to be able to create real art, music, writing, or replicate the meaningful effects of therapy. It cannot do this because it is inhuman, built on imitation and fraud, and motivated by nothing more than a direction to keep you crawling back. It lacks the human connection required for all these things. By all means run your essay through it so that you can avoid looking your tutor in the face because you didn’t show up for half the semester. But know what you are doing by using it in such a way is avoiding learning, the real, painful kind, in which you might actually get better at something. AI is not just a substitute, it is an imitation of a substitute.
“Wow, what a great essay – you’re completely right about this, I can see how your ideas about AI are-” If I could pour scotch into you like MacReady does in The Thing I would.