Could AI and Technology 'Solve' Human loneliness? Or: Does Mark Zuckerberg know what a 'friend' is?
Dr Dean Burnett explores what Zuckerberg is on about this time.
Facebook founder Mark Zuckerberg has been claiming that loneliness will soon no longer be a problem, as we'll all be friends with AI. However, this fundamentally misunderstands how complex human relationships are.
Confession: I often forget that this is my blog. I have more or less complete control over it. If I want to use it put forward some bizarre musing or pet theory, I can, dammit!
But, I tend not to. I’m regularly rewriting and reformatting paragraphs and passages while thinking “My editor will probably cut this, or tell me to make it more straightforward” or “I don’t think this would resonate with the young adult reader market”.
But this is Shambles. We aren’t beholden to sponsors. We aren’t even charging anyone to read it!. So Why am I holding myself to standards and requirements that nobody is setting?
Because, in nearly a decade of published authordom, I’ve become accustomed to working this way, thanks to the constant feedback I’ve received from other people. Whether they’re colleagues, readers, audience members, online friends who reach out and say “Here is a list of typos you made”. And so on. The my relationships and interactions have literally shaped how I think.
Which is hardly surprising. As I say at any given opportunity, humans are ultrasocial. We’re a collaborative, cooperative species, and the need to form and maintain meaningful relationships with others is arguably why we evolved such big brains and powerful intellects in the first place.

Basically, our brains literally evolved like they did to enable human connections. Hence much of our identity, our understanding of the world, our emotional wellbeing, our sense of right and wrong etc. is shaped and heavily influenced by our interactions with others.
Which is why loneliness can be such a serious issue for mental health, one that’s clearly getting worse. You could say it’s the psychological equivalent of being trapped in a pitch black room; you’re not getting the stimulation your brain depends on for good functioning.
But fear not, ye citizens of the gloomy kingdom of isolation. Help is at hand. Mark Zuckerberg has confidently predicted that, in the near future, loneliness will be a thing of the past, thanks to AI! We’ll all have our own AI ‘friends’ which will provide all the companionship and stimulation we need, making them indistinguishable from the real deal.
Or something like that.
How likely is this prediction, though? Is it another case of ‘engineer brain’, where technology-inclined types see all human problems as rigid, predictable, systems that can be resolved with a simple external tech ‘fix’? Or is there actually something to it?
So, based on my own neuro-knowledge, let’s look at both sides of the argument.
Could AI and technology 'solve' human loneliness? The case in favour.

First and foremost, humans really like forming relationships. Our brains are particularly adept at it. Too adept, if anything.
For instance, can we form relationships with nonhumans. Ergo, we have pets. Interestingly, studies suggest those with poorer mental health tend to form stronger emotional bonds with pets. Hence the existence of emotional support animals, I guess.
Does this means those who experience loneliness, and the subsequent mental health impact, may be more inclined to to bond emotionally with with nonhuman entities like AI? Can’t rule it out.
Of course, forming a meaningful emotional connection with a living, breathing animal, that can interact and show affection back, is very different to emotionally bonding with non-sentient software, represented purely by text on a screen.
…but that doesn’t mean it’s impossible.
After all, many long, meaningful friendships formed and endured via purely online communications. I have many myself, and am very thankful for them. We also live in a world where the default way to find a romantic partner for countless people is via dating apps. Basically, we clearly have no problem with building significant connections around what are ultimately, words and images on a screen.

Maybe that’s different, because it’s a two-way process, while an AI-powered bot is just programming. It doesn’t know you’re ‘there’. It doesn’t know anything, in the strictest sense.
This is all true. But people form parasocial relationships all the time, with celebrities, podcasters, youtubers, whatever. And a parasocial relationship is completely one-sided. One person forms a strong emotional connection to the other, while the other doesn’t even know the first person exists. The first person’s feelings are totally legitimate. They just aren’t reciprocated.
Some people will even develop romantic parasocial relationships. Again, far from being ‘weird’, this sort of thing is very common, and could well be a key part of development.

Teenage crushes, for example, are invariably an example of a young person developing an intense longing for someone they barely know. So, they’re technically obsessed with the version of their ‘beloved’ that exists almost entirely in their own mind, not the one in reality.
Research argues that this process is a way for the developing mind to simulate, ‘practice’ engaging in relationships, without any of the stakes or risk of an actual relationship. Which as we know, can be… ‘psychologically taxing’. Yes, that sounds diplomatic enough.
Back to the matter at hand, it’s by no means unheard of for people to develop romantic parasocial connections to fictional characters, e.g. Bronies1, or that stuff you see from the Manga/Anime communities that I’ve no intention of adding to my search history. It’s surprisingly common, if a bit unsettling by many people’s standards.
Basically, not only can a parasocial relationship happen when the recipient doesn’t know you exist, it can happen when they they don’t even exist themselves! So, why would an AI programme, which can actually respond like a real person, be exempt from such a process? Logically, it wouldn’t be.
Indeed, more and more people insist they have developed strong emotional bonds, if not full-on fallen in love with, chatbots and generative AI. And in a world where people have been known to marry chunks of masonry, isn’t that a better option?
Taken together, it seems like AI genuinely could be a fix for the issue of human isolation and loneliness.
But that’s only true if, ironically, you look at these aspects in isolation. When you step back and look at the bigger picture, it’s a lot less obvious.
Could AI and technology 'solve' human loneliness? The case against.

The stance trumpeted by Zuckerberg and other AI evangelists heavily implies that relationships that are formed and sustained purely via online communication/interactions are as rewarding and meaningful as, if not moreso, real-world ones2.
This would be a more compelling argument, if it weren’t for the fact that most civilisation just recently spent nearly two years relying on online interactions to sustain relationships. What with that whole ‘pandemic’ thing.
Granted, at first there was all that talk of Zoom drinks and online quizzes being cheaper, more convenient, less demanding, and regular comments like “we should keep doing this when lockdown ends”.
Then lockdowns were lifted, and people had the choice of sticking with the online setup, or return to demanding, laborious, effort-requiring face-to-face interactions. And most opted for the latter, with intense enthusiasm.3
Why? Because, ultimately, we’ve spent millions of years evolving for face-to-face communication and group dynamics. The human brain is incredibly keyed into to the rich sensory smorgasbord provided by other humans, even synching activity during conversations.
And we’re usually very sensitive to issues and discrepancies in human interaction, hence the uncanny valley effect, where we find anything not quite 100% human actively off-putting.

And yes, young people develop intense crushes on essentially-imaginary individuals, but those youthful crushes rarely survive contact with the real person, suggesting face-to-face interactions overrule mental ones when they’re in conflict. Which implies that our brains defer to real—world connections over less tangible ones.
And sure, many a relationship, friendly or romantic, has blossomed online. But people still regularly say it’s become ‘official’ when a face-to-face meet happens. Again, real-world connections seem to carry more weight.
As for the many individuals who claim to love fictional characters, a lot of them, e.g. Bronies, belong to distinct communities. I.e. they have also have connections to real people. There’s a difference between forming relationships with fictional characters, and your only relationships being with fictional characters.
And if you’re still unconvinced that real-world interactions are inherently more meaningful, consider Zuckerberg’s tech-billionaire-in-arms and fellow AI enthusiast, Elon Musk. A man with a legion of enthusiastic fanboys constantly yelling about how great he is, and enough money, power, and influence to make the most ostentatious pharaoh say “Mate, chill out!”
He also has his own pet AI that he tailors to agree with his views. Basically, if virtual, artificial interactions meant meaningful relationships and contentment, he should be the happiest person on Earth.
And yet, he still spends an obscene amount of time cheating at a video game, in what seems to be a desperate (and unsuccessful) effort to convince real people he’s good at something? It speaks volumes about how fundamentally important the approval of actual people is to us. Whoever we are.
For the record, I accept that many very lonely people have found comfort through interactions with chatbots. And that’s good. But then, many a dehydrated person admitted to hospital has been helped by an intravenous drip. But you seldom see people in the street or at the office with an IV hooked up to their arm.
I’d content the same applies to AI relationships. It’s good if lonely people have access to something that makes them feel less isolated. But based on what we know about how people work, it’s likely more of a stopgap than a long term solution. A diet of IV and vitamin pills, rather than actual food. I’d be interested (and nervous) to how these people who rely AI chatbots for companionship will turn out, even 5 years from now. It’s very recent tech, after all.
And this is all assuming AI chatbots are, at the very least, neutral entities, just reacting to the inputs they receive. But they aren’t, are they. AI companions are all created and owned by private companies or individual with their own agendas, which is usually “making money”.
Zuckerberg himself said an AI companion could know you “as well as your algorithms”. But does anyone need a friend who tries to sell you a dozen kettles every time you interact, on the grounds that you mentioned you needed a new one, 13 months earlier? Or a friend who assumes you’re an aspiring fascist because you once got a ride from a taxi driver that moaned about immigrants? That doesn’t seem like a healthy setup for anyone.
And Mark Zuckerberg may believe that online interactions are just as effective at forming rewarding emotional bonds, if not better, than face-to-face ones. But a counterargument to that would be that “It is still easier to build trust in person”. At least, according to… [checks notes] Mark Zuckerberg.
His stance on whether or not technological relationships are as meaningful as real-world ones seems to vary, according to whether it makes him more money or not in a particular context. It’s weird, in a ‘not weird at all’ kind of way.
That’s the thing about actual, real friendships and relationships. They’re not transactional, or profit-motivated. They’re mutual. Your friend wants to be your friend. They care about you. And that involves more than just agreeing with you and telling you what you want to hear. They’ll also tell you what you need to hear, because your wellbeing matters more to them than your ego. This is how we grow, and develop, and learn.
Interactions with ‘others’ that only tell you what you want to hear, so they can profit from you? That sounds like what a lot of child stars experienced in the old days. And all of them turned out fine, right?
Right?
Dean covers this issue at length in my book, Emotional Ignorance which you can get a signed edition of from the Shambles shop. You should definitely check it out. You can trust him, he’s a real person.