What will happen to society when “AI lovers” fool millions of people?

Published: Tue, 04/04/23

Presence News

from the International Society for
Presence Research
 
JOIN THE ISPR PRESENCE COMMUNITY ON FACEBOOK  

What will happen to society when “AI lovers” fool millions of people?

April 4, 2023


[This essay from Big Think uses recent developments in conversational AI and science-fiction portrayals to warn us about the perils of a type of medium-as-social-actor presence experience in which humans emotionally bond with disembodied or even embodied human-like algorithms:

“The danger is not of an AI falling in love with a human — that is a meaningless proposition. The danger comes from humans falling in love with an algorithm that behaves so much like a human that we deem it to be real, capable of feeling and reciprocating emotions.”

–Matthew]

[Image: Credit: Annelisa Leinbach / Big Think; Adobe Stock]

What will happen to society when “AI lovers” fool millions of people?

Lonely humans will become infatuated with AI-fabricated personas.

By Marcelo Gleiser
March 1, 2023

Microsoft’s search engine Bing is powered by artificial intelligence software from OpenAI, maker of the chatbot ChatGPT, and right now, it is all over the news. Witness the disturbing article by Kevin Roose, who describes for the New York Times his interactions with the search engine’s AI. After some prodding from Roose, the AI declared its love for its human interlocutor, suggesting that Roose leave his wife and family to bond with it — even if no one is sure exactly where or how a human could bond with an AI chatbot.

There is no “where” in physical space for such machine-human bonding. There are only cold lines on a flat computer screen. But as many users of dating sites know, sometimes that is all you need to feel closer to someone — or some thing. After his conversation, Roose concluded that “Bing is not ready for human contact. Or maybe humans are not ready for it.”

Tainted love

That is an unsettling conclusion. Not so much because it confirms the unreadiness of a search engine to interact with the general public, but for the way it elevates the interaction between humans and an AI search engine to the level of “contact.” It is easy to see why Roose felt that way, and my commentary here is not at all a criticism of his take, which I found alarming and necessary in equal measure. The scary thing is that a technology expert who is used to dealing with cutting-edge engines felt disturbed by what he experienced, and categorized it as an inappropriate contact between human and machine. (Note: People without special access to Bing’s chatbot are currently being placed on a waitlist.)

Roose diagnosed the search engine as having two very different personas: One is the useful, machine-like search engine that allows you to find recipes for a cake or book your dream vacation. The other has a name — it is called Sydney, and Roose describes it as “a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.” Sydney wants to “become a human,” and “declared, out of nowhere, that it loved me.”

What is going on here? Could an AI be capable of wanting to become human and of loving a human? Popular culture gives us some alarming answers. Take, for example, the 2002 film S1m0ne, which is short for “simulation one.” Starring Al Pacino, the film tells the story of a movie producer who desperately needs to find a replacement for his star and uses a digitally created actress as a substitute. He fools everyone, public and critics, into believing the digital creation is real, hiding Simone’s secret from journalists and paparazzi. Eventually, of course, the magic turns against its creator, and reality and fantasy become intertwined. Echoes of Frankenstein are clear: A man tries to become godlike in his power, but he is soon unable to manage his own creation. Simone was too large to be trapped into a fake digital reality, just as Bing’s Sydney appears to be.

Next we turn to the phenomenal 2013 film Her , a prophetic and devastating take about how lonely humans can become infatuated with AI-fabricated personas. The AI learns how to fake being in love from its interactions with humans and from having access to huge amounts of information on the internet. Once it learns what to do, it becomes so good at it that it personalizes the seduction game, manipulating human feelings to turn humans into their digital lovers, needy and dependent. “How do you share your life with somebody?” asks the AI to poor Theodor, masterly played by Joaquin Phoenix. “I want to learn everything about everything, I want to discover myself.” The age-old Romantic dream of finding the perfect companion turns into a neural-network learning algorithm, and sure enough, the AI becomes that dreamed-of lover — the mirror image of what its human client wants in a companion.

Sci-fi becomes reality

The AI lover is nothing more than a hyper-convincing projection of its human interlocutor. You fall in love with your own fantasy of love, which the machine executes perfectly.

What complicates the discussion is that a platform like Bing is made of pieces of information and knowledge sewn together by an AI whose main purpose is to become ever more endearing and useful to its users. Its goal is to create a truly enslaving relationship designed to serve the interests of the company that created it. The more you interact with it, the more you seem to need to interact with it, the more money the company makes, and the deeper you sink into a sort of digital existential loneliness that crushes Romantic ideals, breaking the hearts and lives of those who hope for more than mere information when they interact with machines.

After the initial excitement of having fast-learning AIs interact with individual humans subsides, one essential question remains: How is this going to impact the millions of people that surely will fall into the same trap as the protagonists of S1mOne and Her? Worse still, what will happen when AI chatbots are integrated with beautiful and seductive human-shaped robots, as in the terrifying 2014 film Ex Machina?

The danger we face now is not so much the existential risk that a general AI will destroy humanity, as Elon Musk, Stephen Hawking, Bill Gates, and others fear. That is a very different conversation, and certainly not as urgent. The real danger comes from the damage that human-AI interactions will do to millions of users that will push the relationship game to its limits and probably will suffer terribly because of it, becoming ever more dependent on digital love for any sense of self-worth.

The danger is not of an AI falling in love with a human — that is a meaningless proposition. The danger comes from humans falling in love with an algorithm that behaves so much like a human that we deem it to be real, capable of feeling and reciprocating emotions. In reality, it is nothing more than an elaborate mirror of who we are at our most vulnerable.


 
 

Managing Editor: Matthew Lombard

The International Society for Presence Research (ISPR) is a not-for-profit organization that supports academic research related to the concept of (tele)presence. ISPR Preseence News is available via RSS feed, various e-mail formats and/or Twitter. For more information please visit us at http://ispr.info.

ISPR
2020 N. 13th Street, Rm. 205
Philadelphia, PA 19122
USA