A school counselor who works in a private school in California recently emailed me to ask for help as it relates to students and the misuse of specific AI bots. Her concerns, though, did not surround general purpose AI bots like Snapchat’s My AI or Microsoft’s CoPilot, but rather those that are specifically used as virtual girlfriends and virtual boyfriends. I thought it would be instructive for other youth-serving professionals (and parents and guardians) to make sure they were up to speed on the positives and negatives of these in-app “software agents” or “artificial conversational entities.” So, let’s dive right in!
Just like Khan Academy’s Khanmigo is used to provide learners with specific educational guidance, strategies, and solutions, and Expedia’s chatbot helps vacationers get their destination ideas and trip details sorted, virtual girlfriend and virtual boyfriend bots provide those interested with a screen-confined romantic partner that can interact much like a human because of AI and the technologies that undergird it (e.g., natural language processing, machine learning, deep learning, and neural networks). Some of us may remember the critically acclaimed, Academy award-winning 2013 film Her staring Joaquin Phoenix and Scarlett Johansson, where the male lead (Theodore) falls in love with an AI virtual assistant (Samantha). That film vividly depicted how incredibly human-like, conversational, and engaging these chatbots can be as they interact with a user, and now the technology is ubiquitously available to almost everyone via a simple app download from Apple’s App Store and Google’s Play store. While interactions occur primarily via text chat, some apps provide voice messages, voice calls, and even image exchange functionality. Users can also customize their virtual boyfriend bot or virtual girlfriend bot to look, dress, act, and interact how they want, and this personalization may contribute to a deeper attachment than if the avatar with whom they are talking and flirting with was generic or non-anthropoidal.
A search in the app stores for “virtual girlfriend” and “virtual boyfriend” brings numerous results, including iGirl, AI Girlfriend, AI Boyfriend, and Eva AI. What might be some benefits of using these apps? Well, we understand that youth in particular long for companionship, seek belongingness within intimate relationships, explore their sexuality in novel ways, and find enjoyment and excitement in certain risk-taking behaviors. Teenagers may gravitate towards virtual boyfriends and girlfriends to address feelings of loneliness or disconnection, to receive affirmation, attention, affection and validation missing from their other relationships. One app markets itself as having the ability to make users feel “cared, understood and loved.” Another app states that its product helps users experiment with romantic advances and exchanges with “someone” before doing so in their normal social circle.
One app markets itself as having the ability to make users feel “cared, understood and loved.” Another app states that its product helps users experiment with romantic advances and exchanges with “someone” before doing so in their normal social circle.
Potential concerns, though, relate to what a user directly and unwittingly is exposing themselves to. For instance, a teen may begin flirting innocuously with their virtual girlfriend but then be introduced to mature sexual language, imagery, or experiences well before they are developmentally ready to handle them. While one hopes that a teen would immediately exit out of such an app, it’s possible they stay engaged for too long and the inappropriate content they read or see produces a measurable traumatic outcome (or at least introduces confusion, fear, and an unhealthy view of romantic relationships and/or sexual activity).
A teen might also become heavily involved with their virtual boyfriend, and play out romantic or sexual fantasies in ways that distort reality, feed overuse, and misrepresent how relationships actually work with other humans. For instance, research indicates that interactions with chatbots do not require much cognitive effort and are therefore sometimes preferred over human interactions. The problem, of course, is that youth who disproportionately or primarily interact with chatbots because of their simplicity may fail to develop the social skills necessary to navigate the messy complexities and nuances of actual human romantic relationships. Such users might also struggle with unhealthy emotional attachments and dependencies that can lead to psychological damage if unaware of the importance of maintaining one’s individuality and self.
Relatedly, engaging intensely with their virtual boyfriend or girlfriend may alter their expectations of the availability, malleability, and amenability of others. Said another way, if I am able to construct a girlfriend within an app to abide by my ideals of physical beauty, and also control how they dress, talk, and act towards me, it is reasonable to assume that this will color and condition my view and treatment of girls and women over time if I have no other reference points or teachable moments. Gender roles and perceptions may also be affected by the fact that giving money to these apps unlocks additional (often sexual) content and features. I wouldn’t want my son or daughter to think that they can just pay more or give up more to get someone else to be romantically interested or promiscuous with them. Wow, even writing out that sentence felt very icky, which underscores the uncomfortableness of this topic. But this is where we are, and educators, mental health professionals, families, and others who work with young people must understand the pull of this phenomenon.
If I am able to construct a girlfriend within an app to abide by my ideals of physical beauty, and also control how they dress, talk, and act towards me, it is reasonable to assume that this will color and condition my view and treatment of girls and women over time if I have no other reference points or teachable moments.
So what can we do in response? I predict the use of virtual girlfriend and virtual boyfriend bots will persist, and perhaps even grow in frequency. It’s relatively easy for anyone of any age to download one of these apps, build their dream romantic partner in avatar form, and then communicate with it. As such, I wouldn’t use fear-based messaging to keep youth from such experimentation. What I would do is have a conversation with them that looks towards the future they are shaping. I’ve taken the liberty to flesh out some points that are worth considering in your role as an educator, parent, or other youth-serving adult should you want to broach this topic with a teen.
1. It is completely normal and natural to feel a strong desire to connect with someone else, even if it’s online and even if it’s a bot. We all want to feel truly seen, understood, and valued by others, and when that is not happening, loneliness, self-pity, and sometimes even self-hatred can take over. We don’t want our teens to feel lonely, and we want them to be seen and valued by more than just their family or teachers. But AI bots may very well be a short-term fix, and may not truly meet that visceral need over the long-term. Interestingly, recent research is showing that using AI chatbots may, for many, make users actually feel more lonely. Perhaps a different strategy is needed to help a teen find their “people” – or at least find one or two other members of their peer group with whom they can get their relational needs met.
2. Chatbots are trained on large language models that involve analyzing the structure and patterns of sentences and paragraphs across the billions upon billions of words posted or uploaded by billions of random people all over the Internet. As such, a virtual girlfriend or boyfriend is using computational models to determine what to say in response to what you’ve inputted (by predicting the next most sensible word, and then the next, and then the next). That’s it. It’s all very artificial and contrived. It can quickly get boring when all she says in response is cobbled together from the intelligent processing of seemingly relevant but absolutely generic textual content online. It’s not really personalized, and it’s not true intimacy by any stretch.
3. Virtual girlfriend and boyfriend bots can affect how someone perceives and interacts with the person they are romantically interested in, but in a negative way. The person your teen has a crush on has their own unique hopes, dreams, commitments, values, imperfections, and idiosyncrasies. The most beautiful thing about a romantic relationship is slowly unveiling those to your partner, and treasuring and uplifting what they unveil to you. Real relationships are bidirectional, not self-serving. Real relationships are hard work and inconvenient, not only available to you when you feel like it. Real relationships are messy and challenging, but incredibly worth the effort. It seems more valuable for youth to spend their time focusing on their current friendships and potential relationships with other humans, because through those they learn so much about patience, grace, kindness, tolerance, and mutual respect – traits that will serve them well in their romantic relationship. The current iteration of AI bots in this space are not helping them level up in those areas.
4. It’s helpful to ask the teen about the girlfriend or boyfriend they would like to have one day. Encourage them to share what they hope for, particular likes and dislikes, and the type of person they envision as a potential partner. Then shift gears and ask, “What do you think your potential partner would want in a boyfriend or girlfriend?” The point is to gently challenge them to become that person and to recognize the areas of life in which they need to grow. Furthermore, it is instructive to ask them if chatting with an AI bot is moving them in that direction, or whether it may be unhelpful in that regard. Remind them that they have a choice right now to do the things that will help them achieve the romantic goals they have in the future.
5. While almost every app, platform, and search engine provides this accessibility as well, and it may be the major motivation in having a virtual girlfriend and boyfriend, consider discussing the reality of premature exposure to sexual language and content. Teens may believe they can handle all sorts of mature interactions and fantasy roleplay, but it is a such a gift to be young and relatively innocent. Plus, those who are consistently exposed to sexually explicit content tend to engage in problematic and unhealthy sexual behaviors and also experience decreased sexual satisfaction. This may or may not be relevant to the youth under your care, but it would be beneficial to avoid certain struggles when older because of wise choices made while younger.
6. This also applies to almost every platform, but virtual boyfriend and girlfriend apps are very likely selling user data, sharing it for the purposes of targeted advertising, and caring only about acquiring subscribers and paying customers. This may give them information they hadn’t fully considered as of yet, and may lead to a decision to avoid using these apps.
Youth-serving adults can pick and choose as to the talking points they’d like to emphasize, but I hope what I’ve shared has provided some options and shed new light on the adolescent landscape. I’d be interested in learning more about what you’re seeing among your students, and whether you’ve been able to tackle this topic in a way that broadens their minds as to the short-term and long-term implications. As with all novel technological developments that affect youth, we must avoid fear-mongering, consider the benefits they may provide, remain rational, calm, and non-judgmental when conveying our concerns, and stay in front of potential problems through education, awareness-raising initiatives, and continued dialogue.
Source of heart sparkler image: Jamie Street, Unsplash.
The post Teens and AI: Virtual Girlfriend and Virtual Boyfriend Bots appeared first on Cyberbullying Research Center.