Arguably one of the biggest concerns of the metaverse, especially to law enforcement, parents, guardians, and caregivers of children, is the potential for pedophilia, online grooming, and child sexual exploitation. To be sure, the media is quick to focus on this hazard with any new environment (offline or online) that allows for adults and youth to commingle in some capacity, and metaverse candidates are no exception. It is important, therefore, to try to separate fiction from fact in order to assess the possible extent of these concerns and to determine how best to proactively and reactively address them.

It is fair to say that participation in virtual reality (VR) is increasingly occurring among younger populations. Based on a Spring 2022 survey involving 7,100 teens from 44 US states, 26% own a VR headset while 5% of those say they use it on a regular basis.1 Some research has identified that adolescents make more rational and mature decisions in situations that are not emotionally arousing but struggle to choose wisely (e.g., make riskier choices) in high arousal environments.2, 3 Considering that the metaverse provides a richer, deeper, and more visceral experience of heightened emotionality, it stands to reason that youth may be increasingly susceptible to grooming on these immersive platforms. Compounding factors, of course, relate to their unique life stage that typically involves a desire to follow their impulses and take risks,4, 5 sexually experiment,6 and be validated by others outside of their family.7, 8 It also bears mentioning that certain populations of youth are disproportionately susceptible to online grooming, such as those who suffer from emotional distress or mental health problems,5, 9, 10 low self-esteem,11 poor parental relationships and weak family cohesion.12-15 Because of the intrinsic characteristics of metaverse environments, these more vulnerable groups merit additional attention and support.

Unfortunately, research of online victimization in metaverse environments is sorely lacking. Some guidance can be found, though, in the incipient research base involving gaming, VR, and sexual violence among adults.16-18 While uncommon, instances of groping,19-21 forced engagement with sexual imagery and interactions in extended reality spaces,22, 23 unwanted sexual advances, and the overriding, misunderstanding, or ignorance of sexual consent have taken place over the years in various online multiplayer environments. The exchange of child pornography was a non-trivial problem on Second Life (created in 2003 by Linden Labs),24-26 and it intuitively makes sense that other metaverse properties have the potential to serve as shadowy alternatives to the “real world” where those so inclined can operate with a measure of perceived impunity. Of course, detection technologies such as Microsoft’s PhotoDNA, Google’s Content Safety API and CSAI Match, and Meta’s PDQ and TMK+PDQF – as well as other advances in AI – have been developed in recent years to help forestall the distribution of child sexual abuse material in VR spaces.

Some studies have been conducted on what has been termed “sexual ageplay” in Second Life.27-29 This phenomenon involves consenting adults intentionally choosing child avatars to engage in child/child or child/adult sexual activities within that virtual world. It has not been proven conclusively that such actions in Second Life are correlated with offline sexual harms toward children.24 Moreover, sexual ageplay is not (currently) illegal because the participants are consenting adults, the content involves fantasy images, and no actual children are directly harmed.30, 31 However, there is concern that such role-playing may normalize this aberrant behavior with onlooking children, may intentionally facilitate the grooming of children (either individually or planned with like-minded others), and may bolster inappropriate feelings towards this population.29, 32, 33 Strict policies and controls for how avatars can interact – especially avatars representing minors – should be in place to prevent misuse. Additionally, since VR environments – like many forums or channels on other online platforms – have the potential to spawn insular communities where children are viewed as sexual commodities, these behaviors will be reinforced and validated, specific tactics will be shared, inhibitions will be lowered, and rationalizations will proliferate.28, 34

As such, there must exist simple, frictionless ways to report instantiations of CSAM, grooming, sexual ageplay, or other prohibited conduct in a way that captures virtual location, participants, onlookers, date and time stamps, and footage by any user who witnesses it. For immersive environments that cater to adults seeking to engage in mature behaviors, age-gating is a must – even though age verification remains as one of the largest (and clunkiest) challenges in the Trust and Safety space given concerns about personal privacy, the potential for data leaks, forged identification documents, and the legitimate need for anonymity in some contexts. It may be a necessary evil that more companies will invariably adopt (e.g., Roblox currently requires government identification and a selfie to be uploaded in order to use their voice chat feature – a policy they made after evaluating multiple options including the use of video call verifications and government database crosschecks).

While AI solutions continue to improve, manual content moderation by Trust and Safety team members who work for each respective platform also seems essential. Trained, empowered users must be present to supervise rooms and landscapes where sexual exploitation might take place, thereby serving a deterrent purpose with their presence and a reactive presence with their flagging and reporting.  Similar to this is the concept of bottom-up, user-driven volunteers whose primary purpose is to keep tabs on situations and scenarios where various kinds of harm can take place, and to play an intervening protective role.35, 36 Accordingly, they can act as “arbiters, governors, community managers, teachers, role models, curators, and enforcers” and operate based on a strong commitment to preserving the integrity of the space they surveil.37:1421

Clearly-stated Terms of Service and Community Guidelines remain valuable38, 39 available not only in corners of the platform’s corporate website but also pushed to users via messages and prompts to remind them about appropriate behavior in both public and private forums. As another idea, third-party blocklists have been used to help individuals stay safe from harassment on Twitter by easily allowing them to guard against interactions from those communally identified as “harassers”.40 While understanding the obvious limitations of knowing exactly who is behind an account, can a blocklist of usernames, avatars, or persona regularly suspected of sexually harmful behavior be made available by conscientious third-party developers or watchdog organizations so that other users (youth or adults) can preemptively avoid interacting with them?

Finally, platforms are creating in-app or in-game tools to protect users like Meta’s Personal Boundary or Microsoft’s Space Bubble, but platforms must make sure that individuals are actively onboarded in a way that largely confirms they know, understand, and are capable of using these safety features when needed. Similarly, users must also understand how they can quickly extricate themselves from any unsafe interaction (e.g., in AltSpaceVR you can use the Radial Menu to quickly select “Title Screen” which will pull you out of the current scene and return you to your Command Center). Relatedly, Meta recently released new supervision tools in June 2022 so that parents and guardians can better control what their teen downloads, plays or experiences via the Quest headset, view their teen’s list of Friends, and monitor headset screentime use. Presumably, other hardware and software manufacturers will follow with similar family safety features in the near term and will push each other to continually enhance such offerings. However, it is arguably not enough to shift the onus of responsibility onto the userbase by simply creating safety controls for them to employ; rather, platforms must handhold users to increase the likelihood they know how to access and utilize those controls. This seems especially true when attempting to support and safeguard younger users from the potential of sexual exploitation and related harms in metaverse environments.

Images: Mart Production, Emily Wade, Meta, and Julia M Cameron

References

1. Sandler P. Taking Stock with Teens: 21+ Years of Researching U.S. Teens, GenZ Insights. 2022. Accessed June 2, 2022. https://www.pipersandler.com/private/pdf/TSWT_Spring_2022_Full_Report.pdf

2. Whittle H, Hamilton-Giachritsis C, Beech A, Collings G. A review of young people’s vulnerabilities to online grooming. Aggression and violent behavior. 2013;18(1):135-146.

3. Van Duijvenvoorde AC, Jansen BR, Visser I, Huizenga HM. Affective and cognitive decision-making in adolescents. Developmental Neuropsychology. 2010;35(5):539-554.

4. Ybarra ML, Mitchell KJ, Finkelhor D, Wolak J. Internet prevention messages: Targeting the right online behaviors. Archives of Pediatrics & Adolescent Medicine. 2007;161(2):138-145.

5. Soo K, Bodanovskaya Z. Risk factors of becoming a victim of Internet related sexual abuse. Online behavior related to child sexual abuse: Literature report European Union and Council of the Baltic Sea States: ROBERT Project (Ristaking Online Behavior Empowerment Through Research and Training). 2012;

6. Quayle E, Jonsson L, Lööf L. Online behaviour related to child sexual abuse. Interviews with affected young people ROBERT, Risktaking online behaviour, empowerment through research and training European Union & Council of the Baltic Sea States. 2012;

7. Dombrowski SC, LeMasney JW, Ahia CE, Dickson SA. Protecting children from online sexual predators: technological, psychoeducational, and legal considerations. Professional Psychology: Research and Practice. 2004;35(1):65.

8. Stanley J. Child abuse and the Interne. Vol. 15. 2001. Child abuse prevention issues.

9. Mitchell KJ, Finkelhor D, Wolak J. Risk factors for and impact of online sexual solicitation of youth. JAMA. 2001;285(23):3011-3014.

10. Wolak J, Finkelhor D, Mitchell KJ, Ybarra ML. Online” predators” and their victims: myths, realities, and implications for prevention and treatment. American Psychologist. 2008;63(2):111-128.

11. Olson LN, Daggs JL, Ellevold BL, Rogers TK. Entrapping the innocent: Toward a theory of child sexual predators’ luring communication. Communication Theory. 2007;17(3):231-251.

12. Wolak J, Finkelhor D, Mitchell K. Internet-initiated sex crimes against minors: Implications for prevention based on findings from a national study. Journal of adolescent health. 2004;35(5):424. e11-424. e20.

13. Mitchell KJ, Finkelhor D, Wolak J. Youth Internet users at risk for the most serious online sexual solicitations. American Journal of Preventive Medicine. 2007;32(6):532-537.

14. Mitchell KJ, Finkelhor D, Wolak J. Online requests for sexual pictures from youth: Risk factors and incident characteristics. Journal of Adolescent Health. 2007;41(2):196-203.

15. Stith SM, Liu T, Davies LC, et al. Risk factors in child maltreatment: A meta-analytic review of the literature. Aggression and violent behavior. 2009;14(1):13-29.

16. Anderson M, Vogels EA, Turner E. The virtues and downsides of online dating. Pew Research Center; 2020. Accessed June 3, 2022. https://www.pewresearch.org/internet/2020/02/06/the-virtues-anddownsides-of-online-dating/

17. Fox J, Tang WY. Women’s experiences with general and sexual harassment in online video games: Rumination, organizational responsiveness, withdrawal, and coping strategies. New Media & Society. 2017;19(8):1290-1307.

18. Tang WY, Fox J. Men’s harassment behavior in online video games: Personality traits and game factors. Aggressive behavior. 2016;42(6):513-521.

19. Blackwell L, Ellison N, Elliott-Deflo N, Schwartz R. Harassment in social virtual reality: Challenges for platform governance. Proceedings of the ACM on Human-Computer Interaction. 2019;3(CSCW):1-25.

20. Adams D, Bah A, Barwulor C, Musaby N, Pitkin K, Redmiles EM. Ethics emerging: the story of privacy and security perceptions in virtual reality. 2018:427-442.

21. Maloney D, Freeman G, Robb A. A Virtual Space for All: Exploring Children’s Experience in Social Virtual Reality. 2020:472-483.

22. Blackwell L, Ellison N, Elliott-Deflo N, Schwartz R. Harassment in social VR: Implications for design. IEEE; 2019:854-855.

23. Sparrow L, Antonellos M, Gibbs M, Arnold M. From ‘Silly’to ‘Scumbag’: Reddit Discussion of a Case of Groping in a Virtual Reality Game. 2020;

24. Russell G. Pedophiles in Wonderland: Censoring the sinful in cyberspace. J Crim L & Criminology. 2007;98:1467.

25. Garcia-Ruiz MA, Martin MV, Ibrahim A, Edwards A, Aquino-Santos R. Combating child exploitation in Second Life. IEEE; 2009:761-766.

26. Laue C. Crime potential of metaverses. Virtual worlds and criminality. Springer; 2011:19-29.

27. Kierkegaard S. Cybering, online grooming and ageplay. Computer Law & Security Review. 2008;24(1):41-55.

28. Reeves C. Fantasy depictions of child sexual abuse: The problem of ageplay in Second Life. Journal of sexual aggression. 2013;19(2):236-246.

29. Meek-Prieto C. Just age playing around-how second life aids and abets child pornography. NCJL & Tech. 2007;9:88.

30. Reeves C. The virtual simulation of child sexual abuse: online gameworld users’ views, understanding and responses to sexual ageplay. Ethics and Information Technology. 2018;20(2):101-113.

31. Richards C. Further sexualities. The Palgrave handbook of the psychology of sexuality and gender. Springer; 2015:60-76.

32. Lanning KV. Child molesters: A behavioral analysis for law enforcement officers investigating cases of child sexual exploitation. National Center for Missing & Exploited Children; 1992.

33. Kim C. From fantasy to reality: the link between viewing child pornography and molesting children. Prosecutor. 2005;39(2)

34. Durkin K, Forsyth CJ, Quinn JF. Pathological internet communities: A new direction for sexual deviance research in a post modern era. Sociological Spectrum. 2006;26(6):595-606.

35. Seering J. Reconsidering Community Self-Moderation: the Role of Research in Supporting Community-Based Models for Online Content Moderation. Proc ACM Hum-Comput Interact. 2020;3

36. Jhaver S, Appling DS, Gilbert E, Bruckman A. ” Did you suspect the post would be removed?” Understanding user reactions to content removals on Reddit. Proceedings of the ACM on human-computer interaction. 2019;3(CSCW):1-33.

37. Seering J, Wang T, Yoon J, Kaufman G. Moderator engagement and community development in the age of algorithms. New Media & Society. 2019;21(7):1417-1443.

38. Kraut RE, Resnick P. Building successful online communities: Evidence-based social design. Mit Press; 2012.

39. Matias JN. Preventing harassment and increasing group participation through social norms in 2,190 online science discussions. Proceedings of the National Academy of Sciences. 2019;116(20):9785-9789.

40. Jhaver S, Ghoshal S, Bruckman A, Gilbert E. Online harassment and content moderation: The case of blocklists. ACM Transactions on Computer-Human Interaction (TOCHI). 2018;25(2):1-33.
The post Child Grooming and the Metaverse – Issues and Solutions appeared first on Cyberbullying Research Center.