“Half of these worlds took the additional step of rejecting a child’s immediate attempt to re-register as an age-eligible user from the same computer,” the report says. “Three of the teen- and adult-oriented virtual worlds in which the Commission found a moderate to heavy amount of explicit content had separate ‘adult only’ sections to keep minors from viewing age-inappropriate content; these worlds also employed supplemental age-segregation initiatives to prevent interactions between adults and minors.”
While the rules of conduct for the teen and adult sites with moderate to heavy explicit content did prohibit certain types of sexual, threatening, or abusive material, “they did so in vague terms that provide little guidance to users about specific prohibited conduct. Indeed, the Commission found explicit content in these worlds despite their rules of conduct, a fact that indicates that conduct standards, on their own, are insufficient to stem the creation of or exposure to explicit material.”
While many virtual worlds use “age screens”—registering a birth date before entering a site—to keep minors from accessing sites intended for adults, the report says that age screening is “only a threshold measure that operators should take to prevent youth access.”
It recommends that virtual-world operators include separate “adults only” sections, either by subscription or through age verification, to keep minors form viewing inappropriate content, as well as age-segregation initiatives that give users different experiences depending on their birth date.
Many virtual worlds already use community policing measures, such as abuse reporting, flagging, and live moderators, and some use filtering technologies to enforce community standards.
In particular, the FTC makes five recommendations to virtual-world operators to help reduce the risk of youth exposure to explicit content:
• Use more effective age-screening mechanisms to prevent children from registering in adult virtual worlds.
• Use or enhance age-segregation techniques to make sure that people interact only with others in their age group.
• Re-examine language filters to ensure that they detect and eliminate messages that violate rules of behavior in virtual worlds.
• Provide more guidance to community enforcers in virtual worlds so they are better able to review and rate virtual-world content, report potential underage users, and report any users who appear to be violating rules of behavior.
• Employ a staff of specially-trained moderators who are equipped to take swift action against rule violations.
The report also recommends that parents and children become better educated about online virtual worlds.
Effectiveness of age screening
The study examined sites such as Second Life, Kaneva, There.com, IMVU, and Red Light Center.
Second Life’s age-screening system is unique among the virtual worlds the FTC studied, the report noted, in that Second Life automatically segmented registrants into three age categories based on the date of birth the users first entered during the registration process.
All users registering for the Second Life grid, whether for the adult-oriented Second Life or for Teen Second Life, register at the same URL. Depending on the birth date entered, a user either is rejected (where the age entered is under 13), directed to Teen Second Life (if the age entered is between 13 and 17), or directed to adult Second Life (if the age entered is 18 or older). In the former two cases, a persistent cookie is set correlating to the entered age, giving the user no access to Second Life at all, or access only to Teen Second Life.