As if porn sites and pedophiles in chat rooms weren’t frustrating enough for educators whose students use the internet, now online postings of amateur video featuring skin and violence are raising concerns. The explosion in online video-sharing sites, where clips of any nature can be easily uploaded for the world to see, has become the latest challenge for adults trying to shield children from the dangers of cyberspace.

Carol Kiesman, a mother and fourth-grade teacher in Houlton, Maine, enrolled her 14-year-old daughter in a cyberspace club called “Zoey’s Room,” so the teen could chat away online with other girls in a gated community where all participants are screened.

Imagine, then, how Kiesman cringed when she saw her daughter, 10-year-old son, and fourth-grade students recently encounter homemade videos online that included nudity and animal cruelty.

“I don’t like that innocent kids can click on stuff like that,” Kiesman said. “What you view as entertainment as an adult shouldn’t be entertainment for 13-year-olds.”

Popular web sites such as YouTube, Yahoo, Google, and soon also Microsoft Corp.’s MSN are featuring user-generated videos that quickly have become a phenomenal form of entertainment. YouTube, the leading video site that helped catapult the genre with its public launch in December, attracted more than 20 million visitors in May. The company says it averages 50,000 new video uploads per day.

The infectiousness of the video-sharing sites–users can quickly eMail friends and family to alert them to favorite videos–has created feverish sensations: The uncanny star of “The Evolution of Dance,” a comedic performance of different dance styles, has amassed more than 25 million page views in two months to become the all-time most viewed video on YouTube, and the explosive backyard science experiment of mixing Mentos candies with Diet Coke has snowballed into hundreds of copycats, remixes, and spin-offs.

Within minutes, an auteur’s work could be viewed by thousands. At some web sites, videos garnering the most page views are automatically pushed to a highlighted list or “most popular” section.

But alongside the cute animal tricks, comic sports bloopers, and corny lip-synching sessions are extremely weird antics and clips of the crudest kind. There’s a plethora of videos of people vying for some attention and young women flaunting their bodies.

The emergence of these shared communities on the broader internet has created a host of problems for educators and concerned parents, many of whom struggle with the moral and ethical challenges posed by the openness of cyberspace. There are obvious social and educational benefits to allowing children to have access to video resources on the internet, but sometimes there is a need to censor and control those experiences for fear of what kids might find, critics say.

“Even before YouTube and similar sites [emerged], access to graphic images on the internet was becoming an issue for schools,” wrote Susan Brooks-Young, a veteran educator and longtime ed-tech advocate who contributes regularly to eSchool News Online’s Ed-Tech Insider blog.

For many educators, she said, it’s a matter of interpreting existing policies–and, in most cases, making tough decisions.

“My thinking is that schools and districts need to embrace video technology, incorporating video production into classroom activities and teaching adults and students about ethics and responsibilities in creating and posting their work,” she said.

“For example, a flurry of teachers across the country allowed students to view an online video of the execution of an American in Iraq. My understanding is that in those instances, district leaders turned to existing board policies related to showing age-appropriate material and/or using non-board approved materials without parental permission to make decisions about how to handle these incidents.”

Said Young: “I think we need to remember that we are the adults. Just because something like YouTube is available, and just because our students might see it anyway, doesn’t mean we need to make it available in classrooms; but that doesn’t mean we ignore the medium altogether.”

Some viewers, including Ellen Harris of Palo Alto, Calif., consider the racier posts as an outgrowth of today’s culture.

“We certainly shake our heads when we see certain stuff, but there’s stuff like that on prime-time TV as well,” said the mother of three teenagers.

Harris thinks the homemade video explosion is an exciting new form of creativity; her family has gathered to watch some hilarious online clips together. The risque byproducts have simply become another source for family discussions–alongside television and movies–on matters such as sex, violence, or exploitation.

Still, for now, she’s asked her youngest child, a 13-year-old daughter, to stay away from MySpace, the leading social networking web site, which added video-sharing features this year.

To raise awareness that explicit or inappropriate videos could be accessible to children through popular web video sites, the New York State Consumer Protection Board last month issued a consumer alert and pushed Google Inc. to do more to protect children.

“Parents have a hard enough time policing the internet without Google Video making it easier to see and to save these types of videos,” said Teresa Santiago, the board’s chair.

While catering to a mass audience whose entertainment tastes run the gamut, the online video web sites are aware of the challenges they face in welcoming uncensored clips. They strive to be an open stage for budding musicians, comedians, and filmmakers, but they also don’t want to drive away offended viewers or advertisers.

“We are concerned about this issue and are aware that it affects most services that make video available on the internet,” Google stated in response to the New York consumer board alert.

One dilemma is that while some videos could be considered offensive or inappropriate for underage viewers, they don’t necessarily amount to pornographic or obscene material, which is denounced on YouTube, MySpace, Yahoo, and Google.

The web sites require that those uploading a video sign off on an agreement acknowledging the prohibition of obscene submittals, such as pornography or nudity. But users who click to agree to those terms can ignore it and post anyway, slipping the clips online for a while before they get pulled.

Those top web sites all rely on viewers to alert them to objectionable clips–a form of community policing that has been used for years by other internet stalwarts such as auctioneer eBay Inc. and classifieds ads provider Craigslist.

YouTube spokeswoman Julie Supan said “the really objectionable material gets flagged very quickly” and is pulled from the site usually within 15 minutes. Supan did not disclose specific figures but said “a small percentage” of daily uploads are removed, including those marked for copyright violations.

But not all flagged content gets pulled, if the site’s editorial team doesn’t think it violates the user agreement.

Like MySpace, YouTube sometimes keeps the flagged material online but makes the clip accessible only to its registered users who are 18 and older. People who say they’re younger than 13 are barred from registering, though parents and industry observers all know youths could easily work around the age restrictions by logging a false birth year.

“We’re all battling the same thing, keeping this stuff off our site,” Supan said. “But the reality is there’s a handful of people who try to take advantage of the system. And we are trying to put more controls in place.”

Yahoo Inc., which last month added video-upload features to its already vast index of videos culled from throughout the internet, lets parents turn on a “safe search” mechanism to restrict children from viewing any content that has been flagged as adult. Schools also can use this feature to restrict use.

And while Yahoo, like its video-sharing rivals, doesn’t prescreen every uploaded video, any clips that get onto its featured pages must first pass the muster of the company’s human editors, said Jason Zajac, Yahoo’s general manager of social media.

Still, Zajac acknowledges the system isn’t perfect. Yahoo is looking into advanced image-recognition technologies that could look for something such as a certain percent of skin tone in an image.

Google Video said it has added more screening methods for videos that appear on its “Top 100” and popular sections. It’s also considering a “safe search” feature similar to Yahoo’s, among other improvements.

But even additional human screeners wouldn’t be an end-all answer. “It’s really subjective,” YouTube’s Supan said. “What might offend some might not offend others.”

Links:

YouTube.com
http://www.youtube.com

MySpace.com
http://www.myspace.com

Yahoo
http://www.yahoo.com

Google
http://www.google.com

eSN Ed-Tech Insider
http://www.eschoolnews.com/eti/etiintro1.cfm

eSN Video Archive
http://www.eschoolnews.com/marc/video.cfm

eSN Conference Blog
http://www.eschoolnews.com/cic/necc/blog/