Yesterday's conference of the Family Online Safety Institute was held in Washington DC at the Newseum just a laptop's throw from the Capitol. During the closing evening session, we could see the glowing marble of the well-lit Capitol building from our vantage point on the 8th floor of this amazing new museum dedicated to newspaper and other press publications.
Stephen Balkham, the chief executive of the F.O.S.I., delivered a series of recommendations that I'll summarize as follows:
1. To hold an annual White House Online Safety conference
2. Create a US council for Internet Safety
3. To establish a $100 million campaign to fund research and to help educate the public about issues of internet safety
4. To establish a role of National Safety Officer, to be within the office of the new Chief Technology Officer.
You can read the full text of the recommendations here.
One aspect of yesterday's festivities that caught a few of us by surprise was during a panel on 1st Amendment speech that included North Carolina Attorney General Roy Cooper. Larry Magid of CBS Radio (and ConnectSafely) questioned AG Cooper regarding the focus of the Attorney Generals' task force, convened at the Berkman Center of Harvard University's Law School to review strategies for keeping children safer within social networking sites. The recommendations from that task force will be issued in mid-January. Magid's question was why focus solely on the issue of online predators (at the heart of law enforcement's beef with social networks) when a more common issue facing America's youth is online harassment or cyberbullying.
AG
Cooper's response was that his data regarding what the biggest issue is
comes directly from law enforcement. Magid, followed closely by a
similar query from Nancy Willard, author and head of the Center for
Safe and Responsible Internet Use, felt that if data were the issue,
one might review the body of research the supports his concern. With
statistics such as 43% of children have already been victims of
cyberbullying or 1 in 4 children admit being cyberbullies, or the
concern of a researcher in the room that we've yet to address the
problem of the silent observer population (those who see the bullying
but make no effort to stop or report it), both Magid and Willard felt
there was overwhelming evidence that the AG's effort, though
well-intentioned, may have diverted funds, talent and energy away from
the most significant and frequent Internet danger in favor of the most
terrifying and media-hyped Internet issue.
AG Cooper responded that he had not seen that research.
I'm not expressing this as well as others have (I flew home on a red-eye and have had about 4 hours sleep). Here's how the Washington Internet Daily wrote it up:
"North Carolina’s attorney general said he doubts the research finding that most cases in which children are bullied or solicited sexually online involve other children. Speaking at the Family Online Safety Institute’s annual conference, Attorney General Roy Cooper urged further study, in response to audience questions about why attorneys general seem
to emphasize age verification and walling off children from adults, when children are most likely to be victimized by other children.
“I question the research on that,” Cooper said. “That doesn’t square with what law enforcement is telling us on a consistent basis.”
Agents who pose as children online are solicited very quickly, he said. Yet the experience of agents doesn’t match what researchers have found, said Larry Magid, of SafeKids.com, from the audience. Perhaps children aren’t nearly as naive as everyone seems to believe, he said.
The attorneys general seem fixated on following their “myopic path” in search of the “golden grail of age verification,” said audience member Nancy Willard, of the Center for Safe and Responsible Internet Use. She asked Cooper why the attorneys general don’t seem interested in a comprehensive solution. “I would reject your notion that we have been myopic,” Cooper said. The AGs are acting comprehensively, he said, and
that's why the Internet Safety Technical Task Force was formed.
Even if fewer children are solicited than believed -- if the figures were one in 29 rather than one in five -- there are still too many, Cooper said. Panelist Robert Corn-Revere, though, said he worries about possible side effects of a standardized age-verification system. A pedophile might get a free pass if he got verified as a minor, Corn-Revere said.
Panelists didn’t seem very worried about private bodies taking on the role of speech arbiters. But Gene Policinski, executive director of the First Amendment Center, asked whether this middle ground -- not the government and not the home -- of businesses deciding which speech can be on a Web site could be a threat to the First Amendment. “What about the middle speech? Hate speech -- perhaps hate to you and I, but not to another person,” he said. Do Internet companies regulating speech in effect create hidden censorship, he asked. Christopher Wolf, speaking for the Internet Task Force of the Anti-Defamation League, said he doesn’t think so. If a Google site removes content, there’s an opportunity to object, and the rest of the Internet available as a platform to complain, he said. Wolf said hate speech is usually the last topic covered in discussions of protecting children online,
but it has a “profound and very harmful effect on kids.”
Corn-Revere also said he doesn’t see a problem. The problem would be if a take-down system were federalized and made uniform across all sites, he said: Then a person wouldn’t have the choice of a different site. Andrew McLaughlin, director of global public policy and government affairs at Google, said it’s a thorny problem. Businesses could take the approach that their Web sites are simply platforms, and not regulate speech except for illegal activities. But sites want people to like and use their services, he said, so monitoring for inappropriate content becomes a business necessity. McLaughlin said Google has decided on transparency as its best option, so it alerts Web users when a page has been taken down, either for inappropriate content or because of local laws.
Google also added an Abuse and Safety Center on YouTube, it announced Thursday, to give people information about cyberbullying, online harassment and hate speech and easier access to reporting and control tools. "