Homecoming parade details announced

Students make history.

Students make history.

Internet companies are a different breed. Because they traffic in speech — rather than, say, corn syrup or warplanes — they make decisions every day about what kind of expression is allowed where. And occasionally they come under pressure to explain how they decide, on whose laws and values they rely, and how they distinguish between toxic speech that must be taken down and that which can remain.

The storm over an incendiary anti-Islamic video posted on YouTube has stirred fresh debate on these issues. Google, which owns YouTube, restricted access to the video in Egypt and Libya, after the killing of a United States ambassador and three other Americans. Then, it pulled the plug on the video in five other countries, where the content violated local laws.

Some countries blocked YouTube altogether, though that didn’t stop the bloodshed: in Pakistan, where elections are to be scheduled soon, riots on Friday left a death toll of 19.

The company pointed to its internal edicts to explain why it rebuffed calls to take down the video altogether. It did not meet its definition of hate speech, YouTube said, and so it allowed the video to stay up on the Web. It didn’t say very much more.

That explanation revealed not only the challenges that confront companies like Google but also how opaque they can be in explaining their verdicts on what can be said on their platforms. Google, Facebook and Twitter receive hundreds of thousands of complaints about content every week.

“We are just awakening to the need for some scrutiny or oversight or public attention to the decisions of the most powerful private speech controllers,” said Tim Wu, a Columbia University law professor who briefly advised the Obama administration on consumer protection regulations online.

Google was right, Mr. Wu believes, to selectively restrict access to the crude anti-Islam video in light of the extraordinary violence that broke out. But he said the public deserved to know more about how private firms made those decisions in the first place, every day, all over the world. After all, he added, they are setting case law, just as courts do in sovereign countries.

Mr. Wu offered some unsolicited advice: Why not set up an oversight board of regional experts or serious YouTube users from around the world to make the especially tough decisions?

Google has not responded to his proposal, which he outlined in a blog post for The New Republic.

Certainly, the scale and nature of YouTube makes this a daunting task. Any analysis requires combing through over a billion videos and overlaying that against the laws and mores of different countries. It’s unclear whether expert panels would allow for unpopular minority opinion anyway. The company said in a statement on Friday that, like newspapers, it, too, made “nuanced” judgments about content: “It’s why user-generated content sites typically have clear community guidelines and remove videos or posts that break them.”

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: