Monday, February 18, 2008

Internet content fi ltering practices vary widely by country.

In China, the government controls access to Internet content and
online publishing by a combination of technical fi ltering methods
and extensive regulations and guidelines. The technical fi ltering
is implemented primarily at the national backbone level, with
requests for information fi ltered for both banned Internet Protocol
(IP) addresses and keywords. Although sometimes inconsistent,
China’s centralized system of content fi ltering ensures uniform
blocking of access throughout the country to human rights, opposition
political movements, Taiwanese and Tibetan independence,
international news, and other web sites. There is very little transparency
about Internet fi ltering, and no public accountability process.

In Iran, there is no nationwide uniform fi ltering system. Instead,
Internet Service Providers (ISPs) are responsible for implementing
censorship following explicit guidelines stipulated by the state.
Individual ISPs choose how they fi lter, with some using American
commercial fi ltering software while others use more manual
methods. Users accessing the Internet on different ISPs can experience
signifi cant variation of accessIbility to web sites. Iran uses this
system to fi lter Iran-related and Persian/Farsi language content
critical of the regime, including politically sensitive sites, gay and
lesbian content, women’s rights sites, streaming media, and blogs.
While there are debates within government that openly acknowledge
and discuss Internet content fi ltering policies, there is very little
transparency about the specifi c content that is targeted for fi ltering.

In the United States, public institutions (e.g., schools and libraries)
are required by law (the Children’s Internet Protection Act
- CIPA) to use fi ltering software to block access to obscene,
pornographic and other materials related to the sexual exploitation
of children. Most implement the fi ltering policy by using
commercial fi ltering technologies, which are prone to miscategorization
and error. Researchers have found that commercial
fi ltering technologies mistakenly block access to content related
to women’s health, gay and lesbian rights groups, and sexual
education for teenagers.

reference-
Citizen Lab