Home

Children's Internet Protection Act (CIPA) Ruling Part 8

Children's Internet Protection Act (CIPA) Ruling - novelonlinefull.com

You’re read light novel Children's Internet Protection Act (CIPA) Ruling Part 8 online at NovelOnlineFull.com. Please use the follow button to get notification about the latest chapter next time when you visit NovelOnlineFull.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy

Ct. at 1404; see also Stanley, 394 U.S. at 567-58 (holding that individuals have a First Amendment right to possess obscene material, even though the existence of this right makes it more difficult for the states to further their legitimate interest in prosecuting the distribution of obscenity). By the same token, even if the use of filters is effective in preventing patrons from receiving const.i.tutionally unprotected speech, the government's interest in preventing the dissemination of such speech cannot justify the use of the technology protection measures mandated by c.i.p.a, which necessarily block substantial amounts of const.i.tutionally protected speech.

c.i.p.a thus resembles the Communications Decency Act, which the Supreme Court facially invalidated in Reno v. ACLU, 521 U.S.

844 (1997). Although on its face, the CDA simply restricted the distribution to minors of speech that was const.i.tutionally unprotected with respect to minors, as a practical matter, given Web sites' difficulties in identifying the ages of Internet users, the CDA effectively prohibited the distribution to adults of material that was const.i.tutionally protected with respect to adults. Similarly, although on its face, c.i.p.a, like the CDA, requires the suppression of only const.i.tutionally unprotected speech, it is impossible as a practical matter, given the state of the art of filtering technology, for a public library to comply with c.i.p.a without also blocking significant amounts of const.i.tutionally protected speech. We therefore hold that a library's use of a technology protection measure required by c.i.p.a is not narrowly tailored to the government's legitimate interest in preventing the dissemination of visual depictions that are obscene, child p.o.r.nography, or in the case of minors, harmful to minors.

For the same reason that a public library's use of software filters is not narrowly tailored to further the library's interest in preventing its computers from being used to disseminate visual depictions that are obscene, child p.o.r.nography, and harmful to minors, a public library's use of software filters is not narrowly tailored to further the library's interest in protecting patrons from being unwillingly exposed to offensive, s.e.xually explicit material. As discussed in our findings of fact, the filters required by c.i.p.a block substantial numbers of Web sites that even the most puritanical public library patron would not find offensive, such as http://federo.com, a Web site that promotes federalism in Uganda, which N2H2 blocked as "Adults Only, p.o.r.nography," and http://www.vvm.com/~bond/home.htm, a site for aspiring dentists, which was blocked by Cyberpatrol as "Adult/s.e.xually Explicit."

We list many more such examples in our findings of fact, see supra, and find that such erroneously blocked sites number in at least the thousands.

Although we have found large amounts of overblocking, even if only a small percentage of sites blocked are erroneously blocked, either with respect to the state's interest in preventing adults from viewing material that is obscene or child p.o.r.nography and in preventing minors from viewing material that is harmful to minors, or with respect to the state's interest in preventing library patrons generally from being unwillingly exposed to offensive, s.e.xually explicit material, this imprecision is fatal under the First Amendment. Cf. Reno, 521 U.S. at 874 ("[T]he CDA lacks the precision that the First Amendment requires when a statute regulates the content of speech."); League of Women Voters, 468 U.S. at 398 ("[E]ven if some of the hazards at which [the challenged provision] was aimed are sufficiently substantial, the restriction is not crafted with sufficient precision to remedy those dangers that may exist to justify the significant abridgement of speech worked by the provision's broad ban . . . .").

While the First Amendment does not demand perfection when the government restricts speech in order to advance a compelling interest, the substantial amounts of erroneous blocking inherent in the technology protection measures mandated by c.i.p.a are more than simply de minimis instances of human error. "The line between speech unconditionally guaranteed and speech which may legitimately be regulated, suppressed, or punished is finely drawn. Error in marking that line exacts an extraordinary cost."

Playboy, 529 U.S. at 817 (internal quotation marks and citation omitted). Indeed, "precision of regulation must be the touchstone in an area so closely touching our most precious freedoms." Keyishian v. Bd. of Regents of the Univ. of the State of N.Y., 385 U.S. 589, 603 (1967) (internal quotation marks and citation omitted); see also Bantam Books, Inc. v. Sullivan, 372 U.S. 58, 66 (1963) ("The separation of legitimate from illegitimate speech calls for sensitive tools.") (internal quotation marks and citation omitted). Where the government draws content-based restrictions on speech in order to advance a compelling government interest, the First Amendment demands the precision of a scalpel, not a sledgehammer. We believe that a public library's use of the technology protection measures mandated by c.i.p.a is not narrowly tailored to further the governmental interests at stake.

Although the strength of different libraries' interests in blocking certain forms of speech may vary from library to library, depending on the frequency and severity of problems experienced by each particular library, we conclude, based on our findings of fact, that any public library's use of a filtering product mandated by c.i.p.a will necessarily fail to be narrowly tailored to address the library's legitimate interests. Because it is impossible for a public library to comply with c.i.p.a without blocking substantial amounts of speech whose suppression serves no legitimate state interest, we therefore hold that c.i.p.a is facially invalid, even under the more stringent standard of facial invalidity urged on us by the government, which would require upholding c.i.p.a if it is possible for just a single library to comply with c.i.p.a's conditions without violating the First Amendment. See supra Part III.

3. Less Restrictive Alternatives

The const.i.tutional infirmity of a public library's use of software filters is evidenced not only by the absence of narrow tailoring, but also by the existence of less restrictive alternatives that further the government's legitimate interests.

See Playboy, 529 U.S. at 813 ("If a less restrictive alternative would serve the Government's purpose, the legislature must use that alternative."); Sable, 492 U.S. at 126 ("The Government may . . . regulate the content of const.i.tutionally protected speech in order to promote a compelling interest if it chooses the least restrictive means to further the articulated interest.").

As is the case with the narrow tailoring requirement, the government bears the burden of proof in showing the ineffectiveness of less restrictive alternatives. "When a plausible, less restrictive alternative is offered to a content- based speech restriction, it is the Government's obligation to prove that the alternative will be ineffective to achieve its goals." Playboy, 529 U.S. at 816; see also Reno, 521 U.S. at 879 ("The breadth of this content-based restriction of speech imposes an especially heavy burden on the Government to explain why a less restrictive provision would not be as effective . . . ."); Fabulous a.s.socs., Inc. v. Pa. Pub. Util. Comm'n, 896 F.2d 780, 787 (3d Cir. 1990) ("We focus . . . on the more difficult question whether the Commonwealth has borne its heavy burden of demonstrating that the compelling state interest could not be served by restrictions that are less intrusive on protected forms of expression.") (internal quotation marks and citation omitted).

We find that there are plausible, less restrictive alternatives to the use of software filters that would serve the government's interest in preventing the dissemination of obscenity and child p.o.r.nography to library patrons. In particular, public libraries can adopt Internet use policies that make clear to patrons that the library's Internet terminals may not be used to access illegal content. Libraries can ensure that their patrons are aware of such policies by posting them in prominent places in the library, requiring patrons to sign forms agreeing to comply with the policy before the library issues library cards to patrons, and by presenting patrons, when they log on to one of the library's Internet terminals, with a screen that requires the user to agree to comply with the library's policy before allowing the user access to the Internet.

Libraries can detect violations of their Internet use policies either through direct observation or through review of the library's Internet use logs. In some cases, library staff or patrons may directly observe a patron accessing obscenity and child p.o.r.nography. Libraries' Internet use logs, however, also provide libraries with a means of detecting violations of their Internet use policies. These logs, which can be kept regardless whether a library uses filtering software, record the URL of every Web page accessed by patrons. Although ordinarily the logs do not link particular URLs with particular patrons, it is possible, using access logs, to identify the patron who viewed the Web page corresponding to a particular URL, if library staff discover in the access logs the URL of a Web page containing obscenity or child p.o.r.nography. For example, David Biek, Director of Tacoma Public Library's main branch, testified that in the course of scanning Internet use logs he has found what looked like attempts to access child p.o.r.nography, notwithstanding the fact that Tacoma uses Websense filtering software. In two cases, he communicated his findings to law enforcement and turned over the logs to law enforcement in response to a subpoena.

Once a violation of a library's Internet use policy is detected through the methods described above, a library may either issue the patron a warning, revoke the patron's Internet privileges, or notify law enforcement, if the library believes that the patron violated either state obscenity laws or child p.o.r.nography laws. Although these methods of detecting use of library computers to access illegal content are not perfect, and a library, out of respect for patrons' privacy, may choose not to adopt such policies, the government has failed to show that such methods are substantially less effective at preventing patrons from accessing obscenity and child p.o.r.nography than software filters. As detailed in our findings of fact, the underblocking that results from the size, rate of change, and rate of growth of the Internet significantly impairs the software filters from preventing patrons from accessing obscenity and child p.o.r.nography. Unless software filters are themselves perfectly effective at preventing patrons from accessing obscenity and child p.o.r.nography, "[i]t is no response that [a less restrictive alternative] . . . may not go perfectly every time." Playboy, 529 U.S. at 824; cf. Denver Area Educ. Telecomm. Consortium, Inc.

v. FCC, 518 U.S. 727, 759 (1996) ("No provision . . . short of an absolute ban, can offer certain protection against a.s.sault by a determined child.").

The government has not offered any data comparing the frequency with which obscenity and child p.o.r.nography is accessed at libraries that enforce their Internet use policies through software filters with the frequency with which obscenity and child p.o.r.nography is accessed at public libraries that enforce their Internet use policies through methods other than software filters. Although the government's library witnesses offered anecdotal accounts of a reduction in the use of library computers to access s.e.xually explicit speech when filtering software was mandated, these anecdotal accounts are not a subst.i.tute for more robust a.n.a.lyses comparing the use of library computers to access child p.o.r.nography and material that meets the legal definition of obscenity in libraries that use blocking software and in libraries that use alternative methods. Cf. Playboy, 529 U.S.

at 822 ("[T]he Government must present more than anecdote and supposition.").

We acknowledge that some library staff will be uncomfortable using the "tap-on-the-shoulder" method of enforcing the library's policy against using Internet terminals to access obscenity and child p.o.r.nography. The Greenville County Library, for example, experienced high turnover among library staff when staff were required to enforce the library's Internet use policy through the tap-on-the-shoulder technique. Given filters' inevitable underblocking, however, even a library that uses filtering will have to resort to a tap-on-the-shoulder method of enforcement, where library staff observes a patron openly violating the library's Internet use policy, by, for example, accessing material that is obviously child p.o.r.nography but that the filtering software failed to block. Moreover, a library employee's degree of comfort in using the tap-on-the-shoulder method will vary from employee to employee, and there is no evidence that it is impossible or prohibitively costly for public libraries to hire at least some employees who are comfortable enforcing the library's Internet use policy.

We also acknowledge that use of a tap on the shoulder delegates to librarians substantial discretion to determine which Web sites a patron may view. Nonetheless, we do not believe that this putative "prior restraint" problem can be avoided through the use of software filters, for they effectively delegate to the filtering company the same unfettered discretion to determine which Web sites a patron may view. Moreover, as noted above, violations of a public library's Internet use policy may be detected not only by direct observation, but also by reviewing the library's Internet use logs after the fact, which alleviates the need for library staff to directly confront patrons while they are viewing obscenity or child p.o.r.nography.

Similar less restrictive alternatives exist for preventing minors from accessing material harmful to minors. First, libraries may use the tap-on-the-shoulder method when minors are observed using the Internet to access material that is harmful to minors. Requiring minors to use specific terminals, for example in a children's room, that are in direct view of library staff will increase the likelihood that library staff will detect minors' use of the Internet to access material harmful to minors.

Alternatively, public libraries could require minors to use blocking software only if they are unaccompanied by a parent, or only if their parent consents in advance to their child's unfiltered use of the Internet. "A court should not a.s.sume that a plausible, less restrictive alternative would be ineffective; and a court should not presume parents, given full information, will fail to act." Playboy, 529 U.S. at 824.

In contrast to the "harmful to minors" statute upheld in Ginsberg v. New York, 390 U.S. 629 (1968), which permitted parents to determine whether to provide their children with access to material otherwise prohibited by the statute, c.i.p.a, like the Communications Decency Act, which the Court invalidated in Reno, contains no exception for parental consent: [W]e noted in Ginsberg that "the prohibition against sales to minors does not bar parents who so desire from purchasing the magazines for their children." Under the CDA, by contrast, neither the parents' consent nor even their partic.i.p.ation in the communication would avoid the application of the statute.

Reno, 521 U.S. at 865 (citation omitted); see also Ginsberg, 390 U.S. at 639 ("It is cardinal with us that the custody, care, and nurture of the child reside first in the parents, whose primary function and freedom include preparation for obligations the state can neither supply nor hinder." (quoting Prince v.

Ma.s.sachusetts, 321 U.S. 158, 166 (1944))).

The Court in Playboy acknowledged that although a regime of permitting parents voluntarily to block cable channels containing s.e.xually explicit programming might not be a completely effective alternative to the challenged law, which effectively required cable operators to transmit s.e.xually explicit programming only during particular hours, the challenged law itself was not completely effective in serving the government's interest: There can be little doubt, of course, that under a voluntary blocking regime, even with adequate notice, some children will be exposed to signal bleed; and we need not discount the possibility that a graphic image could have a negative impact on a young child. It must be remembered, however, that children will be exposed to signal bleed under time channeling as well. . . .

The record is silent as to the comparative effectiveness of the two alternatives.

Playboy, 529 U.S. at 826. Similarly, in this case, the government has offered no evidence comparing the effectiveness of blocking software and alternative methods used by public libraries to protect children from material harmful to minors.

Finally, there are other less restrictive alternatives to filtering software that further public libraries' interest in preventing patrons from unwillingly being exposed to patently offensive, s.e.xually explicit content on the Internet. To the extent that public libraries are concerned with protecting patrons from accidentally encountering such material while using the Internet, public libraries can provide patrons with guidance in finding the material they want and avoiding unwanted material.

Some public libraries also offer patrons the option of using filtering software, if they so desire. Cf. Rowan v. Post Office Dept., 397 U.S. 728 (1970) (upholding a federal statute permitting individuals to instruct the Postmaster General not to deliver advertis.e.m.e.nts that are "erotically arousing or s.e.xually provocative").

With respect to protecting library patrons from s.e.xually explicit content viewed by other patrons, public libraries have used a variety of less restrictive methods. One alternative is simply to segregate filtered from unfiltered terminals, and to place unfiltered terminals outside of patrons' sight-lines and areas of heavy traffic. Even the less restrictive alternative of allowing unfiltered access on only a single terminal, well out of the line of sight of other patrons, however, is not permitted under c.i.p.a, which requires the use of a technology protection measure on every computer in the library. See c.i.p.a Sec.

1721(b)(6)(C) (codified at 47 U.S.C. Sec. 254(h)(6)(C)), c.i.p.a Sec. 1712 (codified at 20 U.S.C. Sec. 9134(f)(1)(A)) (requiring a public library receiving E-rate discounts or LSTA grants to certify that it "has in place a policy of Internet safety that includes the operation of a technology protection measure with respect to any of its computers with Internet access . . . ." (emphasis added)); In re Federal-State Joint Board on Universal Service: Children's Internet Protection Act, CC Docket No. 96-45, Report and Order, FCC 01-120, 30 (Apr. 5, 2001) ("c.i.p.a makes no distinction between computers used only by staff and those accessible to the public.").

Alternatively, libraries can use privacy screens or recessed monitors to prevent patrons from unwillingly being exposed to material viewed by other patrons. We acknowledge that privacy screens and recessed monitors suffer from imperfections as alternatives to filtering. Both impose costs on the library, particularly recessed monitors, which, according to the government's library witnesses, are expensive. Moreover, some libraries have experienced problems with patrons attempting to remove the privacy screens. Privacy screens and recessed monitors also make it difficult for more than one person to work at the same terminal.

These problems, however, are not insurmountable. While there is no doubt that privacy screens and recessed terminals impose additional costs on libraries, the government has failed to show that the cost of privacy screens or recessed terminals is substantially greater than the cost of filtering software and the resources needed to maintain such software. Nor has the government shown that the cost of these alternatives is so high as to make their use prohibitive. With respect to the problem of patrons removing privacy screens, we find, based on the successful use of privacy screens by the Fort Vancouver Regional Library and the Multnomah County Public Library, that it is possible for public libraries to prevent patrons from removing the screens. Although privacy screens may make it difficult for patrons to work at the same terminal side by side with other patrons or with library staff, a library could provide filtered access at terminals that lack privacy screens, when patrons wish to use a terminal with others. Alternatively, a library can reserve terminals outside of patrons' sight lines for groups of patrons who wish unfiltered access.

We therefore conclude that the government has failed to show that the less restrictive alternatives discussed above are ineffective at furthering the government's interest either in preventing patrons from using library computers to access visual depictions that are obscene, child p.o.r.nography, or in the case of minors, harmful to minors, or in preventing library patrons from being unwillingly exposed to patently offensive, s.e.xually explicit speech.

4. Do c.i.p.a's Disabling Provisions Cure the Defect?

The Government argues that even if the use of software filters mandated by c.i.p.a blocks a substantial amount of speech whose suppression serves no legitimate state interest, and therefore fails strict scrutiny's narrow tailoring requirement, c.i.p.a's disabling provisions cure any lack of narrow tailoring inherent in filtering technology. The disabling provision applicable to libraries receiving LSTA grants states that "[a]n administrator, supervisor, or other authority may disable a technology protection measure . . . to enable access for bona fide research or other lawful purposes." c.i.p.a Sec. 1712(a)(2) (codified at 20 U.S.C. Sec. 9134(f)(3)). c.i.p.a's disabling provision with respect to libraries receiving E-rate discounts similarly states that "[a]n administrator, supervisor, or other person authorized by the certifying authority . . . may disable the technology protection measure concerned, during use by an adult, to enable access for bona fide research or other lawful purpose."

c.i.p.a Sec. 1721(b) (codified at 47 U.S.C. Sec. 254(h)(6)(D)).

To determine whether the disabling provisions cure c.i.p.a's lack of narrow tailoring, we must first determine, as a matter of statutory construction, under what circ.u.mstances the disabling provisions permit libraries to disable the software filters.

It is unclear to us whether c.i.p.a's disabling provisions permit libraries to disable the filters any time a patron wishes to access speech that is neither obscenity, child p.o.r.nography, or in the case of a minor patron, material that is harmful to minors.

Whether c.i.p.a permits disabling in such instances depends on the meaning of the provisions' reference to "bona fide research or other lawful purpose." On the one hand, the language "to enable access for bona fide research or other lawful purpose" could be interpreted to mean "to enable access to all const.i.tutionally protected material." As a textual matter, this reading of the disabling provisions is plausible. If a patron seeks access to speech that is const.i.tutionally protected, then it is reasonable to conclude that the patron has a "lawful purpose," since the dissemination and receipt of const.i.tutionally protected speech cannot be made unlawful.

Moreover, since a narrower construction of the disabling provision creates more const.i.tutional problems than a construction of the disabling provisions that permits access to all const.i.tutionally protected speech, the broader interpretation is preferable. "[I]f an otherwise acceptable construction of a statute would raise serious const.i.tutional problems, and where an alternative interpretation of the statute is fairly possible, we are obligated to construe the statute to avoid such problems."

INS v. St. Cyr, 121 S. Ct. 2271, 2279 (2001) (internal quotation marks and citations omitted). On the other hand, interpreting c.i.p.a's disabling provisions to permit disabling for access to all const.i.tutionally protected speech presents several problems.

First, if "other lawful purpose" means "for the purpose of accessing const.i.tutionally protected speech," then this reading renders superfluous c.i.p.a's reference to "bona fide research,"

which clearly contemplates some purpose beyond simply accessing const.i.tutionally protected speech. In general, "courts should disfavor interpretations of statutes that render language superfluous." Conn. Nat'l Bank v. Germain, 503 U.S. 249, 253 (1992).

Furthermore, Congress is clearly capable of explicitly specifying categories of const.i.tutionally unprotected speech, as it did when it drafted c.i.p.a to require funding recipients to use technology protection measures that protect against visual depictions that are "obscene," "child p.o.r.nography," or, in the case of minors, "harmful to minors." c.i.p.a Sec. 1712(a) (codified at 20 U.S.C. Sec. 9134(f)(1)(A)(i)(I)-(III)); c.i.p.a Sec. 1721(b) (codified at 47 U.S.C. Sec. 254(h)(6)(B)(i)(I)-(III)). If Congress intended c.i.p.a's disabling provisions simply to permit libraries to disable the filters to allow access to speech falling outside of these categories, Congress could have drafted the disabling provisions with greater precision, expressly permitting libraries to disable the filters "to enable access for any material that is not obscene, child p.o.r.nography, or in the case of minors, harmful to minors," rather than "to enable access for bona fide research or other lawful purposes," which is the language that Congress actually chose.

At bottom, however, we need not definitively construe c.i.p.a's disabling provisions, since it suffices in this case to a.s.sume without deciding that the disabling provisions permit libraries to allow a patron access to any speech that is const.i.tutionally protected with respect to that patron. Although this interpretation raises fewer const.i.tutional problems than a narrower interpretation, this interpretation of the disabling provisions nonetheless fails to cure c.i.p.a's lack of narrow tailoring. Even if the disabling provisions permit public libraries to allow patrons to access speech that is const.i.tutionally protected yet erroneously blocked by the software filters, the requirement that library patrons ask a state actor's permission to access disfavored content violates the First Amendment.

The Supreme Court has made clear that content-based restrictions that require recipients to identify themselves before being granted access to disfavored speech are subject to no less scrutiny than outright bans on access to such speech. In Lamont v. Postmaster General, 381 U.S. 301 (1965), for example, the Court held that a federal statute requiring the Postmaster General to halt delivery of communist propaganda unless the addressee affirmatively requested the material violated the First Amendment: We rest on the narrow ground that the addressee in order to receive his mail must request in writing that it be delivered. This amounts in our judgment to an unconst.i.tutional abridgment of the addressee's First Amendment rights. The addressee carries an affirmative obligation which we do not think the Government may impose on him. This requirement is almost certain to have a deterrent effect, especially as respects those who have sensitive positions.

Id. at 307.

Similarly, in Denver Area Educational Telecommunications Consortium, Inc. v. FCC, 518 U.S. 727 (1996), the Court held unconst.i.tutional a federal law requiring cable operators to allow access to patently offensive, s.e.xually explicit programming only to those subscribers who requested access to the programming in advance and in writing. Id. at 732-33. As in Lamont, the Court in Denver reasoned that this content-based restriction on recipients' access to speech would have an impermissible chilling effect: "[T]he written notice requirement will . . . restrict viewing by subscribers who fear for their reputations should the operator, advertently or inadvertently, disclose the list of those who wish to watch the 'patently offensive' channel." Id.

at 754; see also Fabulous a.s.socs., Inc. v. Pa. Pub. Util. Comm'n, 896 F.2d 780, 785 (3d Cir. 1990) (considering the const.i.tutionality of a state law requiring telephone users who wish to listen to s.e.xually explicit telephone messages to apply for an access code to receive such messages, and invalidating the law on the ground that "[a]n identification requirement exerts an inhibitory effect").

We believe that c.i.p.a's disabling provisions suffer from the same flaws as the restrictions on speech in Lamont, Denver, and Fabulous a.s.sociates. By requiring library patrons affirmatively to request permission to access certain speech singled out on the basis of its content, c.i.p.a will deter patrons from requesting that a library disable filters to allow the patron to access speech that is const.i.tutionally protected, yet sensitive in nature. As we explain above, we find that library patrons will be reluctant and hence unlikely to ask permission to access, for example, erroneously blocked Web sites containing information about s.e.xually transmitted diseases, s.e.xual ident.i.ty, certain medical conditions, and a variety of other topics. As discussed in our findings of fact, software filters block access to a wide range of const.i.tutionally protected speech, including Web sites containing information that individuals are likely to wish to access anonymously.

That library patrons will be deterred from asking permission to access Web sites containing certain kinds of content is evident as a matter of common sense as well as amply borne out by the trial record. Plaintiff Emmalyn Rood, who used the Internet at a public library to research information relating to her s.e.xual ident.i.ty, testified that she would have been unwilling as a young teen to ask a librarian to disable filtering software so that she could view materials concerning gay and lesbian issues. Similarly, plaintiff Mark Brown stated that he would have been too embarra.s.sed to ask a librarian to disable filtering software if it had impeded his ability to research surgery options for his mother when she was treated for breast cancer.

As explained in our findings of fact, see supra at Subsection II.D.2.b, the reluctance of patrons to request permission to access Web sites that were erroneously blocked is further established by the low number of patron unblocking requests, relative to the number of erroneously blocked Web sites, in those public libraries that use software filters and permit patrons to request access to incorrectly blocked Web sites. Cf. Fabulous a.s.socs., 896 F.2d at 786 ("On the record before us, there is more than enough evidence to support the district court's finding that access codes will chill the exercise of some users' right to hear protected communications.").

To be sure, the government demonstrated that it is possible for libraries to permit patrons to request anonymously that a particular Web site be unblocked. In particular, the Tacoma Public Library has configured its computers to present patrons with the option, each time the software filter blocks their access to a Web page, of sending an anonymous email to library staff requesting that the page be unblocked. Moreover, a library staff member periodically scans logs of URLs blocked by the filters, in an effort to identify erroneously blocked sites, which the library will subsequently unblock. Although a public library's ability to permit anonymous unblocking requests addresses the deterrent effect of requiring patrons to identify themselves before gaining access to a particular Web site, we believe that it fails adequately to address the overblocking problem.

In particular, even allowing anonymous requests for unblocking burdens patrons' access to speech, since such requests cannot immediately be acted on. Although the Tacoma Public Library, for example, attempts to review requests for unblocking within 24 hours, requests sometimes are not reviewed for several days. And delays are inevitable in libraries with branches that lack the staff necessary immediately to review patron unblocking requests. Because many Internet users "surf" the Web, visiting hundreds of Web sites in a single session and spending only a short period of time viewing many of the sites, the requirement that a patron take the time to affirmatively request access to a blocked Web site and then wait several days until the site is unblocked will, as a practical matter, impose a significant burden on library patrons' use of the Internet. Indeed, a patron's time spent requesting access to an erroneously blocked Web site and checking to determine whether access was eventually granted is likely to exceed the amount of time the patron would have actually spent viewing the site, had the site not been erroneously blocked. This delay is especially burdensome in view of many libraries' practice of limiting their patrons to a half hour or an hour of Internet use per day, given the scarcity of terminal time in relation to patron demand.

Please click Like and leave more comments to support and keep us alive.

RECENTLY UPDATED MANGA

Unscientific Beast Taming

Unscientific Beast Taming

Unscientific Beast Taming Chapter 1738: World King's Mission (1) Author(s) : Ligh Spring Flow, 轻泉流响 View : 1,260,747
Fey Evolution Merchant

Fey Evolution Merchant

Fey Evolution Merchant Chapter 2978: The Amazing Lin Yuan! Author(s) : 琥珀纽扣, Amber Button View : 8,132,740
Keyboard Immortal

Keyboard Immortal

Keyboard Immortal Chapter 2131: The Final Lesson Author(s) : 六如和尚, Monk Of The Six Illusions View : 1,382,833
The Grand Secretary's Pampered Wife

The Grand Secretary's Pampered Wife

The Grand Secretary's Pampered Wife Chapter 599.1: Past Life and Present Life Author(s) : Pian Fang Fang, 偏方方, Folk Remedies, Home Remedy View : 285,263

Children's Internet Protection Act (CIPA) Ruling Part 8 summary

You're reading Children's Internet Protection Act (CIPA) Ruling. This manga has been translated by Updating. Author(s): United States District Court. Already has 584 views.

It's great if you read and follow any novel on our website. We promise you that we'll bring you the latest, hottest novel everyday and FREE.

NovelOnlineFull.com is a most smartest website for reading manga online, it can automatic resize images to fit your pc screen, even on your mobile. Experience now by using your smartphone and access to NovelOnlineFull.com