Section 230 of the Communications Decency Act

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Section 230 of the Communications Decency Act of 1996 (a common name for Title V of the Telecommunications Act of 1996) is a landmark piece of Internet legislation in the United States, codified at 47 U.S.C. § 230. Section 230(c)(1) provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

In analyzing the availability of the immunity offered by this provision, courts generally apply a three-prong test. A defendant must satisfy each of the three prongs to gain the benefit of the immunity:

  1. The defendant must be a "provider or user" of an "interactive computer service."
  2. The cause of action asserted by the plaintiff must treat the defendant as the "publisher or speaker" of the harmful information at issue.
  3. The information must be "provided by another information content provider," i.e., the defendant must not be the "information content provider" of the harmful information at issue.


Prior to the Internet, case law was clear that a liability line was drawn between providers of content and publishers of content; publishers would be expected to have awareness of material it was publishing and thus should be held liable for any illegal content it published, while distributors would likely not be aware and thus would be immune. This was established in Smith v. California (1959), where the Supreme Court ruled that putting liability on the provider (a book store in this case) would have "a collateral effect of inhibiting the freedom of expression, by making the individual the more reluctant to exercise it".[1]

In the early 1990s, the Internet became more widely adopted and created means for users to engage in forums and other user-generated content. While this helped to expand the use of the Internet, it also resulted in a number of legal cases putting service providers at fault for the content generated by its users. This concern was raised by legal challenges against CompuServe and Prodigy, early service providers at this time.[2] CompuServe stated they would not attempt to regular what users posted on their services, while Prodigy had employed a team of moderators to validate content. Both faced legel challenges related to content posted by their users. In Cubby, Inc. v. CompuServe Inc., CompuServe was found not be at fault as, by its stance as allowing all content to go unmoderated, it was a distributor and thus not liable for libelous content posted by users. However, in Stratton Oakmont, Inc. v. Prodigy Services Co. found that as Prodigy had taken an editorial role with regard to customer content, it was a publisher and legally responsible for libel committed by customers. [3]

Service providers made their Congresspersons aware of these cases, believing that if upheld across the nation, it would stifle the growth of the Internet. At the time, Congress was preparing the Communications Decency Act (CDA), part of the omnibus Telecommunications Act of 1996, which was designed to make knowingly sending indecent or obscene material to minors a criminal offense. Based on the Stratton Oakmont decision, Congress recognized that by requiring service providers to block indecent content would make them be treated as publishers in context of the First Amendment and thus become liable for other illegal content such as libel, not set out in the existing CDA.[2] Representatives Christopher Cox (R-CA) and Ron Wyden (D-OR) wrote the bill's section 509, titled the Internet Freedom and Family Empowerment Act, designed to override the decision from Stratton Oakmont, so that services providers could moderate content as necessary and did not have to act as a wholly neutral conduit. The new Act was added the section while the CDA was in conference within the House. The overall Telecommunications Act, with both the CDA and Cox/Wyden's provision, passed both Houses by near-unanimous votes and signed into law by President Bill Clinton by February 1996.[4] Cox/Wyden's section was codified as Section 230 in Title 47 of the US Code.

The anti-indecency portion of the CDA was immediately challenged on passage, resulting in the Supreme Court 1997 in Reno v. American Civil Liberties Union that ruled all of the anti-indecency sections of the CDA were unconstitutional, but left Section 230.[5] One of the first legal challenges to Section 230 Zeran v. America Online, Inc., in which a Federal affirmed that the purpose of Section 230 as passed by Congress was "to remove the disincentives to self-regulation created by the Stratton Oakmont decision.[6] Under that court's holding, computer service providers who regulated the dissemination of offensive material on their services risked subjecting themselves to liability, because such regulation cast the service provider in the role of a publisher. Fearing that the specter of liability would therefore deter service providers from blocking and screening offensive material, Congress enacted § 230's broad immunity "to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children's access to objectionable or inappropriate online material."[6] In addition, Zeran notes "the amount of information communicated via interactive computer services is . . . staggering. The specter of tort liability in an area of such prolific speech would have an obviously chilling effect. It would be impossible for service providers to screen each of their millions of postings for possible problems. Faced with potential liability for each message republished by their services, interactive computer service providers might choose to severely restrict the number and type of messages posted. Congress considered the weight of the speech interests implicated and chose to immunize service providers to avoid any such restrictive effect."[6]


Section 230 immunity is not unlimited. The statute specifically excepts federal criminal liability and intellectual property claims.[7] However, state criminal laws have been held preempted in cases such as, LLC v. McKenna[8] and Voicenet Commc'ns, Inc. v. Corbett[9] (agreeing "[T]he plain language of the CDA provides ... immunity from inconsistent state criminal laws.").

As of mid-2016, courts have issued conflicting decisions regarding the scope of the intellectual property exclusion set forth in 47 U.S.C. § 230(e)(2). For example, in Perfect 10, Inc. v. CCBill, LLC,[10] the 9th Circuit Court of Appeals ruled that the exception for intellectual property law applies only to federal intellectual property claims such as copyright infringement, trademark infringement, and patents, reversing a district court ruling that the exception applies to state-law right of publicity claims.[11] The 9th Circuit's decision in Perfect 10 conflicts with conclusions from other courts including Doe v. Friendfinder. The Friendfinder court specifically discussed and rejected the lower court's reading of "intellectual property law" in CCBill and held that the immunity does not reach state right of publicity claims.[12]


Section 230 has been controversial because several courts have interpreted it as providing complete immunity for ISPs with regard to the torts committed by their users over their systems. Zeran v. AOL, a 1997 4th Circuit decision, which held that Section 230 "creates a federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service." This rule effectively protects online entities, including user-generated content websites, that qualify as a "provider or user" of an "interactive computer service."[citation needed]

Allow States and Victims to Fight Online Sex Trafficking Act - Stop Enabling Sex Traffickers Act (FOSTA-SESTA)[edit]

Section 230 has recently been applied to dismiss a lawsuit that was filed by victims of sex-trafficking against Backpage for allowing advertisements with sex-trafficking content to remain on the website. The First Circuit affirmed a lower court decision that granted Backpage's motion to dismiss under Section 230 immunity.[13] As a result, Congress introduced two bills in August 2017: the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) in the U.S. House of Representative by Ann Wagner in April 2017, and the Stop Enabling Sex Traffickers Act (SESTA) U.S. Senate bill introduced by Rob Portman in August 2017. Combined, the FOSTA-SESTA bills modified Section 230 to exempt services providers from Section 230 immunity when dealing with civil or criminal crimes related to sex trafficking,[14] making them liable if they "knowingly assist, facilitate, or support sex trafficking", creating serious, legal consequences for websites that profit from sex trafficking and give prosecutors tools they need to protect their communities and give victims a pathway to justice.[15] The bill passed both Houses and was signed into law by President Donald Trump on April 11, 2018.[16][17]

The bills were criticized by pro-free speech and pro-Internet groups as a "disguised internet censorship bill" that weakens the section 230 safe harbors, places unnecessary burdens on internet companies and intermediaries that handle user-generated content or communications with service providers required to proactively take action against sex trafficking activities, and requiring a "team of lawyers" to evaluate all possible scenarios under state and federal law (which may be financially unfeasible for smaller companies).[18][19][20][21][22] Online sex workers argued that the bill would harm their safety, as the platforms they utilize for offering and discussing sexual services (as an alternative to street prostitution) had begun to reduce their services or shut down entirely due to the threat of liability under the bill.[23][24]

Social media[edit]

Many social media sites, notably Facebook and Twitter, came under scrutiny as a result of the alleged Russian interference in the 2016 United States elections, where it was asserted that Russian agents used the sites to spread propaganda and fake news to swing the election in favor of Donald Trump. These platforms also were criticized for not taking action against users that used the social media outlets for harassment and hate speech against others. Shortly after the passage of FOSTA-SESTA acts, some in Congress recognized that additional changes could be made to Section 230 to require service providers to deal with these bad actors, beyond what Section 230 already provided to them.[25]

Platform Neutrality[edit]

Some conservatives in congress have increasingly been taking aim at section 230 of the CDA amid charges that social media sites such as YouTube, Twitter, and Facebook have "taken sides", offering "preferential treatment" to certain political viewpoints and engaging in censorship and de-platforming against others. In a CSPAN interview Senator Josh Hawley charged that "Big tech" has received a "sweetheart deal" from government courtesy of section 230.[26] He claimed that making changes to this section of the law would reduce discrimination against conservatives and libertarian-leaning individuals. Senator Ted Cruz has also taken to target section 230 saying that "The predicate for Section 230 immunity under the CDA is that you’re a neutral public forum". Cruz believes that if these platforms cannot remain neutral then they have not earned CDA 230 protections. The focus on CDA 230 has also reached the Trump Administration.[27]

In the House of Representatives, Congressman Louie Gohmert introduced legislation in December 2018 during the 2018-2019 term that would remove all CDA 230 protections for any provider that used filters or any other type of algorithm to display user content when otherwise not directed by a user.[28][29][30] Slate author Joshua Geltzer argues that this approach is mistaken. "Section 230 empowers tech companies to experiment with new ways of imposing and enforcing norms on new sites of discourse, such as deleting extremist posts or suspending front accounts generated by foreign powers seeking to interfere with our elections.", wrote Geltzer.[31] Some criticisms of this viewpoint even among some conservatives, is that weakening section 230 would only lead to an even further crackdown against conservatives by social media platforms.[32]

Hawley introduced a bill to the Senate in June 2019 during the 2018-2019 term to modify section 230. Affecting only the largest service provides based on user-count or revenue, Hawley's bill would require the Federal Trade Commission (FTC) to review the content-screening policies used by these large sites every two years to make sure the content-screening was "politically neutral". The FTC would need a supermajority vote to approve of this, which would allow the company to retain its Section 230 protections, otherwise they would lose those.[33][34] Hawley's bill was meet with bi-partisan criticism, with only extreme right-leaning groups showing clear support. Many Republicans feared that by adding FTC oversight, the bill would continue to fuel fears of a big government with excessive oversight powers, and would lead to further reductions in free speech rather than more.[35] According to law professor Jeff Kosseff, who has spoken about Section 230 at length with Cox and Wyden and lobbyists at the time Section 230 was introduced, political neutrality was not the intent of Section 230, but only to make sure they had the ability to make content-removal judgements without fear of liability.[2]

Case law[edit]

Defamatory information[edit]

Immunity was upheld against claims that AOL unreasonably delayed in removing defamatory messages posted by third party, failed to post retractions, and failed to screen for similar postings.

  • Blumenthal v. Drudge, 992 F. Supp. 44, 49-53 (D.D.C. 1998).[37]

The court upheld AOL's immunity from liability for defamation. AOL's agreement with the contractor allowing AOL to modify or remove such content did not make AOL the "information content provider" because the content was created by an independent contractor. The Court noted that Congress made a policy choice by "providing immunity even where the interactive service provider has an active, even aggressive role in making available content prepared by others."

The court upheld immunity for an Internet dating service provider from liability stemming from third party's submission of a false profile. The plaintiff, Carafano, claimed the false profile defamed her, but because the content was created by a third party, the website was immune, even though it had provided multiple choice selections to aid profile creation.

  • Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003).[39]

Immunity was upheld for a website operator for distributing an email to a listserv where the plaintiff claimed the email was defamatory. Though there was a question as to whether the information provider intended to send the email to the listserv, the Court decided that for determining the liability of the service provider, "the focus should be not on the information provider's intentions or knowledge when transmitting content but, instead, on the service provider's or user's reasonable perception of those intentions or knowledge." The Court found immunity proper "under circumstances in which a reasonable person in the position of the service provider or user would conclude that the information was provided for publication on the Internet or other 'interactive computer service'."

  • Green v. AOL, 318 F.3d 465 (3rd Cir. 2003).[40]

The court upheld immunity for AOL against allegations of negligence. Green claimed AOL failed to adequately police its services and allowed third parties to defame him and inflict intentional emotional distress. The court rejected these arguments because holding AOL negligent in promulgating harmful content would be equivalent to holding AOL "liable for decisions relating to the monitoring, screening, and deletion of content from its network -- actions quintessentially related to a publisher's role."

Immunity was upheld for an individual internet user from liability for republication of defamatory statement on a listserv. The court found the defendant to be a "user of interactive computer services" and thus immune from liability for posting information passed to her by the author.

  • MCW, Inc. v. Report/Ed Magedson/XCENTRIC Ventures LLC) 2004 WL 833595, No. Civ.A.3:02-CV-2727-G, (N.D. Tex. April 19, 2004).[42]

The court rejected the defendant's motion to dismiss on the grounds of Section 230 immunity, ruling that the plaintiff's allegations that the defendants wrote disparaging report titles and headings, and themselves wrote disparaging editorial messages about the plaintiff, rendered them information content providers. The Web site,, allows users to upload "reports" containing complaints about businesses they have dealt with.

  • Hy Cite Corp. v. (RipOff Report/Ed Magedson/XCENTRIC Ventures LLC), 418 F. Supp. 2d 1142 (D. Ariz. 2005).[43]

The court rejected immunity and found the defendant was an "information content provider" under Section 230 using much of the same reasoning as the MCW case.

False information[edit]

  • Gentry v. eBay, Inc., 99 Cal. App. 4th 816, 830 (2002).[44]

eBay's immunity was upheld for claims based on forged autograph sports items purchased on the auction site.

  • Ben Ezra, Weinstein & Co. v. America Online, 206 F.3d 980, 984-985 (10th Cir. 2000), cert. denied, 531 U.S. 824 (2000).[45]

Immunity for AOL was upheld against liability for a user's posting of incorrect stock information.

Immunity upheld against claims of fraud and money laundering. Google was not responsible for misleading advertising created by third parties who bought space on Google's pages. The court found the creative pleading of money laundering did not cause the case to fall into the crime exception to Section 230 immunity.

Immunity for Orbitz and CheapTickets was upheld for claims based on fraudulent ticket listings entered by third parties on ticket resale marketplaces.

Sexually explicit content and minors[edit]

  • Doe v. America Online, 783 So. 2d 1010, 1013-1017 (Fl. 2001),[48] cert. denied, 122 S.Ct. 208 (2000)

The court upheld immunity against state claims of negligence based on "chat room marketing" of obscene photographs of minor by a third party.

  • Kathleen R. v. City of Livermore, 87 Cal. App. 4th 684, 692 (2001).[49]

The California Court of Appeal upheld the immunity of a city from claims of waste of public funds, nuisance, premises liability, and denial of substantive due process. The plaintiff's child downloaded pornography from a public library's computers which did not restrict access to minors. The court found the library was not responsible for the content of the internet and explicitly found that section 230(c)(1) immunity covers governmental entities and taxpayer causes of action.

The court upheld immunity for a social networking site from negligence and gross negligence liability for failing to institute safety measures to protect minors and failure to institute policies relating to age verification. The Does' daughter had lied about her age and communicated over MySpace with a man who later sexually assaulted her. In the court's view, the Does' allegations, were "merely another way of claiming that MySpace was liable for publishing the communications."

The court upheld immunity for Craigslist against a county sheriff's claims that its "erotic services" section constituted a public nuisance because it caused or induced prostitution.

  • v. McKenna, et al., CASE NO. C12-954-RSM[52]
  • LLC v Cooper, Case #: 12-cv-00654[SS1][53]
  • LLC v Hoffman et al., Civil Action No. 13-cv-03952 (DMC) (JAD)[54]

The court upheld immunity for Backpage in contesting a state of Washington law (SB6251)[55] that would have made providers of third-party content online liable for any crimes related to a minor in Washington State.[56] The states of Tennessee and New Jersey later passed similar legislation. Backpage argued that the laws violated Section 230, the Commerce Clause of the United States Constitution, and the First and Fifth Amendments.[55] In all three cases the courts granted Backpage permanent injunctive relief and awarded them attorney's fees.[53][57][58][59][60]

The court ruled in favor of Backpage after Sheriff Tom Dart of Cook County IL, a frequent critic of Backpage and its adult postings section, sent a letter on his official stationary to Visa and MasterCard demanding that these firms "immediately cease and desist..." allowing the use of their credit cards to purchase ads on Backpage. Within two days both companies withdrew their services from Backpage.[62] Backpage filed a lawsuit asking for a temporary restraining order and preliminary injunction against Dart granting Backpage relief and return to the status quo prior to Dart sending the letter. Backpage alleged that Dart's actions were unconstitutional violating the First and Fourteenth amendments to the US Constitution as well as Section 230 of the CDA. Backpage asked for Dart to retract his "cease and desist" letters.[63] After initially being denied the injunctive relief by a lower court,[64][65] the Seventh Circuit U.S. Court of Appeals reversed that decision and directed that a permanent injunction be issued enjoining Dart and his office from taking any actions "…to coerce or threaten credit card companies…with sanctions intended to ban credit card or other financial services from being provided to"[66] The court cited section 230 as part of its decision.

Discriminatory housing ads[edit]

The court upheld immunity for Craigslist against Fair Housing Act claims based on discriminatory statements in postings on the classifieds website by third party users.

The Ninth Circuit Court of Appeals rejected immunity for the roommate matching service for claims brought under the federal Fair Housing Act[69] and California housing discrimination laws.[70] The court concluded that the manner in which the service elicited information from users concerning their roommate preferences (by having dropdowns specifying gender, presence of children, and sexual orientation), and the manner in which it utilized that information in generating roommate matches (by eliminating profiles that did not match user specifications), the matching service created or developed the information claimed to violate the FHA, and thus was responsible for it as an "information content provider." The court upheld immunity for the descriptions posted by users in the "Additional Comments" section because these were entirely created by users.


A California Appellate Court unanimously upheld immunity from state tort claims arising from an employee's use of the employer's e-mail system to send threatening messages. The court concluded that an employer that provides Internet access to its employees qualifies as a "provider . . . of an interactive service."

Failure to warn[edit]

The Ninth Circuit Court of Appeals rejected immunity for claims of negligence under California law. Doe filed a complaint against Internet Brands which alleged a "failure to warn" her of a known rape scheme, despite her relationship to them as a member. They also had requisite knowledge to avoid future victimization of users by warning users of online sexual predators. The Ninth Circuit Court of Appeals concluded that the Communications Decency Act did not bar the claim and remanded the case to the district court for further proceedings.

In February 2015, the Ninth Circuit panel set aside its 2014 opinion and set the case for reargument. In May 2016, the panel again held that Doe's case could proceed.[71][72]

Similar legislation in other countries[edit]

European Union[edit]

Directive 2000/31/EC[73] establishes a safe haven regime for hosting providers:

  • Article 14 establishes that hosting providers are not responsible for the content they host as long as (1) the acts in question are neutral intermediary acts of a mere technical, automatic and passive capacity; (2) they are not informed of its illegal character, and (3) they act promptly to remove or disable access to the material when informed of it.
  • Article 15 precludes member states from imposing general obligations to monitor hosted content for potential illegal activities.

However, the pending Article 17 of the Directive on Copyright in the Digital Single Market would make providers liable if they fail to take "effective and proportionate measures" to prevent users from uploading certain copyright violations and do not response immediately to takedown requests.[74]


In Dow Jones & Company Inc v Gutnick,[75] the High Court of Australia treated defamatory material on a server outside Australia as having been published in Australia when it is downloaded or read by someone in Australia.

Gorton v Australian Broadcasting Commission & Anor (1973) 1 ACTR 6

Under the Defamation Act 2005 (NSW),[76] s 32, a defence to defamation is that the defendant neither knew, nor ought reasonably to have known of the defamation, and the lack of knowledge was not due to the defendant's negligence.

New Zealand[edit]

Failing to investigate the material or to make inquiries of the user concerned may amount to negligence in this context: Jensen v Clark [1982] 2 NZLR 268.


Directive 2000/31/CE was transposed into the LCEN law. Article 6 of the law establishes safe haven for hosting provider as long as they follow certain rules.

In LICRA vs. Yahoo!, the High Court ordered Yahoo! to take affirmative steps to filter out Nazi memorabilia from its auction site. Yahoo!, Inc. and its then president Timothy Koogle were also criminally charged, but acquitted.


In 1997, Felix Somm, the former managing director for CompuServe Germany, was charged with violating German child pornography laws because of the material CompuServe's network was carrying into Germany. He was convicted and sentenced to two years probation on May 28, 1998.[77][78] He was cleared on appeal on November 17, 1999.[79][80]

The Oberlandesgericht (OLG) Cologne, an appellate court, found that an online auctioneer does not have an active duty to check for counterfeit goods (Az 6 U 12/01).[81]

In one example, the first-instance district court of Hamburg issued a temporary restraining order requiring message board operator Universal Boards to review all comments before they can be posted to prevent the publication of messages inciting others to download harmful files. The court reasoned that "the publishing house must be held liable for spreading such material in the forum, regardless of whether it was aware of the content."[82]

United Kingdom[edit]

Also see: Defamation Act 2013.

The laws of libel and defamation will treat a disseminator of information as having "published" material posted by a user, and the onus will then be on a defendant to prove that it did not know the publication was defamatory and was not negligent in failing to know: Goldsmith v Sperrings Ltd (1977) 2 All ER 566; Vizetelly v Mudie's Select Library Ltd (1900) 2 QB 170; Emmens v Pottle & Ors (1885) 16 QBD 354.

In an action against a website operator, on a statement posted on the website, it is a defence to show that it was not the operator who posted the statement on the website. The defence is defeated if it was not possible for the claimant to identify the person who posted the statement, or the claimant gave the operator a notice of complaint and the operator failed to respond in accordance with regulations.


  1. ^ "Section 230 as First Amendment Rule". Harvard Law Review. 131: 2027. May 10, 2018. Retrieved June 21, 2019.
  2. ^ a b c Robertson, Adi (June 21, 2019). "Why The Internet's Most Important Law Exists And How People Are Still Getting It Wrong". The Verge. Retrieved June 21, 2019.
  3. ^ Stratton Oakmont, Inc. v. Prodigy Services Co., 31063/94, 1995 WL 323710, 1995 N.Y. Misc. LEXIS 712 Archived 2009-04-17 at the Wayback Machine (N.Y. Sup. Ct. 1995).
  4. ^ Congressional Record. 140: H 8478. August 4, 1995. Missing or empty |title= (help)
  5. ^ Reno v. ACLU, 521 844, 885 (United States Supreme Court 1997).
  6. ^ a b c Zeran v. Am. Online, Inc., 129 F.3d 327, 331 (4th Cir. 1997)
  7. ^ 47 U.S.C. § 230(e)(1) (criminal) and (2) (intellectual property); see also Gucci America, Inc. v. Hall & Associates, 135 F. Supp. 409 (S.D.N.Y. 2001). (no immunity for contributory liability for trademark infringement).
  8. ^, LLC v. McKenna, 881 F. Supp.2d 1262 (W.D. Wash. 2012).
  9. ^ Voicenet Commc'ns, Inc. v. Corbett, 2006 WL 2506318, 4 (E.D.Pa. Aug. 30, 2006).
  10. ^ Perfect 10, Inc. v. CCBill, LLC, 481 F.3d 751 (9th Cir. Mar. 29, 2007, amended May 31, 2007).
  11. ^ Cf. Carafano v., Inc., 339 F.3d 1119 (9th Cir. Aug. 13, 2003). (dismissing, inter alia, right of publicity claim under Section 230 without discussion). But see Doe v. Friendfinder Network, Inc., 540 F.Supp.2d 288 (D.N.H. 2008). (230 does not immunize against state IP claims, including right of publicity claims).
  12. ^ Doe v. Friendfinder Network, Inc., 540 F.Supp.2d 288 (D.N.H. 2008).
  13. ^ Staff, Ars (2017-12-23). "How do you change the most important law in Internet history? Carefully". Ars Technica. Retrieved 2017-12-26.
  14. ^ Jackman, Tom (2017-08-01). "Senate launches bill to remove immunity for websites hosting illegal content, spurred by". Washington Post. ISSN 0190-8286. Retrieved 2017-12-26.
  15. ^ Ann, Wagner (March 21, 2018). "H.R.1865 - 115th Congress (2017-2018): Allow States and Victims to Fight Online Sex Trafficking Act of 2017".
  16. ^ Elizabeth Dias (2018-04-11). "Trump Signs Bill Amid Momentum to Crack Down on Trafficking". New York Times. Retrieved 2018-04-11.
  17. ^ Larry Magid (2018-04-06). "DOJ Seizes Weeks After Congress Passes Sex Trafficking Law". Forbes. Retrieved 2018-04-08.
  18. ^ "ACLU letter opposing SESTA". American Civil Liberties Union. Retrieved 2018-03-25.
  19. ^ "SWOP-USA stands in opposition of disguised internet censorship bill SESTA, S. 1963". Sex Workers Outreach Project. Retrieved 2017-10-23.
  20. ^ "Wikipedia warns that SESTA will strip away protections vital to its existence". The Verge. Retrieved 2018-03-08.
  21. ^ "Sex trafficking bill is turning into a proxy war over Google". The Verge. Retrieved 2017-09-20.
  22. ^ Quinn, Melissa. "Tech community fighting online sex trafficking bill over fears it will stifle innovation". Washington Examiner. Retrieved 2017-09-20.
  23. ^ "How a New Senate Bill Will Screw Over Sex Workers". Rolling Stone. Retrieved 2018-03-25.
  24. ^ Zimmerman, Amy (2018-04-04). "Sex Workers Fear for Their Future: How SESTA Is Putting Many Prostitutes in Peril". The Daily Beast. Retrieved 2018-04-07.
  25. ^ Zhou, Li; Scola, Nancy; Gold, Ashley (November 1, 2017). "Senators to Facebook, Google, Twitter: Wake up to Russian threat". Politico. Retrieved March 12, 2019.
  26. ^ Josh Hawley attacks CDA 230
  27. ^ Takeaways from SOTN 2019
  28. ^ Google CEO Sundar Pichai Faces Lawmakers Skeptical Over Privacy, Alleged Anti-Conservative Bias
  29. ^ H.R.7363 Biased Algorithm Deterrence Act of 2018
  30. ^ Gohmert Introduces Bill That Removes Liability Protections for Social Media Companies That Use Algorithms to Hide, Promote, or Filter User Content
  31. ^ The President and Congress Are Thinking of Changing This Important Internet Law
  32. ^ Expect More Conservative Purges on Social Media If Republicans Target Section 230
  33. ^ [1]
  34. ^ Wellons, Mary Catherine (June 18, 2019). "GOP senator introduces a bill that would blow up business models for Facebook, YouTube and other tech giants". CNBC. Retrieved June 21, 2019.
  35. ^ Lecher, Colin (June 21, 2019). The Verge Retrieved June 21, 2019. Text " Both parties are mad about a proposal for federal anti-bias certification " ignored (help); Missing or empty |title= (help)
  36. ^ Zeran v. AOL Archived 2008-10-31 at the Wayback Machine, 129 F.3d 327 (4th Cir. 1997).
  37. ^ Blumenthal v. Drudge, 992 F. Supp. 44, 49-53 (D.D.C. 1998).
  38. ^ Carafano v., 339 F.3d 1119 (9th Cir. 2003).
  39. ^ Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003).
  40. ^ Green v. AOL Archived 2008-05-12 at the Wayback Machine, 318 F.3d 465 (3rd Cir. 2003).
  41. ^ Barrett v. Rosenthal, 40 Cal. 4th 33 (2006).
  42. ^ MCW, Inc. v., L.L.C. 2004 WL 833595, No. Civ.A.3:02-CV-2727-G, (N.D. Tex. April 19, 2004).
  43. ^ Hy Cite Corp. v., 418 F. Supp. 2d 1142 (D. Ariz. 2005).
  44. ^ Gentry v. eBay, Inc., 99 Cal. App. 4th 816, 830 (2002).
  45. ^ Ben Ezra, Weinstein & Co. v. America Online Archived 2008-07-24 at the Wayback Machine, 206 F.3d 980 (10th Cir. 2000).
  46. ^ Goddard v. Google, Inc., C 08-2738 JF (PVT), 2008 WL 5245490, 2008 U.S. Dist. LEXIS 101890 (N.D. Cal. Dec. 17, 2008).
  47. ^ Milgram v. Orbitz Worldwide, LLC, ESX-C-142-09 (N.J. Super. Ct. Aug. 26, 2010).
  48. ^ Doe v. America Online Archived 2009-05-23 at the Wayback Machine, 783 So. 2d 1010 (Fl. 2001)
  49. ^ Kathleen R. v. City of Livermore, 87 Cal. App. 4th 684 (2001)
  50. ^ Doe v. MySpace, 528 F.3d 413 (5th Cir. 2008)
  51. ^ Dart v. Craigslist, 665 F. Supp. 2d 961 (N.D. Ill. Oct. 20, 2009)
  52. ^ BACKPAGE.COM LLC, Plaintiff, and THE INTERNET ARCHIVE, Plaintiff Intervenor, vs. ROB MCKENNA, Attorney General of Washington, et al., Defendants, in their official capacities (United States District Court Western District of Washington at Seattle July 30, 2012). Text
  53. ^ a b BACKPAGE.COM, LLC, Plaintiff, v. BACKPAGE.COM, LLC v. ROBERT E. COOPER, JR., et al., Defendants, Case 3:12-cv-00654, Document 88 (The United States District Court for the Middle District of Tennessee, Nashville Division May 22, 2014).
  54. ^ BACKPAGE.COM, LLC, Plaintiff, V. JOHN JAY HOFFMAN, Acting Attorney General of the State of New Jersey; et al.; Defendants. in their official capacities - THE INTERNET ARCHIVE, Plaintiff, v. JOHN JAY HOFFMAN, Acting Attorney General of the State of New Jersey; et al.; Defendants, in their official capacities (United States District Court for The District of New Jersey June 28, 2013). Text
  55. ^ a b DMLP Staff (2 August 2012). " v. McKenna, et al". Digital Media Law Project. Retrieved 18 May 2014.
  56. ^ 62nd Legislature 2012 Regular Session. "Certification of Enrollment: Engrossed Substitute Senate Bill 6251" (PDF). Washington State Legislature. Retrieved 2014-05-18.
  57. ^ "Judgment in a Civil Case:, LLC and The Internet Archive v. Rob McKenna, Attorney General of the State of Washington, et al" (PDF). United States District Court for the Western District of Washington. 2012-12-10. Case Number C12-954RSM, Document 87. Retrieved 2014-05-18.
  58. ^ Nissenbaum, Gary (May 29, 2014). "Are Internet Publishers Responsible for Advertisements for Potential Sexual Liaisons with Minors?". Nissenbaum Law Group, LLC. Retrieved January 21, 2016.
  59. ^ BACKPAGE.COM, LLC, Plaintiff, v. JOHN JAY HOFFMAN, Acting Attorney General of the State of New Jersey, et al.; Defendants, in their official capacities. & THE INTERNET ARCHIVE, Plaintiff-Intervenor, v. JOHN JAY HOFFMAN, Acting Attorney General of the State of New Jersey, et al.; Defendants, in their official capacities., CIVIL ACTION NO. 2:13-03952 (CCC-JBC) (The United States District Court for the District of New Jersey May 14, 2014).
  60. ^ BACKPAGE.COM, LLC, Plaintiff, v. JOHN JAY HOFFMAN, Acting Attorney General of the State of New Jersey, et al.; Defendants, in their official capacities. - THE INTERNET ARCHIVE, Plaintiff-Intervenor, v. JOHN JAY HOFFMAN, Acting Attorney General of the State of New Jersey, et al. (The United States District Court for the District of New Jersey May 13, 2014). Text
  61. ^ BACKPAGE.COM, LLC, Plaintiff-Appellant, v. THOMAS J. DART, Sheriff of Cook County, Illinois, Defendant-Appellee (United States Court of Appeals For the Seventh Circuit November 30, 2015). Text
  62. ^ Stempel, Jonathan (November 30, 2015). " wins injunction vs Chicago sheriff over adult ads". Reuters. Retrieved January 21, 2016.
  63. ^ Sneed, Tierney (July 21, 2015). "Backpage Sues Chicago Sheriff Over Pressure Campaign to Stop Sex Ads". Talking Points Memo. Retrieved January 21, 2016.
  64. ^ BACKPAGE.COM, LLC, Plaintiff, No. 15 C 06340 v. SHERIFF THOMAS J. DART, Defendant (The United States District Court Northern District of Illinois, Eastern Division July 24, 2015). Text
  65. ^ "BACKPAGE.COM, LLC v. DART". Leagle. August 24, 2015. Retrieved February 19, 2016.
  66. ^ BACKPAGE.COM, LLC, Plaintiff-Appellant, v. THOMAS J. DART, Sheriff of Cook County, Illinois Defendant-Appellee (United States Court of Appeals For the Seventh Circuit November 30, 2015). Text
  67. ^ Chicago Lawyers' Committee For Civil Rights Under Law, Inc. v. Craigslist, Inc. Archived 2008-05-22 at the Wayback Machine 519 F.3d 666 (7th Cir. 2008).
  68. ^ Fair Housing Council of San Fernando Valley v., LLC, 521 F.3d 1157 (9th Cir. 2008) (en banc).
  69. ^ 42 U.S.C. § 3604(c).
  70. ^ Cal. Gov. Code § 12955 Archived 2010-08-02 at the Wayback Machine.
  71. ^ Proctor, Katherine (May 31, 2016). "Raped Model's Suit Against Website Revived". Courthouse News Service. Retrieved June 1, 2016.
  72. ^ Jane Doe No. 14 v. Internet Brands, Inc., no. 12-56638 (9th Cir. May 31, 2016).
  73. ^ "EUR-Lex - 32000L0031 - EN".
  74. ^ "Proposal for a directive on copyright in the Digital Single Market" (PDF). 25 May 2018. p. 26.
  75. ^ "Dow Jones & Company Inc. v Gutnick [2002] HCA 56 (10 December 2002)".
  76. ^ "DEFAMATION ACT 2005".
  77. ^ "The CompuServe Germany Case". Archived from the original on 2004-02-25. Retrieved 2003-11-23.
  78. ^ Christopher Kuner. "Judgment of the Munich Court in the "CompuServe Case" (Somm Case)".
  79. ^ PROF. DR. ULRICH SIEBER. "Commentary on the Conclusion of Proceedings in the "CompuServe Case"".
  80. ^ "World: Europe Ex-CompuServe boss acquitted". BBC. 17 November 1999.
  81. ^ Noogie C. Kaufmann (12 March 2004). "BGH: Online-Auktionshäuser müssen Angebote von Plagiaten sperren". heise online.
  82. ^ "heise online - IT-News, Nachrichten und Hintergründe". heise online. Archived from the original on 2008-10-22.

External links[edit]