Bill combines the worst of online censorship schemes
The
House of Representatives is about to vote on a bill that would force
online platforms to censor their users. The Allow States and Victims to
Fight Online Sex Trafficking Act (FOSTA, H.R. 1865) might sound noble, but it would do nothing to stop sex traffickers. What it would do
is force online platforms to police their users’ speech more forcefully
than ever before, silencing legitimate voices in the process.
Back in December, we said that while FOSTA was a very dangerous bill, its impact on online spaces would not be as broad as the Senate bill, the Stop Enabling Sex Traffickers Act (SESTA, S. 1693).
That’s about to change.
The House Rules Committee is about to approve a new version of FOSTA [.pdf] that incorporates most of the dangerous components of SESTA. This new Frankenstein’s Monster of a bill would be a disaster for Internet intermediaries, marginalized communities, and even trafficking victims themselves.
FOSTA would undermine Section 230, the law protecting online platforms from some types of liability for their users’ speech. As we’ve explained before, the modern Internet is only possible thanks to a strong Section 230. Without Section 230, most of the online platforms we use would never have been formed—the risk of liability for their users’ actions would have simply been too high.
Section 230 strikes an important balance for when online platforms can be held liable for their users’ speech. Contrary to FOSTA supporters’ claims, Section 230 does nothing to protect platforms that break federal criminal law. In particular, if an Internet company knowingly engages in the advertising of sex trafficking, the U.S. Department of Justice can and should prosecute it. Additionally, Internet companies are not immune from civil liability for user-generated content if plaintiffs can show that a company had a direct hand in creating the illegal content.
The new version of FOSTA would destroy that careful balance, opening platforms to increased criminal and civil liability at both the federal and state levels. This includes a new federal sex trafficking crime targeted at web platforms (in addition to 18 U.S.C. § 1591)—but which would not require a platform to have knowledge that people are using it for sex trafficking purposes. This also includes exceptions to Section 230 for state law criminal prosecutions against online platforms, as well as civil claims under federal law and civil enforcement of federal law by state attorneys general.
Perhaps most disturbingly, the new version of FOSTA would make the changes to Section 230 apply retroactively: a platform could be prosecuted for failing to comply with the law before it was even passed.
Together, these measures would chill innovation and competition among Internet companies. Large companies like Google and Facebook may have the budgets to survive the massive increase in litigation and liability that FOSTA would bring. They may also have the budgets to implement a mix of automated filters and human censors to comply with the law. Small startups don’t. And with the increased risk of litigation, it would be difficult for new startups ever to find the funding they need to compete with Google.
Today’s large Internet companies would not have grown to prominence without the protections of Section 230. FOSTA would raise the ladder that has allowed those companies to grow, making it very difficult for newcomers ever to compete with them.
More dangerous still is the impact that FOSTA would have on online speech. Facing the threat of extreme criminal and civil penalties, web platforms large and small would have little choice but to silence legitimate voices. Supporters of SESTA and FOSTA pretend that it’s easy to distinguish online postings related to sex trafficking from ones that aren’t. It’s not—and it’s impossible at the scale needed to police a site as large as Facebook or Reddit. The problem is compounded by FOSTA’s expansion of federal prostitution law. Platforms would have to take extreme measures to remove a wide range of postings, especially those related to sex.
Some supporters of these bills have argued that platforms can rely on automated filters in order to distinguish sex trafficking ads from legitimate content. That argument is laughable. It’s difficult for a human to distinguish between a legitimate post and one that supports sex trafficking; a computer certainly could not do it with anything approaching 100% accuracy. Instead, platforms would have to calibrate their filters to over-censor. When web platforms rely too heavily on automated filters, it often puts marginalized voices at a disadvantage.
Most tragically of all, the first people censored would likely be sex trafficking victims themselves. The very same words and phrases that a filter would use to attempt to delete sex trafficking content would also be used by victims of trafficking trying to get help or share their experiences.
There are many, many stories of traffickers being caught by law enforcement thanks to clues that police officers and others found on online platforms. Congress should think long and hard before dismantling the very tools that have proven most effective in fighting trafficking.
There is no amendment to FOSTA that would make it effective at fighting online trafficking while respecting the civil liberties of everyone online. That’s because the problem with FOSTA and SESTA isn’t a single provision or two; it’s the whole approach.
Creating more legal tools to go after online platforms would not punish sex traffickers.
It would punish all of us, wrecking the safe online communities that we use every day. And in the process, it would also undermine the tools that have proven most effective at putting traffickers in prison. FOSTA is not the right solution, and no trimming around the edges will make it the right solution.
If you care about protecting the safety of our online communities—if you care about protecting everyone’s right to speak online, even about sensitive topics—we urge you to call your representative today and tell them to reject FOSTA.
Back in December, we said that while FOSTA was a very dangerous bill, its impact on online spaces would not be as broad as the Senate bill, the Stop Enabling Sex Traffickers Act (SESTA, S. 1693).
That’s about to change.
The House Rules Committee is about to approve a new version of FOSTA [.pdf] that incorporates most of the dangerous components of SESTA. This new Frankenstein’s Monster of a bill would be a disaster for Internet intermediaries, marginalized communities, and even trafficking victims themselves.
FOSTA would undermine Section 230, the law protecting online platforms from some types of liability for their users’ speech. As we’ve explained before, the modern Internet is only possible thanks to a strong Section 230. Without Section 230, most of the online platforms we use would never have been formed—the risk of liability for their users’ actions would have simply been too high.
Section 230 strikes an important balance for when online platforms can be held liable for their users’ speech. Contrary to FOSTA supporters’ claims, Section 230 does nothing to protect platforms that break federal criminal law. In particular, if an Internet company knowingly engages in the advertising of sex trafficking, the U.S. Department of Justice can and should prosecute it. Additionally, Internet companies are not immune from civil liability for user-generated content if plaintiffs can show that a company had a direct hand in creating the illegal content.
The new version of FOSTA would destroy that careful balance, opening platforms to increased criminal and civil liability at both the federal and state levels. This includes a new federal sex trafficking crime targeted at web platforms (in addition to 18 U.S.C. § 1591)—but which would not require a platform to have knowledge that people are using it for sex trafficking purposes. This also includes exceptions to Section 230 for state law criminal prosecutions against online platforms, as well as civil claims under federal law and civil enforcement of federal law by state attorneys general.
Perhaps most disturbingly, the new version of FOSTA would make the changes to Section 230 apply retroactively: a platform could be prosecuted for failing to comply with the law before it was even passed.
Together, these measures would chill innovation and competition among Internet companies. Large companies like Google and Facebook may have the budgets to survive the massive increase in litigation and liability that FOSTA would bring. They may also have the budgets to implement a mix of automated filters and human censors to comply with the law. Small startups don’t. And with the increased risk of litigation, it would be difficult for new startups ever to find the funding they need to compete with Google.
Today’s large Internet companies would not have grown to prominence without the protections of Section 230. FOSTA would raise the ladder that has allowed those companies to grow, making it very difficult for newcomers ever to compete with them.
More dangerous still is the impact that FOSTA would have on online speech. Facing the threat of extreme criminal and civil penalties, web platforms large and small would have little choice but to silence legitimate voices. Supporters of SESTA and FOSTA pretend that it’s easy to distinguish online postings related to sex trafficking from ones that aren’t. It’s not—and it’s impossible at the scale needed to police a site as large as Facebook or Reddit. The problem is compounded by FOSTA’s expansion of federal prostitution law. Platforms would have to take extreme measures to remove a wide range of postings, especially those related to sex.
Some supporters of these bills have argued that platforms can rely on automated filters in order to distinguish sex trafficking ads from legitimate content. That argument is laughable. It’s difficult for a human to distinguish between a legitimate post and one that supports sex trafficking; a computer certainly could not do it with anything approaching 100% accuracy. Instead, platforms would have to calibrate their filters to over-censor. When web platforms rely too heavily on automated filters, it often puts marginalized voices at a disadvantage.
Most tragically of all, the first people censored would likely be sex trafficking victims themselves. The very same words and phrases that a filter would use to attempt to delete sex trafficking content would also be used by victims of trafficking trying to get help or share their experiences.
There are many, many stories of traffickers being caught by law enforcement thanks to clues that police officers and others found on online platforms. Congress should think long and hard before dismantling the very tools that have proven most effective in fighting trafficking.
There is no amendment to FOSTA that would make it effective at fighting online trafficking while respecting the civil liberties of everyone online. That’s because the problem with FOSTA and SESTA isn’t a single provision or two; it’s the whole approach.
Creating more legal tools to go after online platforms would not punish sex traffickers.
It would punish all of us, wrecking the safe online communities that we use every day. And in the process, it would also undermine the tools that have proven most effective at putting traffickers in prison. FOSTA is not the right solution, and no trimming around the edges will make it the right solution.
If you care about protecting the safety of our online communities—if you care about protecting everyone’s right to speak online, even about sensitive topics—we urge you to call your representative today and tell them to reject FOSTA.
No comments:
Post a Comment