Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Here's the text.

https://www.congress.gov/bill/119th-congress/senate-bill/146...

I can't identify where EFFs concerns are coming from. There's a specific limitation of liability for online platforms and the entire process appears to be complaint driven and requires quite a bit of evidence from the complaintant.

What actually concerns me in this bill?

> (B) INVOLVING MINORS.—Except as provided in subparagraph (C), it shall be unlawful for any person, in interstate or foreign commerce, to use an interactive computer service to knowingly publish a digital forgery of an identifiable individual who is a minor with intent to—

> “(i) abuse, humiliate, harass, or degrade the minor; or

> “(ii) arouse or gratify the sexual desire of any person.

> “(C) EXCEPTIONS.—Subparagraphs (A) and (B) shall not apply to—

> “(i) a lawfully authorized investigative, protective, or intelligence activity of—

> “(I) a law enforcement agency of the United States, a State, or a political subdivision of a State; or

> “(II) an intelligence agency of the United States;

Wut? Why do you need this? Are we the baddies?



The scenarios envisioned in the way the bill is written don't actually apply sanely to how the Internet works. Anyone can send any number of (valid or not) reports to any service provider. That service provider then has to somehow decide if every one of those reports fits the definitions of an intimate visual depiction, and, if so, take it down. There's nothing preventing someone from making fraudulent claims, nor any punishment for doing so. The requirements in Section (3)(a)(1)(B) are trivial to automate ("Hi, my name is So-and-so. The image at URL is an intimate image of me posted without my consent. For reference, URL is a picture of me that confirms that URL contains a picture of me. My email address is [email protected]." satisfies the requirements of that section, at a glance).

The limitation on liability is only saying they're not responsible for the consequences of taking something down, not for the consequences of leaving something up.

That, plus the FTC being able to sue any company for the inevitable false negatives that will happen means that the only reasonable response to takedown requests is to be extremely cautious about rejecting them. It'll inevitably be abused for spurious takedowns way more than the DMCA already is.


> That, plus the FTC being able to sue any company for the inevitable false negatives that will happen means that the only reasonable response to takedown requests is to be extremely cautious about rejecting them.

... or to finally hire enough moderators to make competent judgements to avoid getting a counter lawsuit for restricting free speech.


The safe harbor provision ensures that they don't have to worry about getting sued for restricting free speech.


> fits the definitions of an intimate visual depiction,

Hardly seems difficult. I think a lot of services have TOSes which cover this type of content. The text of the bill also plainly defines what is covered.

> are trivial to automate

And removal is trivial to automate. I'm pretty sure providers already have systems which cover this case. Those that don't likely don't allow posting of pornographic material whether it's consensual or not.

> they're not responsible for the consequences of taking something down

So the market for "intimate depictions" got a little harder to participate in. This is a strange hill to fight over.

> It'll inevitably be abused for spurious takedowns

Of pornographic content. The law is pretty well confined to "visual depictions." I can see your argument on it's technical merits I just can't rationalize it into the real world other than for some absurdly narrow cases.


How does your trivial removal automation distinguish between 'intimate depictions' and 'political imagery I dislike'?

The whole point of this is discussion is that this is going to be used to censor everything, not just 'intimate visual depictions'.


> How does your trivial removal automation distinguish between 'intimate depictions' and 'political imagery I dislike'?

It doesn't; however, you would immediately have a civil case against the person and/or their representative for their false claim. That's not spelled out in the bill but it should be obvious.

> The whole point of this is discussion is that this is going to be used to censor everything

That's the claim. You may accept it without objection. I simply do not. Now I'm offering a slightly modified discussion. Is that alright?

> not just 'intimate visual depictions'.

I'm sure you would agree that any automation would obviously only be able to challenge images. This does create a vulnerability to be sure, but I do not agree that it automatically creates the wholesale censorship of political speech that you or the EFF envisions here.

It also makes efforts at being scoped only to sites which rely on user generated content effectively limiting it to social media platforms and certain types of adult content websites. Due to their nature it's already likely that these social media platforms do _not_ allow adult content on their websites and have well developed mechanisms to handle this precise problem.

The bill could be refined for civil liberties sake; however, in it's current state, I fail to see the extreme danger of it.


>however, you would immediately have a civil case against the person and/or their representative for their false claim. That's not spelled out in the bill but it should be obvious.

Wow. Not sure if this is ludicrously bad faith or just ludicrously naive/ignorant/unthinking but it's ludicrous either way. Plenty enough to nullify every other thing you attempt to say on the topic.


> however, you would immediately have a civil case against the person and/or their representative for their false claim.

Look how well that's worked against DMCA and Youtube copyright strike abuse. In the case that the posts that are being taken down are not commercial, being able to prove damages means that the effectiveness of such a deterrent are minimal. The act could have put in some sort of disincentive against fraudulent reports, or even provided for a way to find the reason your stuff was taken down, but notably does not do so.


> It doesn't; however, you would immediately have a civil case against the person and/or their representative for their false claim.

Great. So a motivated bad actor can send out 10,000,000 bogus takedowns for images promoting political positions and people they disagree with, and they have to be taken down immediately, and then all 10 million people affected have to individually figure out who the hell actually submitted the takedowns, and have enough money for lawyers, and enough time to engage in a civil suit, and in the end they might get money, if they can somehow prove that taking down those specific images damaged them personally, rather than just a cause they believe in, but will they get the images restored?

This just smacks of obliviousness and being woefully out of touch, even before we get to

> That's not spelled out in the bill but it should be obvious.

...which almost makes it sound like this whole thing is just an elaborate troll.


The limitation of liability is itself concerning because it means platforms won't care about fake takedowns. This law doesn't even have the counter notice process that DMCA has. You say take down X, they take it down, they're not liable. There's no appeal process that I see.


It applies to a narrow category of providers.

> IN GENERAL.—The term “covered platform” means a website, online service, online application, or mobile application—

> (i) that serves the public; and

> (ii) (I) that primarily provides a forum for user-generated content, including messages, videos, images, games, and audio files; or

> (II) for which it is in the regular course of trade or business of the website, online service, online application, or mobile application to publish, curate, host, or make available content of nonconsensual intimate visual depictions.

If you publish the content on your own website or on certain public websites you don't even have to respond to these requests. You do; however, open yourself to criminal and civil liability for publicly hosting any nonconseual visual images. Your provider is also excluded from service and cannot legally be compelled to participate in their removal.


This sounds like any online forum of any kind, commercial or hobbyist, which allows uploading images. I guess you could call that a narrow category, but I wouldn't.


What’s stopping someone from taking down this post of yours?

Assume they are a serial liar. Or that they are a political opponent with zero shame.


Sounds like something meant to allow the vice squad to post "selfies" when they're pretending to be kids?


We need it so I don't file a takedown request against what you just posted because I disagree with it.


Sting operation maybe?


The article says the takedown section is much broader than other sections.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: