Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The scenarios envisioned in the way the bill is written don't actually apply sanely to how the Internet works. Anyone can send any number of (valid or not) reports to any service provider. That service provider then has to somehow decide if every one of those reports fits the definitions of an intimate visual depiction, and, if so, take it down. There's nothing preventing someone from making fraudulent claims, nor any punishment for doing so. The requirements in Section (3)(a)(1)(B) are trivial to automate ("Hi, my name is So-and-so. The image at URL is an intimate image of me posted without my consent. For reference, URL is a picture of me that confirms that URL contains a picture of me. My email address is [email protected]." satisfies the requirements of that section, at a glance).

The limitation on liability is only saying they're not responsible for the consequences of taking something down, not for the consequences of leaving something up.

That, plus the FTC being able to sue any company for the inevitable false negatives that will happen means that the only reasonable response to takedown requests is to be extremely cautious about rejecting them. It'll inevitably be abused for spurious takedowns way more than the DMCA already is.



> That, plus the FTC being able to sue any company for the inevitable false negatives that will happen means that the only reasonable response to takedown requests is to be extremely cautious about rejecting them.

... or to finally hire enough moderators to make competent judgements to avoid getting a counter lawsuit for restricting free speech.


The safe harbor provision ensures that they don't have to worry about getting sued for restricting free speech.


> fits the definitions of an intimate visual depiction,

Hardly seems difficult. I think a lot of services have TOSes which cover this type of content. The text of the bill also plainly defines what is covered.

> are trivial to automate

And removal is trivial to automate. I'm pretty sure providers already have systems which cover this case. Those that don't likely don't allow posting of pornographic material whether it's consensual or not.

> they're not responsible for the consequences of taking something down

So the market for "intimate depictions" got a little harder to participate in. This is a strange hill to fight over.

> It'll inevitably be abused for spurious takedowns

Of pornographic content. The law is pretty well confined to "visual depictions." I can see your argument on it's technical merits I just can't rationalize it into the real world other than for some absurdly narrow cases.


How does your trivial removal automation distinguish between 'intimate depictions' and 'political imagery I dislike'?

The whole point of this is discussion is that this is going to be used to censor everything, not just 'intimate visual depictions'.


> How does your trivial removal automation distinguish between 'intimate depictions' and 'political imagery I dislike'?

It doesn't; however, you would immediately have a civil case against the person and/or their representative for their false claim. That's not spelled out in the bill but it should be obvious.

> The whole point of this is discussion is that this is going to be used to censor everything

That's the claim. You may accept it without objection. I simply do not. Now I'm offering a slightly modified discussion. Is that alright?

> not just 'intimate visual depictions'.

I'm sure you would agree that any automation would obviously only be able to challenge images. This does create a vulnerability to be sure, but I do not agree that it automatically creates the wholesale censorship of political speech that you or the EFF envisions here.

It also makes efforts at being scoped only to sites which rely on user generated content effectively limiting it to social media platforms and certain types of adult content websites. Due to their nature it's already likely that these social media platforms do _not_ allow adult content on their websites and have well developed mechanisms to handle this precise problem.

The bill could be refined for civil liberties sake; however, in it's current state, I fail to see the extreme danger of it.


>however, you would immediately have a civil case against the person and/or their representative for their false claim. That's not spelled out in the bill but it should be obvious.

Wow. Not sure if this is ludicrously bad faith or just ludicrously naive/ignorant/unthinking but it's ludicrous either way. Plenty enough to nullify every other thing you attempt to say on the topic.


> however, you would immediately have a civil case against the person and/or their representative for their false claim.

Look how well that's worked against DMCA and Youtube copyright strike abuse. In the case that the posts that are being taken down are not commercial, being able to prove damages means that the effectiveness of such a deterrent are minimal. The act could have put in some sort of disincentive against fraudulent reports, or even provided for a way to find the reason your stuff was taken down, but notably does not do so.


> It doesn't; however, you would immediately have a civil case against the person and/or their representative for their false claim.

Great. So a motivated bad actor can send out 10,000,000 bogus takedowns for images promoting political positions and people they disagree with, and they have to be taken down immediately, and then all 10 million people affected have to individually figure out who the hell actually submitted the takedowns, and have enough money for lawyers, and enough time to engage in a civil suit, and in the end they might get money, if they can somehow prove that taking down those specific images damaged them personally, rather than just a cause they believe in, but will they get the images restored?

This just smacks of obliviousness and being woefully out of touch, even before we get to

> That's not spelled out in the bill but it should be obvious.

...which almost makes it sound like this whole thing is just an elaborate troll.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: