Hacker Newsnew | past | comments | ask | show | jobs | submit | colpabar's commentslogin

I don’t think this is a japanese thing. The way they are askew feels familiar; I have definitely seen food that looks weirdly “off” on other menus. It’s probably just a way to stand out, like how so many models have gaps between their two front teeth. You’re gonna remember the one that’s different.


Any other examples of burgers shown like this?


Usually threads like this fill up with comments about how "think of the children" is always a lie used to justify something draconian. I agree with that to an extent, but for those who think that applies here, is there _nothing_ we can do as a society to address this? I'd like a better answer than "let parents deal with it," and the whole "this wouldn't be a problem if america wasn't so puritanical maaaaaan" argument is total bullshit that completely ignores the young girls who get hurt by this.


I'm a parent, and all I can think as a solution is devalue nudity. I'm not saying we should all walk around naked, but make sure children understand the human body earlier, any why we wear clothes, the values, customs, without the stigma. Makes it also easier to teach what's okay and what's not, and if someone does something "not okay", they are easier to identify as the ones responsible. Fake or real, a nude photo should give no control to anyone.

It's extremely easy to do all sorts of weird things with AI, all local. Controlling that means controlling the hardware, something none of us wants, and it will get only easier, I suppose. So, doing what I said earlier becomes even easier if these stuff gets automatically devalued by commoditization.

I'm not saying I know better than people who want restrictions, and I'm not trying to offer yet another "ban all bans" opinion, I just don't see any other realistic solution. There are however many other people much more knowledgeable than me in these matters, so, maybe I'll be positively surprised.


It won't work, because if nudity doesn't get the rise those bully crave, they will go toward depicting sexual acts. And you don't want to devalue that, girls have enough trouble reporting SA as it is.


Perhaps they observe being "pure", and "clean" valued too much, and they see again and again the people who had their sex tapes leaked getting bad publicity. Perhaps that's why cannot report it? I remember Sibel Kekilli's German family (of Turkish origin, exactly like me) disowning her because someone found some porn movie of her. Also this:

> In 2017, Kekilli blocked her Instagram account from being accessed in Turkey, saying that users from that country had sent a multitude of abusive and threatening messages.

and this:

> A discussion has been trending on X (formerly Twitter) after a post featuring side-by-side images of Sibel Kekilli from the early 2000s and her later look in the popular series Game of Thrones. The caption read, “She was once a p** star, but HBO gave her the role anyway,” which has garnered close to 10 million views.

from https://www.yahoo.com/entertainment/celebrity/articles/inter...

Why should that even matter? So you can see her naked online... So what?! I'm reading "Sluts" from Beth Ashley, and she identifies these patterns perfectly in todays world even when they're not obvious.

Males too, they value being tough too much so they don't report abuse. One example is Chester Bennington (may he rest in peace):

> Bennington was afraid to ask for help, not wanting people to think he was gay or a liar, and the abuse continued until age 13

I have a son, and I would be devastated if I couldn't give my son the courage to report something like this. I'm thankful that in this day and age, the "gay" stigma is much less pronounced (like 0 in this wonderful country called Germany). That said, we still have a lot to do though!

So how do we solve this without making porn and nudity nothing sacred? I remember first coming to Germany and people being confused because I was too shy to change my clothes in the male changing room together. Then I realized... Everyone is naked, why should I be ashamed?


> blocked her Instagram account from being accessed in Turkey

Wait, you buried the lede here! Instagram can block countries? How can I do that?


I agree. My partner works in schools and recently she was talking about how they now run these classes in schools telling the kids not to send nude pictures to each other because it will ruin their careers and if they get out people will bully them, etc.

Of course I agree with teaching kids that people might have various views about nudity, but I think effectively teaching them that if they take nude photos of themselves it is the end of the world and will inflict permanent damage to their reputation as a means to try to prevent it happening is absurd.

I think if anything the opposite would be the better solution – to teach kids that it's perfectly normal and respectable in this day and age for people to share nudes with each other, but that it's important to trust those you share the nudes with if you don't want them getting out.

Similarly with deep fakes I don't think we should be telling kids how awful it is for them to be deepfaked, and that they are a victim etc, but that this is just something that's likely to happen these days and while it's disrespectful, and while they have a right to be angry, it's also not something to get overly worked up about.

I just think we have to be pragmatic about this.. The only reason there's any shame in any of this is because we have a societal sigma around nudity. You're not going to get rid of deepfakes and nudes being leaked, but you might be able to change attitudes such that it doesn't really matter.


This isn't just nudity.

These fakes are made of young women with what looks like cum all over them or in a pose to give a blowjob or be penetrated. Devaluing nudity does not change how people interpret porn.


Sounds like people here want to outlaw non-physical abuse, but that contradicts the first amendment, doesn't it?


No. The first amendment does not provide unlimited protection for all things that resemble speech. Students have further limited speech protections, and sexual speech by students at school has been expressly found to not be protected by the first amendment. Abusive and bullying behavior is similarly unprotected. And while I don't believe that there has been a first amendment challenge to deepfake revenge porn laws, I'd be stunned if such a law didn't survive strict scrutiny.


Democratize the spank bank


I think a pretty good portion of parents would agree with a blanket "your phone shall not be seen during school hours" policy. Something that would probably need to be decided per-district


I think in this instance the issue is that these tools shouldn't be available at all to anyone, adult or child. There are not many use-cases for generating porn from images of real people who aren't porn stars and most of them are not good.


How much violence (government control) do you need to exert to block all of these tools? How successful have we been blocking pirated content?

I don't think the genie is easily returned to the bottle and the cure may be worse than the disease.


Well, I'd start by making the production of deepfake porn that depicts real people illegal, doubly so if it depicts children or teens. Fictional depictions of underage people in sexual situations are already illegal in many jurisdictions.

I don't necessarily like the idea of punitive justice in general but that's the framework under which our society is currently operating so it's time to make lemonade I guess.


I used to think the same about pirated content until I traveled to a country that has been successful in blocking pirated content. Its impossible to even find a free movie on YouTube there. The government really cracked down on it and succeeded.


What country was this specifically? I have yet to hear a country that successfully eliminated 100% of piracy, so very curious to hear what country this is. Obviously, most piracy sits outside of YouTube, not sure why you're using that as an example, it's a US property and you're clearly not talking about the US.


Yeah, we've pretty successfully gotten rid of online child porn from like front page search results and public social media, there's no reason why we couldn't do the same for this.


The article isn't about content on the front page of search results. The content in question is being circulated privately through chats. The proposed solutions, so far, have been the broad, nasty, invasive content scanning that nobody on HN wants (for good reason).


Nudify apps are very much present on the front page of search results. Get rid of that and you'll significantly cut down on the problem.


over the past 20 years technology has become more and more catastrophic for children. i suspect a large number of grownups who oppose child safety legislation don't really grasp the bleakness. they project their idea of a technology that they grew up with. they don't get to see the kind of data that policymakers, healthcare providers and LEAs see. i'm not saying the proposed legislation is justified. but i don't think the people behind it are necessarily draconian


There are broadly speaking two options, neither of which will ever be implemented. What will happen is young girls will learn to accept deep fakes are made of them and it's part of life, in the same way explicit sexual commentary starting at the age of ~9 is, or how men follow you when you go somewhere in public, etc. They'll accept nobody cares in the same way nobody cares if you're sexually harassed in other ways.

The two options are either the people in power standing up for the girls or giving the girls the power to deal with it themselves.

People who have power generally are benefiting from the structure in place and so don't want to change it and/or they don't want to do any more work. Expelling all the kids who create deepfakes would cause a lot of arguing from parents and people who are on the boys' side, and they just don't want to deal with that. It's easier to tell the girls to be quiet.

The other option is setting up a system that rebalances the power. For example, if a kid gets caught making deep fakes, give their victim access to every single thing on their devices: Private messages, Discord chats, images, etc. and let the victim decide who and what to release their private information to. Not going to happen.

Another reason nothing is going to be done is if we teach 11 year old girls it's not acceptable for people to do this to them, they'll carry that forward through their life and a lot of people who find it gross to create fake porn of children are fine with doing it to older women and they don't want to create women who create a fuss about it. There are a lot of people who think it's disgusting I was sexually propositioned when I was 10/11 but think it's fine I can't go for a walk in my neighborhood now without being bothered since I'm older.


No, you make it a crime and prosecute it. What loony tunes world do you live in where you start giving away peoples other info. This just incentivizes you to abstract your data like businesses abstract their profits.

Make it a crime. Prosecute it. This isn’t hard, unless you have a legislature that is incapable of passing laws. People are becoming fed up with this stupidity. Give it a few years and a few congressman kids getting targeted as it becomes more mainstream and things suddenly change. We’re at the leading edge of the stupidity, voters, families, politicians aren’t angry enough yet. It will change, legislation is cyclical.


That's what will likely happen. And that's why girls will just accept it. Plenty of harassment is already illegal, and it's still a normal part of life for women and girls.

And I agree that giving up that personal information would be a nightmare. That's one reason it won't happen. It's just probably the only analogue to what the perpetrators do that would actually scare them.

We're likely just going to put up with this state of affairs.


> Usually threads like this fill up with comments about how "think of the children" is always a lie used to justify something draconian

It also highlights HN's demographics. What younger women feel is problematic is viewed as trifling by a number of younger or middle aged men on HN (especially those without kids).


Only to certain kinds of defective men/men suffering from ressentiment.

No healthy man of good will wishes to see women get hurt or disrespected.


If you think the sentiments of comments in one thread (not even specified which) highlights the general HN demographics, I think saying that reveals more about you than HN.


No, it shows that people on Hacker News understand two important things:

1: Unintended consequences

2: That power-hungry people latch on to issues like this to further political agendas that have severe negative consequences; mostly by using "think of the children" to stifle important debate and discussion of unintended consequences.


So since there are other consequences, nothing should be done? I think sexually demeaning children is a really big issue, and pretty well-defined and wide spread. We may not need to ban things, but not doing anything is a cop-out.


No one's saying to do nothing.

When you debate like that, (specifically twisting words like you did,) you leave yourself open to be a victim of power-hungry people, and trade one problem for another potentially worse problem.


What would the effect of hysteria from technologists achieve?

If you think it’s a really big issue, why don’t you own the problem?

You could just go turn off the trillions in AI spending and destroy computing as we know it. No cop-outs, remember.


It depends on what the other consequences are, obviously.

I can fix the problem right now. We just throw all the children into the orphan grinder. Can't sexualize ground meat. Is that what you want? Wooooww, so you're telling me you want to grind up children?? That's really messed up dude.

It's super easy to twist any argument against this stuff to being "against children", which is extremely annoying and unproductive. That's why you, and others, get a lot of pushback - you're annoying and dishonest.

Nobody here, and I do mean nobody, wants children to be hurt. That includes me and you.

If you want to fix these issues, you have to answer three questions head-on:

1. What is the actionable, real solution?

2. Will it work?

3. How well will it work?

"Durrr sexually demeaning children bad" doesn't answer any of those. We all know sexually demeaning children is bad.

Now tell me if, I don't know, banning the Internet or some shit will work


Am a middle-aged man, don't have kids, don't see it as a trifling problem, and I don't agree with the libertarian free-speech-at-all-costs angle.

Instead I think a) kids shouldn't be on the internet and b) the public school system is a barely supervised dumpster fire.


> is there _nothing_ we can do as a society to address this?

Well, I guess the argument goes that regardless of how much you lock down centralized platforms like Grok, these tools can run locally on a PC so as long as people can do local inference with this tooling, it won't fully go away. With that said, limiting the centralized platforms from generating nudes from uploaded images/photos feels like an obvious limitation they should implement, if they haven't already.

So if we consider that we could probably limit most of the "off-hand" stuff that happens with the platforms, but we cannot fully limit the offline ones, I'm guessing that the best solution is merely "Education" here, together with laws. AFAIK, it's already illegal to create deepfakes and share those with others, but probably the education around this isn't great, as it's such a new thing hardly adults understand, much less younger folks.

It's a societal problem that I'm afraid doesn't have a technological solution, as far as I can tell, because the cat is out of the bag, so whatever "solution" we come up with, have to go beyond just technical capabilities/limitations.

One thing people might be missing to carefully consider, is the whole "private" vs "sharing" part of this. People been fantasizing, drawing pictures and similar things about others (even strangers) for as long as humans probably been around. What's new, is that it's effortlessly to share those now, and they spread far, wide and fast. I don't feel like we could possibly stop the whole "I don't like people fantasizing about me" part, it's just too human to get rid of, but what we need to get rid of is the whole sharing part, which is what actively harms people.


Teenagers aren't running generative models on their home PCs, they're using online tools. Sure, some people will do it offline, but these are by and large crimes of opportunity.


Horny Teenagers Find A Way. If the online tools were all magically shut down today, by tomorrow, they'd all be running local models on the local computer nerd's gaming GPU rig.


I doubt it. Photoshop was already pretty accessible, moreso than downloading sketchy models and getting them to run.

It's also a matter of scale. If a dozen teenage boys in a class are generating nudes, they feel safety in numbers. If it's only the weird computer kid who can do it, it's far easier to address and far more likely that it'll blow back on him.


the whole "this wouldn't be a problem if america wasn't so puritanical maaaaaan" argument is total bullshit that completely ignores the young girls who get hurt by this

If society wasn't so puritanical there would be no harm from it.


> If society wasn't so puritanical there would be no harm from it.

What is "puritanical" then, in your opinion? Is it being uptight? Do we abandon our efforts to uphold modesty and dignity of the human person? Do we stop teaching our children about boundaries and propriety, and their responsibilities as they mature? How should boys and young men conduct themselves around women? Should they learn about the concept of consent, and respecting others' wishes?


Those who immediately jump to accusations of "puritanical!" at the mere criticism of any sexual indiscretion are usually sexual deviants of some kind. They may not necessarily partake in any actual sexual activity (they probably usually don't, hence the desire to lower the baseline of acceptable behavior), and they aren't fit for relationships as those require respect. Porn brain comes to mind.

It's time we got back to sexual ethics, as it is absolutely not the case that anything goes or even that "consent" suffices to make something okay.


In this particular type of abuse, the issue arises because humans value hiding reproductive organs. A similar social constraint applies to women's virginity. So if the society was not so focused on hiding reproductive organs, this particular issue would not have arisen. Nobody laughs at naked hands. It is just expected to be bared and if you put on gloves it is because of weather and not usually to hide hands.


I was just pointing out the poor logic of the comment I was replying to.


Draw the boundary at physical abuse.


I love when the comments start out a little spicy, all the spicy comments get flagged or removed, and the top comment becomes "wow, I can't believe all these spicy comments, I thought HN was better than this."


> I shouldn't be downvoted for my English I think, but that is the reality.

How do you know? Is it possible the downvoters just didn't like what you said?


It’s possible of course but reading all the comments from various non-native English speakers here it seems like a common story. It may indicate a subliminal bias in readers (most of whom are presumably American).


Note that those comments are written in perfectly understandable English. Further note how often you come across comments written in perfectly understandable English, but they're downvoted anyway.

It suggests a bias in writers to assume that people would agree with them if only they could express their thoughts accurately.


This is what I can't get over - who in their right mind would _ever_ give an agent enough access to delete prod data?


Someone who should be immediately fired.


fun fact: sublime text also has a vim mode (called "Vintage mode" which is just hilarious) that is built-in but disabled by default, rather than an extension like in vscode. vim keybinds are just the best.


It's funny she mentions the horrible grammar in the leaked sony emails because that's what I remember most from it too. This one always gets a laugh from me.

https://www.reddit.com/r/marvelstudios/comments/33tkv6/actua...


why did you make a new account just to make this comment?


that's all that happens on this website


I think the percentage of people who drink alcohol and have never done any drugs is >50%, and I would bet that the percentage of people who smoke weed (not just once) who have also done other drugs is like 95%. There are plenty of people who will binge drink every weekend but think that smoking a joint is too far. I find that strange, but it is surprisingly common.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: