Hacker Newsnew | past | comments | ask | show | jobs | submit | grondo4's commentslogin

They do? Or at least the Jetbrains IDE I use the most (IntelliJ) will ask you the first time you start it up if you want to use VS code keybinds


Wow you're right. To be fair it was about 9-10 years since I last tried it, so either it didn't have that back then, or I forgot that it did.


As others have pointed out you should really do some research, in the United States birth rates are inversely correlated with income. [1] If anything, if we wanted to dramatically raise birth rates we would want a poorer population, not a wealthier one.

[1] https://www.statista.com/statistics/241530/birth-rate-by-fam...



Right but we're able to relatively effectively scrub child porn from the Internet right? What makes deepfake porn different?


Child porn is a couple clicks away for anyone with internet access. Any teen boy that wants to see pictures of similarly aged girls, finds them quickly. Remember that such pictures, no matter how gross, are legal in many countries who share the internet with us.

I am not saying that is okay, but I am saying it is impossible to stop without shutting down the internet.

If anything deepfake tech reduces the demand to abuse real humans to obtain some types of pictures, which might reduce harm, even if in gross way.

Also you can find copyright movies too just as easily no matter how many billions have been spent trying to stop it.


> Will you reach the same level of comfort and productivity?

Given time to adjust? Sure. This type of thing is a time limited problem.


> Given time to adjust? Sure.

You cannot mechanically return to the same level of productivity with that tool degradation, so I don't think so.


Sugar releases dopamine in the brain.


Thus to Cosmopolitan; thus to being told that the celestial bodies will grant you true love.


Why does sugar release dopamine in the brain?


This is absolutely not the case, if you've spent much time in the Reddit ecosystem you would know that frequently mods go rogue and Reddit staff have to step in to appoint new mods and some times whole new mod teams.


I moderate tens of subreddits, some of the most prominent and significant in the Reddit ecosystem. So yes indeed, I would say I have "spent time"


Since you're definitely the same Steve as the one on Reddit, how do you feel posting here without the ability to ban people for asking if you've started paying your child support?


Yes, AI should be trained on every piece of information possible. Am I allowed to become a better programmer by looking at private, (illegally leaked) closed-source, proprietary code?


One motivation for artists to create and share new work is the expectation that most people won't just outright copy their work, based on the social norm that stealing is dishonorable. This social norm comes with some level of legal protection, but it largely depends on a common expectation of what is considered stealing or not.

Once we have adopted the attitude that we can just copy as we please without attribution, it would be much more difficult to find motivated artists, and we would have failed as a society.



I didn't ask if can I use other people's proprietary closed source code, obviously they have the right to that code and how it's used.

I asked if I can learn from that code, which obviously I can. There is no license that says "You cannot learn from this code and take the things you learn to become a better programmer".

That's exactly what I do and it's exactly what AI do.


> I asked if I can learn from that code, which obviously I can.

Did you actually read the link you were given? Clean room design is because you may inadvertently plagiarize copyrighted works from your memory of reading it.

i.e. the act of reading may cause accidental infringement when implementing the "things you learn"


> i.e. the act of reading may cause accidental infringement when implementing the "things you learn"

Surely you know this isn't the case right? Maybe you're confused because we're talking about programming and not a different creative artform?

Great artists read, watch and consume copyrighted works of art all day, if they didn't they wouldn't be great artists. And yet the content they produce is entirely there own, free from the copyright of the works they learned from.

What's the difference then in programming? Why can an artist be trusted not to reproduce the copyrighted works that they learned from but not the programmer?


Artists get into trouble all the time for producing works very close to something that already exist. That's like the number one reason artists get shunned in the communities they were in.


Every filmmaker watches movies

Every author reads books

Every painter view paintings

Unless you're arguing that every single artist across every field of artistic expression is constantly being jeopardized by claims of copyright infringement, this is a nonsensical point to make.


But they’re not creating similar works, unlike AI which IS. Why is this so complicated for you?


I would seriously question if this happens all the time, these days. The whole copyright thing is way behind the digital and internet revolution. Look at what the Prince case did for transformation copyright fair use.


The process of online artists shaming each other doesn't really have anything to do with the legal system, though they all act like it is.


> Why can an artist be trusted not to reproduce the copyrighted works that they learned from but not the programmer?

They cant. which is why that quote "Good artists copy, great artists steal" exists.

AI has already been shown to be "accidentally" reproducing copyrighted work. You too, can do the same.

Its likely no-one (including yourself) will ever be aware of it - but strictly speaking it would still be copyright infringement. This is the relevance and context of the link you were given.


If everyone is infringing copyright, no one is infringing copyright. This is a dead-end thought.


Sure but the infringement is the problem, not the ideas themselves.

You're describing thought crime right now. It's not illegal to learn things.


And if you "learn" something and accidentally rewrite it verbatim? Thats what clean-room design is to protect against


Rewriting the code verbatim and distributing it would be a copyright infringement, yes, you do not have a write to distribute code written by other people

That's completely different from reading and learning from code, which is what grondo described.

Clean room design relies on this, in a clean room design you have one party read and describe the protected work, and another party implement it. That first party reading the protected work is learning from closed-source IP.


> That's completely different from reading and learning from code, which is what grondo described.

AI (e.g. copilot) has already been shown to break copyright of material in its training set. Thats the context of this whole thread.


Perhaps, but not of Grondo's point.

If an AI infringes on copyright then it infringes on copyright, that's unfortunate for the distributors of that code.

Humans accidentally infringe on copyright sometimes too. It's not a unique problem to machine learning. The potential to infringe on copyright has not made observing/learning/watching/reading copyright materials prohibited for humans, nor should it or (likely) will it become prohibited for machine learning algorithms.


> Perhaps, but not of Grondo's point.

Grondo said that AI should be given access to all code, including private and unlicensed code.

He was given a link to Clean Room Design demonstrating the problem with the same entity (the AI) reading and learning from the existing code and the risk of regurgitation when writing new code.

He goes on to say thats what he does, which doesn't change that fact.

> Humans accidentally infringe on copyright sometimes too.

Indeed we do, and its almost entirely unnoticed, even by the author.

> nor should it or (likely) will it become illegal for machine learning algorithms.

If those machine learning algorithms are taking in unlicensed material and then they later output unlicensed and/or copyrighted material, then they are a liability. Why would you want that when you can train it otherwise and be sure it NEVER infringes others IP? Its a no-brainer, surely. Or are you assuming there is some magic inherent in other peoples private code?


> If those machine learning algorithms are taking in unlicensed material and then they later output unlicensed and/or copyrighted material, then they are a liability. Why would you want that when you can train it otherwise and be sure it NEVER infringes others IP?

Because it could produce a better model that produces better code.

You're now arguing a heavily reduced point. That a model that trained on proprietary code is at higher risk of reproducing infringing code is not a point under contention. The clean room serves the same purpose, it is a risk mitigation strategy.

Risk mitigation is a choice, left up to individuals. Maybe you use a clean room design, maybe you don't. Maybe you use a model trained on closed-source IP, maybe you don't. There are risks associated with these choices, but that is up to individuals to make.

The choice to observe closed source IP and learn from it shouldn't be prohibited just because some won't want to assume that risk.


If you study a closed source compiler (or whatever) in order to write a competitive product, and the company who wrote the original product sues you for copying it, as the parent suggests, you're on shaky legal ground. Which is why clean room design is a thing.


A clean room design ensures the new code is 100% original, and not a copy of the base code. That is why it is legally preferable, because it is easy to prove certain facts in court.

But fundamentally the problem is copyright, the copying of existing IP, not knowledge. grondo4 is completely correct that there is no legal framework that prevents learning from closed-source IP.

If such a framework existed, clean room design would not work. The initial spec-writers in a clean room design are reading the protected work.


>The initial spec-writers in a clean room design are reading the closed-source work.

Right. And they're only exposing elements presumably not covered by copyright to the developers writing the code. (Of course, this assumes they had legitimate access to the code in the first place.)

Clean room design isn't a requirement in the case of, say, writing a BIOS which may have been when this first came up. But it's a lot easier to defend against a copyright claim when it's documented that the people who wrote the code never saw the original.

Unlike with patents, independent creation isn't a copyright violation.


I don't understand what your point here is. The initial spec-writers learned from the original code. This is not illegal, we seem to be agreed on this point. grondo made the point that learning from code should not be prohibited.

What are you contesting?


My point was that, assuming access to the code was legit, and the information being passed from the spec-writers to the developers wasn't covered by copyright (basically APIs and the like), it's a much better defense against a copyright claim that any code written by the developers isn't a copyright violation given they never saw the original code.


I think you're missing the one big flaw here. How exactly do you have access to closed source code?

Did you acquire it illegally? That's illegal.

Was it publicly available? That's fine, so long as you aren't producing exact copies and violate normal copyright law.


You're obviously not


Is that a joke?

Yes you are allowed to read closed-source, proprietary code and become a better programmer for it.

I've decompiled games to learn how they structure their code to improve the structure of games that I program. I had no right to that code and I used it to become a better programmer just like AI do.

That's not copyright infringement. You have a right to stop me from using your code, not learning from it.


This is a pretty extreme stance. There is a fine line between "learning from" proprietary code and outright stealing some of the key insights and IP. Sometimes it takes a very difficult conceptual leap to solve some of the more difficult computer science and math problems. "Learning" (aka stealing) someone's solution is very problematic and will get you sued if you are not careful.


If you think that's extreme, wait until you hear my stance that code shouldn't be something that you can own (and can therefore "steal") to begin with.


Now granted most EULAs and Terms of Service documents aren't legally enforced, most software licenses explicitly prohibit decompiling or otherwise disassembling binaries.

So, yes: They have a right to stop you from "learning" from their code. If you want that right, see if they're willing to sell that right to you.


> They have a right to stop you from "learning" from their code.

They absolutely do not, and as pedantic as it may be I think it's very important that you and everyone else in this thread know what their rights are.

If you sign a contract / EULA that says you cannot decompile someone's code than yes you are liable for any damages promised in that contract for violating it.

But who says that I ever signed a EULA for the games I decompiled? Who says I didn't find a copy on a hard drive I bought at a yard sale or someone sent me the decompiled binary themselves?

Those people may have violated the contract but I did not.

There is no law preventing you from learning from code, art, film or any other copyrighted media. Nor is there any law (or should there be any law IMO) that stops an AI from learning from copyrighted media.

Learning from each other regardless of intellectual property law is how the human race advances itself. The fact that we've managed to that automate human progress is incredible, and it's very good that our laws are the way they are that we can allow that to happen.


> Am I allowed to become a better programmer by looking at private code?

Your argument is based on the idea that you and AI should have the same rights?

I do not see how this works unless AI going to be entitled to minimum wage and paid leave?

Otherwise it is just a money grab


He's not saying that he and the AI have the same rights, rather that he and the person running the AI have the same rights.


In Catholicism our consciousness comes from our souls which we inherited by being the literal descendants of Adam and Eve.

That would at least exclude sentient extra terrestrial life.


Christianity states that the only ensouled and conscious creatures are the descendants of Adam and Eve.


>But some else's child will not have to care for them.

Generally it's considered a burden of society to take care of their elderly. Thus 'someone else's child' will have to take care of the childless elderly in some form or another.


Depends on the ability of the elderly to obtain that labor. They typically have a decent amount of political power, but it is certainly possible for quality of care to decline in the event that the cost of their care rises too high.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: