Hacker Newsnew | past | comments | ask | show | jobs | submit | thereitgoes456's commentslogin

They also have an image model that’s fallen behind, a coding model that’s fallen behind, a good video model, a social AI slop feed powered by that model, and an upcoming erotica mode(l)


Sex robot… classy stuff @sama.


She lost that contract after being found guilty of plagiarism. That’s why she avoids mentioning her considerable writing career at all


It's fiction what is she plagiarizing


What I just said is a fact. Look it up if you like


The similarities are intriging but not compelling.

https://docs.google.com/document/d/1pPE6tqReSAXEmzuJM52h219f...

Stories of "asian face" actresses with eyes taped back, prominent pieces of anti asian grafitti on walls and drawn in bathrooms are common tropes in asian communities, etc.

The examples of plagiarism are examples of common story arcs, with an educated asian female twist, and use of examples that multiple writers in a shared literary pool would have all been exposed to; eg: it could be argued that they all drew from a similar well rather thn some were original and others copied.

There's a shocked article: https://www.halfmystic.com/blog/you-are-believed that may indeed be looking at more evidence than was cited in the google docs link above which would explain the shock and the dismissal of R.W. as a plagiarist.

The evidence in the link amounts to what is common with many pools of proto writers though, lots of similar passages, some of which have been copied and morphed from others. It's literally how writers evolve and become better.

I'm on the fence here, to be honest, I looked at what is cited as evidence and I see similar stories from people with similar backgrounds sharing common social media feeds.


One of her publishers pulled her book from print, publicly accused her of plagiarism, and asked other publishers to denounce her for plagiarism.

That’s pretty damning evidence. If a publisher was on the fence they might pull her books quietly, but they wouldn’t make such a public attack without very good evidence that they thought would hold up in court. There was no equivocation at all.


Said publisher also claims Rona directly admitted plagiarism to them. That’s probably why they’re so confident.


That's a pretty damning response, sure.

The evidence, at least the evidence that I found cited as evidence, appears less damning.

Perhaps there is more damning evidence.

What I found was on the order of the degree of cross copying and similar themes, etc. found in many pools of young writers going back through literary history.

Rona Wang, whom I've never previously heard of, clearly used similar passages from her peers in a literary group and was called out for it after receiving awards.

I would raise two questions, A) was this a truly significant degree of actual plagarism, and 2) did any of her peers in this group use passages from any of Tona's work ?

On the third hand, Kate Bush was a remarkable singer / song writer / performer. Almost utterly unique and completely unlike any contempory.

That's ... highly unusual.

The majority of writers, performers, singers, et al. emerge from pools that differ from their prior generations, but pools none the less that are filled with similarity.

The arc of careers of those that rise from such origins is really the defining part of many creators.


It is evidence because a strong condemnation raises the likelihood that the accusation is true.

It doesn’t prove anything, but it supports the theory that they have seen additional evidence.

After researching this a bit, it looks like someone from publisher says she admitted it to them. That certainly explains why they weren’t afraid to publicly condemn her.


> Perhaps there is more damning evidence.

Do you consider the announcement from her publisher that she admitted that she plagiarized passages as a damning response or damning evidence?


>On the third hand

On the gripping hand


Thanks, I looked at some of those examples. Several I saw were suspiciously similar, and I wonder how they got that way. Others didn't look suspicious to me.

I wonder whether the similar ones were the result of something innocent, like a shared writing prompt within the workshop both writers were in, or maybe from a group exercise of working on each others' drafts.

Or I suppose some could be the result of a questionable practice, of copying passages of someone else's work for "inspiration", and rewriting them. And maybe sometimes not rewriting a passage enough.

(Aside relevance to HN professions: In software development, we are starting to see many people do worse than copy&revise a passage plagiarism. Not even rewriting the text copy&pasted from an LLM, but simply putting our names on it internally, and company copyrights on it publicly. And the LLM is arguably just laundering open source code, albeit often with more obfuscation than a human copier would do.)

But for a lot of the examples of evidence of plagiarism in that document, I didn't immediately see why that passage was suspect. Fiction writing I've seen is heavily full of tropes and even idiomatic turns of phrase.

Also, many stories are formulaic, and readers know that and even seek it out. So the high-powered business woman goes back to her small town origins for the holidays, has second-chance romance with man in a henley shirt, and she decides to stay and open a bakery. Sprinkle with an assortment of standard subgenre trope details, and serve. You might do very original writing within that framework, but to someone who'd only ever seen two examples of that story, and didn't know the subgenre convention, it might look like one writer totally ripped off the other.


No I'm literally saying - she writes fiction- how can you plagiarize a fiction book and make it work lol

(I have no knowledge / context of this situation - no idea if she did or what happened here)


You don't seem to know what plagiarism is.


I'm struggling to understand the circumstance you'd plagiarize fiction - you can literally write anything you want. Why steal someone else's writing and slap it in your book? It'll either stand out and be weird / stilted or you took the time to make it work somehow in which case you probably rewrote it and so why steal in the first place? Or like use allegory instead?

Obviously it shouldn't be done in any circumstance


You can't plagiarize fiction?

So if I copy paste Harry Potter that's ok?

What kind of argument is that


Absolutely not saying this or making this argument.

I just don't see how this could possibly work - how would slapping Harry Potter in the middle of the book your writing work


Instead of slapping Harry Potter in the middle of your book wholesale, imagine you lifted a few really good lines from Harry Potter, a few from Lord of the Rings, and more from a handful of other books.

Read the evidence document another poster linked for actual examples.


To me as a dumb reader, that would be fine, maybe the author could have mentioned that he likes these authors and takes them as inspirations. Also you can't really forbid books to never have references to pop culture. And at some level of famous-ness passages and ideas loose their exclusive tie to the original book and become part of the list of common cultural sayings.


>could have mentioned

Well plagiarism by definition means passing the work off as your own without crediting the author, so in that case it isn’t plagiarism.

References to pop culture are the same as lifting sentences from other books and pretending you wrote them.

> And at some level of famous-ness passages and ideas loose their exclusive tie to the original book and become part of the list of common cultural sayings

In the actual case being examined the copied references certainly hadn’t reached any such level of famousness.

Also there’s a difference between having a character tell another “not all those who wander are lost” as a clear reference to a famous quote from LOTR and copying multiple paragraph length deep cuts to pass off as your own work.


> Well plagiarism by definition means passing the work off as your own without crediting the author, so in that case it isn’t plagiarism.

Of course, but wrote 'could' and not 'should' for a reason, I won't expect it. A book isn't a paper and the general expectation is that the book will be interesting or fun to read and not that it is original. That means the general expectation is not that it is never a rehash of existing ideas. I think ever book including all the good ones is. A book that invents the world from scratch might be novel, but unlikely what people want to read.

> copying multiple paragraph length deep cuts to pass off as your own work.

If that is true, it sounds certainly fishy, but that is a case of violation of copyright and intellectual property and not of plagiarism.


> That means the general expectation is not that it is never a rehash of existing ideas.

There’s a different from rehashing existing ideas and copying multiple passages off as your own.

> If that is true, it sounds certainly fishy, but that is a case of violation of copyright and intellectual property and not of plagiarism.

What exactly do you think plagiarism is? Here’s one common definition:

“An instance of plagiarizing, especially a passage that is taken from the work of one person and reproduced in the work of another without attribution.”


> What exactly do you think plagiarism is? Here’s one common definition:

Both are about passing of something of your own. Plagiarism is about passing ideas of insights of as your own. It doesn't really matter, whether you copy it verbatim, present it in your own words or just use the concept. It does however matter how important that idea/concept/topic is in your work and the work you took it from without attribution, and whether that is novel or some generally available/common knowledge.

For violation of intellectual property it is basically the opposite. It doesn't matter, whether the idea or concept is fundamental for your work or the other work you took it from, but it does matter, whether it is a verbatim quote or only the same basic idea.

Intellectual property rights is something that is enforced by the legal system, while plagiarism is an issue of honor, that affects reputation and universities revoke titles for.

> There’s a different from rehashing existing ideas and copying multiple passages off as your own.

Yes and that's the difference between plagiarism and violating intellectual property/copyright.

But all this is arguing about semantics. I don't have the time to research whether the claims are true or not, and I honestly don't care. I have taken from the comments that it was only the case, that she rehashed ideas from other books, and I wanted to point out, that while this is a big deal for academic papers, it is not for books and basically expected. (Publishers might have different ideas, but that is not an issue of plagiarism.) If it is indeed the case that she copied other authors verbatim, then that is something illegal she can be sued for, but whether this is the case is for the legal system to be determined, not something I should do.


>I have taken from the comments that it was only the case, that she rehashed ideas from other books, and I wanted to point out, that while this is a big deal for academic papers, it is not for books and basically expected.

In addition to near verbatim quotes, she is also accused of copying stories beat for beat. That's much different than rehashing a few ideas from other works. It is not expected and it is very much considered plagiarism by fiction writers.

As for the quotes she copied. That is likely both a copyright violation and plagiarism.

Plagiarism isn't just about ideas but about expressions of those ideas in the form of words.

Webster's definition:

"to steal and pass off (the ideas or words of another) as one's own : use (another's production) without crediting the source"

"to commit literary theft : present as new and original an idea or product derived from an existing source"

Oxford learner's dictionary:

"to copy another person’s ideas, words or work and pretend that they are your own"

Copying verbatim or nearly verbatim lines from a work of fiction and passing them off as your own is both plagiarism and copyright violation.


So I won't defend what was done here, there doesn't seem much to argue.

> copying stories beat for beat. That's much different than rehashing a few ideas from other works. It is not expected and it is very much considered plagiarism by fiction writers.

Some operas are a greek play. There rehashes of the Faust, the Beggars Opera is a copy of a play from Shakespeare, there are modern versions of Pride and Prejustice, there are tons of stories that are a copy of the Westside Story, which is itself a copy of Romeo and Julia, which I thinks comes from an even older story. This often don't come with any attribution at all, although the listener is sometimes expected to know that the original exists. They change the settings, but the plot is basically the same. Do you consider all of that to be plagiarism? These would be all a reason to call it plagiarism when considering a paper, but for books nobody bats an eye. This is because authors don't sell abstract ideas or a plot, they sell concrete stories.


First, the stories you mentioned are very famous. The audience watching Oh Brother Where Art Thou is aware it’s an adaptation of the Odyssey. Therefore it’s not someone attempting to pass off work as their own.

The stories this authors copied were either unpublished manuscripts she got access to in writers groups or very obscure works that it’s unlikely her readers had read.

Second, the examples you gave were extremely transformative. Just look at the differences between Westside Story and Romeo and Juliette. It’s a musical for goodness sake. It subverts expectations by letting Maria live through it.

The writings at issue are short stories, so there’s less room for transformation in the first place. And there was clearly not even a strong attempt at transformation. The author even kept some of the same character names.

There was no attempt to subvert expectations largely because the audience had expectations, since they weren’t aware of the originals.

>change settings

She didn’t even do that.

> for books nobody bats an eye

If a popular book were revealed to be a beat for beat remake of an obscure novel with the same setting, similar dialogue, some of the same character names, and few significant transformative elements, you can bet your life there would be a scandal.


Like I wrote, I wanted to point a difference in attitude between academic and entertaining writing. I think I don't disagree with you in this specific case (now). You seem to have looked into the actual case, while I didn't.


Sadly we do not have, and will never have, the resources to help everyone, even to a baseline of human dignity. Surely we can't give people unlimited talk therapy, MRIs, and cancer treatment for free. But some people sorely need these things.

Preventative/propylactic care is orders of magnitude cheaper than treatment once a disease has manifested. It makes sense to me to punish people for not doing this care, thereby choosing to impose more strain on an already overburdened system.

Note that GP only mentioned things we have control over -- exercise, weight, not smoking. Of course I agree that it would be cruel to disadvantage pre-existing conditions.


That's the point of insurance. It's the idea that everyone pools together money and when something bad happens to one person, its finances are mitigated by the input of others. Some will benefit more and others benefit not at all. But no one can predict who is on what end.

Yes, if everyone gets cancer at the same time then Health Care is boned. But then again, so is society. So why worry about that worst case scenario?

>Note that GP only mentioned things we have control over -- exercise, weight, not smoking.

We couldn't pass laws to help control what companies put in food, and failed to subsidize healthier food options. I wish you the best of luck with healthcare trying to pull off that endeavor with punishments for obesity. I'm guessing it wouldn't be poolitically popular.


> Sadly we do not have, and will never have, the resources to help everyone, even to a baseline of human dignity.

This is really a matter of choice. There is a level of treatment that most people could have with far less friction. We just have decided to organize our economy otherwise.

Broadly, we need to stop seeing our economy as a zero sum game. It's dehumanizing. So what if there are a few bad actors that abuse the situation? Most don't. If everyone is doing something harmful, eg smoking, then we need strong public education, etc.


Since when do we not have the resources? Nearly every developed country on earth manages it ok.


Does every other country have a boarder that "allows" millions of people in every year? These EU countries are tightening hard on immigration as they have found that it crushes their social nets. Not to mention keeping defense spending adequate relative to allies.

If this was so easy, what is the plan?


Immigrants are easily exploitable labor. They aren't sucking up our resources. If anything, we're sucking up their resources.

I live in Texas. Look around me. Who's building these 500,000 dollar homes? Not fatass white people. And who is buying them? Not the people building them.


Several million people migrate to EU countries from outside the EU every year, yes. The number will be lower than a million for most individual EU countries, but then the individual countries also have smaller populations than the US.


Oh boy, here we go.


Maybe the downvotes are because of the jokes about assault?


It's too early to say if his risks "keep working out". Restructuring is not a risk. His, and others', original decision to make the company a non-profit was also not a calculated risk in this sense.

When he was fired from OpenAI, his use of employee manipulation to regain his position is not a risk; it is the only option he had. It was his bond maturing, of carefully cultivated loyalty he had accrued over years. Gaining that loyalty was not really a risk. It was smart politics.

One risk he took is: signing away such a large portion of the company to Microsoft. I'm not sure whether that is working out.

Another risk he took is: neglecting and sidelining the "safety" portion of his organization. This caused a talent exodus and led to the formation of many competitors. I'm not sure whether that is working out either.


> Restructuring is not a risk...his use of employee manipulation to regain his position is not a risk; it is the only option he had

In both cases he had the option of accepting the status quo.


The reportings at the time said that he was Mark’s 5th choice or similar. It is fairly clear he would prefer Ilya, Murati, Mark Chen, and perhaps others, but they said no, and Alex Wang was the first one to say yes.


Why in the world would he want Murati? She has absolutely no technical chops and was not functionally CTO of OpenAI.


> was not functionally CTO of OpenAI.

Why do you say that?


Her history was entirely non technical up until openAI.


I think that's total BS, based on this article about her, https://fortune.com/2025/10/03/mira-murati-career-ai-thinkin...

1. She has 2 BAs, one in math and one in mechanical engineering.

2. She was an "Advanced Concepts Engineer at Zodiac Aerospace from 2012 to 2013".

3. She was a product manager at Tesla on the Model X

4. She was VP of product and engineering at Leap Motion.

Going from that fact that she wasn't a deep learning researcher to "her history was entirely non technical up until Open AI" is plain false. And plus, the job of CTO is 90%+ people management, and she appears more than smart enough and experienced enough to evaluate technical decisions of her team.


I think you havent been in tech long enough to know what that resume is.


Because she was CTO of OpenAI.


Pretty ironic when access to trade secrets and people skills is seen as more important in a technical field than technical competence.


For the record, I doubt the CTO of OpenAI is the best person to fund if you're looking for trade secrets on training and deploying SOTA LLMs. They are two levels too far from reality to know anything useful.


What technical chops does Sam Altman have?


He started coding at age 8


The self-supervised mesa-optimizer strikes again


He was wryly communicating, "your argument was so stupid I don't even need to engage with it".

In my experience he has a horrible response to criticism. He's right on the AI stuff, but he responds to both legitimate and illegitimate feedback without much thoughtfulness, usually non-sequitur redirect or ad hominem.

In his defense though, I expect 97% of feedback he gets is Sam Altman glazers, and he must be tired.


He's right on the AI stuff? How do you figure that? As far as I can tell, OpenAI is still operating. It sounds like you agree with him on the AI stuff, but he could be wrong, just like how he was wrong about remote work.

I'm actually more inclined to believe he's wrong if he gets so defensive about criticism. That tells me he's more focused on protecting his ego than actually uncovering the truth.


The fact that OpenAI is still operating and the argument that it is completely unsustainable are not two incompatible things.


Wether or not OpenAI is sustainable or not is only a question that can be answered in hindsight. If OpenAI is still around in 10 years, in the same sort of capacity, does OP become retroactively wrong?

My point is, you can agree that OpenAI is unsustainable, but it's not clear to me that is a decided fact, rather than an open conjecture. And if someone is making that decision from a place of ego, I have greater reason to believe that they didn't reason themselves into that position.


The fact they are not currently even close to profitable with ever increasing costs and the sobering scaling realities there is something you could consider, and if you do believe they are sustainable, then you would have to believe (in my opinion, unlikely scenarios) they will somehow become sustainable, which is also a conjecture.

Seems a little unreasonable to point out “they are still around” as a refutation of the claim they aren’t sustainable when, in fact, the moment the investment money faucet keeping them alive is turned off they collapse and very quickly.


No, it's a question answerable now. If you're losing twice as much money as you're making, the end of your company is an inescapable fact unless you turn that trend around.

What Zitron points out, correctly, is that there currently exists no narrative beyond wishful thinking which explains how that reversal will manifest.


I don't think he's right about everything. He is particularly weak at understanding underlying technology, as others have pointed out. But, perhaps by luck, he is right most of the time.

For example, he was the lone voice saying that despite all the posturing and media manipulation by Altman, that OpenAI's for-profit transformation would not work out, and certainly not by EOY2025. He was also the lone voice saying that "productivity gains from AI" were not clearly attributable to such, and are likely make-believe. He was right on both.

Perhaps you have forgotten these claims, or the claims about OpenAI's revenue from "agents" this year, or that they were going to raise ChatGPT's price to $44 per month. Altman and the world have seemingly memory-holed these claims and moved on to even more fantastical ones.

He has never said that OpenAI would be bankrupt, his position (https://www.wheresyoured.at/to-serve-altman/, Jul 2024) is:

I am hypothesizing that for OpenAI to survive for longer than two years, it will have to (in no particular order):

- Successfully navigate a convoluted and onerous relationship with Microsoft, one that exists both as a lifeline and a direct source of competition.

- Raise more money than any startup has ever raised in history, and continue to do so at a pace totally unseen in the history of financing.

- Have a significant technological breakthrough such that it reduces the costs of building and operating GPT — or whatever model that succeeds it — by a factor of thousands of percent.

- Have such a significant technological breakthrough that GPT is able to take on entirely unseen new use cases, ones that are not currently possible or hypothesized as possible by any artificial intelligence researchers.

- Have these use cases be ones that are capable of both creating new jobs and entirely automating existing ones in such a way that it will validate the massive capital expenditures and infrastructural investment necessary to continue.

I ultimately believe that OpenAI in its current form is untenable. There is no path to profitability, the burn rate is too high, and generative AI as a technology requires too much energy for the power grid to sustain it, and training these models is equally untenable, both as a result of ongoing legal issues (as a result of theft) and the amount of training data necessary to develop them.

He is right about this too. They are doing #2 on this list.


Is he right on the AI stuff? Like, on the OpenAI company stuff he could be? I don't know? But on the technology? He really doesn't seem to know what he's talking about.


> But on the technology? He really doesn't seem to know what he's talking about.

That puts him roughly on-par with everyone who isn't Gerganov or Karpathy.


You and everyone else seem to assume on faith that OpenAI's ads revenue is going to dwarf their subscription revenue -- but you're being suckered. If you do the math, you'll find it's not nearly as clear-cut as you think.

Not impossible, but not a given.


>If you do the math

On the contrary the math makes it very clear. They need a free user ARPU of $11 to 12 per quarter to be profitable with billions to spare.

That's a low bar to clear for a platform with 700M+ Weekly Active Users who are more personal with it than any Google search.


According to Karen Hao's Empire of AI, this is only half accurate. And I trust what Karen Hao says a lot more.

OpenAI mistakenly thought Anthropic was about to launch a chatbot, and ChatGPT was a scrappy, rushed-out-the-door product made from an intermediate version of GPT-4, meant to one-up them. Of course, they were surprised at how popular it became.


Do you mean an intermediate version of GPT-3? That's more the timeline I'm thinking.


Because it would be more work for a smaller audience and less money?


Maybe the US government could pay them to do it.


The government isn't going to fund a project that teaches English via the medium of glorifying crime.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: