Hacker Newsnew | past | comments | ask | show | jobs | submit | jsnell's commentslogin

Try updating your Claude Code client. I believe it is a bad interaction between Opus 4.7 and older system prompts.

I don't see how that number could possibly be realistic.

A H100 cost 30k when new, and uses 500W of power.

500W for a year is about 4500kWh, which at $0.10/kWh is $450/year if run at full utilization (unrealistic).

TCO of an AI data center should be entirely dominated by capex depreciation.


In fairness your calculation looks at the most expensive element of the DC but ignores all of the associated parts required to utilize the H100: CPU, memory, cooling, etc. No to say that that flips the calculation (I don't have the answer), but it does leave a lot of power out.

Let's be generous and pretend the rest of the hardware is free but double the energy budget of the H100 to account for all of it along with cooling. You're still at only $1k/yr; $10k over 10 years, or 25% of the TCO (ignoring all other costs).

> Pick any two cypherpunks at random and you won't find that kind of overlap on non-technical quirks.

That could be a valid methodology if you pre-registered the list of quirks before doing the investigation.

But in this case the journalist clearly didn't do that, but tweaked the set of quirks until they produced the desired outcome.


> Do you have a link to that? A rather important piece of context.

It was a quote from your own link from the initial post?

https://www.freebsd.org/security/advisories/FreeBSD-SA-26:08...

> Credits: Nicholas Carlini using Claude, Anthropic


Oh wow, blind as a bat.

Would have been interesting with a write-up of that, to see just what Claude was used for.


Obviously no guarantees that it's exactly what was done in this case, but he talked about his general process recently at a conference and more in depth in a podcast:

https://www.youtube.com/watch?v=1sd26pWhfmg

https://securitycryptographywhatever.com/2026/03/25/ai-bug-f...

It pretty much is just "Claude find me an exploitable 0-day" in a loop.


Except it's not 100x revenues, and it's not 17% growth. I don't know where you got those numbers from?

The numbers OpenAI gave in the post would mean a 30x multiple pre-money. And the $20B -> $24B run-rate growth since the start of the year could plausibly mean anything from 110% to 200% annualized growth rate, depending on whether that happened over two or three months. The $24B is a lower bound as well, since they only gave use one significant digit for the monthly revenue.


You're right, I was thinking about 100x revenues and forgot to confirm the math. Updated to reflect your point. ChatGPT itself provided the 17% number (it's most recently available growth rate)...


What what? Are you surprised it's that low, that high, that they can tell what their revenue is, that they report it on a monthly rather than annual basis, or something totally different?

It's going to be pretty hard to get a good answer to whatever you're having difficulties understanding if you can't be bothered to write more than a word.


A theory that at least is consistent with the observed correlation seems vastly superior to a midbrow dismissal that doesn't. Your "raising kids is hard" theory would explain why people don't have a third child, but raising kids is hard universally. What was observed was that a third child was delayed for longer (even indefinitely) in states with higher age thresholds for mandatory car seats (even when controlling for demographics).

Their causal explanation relies on two additional observations that seem pretty hard to explain by other theories: the effect disappears for single-parent and carless households.


A new fab will need to be filled with advanced equipment like lithography machines. They are the most complex thing humanity has every built.

There is one supplier of EUV lithography machines in the world, ASML. They are basically acting as an integrator for hundreds of highly specialized components manufactured to unimaginable levels of precision. Each of them has roughly one eligible supplier in the world who are operating at full capacity. To expand, they'll need yet another set of specialized and almost impossible to build equipment.

So the supply chain moves incredibly slowly, and the slowness is intrinsic due to the complexity and depth of the supply chain. It can't be fixed with just money. IIRC ASML is aiming to merely double their production of EUV lithography machines by 2030.


Sure, I didn't mean to suggest that it would be easy or fast to increase manufacturing capabilities, just that the confidence I'm seeing around AI should extend to the manufacturers (if that confidence for the future growth and success of OpenAI and Anthropic is warranted). That is, the business decision to increase RAM and GPU supply should be "easy".


Right, but the business decisions probably aren't the constraint at this point? (But were a year ago.)

Once the ability of the supply chain to grow has been saturated, no amount of extra confidence will make it grow faster.


It wasn't actually that exact amount. It was "about 12 tons", and somebody did the 12000 kg / 29g calculation and used the answer with way too many significant digits. Probably the reporter trying to make the 12 ton number relatable.

(You might object that KitKats usually weigh 40g. So these were probably the new KitKat Icon F1 chocolates, which weigh exactly 29g.)


On Chrome, right-click brings up the context menu in addition to flagging a mine, which basically makes this unplayable.

Also the ship is not explained at all (the graphics, the controls, the systems). I'd recommend at least a one paragraph help section in the menu.


Long press instead of right click also marks mines.


fixed


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: