Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can also determine, by optimization, what the most cost effective energy system would look like. If you do this you will discover that nuclear may not have a place in it.

Of course you can set up a market system where nuclear is favored, but that would be a market system that would likely not be delivering the lowest overall cost solution to the optimization problem.



"May." I haven't seen anyone actually do that, have you? All I see are researchers ignoring nuclear and proving that an all-renewables grid is possible, not that it's cheaper than a grid including nuclear.

At one point MIT had a US grid simulator online and I played around with it for a while. The cheapest fossil-free combination I found was solar and nuclear. Basically enough nuclear for nighttime demand, and enough solar for extra daytime demand. Wind+solar+storage was a good bit costlier.

And that was with conventional nuclear cost. We should probably find out whether, in practice, it's cheaper to mass-produce passively-safe small modular reactors in factories.


Look at this website

https://model.energy/

It models minimum cost combinations of renewables, batteries, and hydrogen to supply synthetic baseload given historical weather data (insolation, wind) in various places. It can also include nuclear (enable option "Dispatchable 2", which is based on EPR.) Twiddle the cost assumptions as you like to see what dominates.

Passively safe small reactors are the latest in nuclear magical thinking. Yes, this time, the it-will-be-cheaper promise will be real, unlike all the other times it was a lie. At some point Charlie Brown the Nuclear Stan needs to realize Lucy is going to pull away the football again.


I'll spend some time with that, thanks!

Edit: From some initial attempts, using their EPR costs does leave nuclear out entirely, in both the US and China. However, cutting the nuclear capital cost in half results in a grid that is completely nuclear.

And it's not all or nothing. Reducing nuclear capital cost to 2/3 of the EPR value, the US gets a mix of sources with 63% nuclear.

Whether such cost reductions are feasible, we'll see. You may be right, but I'll note that we have never actually tried mass-producing lots of small passively-safe reactors, or done more than early experiments with MSRs. I'm glad there are companies and countries giving it a shot.

Edit2: Actually, even current production nuclear technology can get us there, if we follow the lead of the most effective countries. See figure 12 in this study of global nuclear cost curves: https://www.sciencedirect.com/science/article/pii/S030142151...


> However, cutting the nuclear capital cost in half results in a grid that is completely nuclear.

With the 2020 assumptions? Do that with the 2030 numbers and renewables will still be on top. 2030 is probably what you want to use right now, since any nuclear plant started now isn't going to be in operation before about then.

Note that some assumptions for 2030 have already been superseded. For example, it assumes hydrogen electrolysers cost 600 EUR/kW; China is already selling them (domestically and for export) for half that.

The EPR numbers are rather optimistic. Flamanville 3 and Vogtle 3 and 4 are now costing around 11,000 EUR/kW, nearly twice what was assumed (and four times the capital cost after we halve that number.) Yes, things went wrong in those projects. That tends to happen with nuclear and cannot be ignored. Renewables usually come in within 10% of promised cost.

Also, remember this is for synthetic baseload, not a grid with variable demand. The latter will favor renewables since that means some of the energy from nuclear will now be going into storage, just like from renewables. The synthetic baseload case is the best case for nuclear.


I've been using the 2030 scenario throughout.

Can you provide a source for hydrogen electrolyzer cost? My quick google didn't show obvious results, though there's plenty to dig into.

Certainly nuclear looks bad if you use the worst costs available. My claim is that South Korea, for example, has achieved much lower costs in production, so our issues are more with mismanagement than the technology itself.

If I plug your hydrogen cost into the scenarios in my other new comment, it lowers the cost at which nuclear starts to appear on the US grid, but there's still a point within the range of the world's production nuclear costs where nuclear takes over.


https://www.rechargenews.com/energy-transition/will-us-and-e... (although there is some question about whether this is comparing apples to apples.)

South Korea's costs seem to have been related to corruption. That will work until the first accident, and as long as people aren't going to jail (as they did in S. Korea.)


Thanks.

South Korea isn't the only country with decent nuclear costs. See the study I linked.

US nuclear costs are related to the fact that we build reactors as occasional one-offs, throw in long political delays, and sometimes change regulations in the middle of construction. We're probably not going to change any of that, so it's good we do so well with wind and solar. But our situation is not the situation everywhere.


Most of the other ones in Figure 12 there are France. But now France can't replicate what it did before (the EPR cost overruns have been frightful) so the validity of those earlier points is called into question. As I understand it, the accounting of those cost of that earlier buildout is sketchy, with military nuclear rolled in in a way that can't be untangled.


Also, their default nuclear lifetime is 25 years, and we have many reactors older than that running right now. In my tests above I changed the lifetime to 60 years. Justification for that here: https://www.energy.gov/ne/articles/whats-lifespan-nuclear-re...

For some countries, even the original capital cost of 6000 is viable with a 60-year lifetime. Two I tried are Thailand and, ironically, Germany, both of which went 100% nuclear that way.


Assuming a 40 year lifetime in an environment of rapidly declining costs (for the competition) is dubious. Note that we're already seeing some nuclear plants in the US that can't compete just based on their operating costs. TMI's remaining reactor was cash flow negative for six years before it was shut down. It still worked fine, it just didn't make sense to keep running it. Plenty of other plants are cash flow positive but aren't selling their output at a price that would justify their replacement with a new NPP.

One can view the strong position of gas-fired generating capacity in the US not just as a consequence of the low cost of natural gas, but also the low capital cost, which limits the downside risk of future competition. A combined cycle power plant might cost 10% or less the capital cost of a NPP, per unit power output.


Nuclear today is competing against cheap, dispatchable natural gas. That's not relevant to a model of a carbon-free grid, where nuclear could displace the higher costs required to get dispatchable power from wind/solar.

Nuclear's main cost is capital, so using an artificially low lifetime biases heavily against nuclear. The model accounts for operating costs separately. Whether other tech will make nuclear uncompetitive is what we're trying to find out with this model; if we start by assuming that, and limit nuclear lifetime accordingly, then we're making a circular argument.

I did notice that fixed O&M costs are expressed as a percentage of capital cost, so if I take them at face value and halve the capital cost, to be conservative perhaps I should double the O&M from 3% to 6%, which I hadn't done before. That doesn't change the German and Thai results, since for those I only changed lifetime. But in the US, it means capital cost of 3000 still results in a grid without nuclear.

But is that number correct? Not quite. Across the nuclear fleet, the average US nuclear O&M cost is $132/kWh, or 125 euros: https://www.epa.gov/sites/default/files/2019-03/documents/at...

That's only 2.1% of 6000. If we lower capital cost without lowering O&M, we get 4.2% of 3000. That puts a small amount of nuclear back in the US market.

However, they default to a discount rate of 10 for nuclear, and only 5 for wind/solar/battery. I don't see any reason to use different numbers here. Setting nuclear to 5, with the above changes, gets us back to a 100% nuclear US grid. Even if we take O&M back up to 6%, a capital cost of 3000 and discount rate of 5% means a 100% nuclear US.

Comparing overall grid cost of two examples: With no nuclear (due to 10% discount), the US average grid cost is 53.7 EUR/MWh. With 100% nuclear (due to 5% discount and 3000 capital cost), the US grid cost is 48.6 EUR/MWh.

In any case, this has changed my view somewhat. I'd thought that nuclear was a clear winner over storage, but it looks like nuclear's place is at least marginal in the US. We have copious wind and solar, and nuclear only lowers overall grid cost if we can manage the sort of nuclear costs they've achieved in South Korea. But in countries less favorable to wind/solar, nuclear dominates.


Yes. I've been saying that for nuclear to survive in the US, up front costs have to be cut at least by a factor of 3. This is going to be tough. NuScale's SMR doesn't appear to be able to do this.

If we start seeing CO2 taxes here, the way forward will be displacing gas using increased renewables, and if there are enough times where gas goes to zero then adding storage to serve that. Getting to 100% will require additional storage (and likely hydrogen), but even before that the environment will become quite hostile to new NPPs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: