Hacker Newsnew | past | comments | ask | show | jobs | submit | not2b's commentslogin

I think on HN, people waste too much time arguing about the phrasing of the headline, whether it is clickbait, etc. and not enough discussing the actual substance of the article.

You're right, mostly, but the fact remains that the behavior we see is produced by training, and the training is driven by companies run by execs who like this kind of sycophancy. So it's certainly a factor. Humans are producing them, humans are deciding when the new model is good enough for release.

Do you honestly think an executive wanted a chat bot that confidently lies?

Do the lies look really good in a demo when you're pitching it to investors? Are they obscure enough that they aren't going to stand out? If so no problem.

In practice, yes, though they wouldn't think of it that way because that's the kind of people they surround themselves with, so it's what they think human interaction is actually like.

"I want a chat bot that's just as reliable at Steve! Sure he doesn't get it right all the time and he cost us the Black+Decker contract, but he's so confident!"

You're right! This is exactly what an executive wants to base the future of their business off of!


You say that like it’s untrue, but they measurably prefer a lying but confident salesman over one who doesn’t act with that kind of confidence.

This is very slightly more rational than it seems because repeating or acting on a lie gives you cover.


Yes, that is in fact their revealed preference.

Did you have a point?


You use unfalsifiable logic. And you seem to argue that, given the choice, CEOs would prefer not to maximize revenue in favor of... what, affection for an imaginary intern?

Cute straw man.

You must be a CEO.

I'm not arguing anything. I'm observing reality. You're the one who is desperate to rationalize it.


You are declaring your imagined logic as fact. Since I do not agree with the basis upon which you pin your argument on, there is no further point in discussion.

You're hallucinating things I did not say.

Given the matrix 'competent/incompetent' / 'sycophant/critic' I would not take it as read that the 'incompetent/sycophant' quadrant would have no adherents, and I would not be surprised if it was the dominant one.

They may say they don't want to be lied to, but the incentives they put in place often inevitably result in them being surrounded by lying yes-men. We've all worked for someone where we were warned to never give them bad news, or you're done for. So everyone just lies to them and tells them everything is on track. The Emperor's New Clothes[1].

1: https://en.wikipedia.org/wiki/The_Emperor%27s_New_Clothes


No, but they like the sycophancy.

People with immense wealth, connections, influence, and power demonstrably struggle to not surround themselves with people who only say what the powerful person already wants to hear regardless of reality.

Putin didn't think Russia could take Ukraine in 3 days with literal celebration by the populace because he only works with honest folks for example.

Rich people get disconnected from reality because people who insist on speaking truth and reality around them tend to stop getting invited to the influence peddling sessions.


Agreed. I used to review lots of submissions for IEEE and similar conferences, and didn't consider it my job to verify every reference. No one did, unless the use of the reference triggered an "I can't believe it said that" reaction. Of course, back then, there wasn't a giant plagiarism machine known to fabricate references, so if tools can find fake references easily the tools should be used.

Cute. But please don't use this, because in addition to making your text useless for LLMs it makes it useless for blind and vision impaired people who depend on screen readers.


And, conversely, it (presumably) has no effect on VLMs using captive browsers and screenshotting to read webpages.


> making your text useless for LLMs

It arguably doesn't even do this. If this is adopted widely, it would only be for current LLMs; newer models could (and would) be trained to detect and ignore zero-width/non-printable characters.


6,000+, and those machines served many others (back then there were tens of thousands of machines on the Internet, but probably 10x as many that were connected to these by relays that handled email or Usenet traffic).


Also worth remember that especially with Internet-connected computers almost everything was multiuser. You did work on the Internet from a shell on a shared Unix server, not from a laptop.


Serverless remote workspaces as you might call them now.


Well, sort of. RTM underestimated the effect of exponential growth, and thought that he would in effect have an account on all of the connected systems, without permission. He evidently didn't intend to use this power for evil, just to see if it could be done.

He did do us all a service; people back then didn't seem to realize that buffer overflows were a security risk. The model people had then, including my old boss at one of my first jobs in the early 80s, is that if you fed a program invalid input and it crashed, this was your fault because the program had a specification or documentation and you didn't comply with it.


Interestingly, it took another 7 years for stack overflows to be taken seriously, despite a fairly complete proof of concept widely written about. For years, pretty much everybody slept on buffer overflows of all sorts; if you found an IFS expansion bug in an SUID, you'd only talk about it on hushed private mailing lists with vendor security contacts, but nobody gave a shit about overflows.

It was Thomas Lopatic and 8lgm that really lit a fire under this (though likely they were inspired by Morris' work). Lopatic wrote the first public modern stack overflow exploit, for HPUX NCSA httpd, in 1995. Later that year, 8lgm teased (but didn't publish --- which was a big departure for them) a remote stack overflow in Sendmail 8.6.12 (it's important to understand what a big deal Sendmail vectors were at the time).

That 8lgm tease was what set Dave Goldsmith, Elias Levy, San Mehat, and Pieter Zatko (and presumably a bunch of other people I just don't know) off POC'ing the first wave of public stack overflow vulnerabilities. In the 9-18 months surrounding that work, you could look at basically any piece of privileged code, be it a remote service or an SUID binary or a kernel driver, and instantly spot overflows. It was the popularization with model exploits and articles like "Smashing The Stack" that really raised the alarm people took seriously.

That 7 year gap is really wild when you think about it, because during that time period, during which people jealously guarded fairly dumb bugs, like an errant pipe filter input to the calendar manager service that run by default on SunOS shelling out to commands, you could have owned up literally any system on the Internet, so prevalent were the bugs. And people blew them off!

I wrote a thread about this on Twitter back in the day, and Neil Woods from 8lgm responded... with the 8.6.12 exploit!

https://x.com/tqbf/status/1328433106563588097


This was great to read. Related: Morris also discovered the predictable TCP sequence number bug and described it in his paper in 1985 http://nil.lcs.mit.edu/rtm/papers/117.pdf. Kevin Mitnick describes how he met some Israeli hackers with a working exploit only in only in 1994 (9 years later) in his book "Ghost in the Wires" (chapter 33). I tried to chronicle the events here (including the Jon Postel's RFC that did not specify how the sequence number should be chosen) https://akircanski.github.io/tcp-spoofing


Mitnick's use of the sequence number spoofing exploit was a super big deal at the time; it's half of the centerpiece of his weird dramatic struggle with Tsutomu Shimomura, whose server he broke into with that exploit (the other half was Shimomura helping use radio triangulation to find him).

Mitnick didn't write any of this tooling --- presumably someone in jsz's circle did --- but it also wasn't super easy to use; spoofing tools of that vintage were kind of a nightmare to set up.


"Your security technique will be defeated. Your technique is no good"


I remember hearing the audio at the time and thinking it was pretty funny back before I realized racism was bad.


So this would be the first stack overflow after the Morris' fingerd one (well, first one that's widely publicized):

https://seclists.org/bugtraq/1995/Feb/109

> we've installed the NCSA HTTPD 1.3 on our WWW server (HP9000/720, HP-UX 9.01) and I've found, that it can be tricked into executing shell commands. Actually, this bug is similar to the bug in fingerd exploited by the internet worm. The HTTPD reads a maximum of 8192 characters when accepting a request from port 80.


Agreed. Some of the big companies seem to be claiming that by going with ReallyBitCompany's AI you can do this safely, but you can't. Their models are harder to trick, but simply cannot be made safe.


Yes, natural gas is now cheaper than coal for power generation, and solar has become much cheaper. Even disregarding environmental concerns, it's just not a good choice any more.


I'm curious, in case anyone knows, how much of the economic disadvantage of coal is because of environmental reg compliance versus other, more fundamental costs?


You can't put coal in a pipeline.

You have to physically handle every piece of coal. Extract, load ship, unload.

Natural gas is shooting out of the ground in North Dakota.

You can compare to wind and solar also.

The economics aren't favorable.


> You can't put coal in a pipeline.

Yes you can!

https://en.wikipedia.org/wiki/Coal_pipeline

A slurry is transported in one variety of these. The trick is to induce a swirl in the flow, so even though particles are constantly falling out of suspension, they stay suspended. It's mathematically analogous to how putting a twist in magnetic field lines in a tokamak (or stellarator) prevents ExB effects from driving the plasma into the wall, as they would in just a plain toroidal magnetic field.

Lots of water is needed and drying the coal before burning it adds cost.


Really interesting, thanks for the link!

Acre-feet as a unit just made me realize how many options you get for volume with multiple base units (a foot-mile-inch is ~12m³ in case anyone was wondering). The non-metric system continues to impress (not in a good way)!


Thats the wrong question. The question to ask is what is the death rates per unit of electricity production: https://ourworldindata.org/grapher/death-rates-from-energy-p...

Coal beats everything else by a mile. We also get mercury pollution for free, so no more eating fish.


I'm not a coal proponent, just trying to understand the veracity of an argument that I think R's make sometimes (that coal isn't economically viable primarily because state regs make it too expensive).


Coal requires manual labour and mining so even without environmental regulations it's expensive. In the US coal use decrease had very little to do with environmental factors most of it was because fracking brought in cheap natural gas and pushed coal out.


Its does, but you should see the machines they use for coal mining these days, particularly in Wyoming. A giant strip mine seems to require like 20 workers, but with giant trucks and excavators.

Even for underground mines, check out the massive longwall mining machines they use- it's kind of astonishing. They pretty much take all the coal.

I think this is lost in the political talk about protecting mining jobs- the main original competition is more efficient mining operations.

Anyway, it really says something that natural gas and solar are cheaper than coal given this context.


You should see this one! https://youtu.be/azEvfD4C6ow

It's a bit dated (completed in 1978) and cost approximately $100 million. At one point it was the heaviest land vehicle, clocking in at 13,500 tons. It's since been succeded, but this bucket-wheel excavator needed only five people to operate. Those five people could mine 240,000 tons of coal[8] or 240,000 cubic metres of overburden (rock/soil on top of the coal) per day. That's 2400 coal wagons!

After totally tapping out all of the coal at the Tagebau Hambach mine, in 2001, it took a crew of 70 together to move it 14 miles to the next mine. This move cost 15 million German Marks.

(Just watch the linked documentary.)


I think it's primarily a fundamental cost issue. It's simply far cheaper to get an equivalent amount of energy from fracking a natural gas formation than having to literally dig coal out of the ground.


(disclaimer that I manage a climate&energy research group)

Most of the comments here are speculative.

The TLDR is that coal plants have trouble ramping their production up/down quickly, unlike natural gas which can do so in minutes. So, if you have a grid that is being thrashed by variable production (renewables), this results in variable pricing and demand for baseload. Coal cannot economically compete in that market (and neither can nuclear, which has the same problem).


Given that renewable power power is self correlated (all the solar panels are producing at once - or they’re not, all the wind turbines are turning at once - or they’re not) - renewable energy leads to low prices when it’s produced and high prices when it’s not.

Why not put massive, grid scale batteries “behind the meter” at a nuclear or coal plant to enable continual production but only sell power when prices are high and store power when prices are low?


Batteries are also highly useful for relaxing transmission constraints. I've seen a claim that sufficient storage (at various places in the network) could increase the energy transmittable over the existing grid by a factor of 3.

An analogy here is natural gas pipelines with intermediate storage caverns, which allow the pipelines to operate more steadily even if demand various greatly over the year.


> Why not put massive, grid scale batteries “behind the meter” at a nuclear or coal plant to enable continual production but only sell power when prices are high and store power when prices are low?

Even better, if you have a functioning wholesale electricity market, you can put those batteries on the grid and benefit everybody.


No, because as the parent comment suggests, if you have solar+wind backed by natural gas and battery storage, if the battery storage isn't enough the natural gas plants can quickly fire up. But coal plants don't have this ability, so it doesn't work as well in this environment (which is today's environment).


Once you have batteries you may do so with solar or wind


Yeah, but sometimes the intermittency is pretty extreme, and you can get away with significantly less overpaneling and storage if you have a mix of power sources. Not many experts advocate for 100% renewables.


Right; batteries aren't really suitable for the low frequency part of the supply/demand mismatch. Daily storage, great, perhaps up to 1 week, but lower frequencies they are increasingly expensive.

But there are other storage ideas that do much better for that. For example, burning an e-fuel like hydrogen, or ultra low capex thermal storage.


> neither can nuclear

Nuclear can load follow to some extent (my previous comment): https://news.ycombinator.com/item?id=36254716

But regulations and economics don't encourage it. Also note that NuScale appears to be designed to be dispatchable


Well, if you want to answer that question, you probably also need to figure out the hypothetical cost of the other power sources minus environmental regulations.

Nuclear would be (and used to be) massively cheaper, before regulations went wild against it.

I'm deliberately saying 'went wild', because the earlier nuclear power generation that was built to saner standards also has turned out to be incredibly safe already.

(Basically, anyone who avoided insane Soviet bullshit had safe nuclear power, as measured in eg fatalities per Joule of electricity generated.)


The regulatory costs of nuclear are mostly occurred in the design phase. Those costs are sunk and mostly irrelevant for new builds of old designs.

The fact that old designs like the AP1000 are crazy expensive to build has a lot more to do with the fact that the US sucks at building mega projects than anything else.


Interestingly, one of the reasons the design phase for nuclear is so onerous is the sheer amount of red tape involved due to compliance and other regulatory reasons. You wanna know something funny? You know what's really good at generating piles of convincing sounding bullshit that it's possible no one even actually reads, and looks like it's totally going to insert fuel rods into the nuclear power industry?

LLMs!


This will definitely make the nuclear industry perceptibly safer. How soon can we start?! /s


Nuclear power is arguably way too safe.


It’s not just the US that sucks at mega projects. It’s everywhere that’s not china.


Other countries are better at building particular types of mega projects. Some are better at transit, others are better at building tunnels, others are better at building massive ships, et cetera. But in regards to nuclear, I believe you're right.


I'm getting the impression the problem isn't any particular regulation, but rather than because regulations exist, the design is fixed.

Getting a design approved means a specific design is approved. So, the power plant must be built as designed, no changes. And apparently ensuring you built exactly what the design specifies is really expensive.

What's needed to reduce this cost is having some way to get a whole cloud of closely related designs approved, so that reasonable deviations from the design are also approved. This is equivalent to saying only the most critical part of the design would need to be built as designed, everything else would be allowed some slop. With something like this, one might (for example) be able to build the confinement building with less tight control on the configuration of the reinforcing steel.

I'm don't know how one would get such a cloud of designs approved. Maybe this is a problem that could be solved by massive computation? Run billions of mutant designs through a simulation gauntlet to see how sensitive it is to various perturbations? Or maybe add more defense in depth, like devices that scrub radioactive elements from steam (such things exist) so the tolerable chance of meltdown can be allowed to increase while keeping expected damage in check?


Sounds kind of like hyperparameter search - you're searching the design space for the bounds of the different parameters. I don't know if parametric design is possible on reactors, but would be neat if possible.


I'm mostly curious if there's any world in which coal beats at-scale solar production, or if it's totally moot. To be clear, I'm not rooting for that so much as looking for an ironclad case against.

Nuclear is a whole can of worms because of its PR problems.


The American nuclear industry was collapsing due to spiraling costs before TMI happened.

This ”before regulations” time period seems to be made up on feelings about a rosy picture of the past rather than actual data.


It wasn't just spiraling costs, but also the collapse in the steady 7%/year growth in electrical energy demand. Without transparency on demand growth, very long term investments become risky (a risk reflected in the bankruptcy of WPPSS). The passage of PURPA in 1978 also didn't help with this as it allowed a flood of non-utility generation onto the grid, helping soak up what demand growth there was.


Yes, regulations started tightening before TMI.

> This ”before regulations” time period seems to be made up on feelings about a rosy picture of the past rather than actual data.

No, we have data about costs in those earlier times, and we know what regulations came in when.


Given the cleanup costs of the early nuclear program maybe they learned a thing or two preventing the externalization of costs?

DOE report: Cost to finish cleaning up Hanford site could exceed $589 billion

https://www.ans.org/news/2025-04-17/article-6942/doe-report-...

Sellafield cleanup cost rises to £136bn amid tensions with Treasury

https://www.theguardian.com/business/2024/oct/23/sellafield-...


Except of course, Fukushima. Or any nuclear plant that gets hit by tsunami, earthquake, terrorism,or other natural disaster.


It looks like the top-end estimate is that the Fukushima disaster may have caused up to 500 additional total lifetime deaths from cancer. Roughly 23,000 people per year died of diseases attributed to coal power plants in the United States alone from 1999-2020.

Edit: Changed "linked to" to "attributed to", because this is the estimated count of people who would not have died of disease if coal power plants were not running.


500 deaths at $12M per life is $6B. This is a small fraction of the total cost of Fukushima.

People say LNT overestimates deaths, but what they don't realize is that even if you take LNT at face value the cost of deaths from a nuclear accident isn't really that high. A regulatory regime where reactor operators that have accidents are charged the inferred cost of the expected deaths could work.


Were there any deaths from the Fukushima nuclear disaster?

A large area was evacuated and "human costs" were great. But as I recall, no deaths from radiation.


People whose metabolic reserve is low often die when you stress them.

I saw a study claiming 440 excess deaths from the Los Angeles fires. I'll make an assumption that permanently moving old and health impaired people from the Fukushima exclusion zone had a similar increase in mortality. And then a bit of looking leads me to this.

"The evacuation itself also was not without severe consequences. The accident was in the winter, and the evacuation of 840 patients or elderly people in nursing homes and health-care facilities apparently resulted in 60 immediate deaths due to hypothermia, dehydration, trauma and deterioration of serious medical conditions (Tanigawa et al 2012) and upwards of 100 deaths in subsequent month"

https://iopscience.iop.org/article/10.1088/0952-4746/33/3/49...

Like the Grapes of Wrath where the family starts out for California and the grandparents both die on the way.


Yes, If there were 20x nuclear power stations, there would probably have been 20x Fukushima scale incidents.

Murphy's law is real...


Fatalities per Joule of generated electricity is extremely low for nuclear power, even if you add not just Fukushima but also Chernobyl.

So if you'd scale up, and keep that level of safety, it would be fine. Even less safety would be fine. After all, we accept much less safety in eg natural gas or even solar power. (Solar power is extremely safe once running, but if you look at casualties over the lifecycle, you get a few people falling off roofs when installing residential solar power. It's a very small number, but nuclear is so safe, that the roof-fall incidents of solar are a big number by comparison.)


NE is starved for natural gas. They replaced the coal plants with gas and didn't bother to build enough pipelines (and even blocked them). We are getting a lot of our gas from Tankers from Europe and even Russia before the war. So now NE has nearly the highest electricity costs in the country even though we're not far away from some of the most abundant natural gas in the planet.


Including regulatory and litigation expenses, this is simply not true. That said, coal is very dirty and is not the best choice for power generation because of pollution. Coal is pretty cheap, and it's take most of my lifetime to change the cost through regulation, litigation and lawmaking.


https://www.eia.gov/dashboard/newengland/overview

Solar makes up 4% of New England electricity. Not much sun there. Needs nuclear to succeed


Unlikely. The ISO has 3.5GW of solar and nuclear capacity equally (I’m aware of capacity factor of solar vs nuclear, but the ISO also reports ~6GW of behind the meter distributed solar which only manifests as reduced demand). The ISO needs more batteries, renewables, and transmission from hydro in Quebec, Canada (1.2GW), but nuclear is not needed to succeed long term and those two generators will eventually be decommissioned, as their license only extends their operating period to ~2050. Twenty five years is plenty of time to replace their 3.5GW of output; 18GW of battery storage, 17GW of wind, and 13.5GW of solar is in the ISO’s interconnect queue or has been proposed by developers.

https://www.mainepublic.org/climate/2025-01-03/central-maine... (“Central Maine Power aims to finish controversial western Maine power corridor in 2025”)

https://www.mainepublic.org/politics/2023-11-30/documents-re... (NextEra, which owns the Seabrook nuclear power plant in N.H., stands to lose tens of millions of dollars every year if the NECEC comes online and attempted to use political donations to scuttle the Quebec Hydro transmission line)

https://www.iberdrola.com/about-us/what-we-do/smart-grids/ne... (“The new transmission line between Quebec and Maine will provide 1,200 megawatts (MW) of renewable hydroelectric power to the New England power grid in Lewiston, Maine, sufficient to meet the demand of 1.2 million homes. Once built, NECEC will be New England's largest renewable energy source, saving customers $190 million per year.”)

https://www.iso-ne.com/about/government-industry-affairs/new...

https://www.canarymedia.com/articles/clean-energy/northeast-...

https://app.electricitymaps.com/map/zone/US-NE-ISNE/live/fif...

(Quebec, interestingly, has ~40GW of hydro generation capacity)


Yup, I've been watching the NECEC story unfold over the years. Boston's giant natural gas Mystic Generating Station was closed in anticipation of the power being replaced by NECEC.

There is an interesting podcast about Quebec hydropower, it's quite an accomplishment, not without controversy.

https://outsideinradio.org/powerline


Apropos nothing, what would happen if that transmission line simply didn't get completed till, say 2028? Hydro power sounds like something woke people might like, and dear leader says we can't have that.


Or it gets built, but NextEra convinces the dear leader to tariff energy imports. Or Canada just cuts us off for being jerks.


Canada may cut you off eventually because the population is rapidly growing and it will want the power for itself. Hopefully New England is preparing for that eventuality.


> Canada may cut you off eventually because the population is rapidly growing and it will want the power for itself. Hopefully New England is preparing for that eventuality.

Canada's population growth has stalled. Here's what it means for the economy - https://ca.finance.yahoo.com/news/canadas-population-growth-... - September 30th, 2025

Canada's population growth almost flat in 2nd quarter as number of non-permanent residents declines - https://www.cbc.ca/news/business/canada-q2-population-1.7642... - September 24th, 2025


And that problem was largely solved by robots.txt. AI scrapers are ignoring robots.txt and beating the hell out of sites. Small sites that have decades worth of quality information are suffering the most. Many of the scrapers are taking extreme measures to avoid being blocked, like using large numbers of distinct IP addresses (perhaps using botnets).


Rural stations relied heavily on CPB funding; urban stations get most of their funding from donations or corporate underwriting. So big city public TV and radio will survive, but those in less populated areas might go under unless some other source of funding is found.


Yeah but the shows that the urban stations are running and producing are all bought by the rural stations. So the whole ecosystem needs the rural stations to help fund the productions.


The urban stations raise a lot of money locally (through pledge drives, and by hitting up local companies for underwriting, which is basically advertising). The rural stations don't, too few people. The rural stations get CPB money, and some of that goes back to fund shows that they carry, but mostly it's the cost to operate the stations. The urban stations aren't being propped up by the rural stations, there's too little money, even including the money that they get from CPB.


I agree overall that this is not a good thing for also furthering a knowledge gap between rural and urban areas. But in the age of internet streaming, wouldn't rural areas still have access to stream public radio? Genuinely asking.

I tried looking for sources on station audience sizes, alternatives they might have, etc. But it was difficult to find.


> But in the age of internet streaming, wouldn't rural areas still have access to stream public radio?

Sometimes streaming isn't an option. When Helene hit WNC we lost power, cell, internet, and water all at the same time. The local NPR stations were the only ones broadcasting updates on a regular cadence so we could learn what in the world was going on. And we're not far from downtown Asheville.

Some extremely rural areas only have spotty internet or no internet or cell at all and public radio is the only thing they have.


Local reporting is basically dead outside of metro areas.

Sure, you can stream, but the content will be focused on another locale or won't address local issues.


When I'm not busy worrying about everything else, I worry that there's assuredly an explosion of local corruption, especially outside of cities large enough to still have something resembling actual local news media, that we can't even begin to get a handle on because it's... well, it's invisible now, that's why it's (surely—I mean, we can't possibly think corruption is dropping or even remaining steady, with the death of the small town paper and small-market TV news rooms, right?) happening in the first place.

I think it's, quietly and slowly, the thing that's going to doom our country to decline if something else doesn't get us first (which, there are certainly some things giving this one a run for its money). The Internet killed a pillar of democracy, replaced it with nothing that serves the same role, and we didn't even try to keep it from happening, so here we are, we doomed ourselves by embracing the Internet quickly and not trying to mitigate any harm it causes.


For some your comment might sound even comic but it is damn true. It safens me that the dangerous spiral is not seen by many others.

After all, the milenia old adage "bread and games" silences to many.


It's pretty dead even in metro areas.

My local NPR broadcasts rarely actually cover anything that's happening in like city or county politics. Heck, even talking about state politics is pretty rare.


In the SF bay area, KQED (NPR affiliate) has a lot of coverage of local SF and Bay Area politics. The Pacifica station, KPFA (public radio but not an NPR affiliate) has more.


Yes, all the rural PBS markets will retain streaming access, which, again, is how most people under the age of 60 get access to PBS today.


Public radio and local broadcasting has been gobbled up by right-wing sources, including Sinclair

Watch this clip:

https://youtu.be/xwA4k0E51Oo?feature=shared


As a long time listener of AM radio. Literally nothing has changed from a programming perspective. The only noticeable difference is who supplies the on the hour news.


Well a counter argument would be, how would you know if anything is changed? If you're not part of the editors for a newsrooms how would you know which stories are cut and which make the broadcast?


Are there many rural-only districts?

Having moved around my PBS districts always seemed to be a metro+rural zone.


There are vast open spaces, out of FM radio range of a really big city. Some of the worst-hit public radio stations are on reservations.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: