> Dependabot has some value IME, but all naïve tools that only check software and version numbers against a vulnerability database tend to be noisy if they don’t then do something else to determine whether your code is actually exposed to a matching vulnerability.
For non-SaaS products it doesn’t matter. Your customer’s security teams have their own scanners. If you ship them vulnerable binaries, they’ll complain even if the vulnerable code is never used or isn’t exploitable in your product.
This is true and customers do a lot of unfortunate things in the name of security theatre. Sometimes you have to play the cards you’ve been dealt and roll with it. However, educating them about why they’re wasting significant amounts of money paying you to deal with non-problems does sometimes work as a mutually beneficial alternative.
We had a Python "vulnerability" that only existed on 32-bit platforms, which we don't use in our environment, but do you think we could get the cyber team to understand that?
It's not just open source though. Many high quality sources of information are being (over-)exploited and hurt in the process. StackOverflow is effectively dead [0], the internet archive is being shunned by publishers [1], scientific journals are bombarded by fake papers [2] (and anecdotally, low-effort LLM-driven reviews), projects like OpenStreetMap incur significant costs due to scraping [3], and many more.
StackOverflow was well on its way to death even without ChatGPT, just look at the graph from [0]. It has been in steady consistent decline since 2014 (minus a very transient blip from covid).
Then the chagpt effect is a sudden drop in visitors. But the rate of decline after that looks more or less the same as pre-chatgpt.
StackOverflow was killed by its toxic moderators. I hope it stays online thought because it's massive source of knowledge, although in many cases outdated already.
Overzealous moderator issue was probably the main reason but I think the direct answer and summary from Google directly also had a significant impact on StackOverflow. It took away potential contributors and reduced the incentives for active contribution.
In a way it was a trial and glimpse of what was coming with the AI revolution
I agree it was a moderation issue, but for me it's Reddit that largely replaced my SO usage starting some years ago. Reddit is pretty similar to SO in design, but the more decentralized nature of the moderation means that questions rarely get "closed as duplicate" and answers tend to be more up-to-date as a result. There's not always a consensus answer and I'm often looking across multiple threads on the same thing, but that's still better than an outdated SO post.
There were multiple times I wanted to contribute to SO but couldn't because I didn't have sufficient "reputation", or something. I shrugged and moved on.
I always thought StackOverflow was meant to fizzle out over time as more questions get answered and don't need to be asked again. Perhaps the decline is just a necessary part of their rule of having no duplicate questions - keeping it as a clean repository of knowledge rather than a messy forum.
Just the other day a question I asked about 10 years ago got flagged as a duplicate. It turns out somebody else had asked the same question several years later and got a better answer than my question got, so that other one is the canonical one and mine is pushed away. It feels kind of offensive but it makes complete sense if the goal is to provide useful answers to people searching.
Unfortunately, the rule of no duplicate questions also destroyed lots of questions that weren't duplicates... because _someone_ couldn't be bothered to read them and realize it wasn't the same.
Plus, there were a lot of fun questions they were really interesting to start with; and they stopped allowing them.
Yes, this. I've asked a couple of questions where the only responses are from people saying "possible dupe of x" where x is something that has a couple of the same words but no relation to what I'm asking.
Turns out if you design a forum where a high effort, high quality post can be devalued by a low effort response, you discourage high effort, high quality posters.
The major trouble with StackOverflow is that nominally duplicate questions may have different answers if asked on 2011 vs 2026 - and the trouble is that answer rankings (the thing that determines what answers are in the top) don't decay over time. So if someone try to answer an old question with up to date info, they won't garner enough upvotes to overcome the old, previously correct but now outdated accepted answer at the top. (even with a ranking decay, there is little incentive to give a new up to date answer to a established thread - people are more likely to contribute to brand new threads)
It would be better to allow duplicates in this specific case, but mark the old thread as outdated and link the questions in such a way that one can see the old thread and compare it to the new thread.
This is something I saw all the time. I’d look something up, knowing that there was probably an easy way to do <basic programming task> in modern c++ with one function call.
Find the stack overflow thread, answer from 10+ years ago. Not modern C++. New questions on the topic closed as duplicate. Occasionally the correct answer would be further down, not yet upvoted.
“Best practice” changes over time. I frequently saw wrong answers with install instructions that were outdated, commands that don’t function on newer OS version, etc etc.
You raise an interesting point about decay. I have thought about similar systems myself. One flaw in a simple decay rule would be that some technologies are very stable, e.g., C & POSIX API programming. However, other tech is very fast moving like Python, Ruby, Java, C#, C++, Rust, etc. One idea to overcome this flaw, might be to have moderators (who are specialists on the subject matter) provide a per question decay rule. Example: Something like struct layout or pointer manipulation in C or the fopen() POSIX function might never decay. But something like a parsing JSON in any fast moving language might require annual updates. For example, a question about parsing JSON in Java might decay answers over a one year period to encourage people to revist the topic. I would like to hear Jeff Atwood and Joel Spolsky debate this topic with other "Internet points" experts for an hour-long podcast. They might brainstorm some very intersting ideas. I would also love to hear what they think about the "moderator problem". Some of the topics had incredibly toxic moderators who scared away newcomers and women. (Women are much less likely to participate in public software forums where public shaming is common.)
> One idea to overcome this flaw, might be to have moderators (...)
> I would also love to hear what they think about the "moderator problem". Some of the topics had incredibly toxic moderators (...)
Yeah having bad moderators and arguably a bad, dysfunctional community is perhaps a even worse handicap. If you go to threads on meta.SE (meta stack exchange, meta discussions on the whole ecosystem) you will see that people mostly believe the site policies are okay, and that's because everyone that didn't believe left years ago.
Maybe better ideas on how to evolve a Q&A site may evolve in a brand new site, unfortunately I think that SO and perhaps the wider Stack Exchange network is done.
Great point, because as the knowledge evolves might need to evolve ranking too by allowing some versioning and somehow ranking or marking the outdated ones
The problem with this, and why SO’s downfall was completely self-inflicted, is that the correct answer from 2013 is only occasionally still the correct answer in 2018. There are a lot of other issues with SO’s general moderation policy but well and truly it was as idiotic and myopic as it was toxic.
They treated subjective questions about programming methods as if they were universal constants. It was completely antithetical to the actual pursuit of applied knowledge, or collecting and discussing best practices and patterns of software design. And it was painfully obvious for years this was as a huge problem, well before LLMs.
That said, I will say after being traumatized by having my threads repeatedly closed, I got so good at boiling down my problem to minimal reproducible examples that I almost never needed to actually post, because I’d solve it myself along the way.
So I guess it was great for training me to be a good engineer in the abstract sense. but absolutely shit at fostering any community or knowledge base.
> that the correct answer from 2013 is only occasionally still the correct answer in 2018
Exactly! They should have added proper structuring to questions/replies so that it could specifically apply for Language/library version X. Later, such a question could be answered again (either by proving it's still correct for version X+1, or by giving a new answer) - that way people wouldn't have to look at a new reply with 2 votes vs an older, possibly outdated one with 100 and make a decision which to prefer.
> StackOverflow was well on its way to death even without ChatGPT, just look at the graph from [0]. It has been in steady consistent decline since 2014.
"monthly questions asked" is a weird metric to measure the decline of StackOverflow tho. How many times are people gonna ask how to compare 2 dates in python, or how to efficiently iterate an array in javascript? According to the duplicates rule on SO, should be once anyway. So it's just inevitable that "monthly questions asked" will forever decrease after reaching its peak, since everything has already been asked. Didn't mean it was dead tho, people still needed to visit the site to read the responses.
A better metric to measure its decline would be "monthly visits", which I guess was still pretty high pre LLM (100s of millions per month?), even if the "monthly questions asked" was declining. But now I imagine their "monthly visits" is closer to zero than 1M. I mean, even if you don't use Claude and its friends, searching anything about programming on Google returns a Gemini answer that probably comes from StackOverflow, removing any reason to ever visit the site…
Your first point only holds if nothing ever changes in the programming world. People write new languages and frameworks all the time. How do you compare dates in pandas? How about polars? Duckdb? Etc.
Mods made asking questions a very hostile experience since they had a flawed ideal of SO becoming some form of encyclopedia. So no wonder people jumped on another train as quickly as possible, especially since it so often was a mistake to close a question whose next best answer was a long deprecated solution.
It still has some corners where people are better, but this is mostly the smaller niches.
Even if we completely avoid the worst case scenarios where AI obliterates the job market or evolves into a paperclip maximizer, it has a good shot of being the most destructive technology in generations. The tech industry has already done a lot of harm to our social fabric with social media, gambling, and other addictive innovations replacing real life experiences and personal connections. This has led to well documented increases in depression, loneliness, and political extremism.
Now it seems AI is poised to eliminate most of the good innovations that tech brought about, and will probably crank social strife up to 11. It already feels like the foundations of the developed world have gotten shaky; I shudder to think what a massive blow will bring about.
I've read enough history to know that I really, really don't want to live through a violent revolution, or a world war, or a great depression.
That would've happened regardless. But the alternative --- zero moderation, 100% free speech --- is how you get flamewars and spam like Slashdot and tons of other forums before it suffered from.
I don't think an alternative exists. Reddit was very unique. The last great BBS (in a sense) that non-Internet natives "got".
Before astroturfing on Reddit at scale was possible, it was an extremely reliable place to get perspectives from real people about loads of things. It's still useful for this purpose, but the same level of trust isn't there.
Now that social networking a la short-form video is "it" right now, I'm not sure if something text-based will thrive again like Reddit did. (People have been trying to make Lemmy the thing, and it's less popular than Mastodon.)
>Before astroturfing on Reddit at scale was possible
It has become so difficult to tell what is karma farming and what is people not bothering to search before asking.
In a strange way, what already started happening to the "other side" of Reddit six or so years ago with the emergence of OnlyFans turning that into a place where people just want to sell you was a precursor to this.
Everything kind of fractured apart and now those niche communities are building up again elsewhere.
Discord has a lot (looking at my discord I see, gaming, programming, clothing/fashion/aesthetic, language, dnd, music, keyboard / hardware, dance, etc... communities).
I've noticed a lot of the major reddit communities have matching communities in the fediverse, specifically the ones with old reddit-like UIs. (lemmy.ml, lemmy.world, sh.itjust.works, mander.xyz, etc...).
I've also noticed a lot of web-standards / browser developers and some gamedevs moved to twitter-style fediverse sites (e.g. mastodon.social, indieweb.social, infosec.exchange, hackyderm.io, floss.social, fosstodon.org, etc...).
---
I think the fediverse is working well for the niche communities for three reasons:
- Having that little bit more initial friction to learning how the fediverse works has made it better since it keeps out the low quality spamming users.
- Niche communities can only grow organically within their own spaces (since forcing them makes them seem inauthentic).
- The big plus of the fediverse is being able to follow/interact with users/communities across the boundary of being on another website. So it doesn't matter if a niche community you want to follow springs up on another website, you can follow them and participate from the website you already use.
For example: the old reddit-like communities that I follow (listed above) appear in a single feed in my programming.dev account (since that's the first one I joined), and the old twitter-like communities I follow appear in a single feed in my mastodon.social account (since that's the first twitter-like one I joined).
Society is a Ship Theseus; each generation ripping off planks and nailing their own in place.
Having been online since the late 80s (am only mid 40s...grandpa worked at IBM, hooked me and my siblings up with the latest kit on the regular) I have read comments like this over and over as the 90s internet, 00s internet, now the 2010s state of the "information super highway" has been replaced.
Tbh things have felt quite stagnant and "stuck" the last 20 years. All the investment in and caretaking of web SaaS infrastructure and JS apps and jobs for code camp grads made it feel like tech had come to a standstill relative to the pace of software progress prior to the last 15-ish years.
Stack Overflow is an interesting case because these days most people ask questions on Discord instead. The data isn't public, and the search functionality is terrible. It makes no sense, but somehow companies still prefer it even though it's inefficient and the same questions keep getting asked over and over.
Looks like at least Discord is recently decided to finally fix the issues caused by having users & are trying very hard to not have any going forward through insane identity verification mandates enforced by the most toxic partner companies ever. :)
The problem isn't the back door. Every telecom company in every country provides access for "lawful intercept". Phone taps have been a thing for decades and as far as I know, require a warrant.
The problem is that telecoms are very large, very complex environments, often with poor security controls. Investing in better controls is hard, time-consuming and expensive, and many telecoms are reluctant to do it. That's not great great since telcos are prime targets for nation state hackers as Salt Typhoon shows.
Hacking the lawful intercept systems is very brazen, but even if the hackers didn't don't go as far, and "only" gained control of normal telco stuff like call routing, numbering, billing, etc. it still would have been incredibly dangerous.
This really buries the lede. Telecoms are reluctant to do it because 'doing' it isn't aligned with their priorities.
Why would a telecom risk bankruptcy by investing heavily into a system that their competitors aren't?
If you want a back-door to exist (questionable) then the government either needs to have strong regulatory compliance where poor implementations receive a heavy fine such that telecoms who don't invest into a secure implementation get fined in excess of the investment cost or the government needs to fund the implementation itself.
Yes, telecoms should be forced to invest in their own security if they're not doing it. But the focus on the back door misses the point in my opinion. Even if the back door wasn't there, you wouldn't want nation state hackers anywhere near telecoms since they're critical infrastructure.
> Even if the back door wasn't there, you wouldn't want nation state hackers anywhere near telecoms since they're critical infrastructure.
This is only because of the design defect that "lawful intercept" requires.
Telecoms should be completely untrusted because everything is end-to-end encrypted. Compromising a telecom shouldn't allow you to do anything other than bring about a denial of service, and even that would only be effective against anyone who didn't have a redundant link with a different provider, which all actually critical infrastructure should. And a denial of service is conspicuous, as opposed to spying on required-to-be-unencrypted traffic which can continue undetected indefinitely and is a significant national security risk.
Our need to not be spied on is greater than our need to spy on ourselves and requiring designs that assume the opposite of that is a major self-imposed security vulnerability.
Even if let's say lawful intercept is done away with and calls are end-to-end encrypted, the telco would still be in control of key management and distribution... and if those clowns can't secure lawful intercept, why do you think the key distribution infrastructure would fare any better?
Why should they be in charge of key management? They should be in charge of physical plant and leave all of that to someone else. We should be discontinuing the legacy PSTN and making "phone" an IETF protocol where your "phone number" is user@domain.
Yes there is a lawful intercept system that operates inside telecoms networks, that is an issue.
The other issue is that there is no real security inside said telecoms networks. (side note, there is still fucking SS7 floating about)
Salt typhoon is not "just hijacking lawful intercept" its ability to fuck with the network in a way that is largely undetected. Sure the intercept stuff might help, but they don't actually need that. In the same way we learnt about state actors taking complete control of middle east telecoms systems, we can be fairly sure that other state actors have taken control of USA telecoms systems
Both the Executive and congress have done shit all about it, and will continue to ignore it until something happens
This. The lawful intercept infrastructure is one facet of their network. The rest of their infra is also a deep concern: call records, SS7 signaling, the IP network, mobile infra and it's back end (sim swapping).
How am I confusing the two? My whole point was the same as yours - that the existence of lawful intercept is a separate issue and that the focus should be on securing telecoms.
Even if the back door wasn't there, you wouldn't want nation state hackers anywhere near telecoms since they're critical infrastructure. Telecoms should be highly secure. Period.
I get that you don't like lawful intercept. That's fine. But focusing on only that aspect of telcos derails the conversation and prevents us (in the very broad sense of "us") from making progress on things we all agree on. Can we stop bikeshedding and agree that telcos are critical infrastructure and need to be highly secure in general?
A hacker in control of a telco can do as they please regardless of any backdoors or lawful intercept systems. They can just use regular network functions to route calls wherever they want.
> Can we stop bikeshedding and agree that telcos are critical infrastructure and need to be highly secure in general?
Yes, because the solutions to both are the same. Decentralized and trustless systems solve both problems is my opinion. I agree the pathway from where we are at now and there is complex, but it's not "bikeshedding" to believe there are fundamentally different and better ways to organize and secure a network that change the attack surface entirely.
(Think of IP layer being replaced with a PKI as a small example)
Nice website, but I feel like calling it "wire wiki" is quite ambitious. Currently, it's a (beautiful) DNS lookup tool, but that's about it. I expected something like RIPE Stat [0], or something like the undersea cable map [1] (based on the "wire" in the name). Also, if you're doing DNS, take a look at resolve.rs [2], they have some nice DNS tools, though not as pretty as yours :)
And since you mentioned scanning the IPv4 address space for DNS servers - I did that as well at a some point for a product I've built (and even have a patent on). The list of servers you're going to get with a naive scanning approach is not what you want. It won't include the servers you probably want (such as the customer-facing DNS servers of ISPs) and will include an insane amount of junk like home routers or weird IoT devices that expose their port 53. Hit me up via the email in my profile if you want to chat.
You're right that it doesn't do the name Wirewiki justice yet. I've got so many things planned to add at some point, much more than just DNS. Check again in 2 years' time ;)
> The list of servers you're going to get with a naive scanning approach is not what you want.
Absolutely right. I'm doing uptime monitoring and a handful of checks (udp/tcp, nxdomain, dnssec, dns filtering) before listing them, but I feel like it could definitely be improved. Would love to talk! I'll send you an email.
Tangentially, the fact that we're still using gigabit connections in our homes and especially offices in 2026 is weird. Gigabit Ethernet is over two decades old, but it's still the most common standard. Both 2.5 and 10 Gbps are effectively niche technologies.
I get it; it's "good enough" in most cases, like USB 2.0. But it still sucks we haven't moved past it.
Isn't Frozen something you do to a set or dictionary to say, I'm not going to add any more values, please give me a version of this which is optimized for lookup only?
You don’t see a difference between a major news outlet from a democratic country which has freedom of speech and an outlet from a religious monarchy which has no notion of free speech or even human rights?
That's usually not the bar though, many who refuse saudi media due to saudi ownership would be completely okay with al jazeera regardless of qatari ownership, even though both countries have very dubious intentions and government system
Saudi Arabia is one of the world leaders by number of death sentences. They have no qualms with putting you to death or giving you life imprisonment for all sorts of things, including "wrong-speech" in the form of leaving the state religion, or opposing the government. The UK isn't some shining beacon of freedom by Western standards, but it's not even in the same universe as Saudi Arabia.
For non-SaaS products it doesn’t matter. Your customer’s security teams have their own scanners. If you ship them vulnerable binaries, they’ll complain even if the vulnerable code is never used or isn’t exploitable in your product.
reply