Hacker Newsnew | past | comments | ask | show | jobs | submit | ENGNR's commentslogin

Banks remain with COBOL because it's unsexy and stable. And then they say... let's just YOLO some vibe code into the next release sight unseen! Logic checks out.


They stick with COBOL because it runs well on the mainframe. The mainframe and sysplex architecture gives them an absurd level of stability and virtualization that I don't think the rest of the market has nearly caught up to yet. Plus having a powerful and rugged centralized controller for all of this is very useful in the banking business model.


This is the reason. IBM Mainframe business grew 60%. The modern mainframe is the best state of the art platform for computing, in both reliability and efficiency.


Also, IBM mainframes are wonderfully isolated from physical hardware. They could change processors in the next model, and users would notice a small delay as binaries were recompiled on-the-fly the first time it was used.

They surely could extract more performance from the hardware by shedding layers, but prioritized stability and compatibility.


> Also, IBM mainframes are wonderfully isolated from physical hardware. They could change processors in the next model, and users would notice a small delay as binaries were recompiled on-the-fly the first time it was used.

This was with AS/400's move from their own CISC processors to POWER. While you could pull that off with mainframes, it'd be recompiling actual native binary code. IBM mainframe architecture is very well defined and documented (sadly, unlike AS/400).

This [0] describes it in depth.

0- https://www.ibm.com/docs/en/SSQ2R2_15.0.0/com.ibm.tpf.toolki...


I stand corrected. I experienced that change, and it was seamless, but AS/400 was a midrange, not a mainframe.


At this point in time it can have more cores and more memory than a Z, and likely higher performance in benchmarks, but the architecture is closer to a minicomputer than a mainframe.

It’s an odd lineup.


Banks remain on COBOL because of the quantity of code that's been tested in real life situation for decades.

Yes IBM license for mainframe are expensive but it never fails.

I worked on a migration project where only the tests would take a few thousand days.

Yes they could be automated, but the regulations in place required that a human sign that all the test were executed at least once by a human.


What do you mean "only the tests would take a few thousand days"?

Did running the test suite take 10 years? Like literally what exactly do you mean?


If you have an app where the user must give their address when subscribing, what are the test that can be done?

User doesn't exist, invalid character in a field, user exists, wrong street name for the zip, wrong state for zip, wrong house number in the street, age below threshold, age above threshold,...

Each of these example must be done manually at least once to prove that the logic is correct and the tester must keep a report of it.

But, for each of these basic test, the data must be in a specific state (especially for the name already exists) so between each test you usually have a data preparation phase.

When you have a lot of these tests because it's spanning logic from decades, it takes time, especially when dealing with investments or insurance.

And usually for these test, you hire specific people that are targeted on correctness, not speed.

Now imagine what happens when you're at step 89 of your test and it fails.

The dev fix the code, fix the automated tests... And the tester restarts from step 1.


...Sir/madam, we have test environments, and comprehensive banks of (overtime hopefully idempotized) test data. When your schtick is software verification, one of the first things you develop a damn near fetishization for is test data management infra, and a place to store it. It's... A weird fixation admittedly. But getting data to land in just the right place at the right time so your testcase can run in a maximally parallelized fashion... chef's kiss Is perfection.

Did you read the last sentence ?


I read it twice; also have no idea what it’s implying


I imagine it's a case where you hire a dozen or so test attendants.


You couldn't score any higher on the risk factors. The training corpus for COBOL can't be all that large so the models won't understand it that well. Humans are largely out of the loop and the tooling guardrails are insufficient. Causing a billion dollar disaster with the help of a "shotgun surgeon"? Priceless.


It'll be sooo fun to watch...


Banks are slowly moving away from their old COBOL systems. It's about cost as much as it's about catching up with the neo-bank competition.

The main thing that makes this difficult is that in most cases the new system is supposed to be more capable. Transactional batch processing systems are replaced with event-based distributed systems. Much more difficult to get right.


Banks remain with COBOL because they have a fuckton of COBOL code and 4 people can write it. There's nothing more to it.

Now, 4 million people can write it.


I don't think learning how to write COBOL was ever a problem. Knowing that spaghetti codebase and how small changes in one place cause calamity all over the place is. Those 4 people's job is to avoid outages, not to write tons of code, or fix tons of bugs.


Honestly, there's companies that have lost the source code for some of their applications. Or, they depend on components from vendors that have long ceased to exist. I remember there being a lot of consternation around being able to compile and link against binary components that have just been around forever that could never be recompiled themselves. More people "Learning COBOL" was never going to be a solution to that ball and chain. And yeah, LLMs are good in the reverse engineering space too so maybe we'll finally see movement on that in the next decade.


You're probably right, no disagreement there. but in the context of my previous comment, the people that write cobol today, I don't think there is a lot of work for them trying to reverse engineer native code back to cobol because the source is lost. But you make a really good point, if AI can assist with lost code recovery, perhaps it will assist them in migrating away from it or getting rid of workarounds and complexities implemented to get that previously opaque binary's behavior.


I would say more significantly, 4 million people can read it. The changes required for any given quarter are probably miniscule, but the tricky part is getting up to speed on all those legacy patterns and architectural decisions.

A model being able to ingest the whole codebase (maybe even its VCS history!) and take you through it is almost certainly the most valuable part of all.

Not to mention the inevitable "now one-shot port that bad boy to rust" discussion.


You also need "make no mistakes"


Run both systems side by side for 9 months. Banks have patience.


In my experience, learning COBOL takes you a week at most, learning the "COBOLIC" (ha ha) way of your particular source base will take you a couple of months, but mastering it all including architecture will take you a year, half a year if you're really good.

One year from zero to senior doesn't sound that hard, does it? Try that with a Java codebase.


I would have you nnowhere near my production


Or production for the bank with my savings.


It's got to be a lot more than 4. Global Shop (ERP) is written in Visual COBOL.


They also launched dummy satellites from the "pez dispenser", directly simulating the actual mission payload, about 4 months ago.


The trick is that the USA steps up the buy price of an asset when you pass away. So if you use cheap loans your whole life, you can defer capital tax until it goes away.

Instead of two certainties in life being death and taxes, it's now death or taxes.


I have to congratulate you on the quip at the end there, which I'll steal! Great way to summarise this strategy. Is it an ENGNR original?


Lol thank you. Original as far as I know, most welcome to steal!


How does that work in practice?

If you're bootstrapped, borrow a bunch of money to pay tax because your company got to $10M val. But then the market shifts and it goes back down to $0 in later years, do you get the money back?

Even if you do, it sounds weird taxing someone for the right to create something, especially when they're still in the middle of creating it.


Yes, you do get the money back, that's the point of the paper.

This is better for many founders that otherwise wouldn't cash out at all. VCs will be forced to cover your unrealized capital gains taxes.


Yes! I rewound the video to double check

But honestly at this point I’m destined to buy a Steam Machine despite having a hefty Mac that could do gaming if only it were possible. Valve have been amazing about open computing and Apple are basically the enemy at this point.

It makes me wonder about what using steam machine for all computing might look like, as the new home of open computing and gaming.


I wonder if the video team uses Mac, and just shot a quick clip with the closest USB port on hand.


Had some very weird behaviour from cloudfront used purely to serve images from s3. Mostly huge slowdowns and outright failures on endpoints. Was about 15 hours ago that I noticed it by chance.

Was nothing on the aws status pages and no alerts/errors in my console. Eventually it sped up again.


We noticed massive latency from cloudfront spent the first part of my day migrating services out.


Nakamoto = central origin

Origin = CIA?


Satoshi = intelligent


Omfg you're right. I was half joking before, but yeah that's... quite a coincidence



Vision Pro is an excellent example

What Apple really needs to do is mimic their old policy of no fees except for games. Let everyone develop for it, and then rug pull by making the fees apply to everything

But they can’t do it twice. So the Vision Pro ends up with no ecosystem


Ewwwwwww


There’s too many hackers on hacker news!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: