Looks like that still has downtime for a Postgres migration- you're suggesting going into maintenance mode and just doing a dump/restore. I've seen that take hours once you hit the terabyte scale, depending on hardware.
I've had pretty good luck setting up logical replication from Heroku to the new provider and having a 10-15 minute maintenance window to catch up once it's in sync. Might be worth considering.
You might also want to add a warning about Postgres versions. There's some old bugs around primary key hash functions that can cause corruption on a migration. I've seen it twice when migrating from Heroku to other vendors.
Sorry, but telling people to take a logical backup of their database, and then download it onto their local work station is insane for a production application. First, a logical backup at any decent scale will fail, and second, I don't even have enough local storage to do that -- even ignoring the compliance issues with downloading a full copy of production data onto a work station.
For a company like Northflank, I'd expect actual production-grade documentation for migrating, not instructions that are only applicable to a toy app.
Some folks want to do that, others want to import a backup directly, some want to spawn a read replica and sync their DB. Different strokes for different folks, all supported on Northflank.
Crunchy Bridge will help you migrate. They did a great job for us. We had a minute or so of downtime to let the read replica catch up and cut across. The team knows Heroku well, and some of them built it. (No affiliation, just a happy customer.)
Spend a few days on Threads or on Instagram and you'll see the majority of viral posts on Threads are generated by AI, and the majority of descriptions on Instagram are generated by AI. It makes me incredibly sad, because I've used the internet for human connection my entire life. I always loved meeting new people, or reading other's perspectives. But it all just feels so ... empty ... now. I hope the pendulum swings the other way soon.
I'm not sure I agree with the doom and gloom here, but I do understand the sentiment. Especially after spending a few days writing something that I feel should've taken me 3 weeks. I do struggle with having a knee-jerk reaction to that change. I mean I just told my wife a couple nights ago that the future of programming feels shaky, even though I know that's not the reality right now. But she was in my office, so I showed her a bug, and then I asked Claude to fix it, and the bug was gone in 30 seconds, with regression tests.
So yes, AI writes code faster than I can, but it usually doesn't write better code. And you still need to know how to program to produce good code; it's very easy for Claude to write unmaintainable code, especially as it continues to write more code. You really have to put time into refactoring, using prior programming experience to know how to do so.
My current workflow is to prototype with Claude, and then refactor with Claude by giving clear instructions on what needs to be refactored and why. This works relatively well. But even then, at the end of the session, even the refactorings don't quite meet my high bar. So I hand-code towards the end. Maybe that last bit will go away? But it hasn't.
To be honest, code is too personal and artistic for me to fully give up control. I enjoy stressing the details, like doing a squint test on the code as I would with any art to check composition, or rewriting variables and classes and methods to hit the spot where I say "now that's beautiful code." Passing tests doesn't mean code good, or tests.
Thankfully my job allows me to spend time doing that. I think it's human to spend time doing that.
Don't blame AI for this correction, then.
reply