How..? Was there no local code on any of the dev machines? No git? I'm asking because for example if Github is vaporized today, my product would lose roughly a day or two's worth of work, since we have like 30 computers having a repository copy.
Of course redeploying every single thing would not be seamless because of course, there might be some configuration stored in services, or something similar, but I'd say that ~90% of our automation is stored in Git.
I mean, after all, if you don't have your own copy of the MS Access database then when your team scales beyond about 5 people that database is going to get harder to access. So really everyone should have a copy of all important PII. :P
Only if you're willing to stake your company's digital existence on the reliability of another company's cloud service.
If anything, it increases the need for 3-2-1 backups: the original copy of all of your files are on somebody else's computer that you have no control over. Hopefully they're keeping it backed up, and hopefully they don't go belly up and pull the plug all of a sudden. So you can use a primary backup in another cloud service from another company that hopefully won't kill their product at the same time as the other one (again, you have very little knowledge or control of the way they run their data center). Ultimately, it's a good idea to have a copy of your data that you have control over, maybe in a big drive (or set of drives, tapes, etc) in the safe, rotated daily/weekly/however long your company can cope with losing in a major SHTF situation.
Excessive? Maybe. For what it's worth my shop is locally hosted with both local and cloud backups. I have never regretted having at least one backup of anything and it's saved my bacon (or my coworkers', boss', etc.) a number of times. I've been fortunate to never need to rely on a secondary backup, but I sure wouldn't bet the company on it.
I would also like to know the answer. Would it be a good idea for the company to keep _encrypted_ backups on their machines/HDDs? Not a laptop somewhere, but something just a bit more involved.
It would make sense to keep backup on hard drive stored in safe in office. Doing it weekly would be reasonable but would have to accept that going to lose a week's worth of data.
The main problem is that would outgrow single hard drive so would need NAS. Also, the transfer speed could be an issue as database gets bigger. Even if don't store all customer data, it does make sense to store all the configuration, keys, and secrets.
i think for company-critical databases, the best you can do without invoking a terrible headache for your security officer is going multi-cloud: one big tech cloud, and one smaller firm that is completely disconnected from the other one
maybe they could even use a relatively inexpensive colo/baremetal provider to simply mirror the bigtech deployment on a smaller scale (would need to be quite flexible/vendor-agnostic to make that work...)
Ah that makes more sense, I can't read. I thought that the project stopped working all together, hence the startup was finished. I didn't realize it meant that they simply lost enough customers to go under.
A company's source code is mostly valueless. A company's customer data is priceless.
As Fred Brooks said in Mythical Man-Month: Show me your flowcharts and conceal your tables, and I shall continue to be mystified. Show me your tables, and I won’t usually need your flowcharts; they’ll be obvious.
Of course redeploying every single thing would not be seamless because of course, there might be some configuration stored in services, or something similar, but I'd say that ~90% of our automation is stored in Git.