VitaminD wrote: ↑
Sat Aug 01, 2020 11:43 am
Imagine not having a backup of data newer than 2018 and not having a colocation as either backup and/or to balance traffic.
Ploki wrote: ↑
Sat Aug 01, 2020 12:01 pm
nobody in their right mind would publicly say "we got hacked and got our data stolen and possibly lost a lot of your shit" before assessing the damage and preparing damage control...
And i would never hold it against anyone for buying some time to do that.
I've bought a load of stuff from JRR over the years and I really hope that he recovers from this. I've been in IT for many years (well over 30), and here are my observations:
- Your website is your business, lose it and you lose your business. Every day it's down is a day of revenue lost and you'll never get that back.
- Backups are king. Not just one backup, but multiple backups, backups of backups, and more backups. If you lose data without a backup, you can never get it back. Once it's gone, it's gone. In the office, we'd back up critical data (customer data, source code etc) to DVD every night, to an array of hard drives (one per day) and to an online backup provider.
- I was an admin for a number of mission critical dedicated web servers (not VPS) that we hosted in the US, with a primary dedicated SQL Server. We had an extra server doing nothing 99% of the time but it ran SQL Server (Microsoft SQL Server is not cheap!) that we synced the data to every night. We also took SQL backups nightly too. If we lost our SQL Server, we could just flip the application servers over to the backup while we rebuilt the primary. And if that failed, we had multiple backups.
- This was not cheap, it probably cost us upwards of £3,000 to £4,000 a month (we were only a small company) but for our customers, it was their business.
- As far as JRRShop goes, it's possible that overloading the site corrupted the SQL (or more likely, the mySQL) database. it seems incredible that the website couldn't handle the traffic but whether it was overloaded or hacked. when it came time to rebuild the site, it was discovered that the most recent SQL backup was two years old. Not good!
- If my business was reliant on my web store, I'd make damn sure that if something brought it down, I could get back online pretty damn quick without losing any data. Something catastrophic happened here and it sure looks like the backups that I'm sure Uncle E thought he was taking were either corrupt or non-existent. Or maybe he thought his web provider was taking backups and they weren't. It's all too easy to buy some VPS or web space somewhere that comes with admin, mySQL and backups as part of the package, but when push comes to shove, you discover they are sorely lacking, and then it's too late and the damage is done. When everything's running smoothly, you don't plan your disaster recovery (but you should)!
Like I say, we don't really know what happened and we may never be told. I sincerely hope Uncle E makes a full recovery from this and learns how to host his store-front better next time so he never experiences a repeat of this disaster.