It is hard to believe there was such a thing as “the good old days of ransomware,” but we might be forgiven for looking back nostalgically. While ransomware was a bloody nuisance, law firms generally felt protected if they had a well-engineered backup system to facilitate recovery.
With multiple backups, usually in the cloud, or (with small firms) on two or more external USB drives, you could ignore the badgering requests to pay the ransom, the clocks counting down to when your data would be totally inaccessible, etc.
The trick was always to have multiple backups so that a single backup solution didn’t leave you vulnerable to having all your data encrypted if you were struck by ransomware while backing up. Having that “virgin” backup meant you could restore the data. This of course assumes that you regularly performed test restores on your backups to make sure you could indeed restore data from them.
Good guys, 1 — bad guys, 0.
The exception was often in the health-care industry, where lives were at stake and taking the time required to restore data might cost lives. Often, those entities paid up—and once the cybercriminal discovered that, health-care entities were targeted.
If you are scratching your head about all the state and city governments that were brought to their knees by ransomware in the last two years, you should know that their backups were not properly engineered. In fact, they were a mess. The cleanup took forever and cost millions of dollars. Many local and state government agencies never understood what constituted properly engineered backups—nor did they budget for it. Even now, they are more likely to get cyber insurance to cover the risk than to adequately address the baseline problems.