Jump to content
  • 0

Had an idea I want to run by forum regarding ransomware


dsteinschneider

Question

I do small business IT support. I've seen my share of ransomware and was just thinking it would be hard to recover all the Austin City Limits I've recorded on WMC7. My main remedy for ransomware is backup but I don't really want to backup 20TB of things recorded from TV. I do have them duplicated so hardware failure is unlikely to be an issue. User error is possible though as is ransomware encryption. I was thinking that it would be neat if the duplicated copy in DrivePool was hidden from the copy that shows up in shares so it can't be encrypted by ransomware. 

What are people using for backing up large data stores like this?

Thanks,

Link to comment
Share on other sites

3 answers to this question

Recommended Posts

  • 0

This is just my personal way of doing things, but likewise having 10s of TBs of random archive stuff - where it'd be really quite annoying, but neither critical nor devastating, to lose stuff - then for each of a couple of sets of things I have 2 separate pools with one copy on each instead of letting DP duplicate as one pool.

(naturally more important data is treated differently, with it being stored with some combination of multiple drives (or arrays) &/or duplication (or R1) &/or on more than one machine &/or on cloud storage as appropriate)

There's then some basic main folder rules so that if a drive were to fail then it'd be very easy to know what's been lost & needs copying from the sister pool onto a replacement... ...though, more usefully, being pull a drive & connect it via a USB dock if the capacity needs increasing for a specific subset of data is handy - as it's only moving the data once to remove one drive & add the other...

…but it also means that, along with 2 drives with identical data on having to fail simultaneously, I'd have to manually delete the same thing twice... ...& in your situation then only one of the two pools would be the share.

I also rely heavily on checksums so if there were to be any data corruption on a drive then I could identify which of the 2 versions of a file was correct - though, with the exception of checking things when something's added to a subfolder before updating that folder's checksum, it's maybe twice a year that everything's checked through.

Well, whilst it's not that I have to sit & hand-crank the thing to get it to check 100s or 1000s of checksums & report any issues, it's not worth the effort to do it constantly.

Okay, it took a bit of time to get an organisational system that I was happy with, but it works for me for doing my version of what you're talking about.

Link to comment
Share on other sites

  • 0
On 3/18/2019 at 12:16 PM, dsteinschneider said:

I've seen my share of ransomware

The most foolproof solution I've found is to move your archival storage to a dedicated machine, and create a network share with WORM access.

I fooled around with Windows 10 'Controlled Folder Access' and it was a constant headache, having to constantly add allowances for new software.

I re-purposed an ageing Gigabyte Brix mini-PC as a headless NAS, and it's powerful enough to run DrivePool and a handful of other web-based tools.

 

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...