Got a question I figure maybe some people in here have thought about. So there's the age old debate of duplication vs. replication - and I get it.
I've got family photos in a DrivePool with 3x Global Duplication (around 1TB total). Now this is all replicated in the pool.
While this protects against hard drive failures, this does not protect against mistakes. My father could delete the Photo directory and it's effectively game over (yes there are undelete tools but lets ignore that for the sake of the argument).
What I am wondering is if people have any tried-and-tested minimal effort solutions to this kind of problem vector?
My initial ideas:
Remove deletion privileges from the main user used to access the NAS
Destructive actions can only be performed via a special "Deletion" designed Windows user, with a different login and password
Alternatively:
Create a second DrivePool for data I want to designate as "mistake proof"
Use some form of incremental/differential/<something else> backup tool to routinely (every week?) mirror from the original DrivePool to the secondary pool.
The idea being that, if files are ever deleted on Pool A, their "ghost" will always live on Pool B (aka Pool B should be forever growing, never decreasing)
Or:
Some sort hybrid solution between 1 and 2?
The second solution assumes the data rarely changes - which in the case of my family photos.
Now I'm not saying any of my solutions are right, I am still very much in the brainstorming process. I want my family to have confidence that I can keep their data safe (DrivePool is around 40TB big now, but for this post, only 1TB applies to my 'problem') - and that effectively means that I need to protect them.....from themselves.
And who knows, maybe I'll write the wrong command in the CLI one day and accidentally nuke everything...
Question
vfsrecycle_kid
Hi folks,
Got a question I figure maybe some people in here have thought about. So there's the age old debate of duplication vs. replication - and I get it.
I've got family photos in a DrivePool with 3x Global Duplication (around 1TB total). Now this is all replicated in the pool.
While this protects against hard drive failures, this does not protect against mistakes. My father could delete the Photo directory and it's effectively game over (yes there are undelete tools but lets ignore that for the sake of the argument).
What I am wondering is if people have any tried-and-tested minimal effort solutions to this kind of problem vector?
My initial ideas:
Alternatively:
Or:
The second solution assumes the data rarely changes - which in the case of my family photos.
Now I'm not saying any of my solutions are right, I am still very much in the brainstorming process. I want my family to have confidence that I can keep their data safe (DrivePool is around 40TB big now, but for this post, only 1TB applies to my 'problem') - and that effectively means that I need to protect them.....from themselves.
And who knows, maybe I'll write the wrong command in the CLI one day and accidentally nuke everything...
Thanks folks!
Link to comment
Share on other sites
10 answers to this question
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.