
fattipants2016
-
Posts
80 -
Joined
-
Last visited
-
Days Won
2
Reputation Activity
-
fattipants2016 got a reaction from Shane in Cleaning up empty folders?
I run double parity, also. It's 1/2 for peace of mind and 1/2 because I'm a nerd and multiple parity is seriously cool stuff.
Worked great! I've simulated a few scenarios in the past by moving files to another disk, but this was the first time I needed it.
I couldn't leave well enough alone, though, and finally found a solution that works Even with write-protected folders.
From an administrator-level command prompt type: "Robocopy C:\Test C:\Test /B /S /Move"
I don't understand exactly why it works, but you're basically moving a folder onto itself, but skipping (deleting) empty folders.
/B allows Robocopy to properly set attributes and permissions for system folders like Recycle Bin and .covefs. Bad stuff happens if you don't use it.
I wrote a batch to run this script on all 7 of my Drivepool disks, and it completes in under a minute. Good stuff.
-
fattipants2016 reacted to Christopher (Drashna) in Junction points & symbolic links keep being recreated. How to clean reparse points / folder metadata?
Reparse point information is stored in the .covefs folder in the root of the pool. Worst case, delete the link, remove the contents of the .covefs folder, and then reboot.
-
fattipants2016 reacted to Shane in NTFS Permissions and DrivePool
Spend long enough working with Windows and you may become familiar with NTFS permissions. As an operating system intended to handle multiple users, Windows maintains records that describe which user owns each file and folder and how much access each user has to those files and folders. These records are kept on the same volumes as those files and folders. Unfortunately, in the course of moving or copying folders and files around, Windows may fail to properly update these settings for a variety of reasons (e.g. bugs, bit errors, power interruptions, failing hardware).
This can mean files and folders that you can no longer delete, access or even have ownership of anymore, sometimes for no obvious reason when you check via the Security tab of the file/folder's Properties (they can look fine but actually be broken inside).
So, first up, here’s what the default permissions for a pool drive should look like:
And now here’s what the default permissions for the hidden poolpart folder on any drive added to the pool should look like:
The above are taken from a freshly created pool using a previously unformatted drive, on a computer named CASTLE that is using Windows 10 Professional. I believe it should be the same for all supported versions of Windows so far.
Any entries that are marked Deny override entries that are marked Allow. There are limited exceptions for SYSTEM. It is optional for a hidden poolpart folder to Inherit its permissions from its parent drive. It is recommended that the Administrators account have Full control of all poolpart folders, subfolders and files. It is necessary that the SYSTEM account have Full control of all poolpart folders, subfolders and files. The permissions of files and folders in a pool drive are the permissions of those files and folders in the constituent poolpart folders. Caveat: duplicates are expected to have identical permissions (because in normal practice, only DrivePool should be creating them). My next post in this thread will describe how I go about fixing these permissions when they go bad.
-
fattipants2016 reacted to Shane in Permissions Confusion?
The poolpart folders do not need to inherit their permissions from their respective volume roots (though they default to doing so on a newly created pool using previously unformatted drives). The SYSTEM account must have full control of the poolpart folder, subfolders and files. The Administrators account is recommended to have full control of same. For more details, I have just created this thread.
-
fattipants2016 got a reaction from silk in Manualy delete duplicates?
Inside each physical disk that's part of the pool, exists a hidden folder named with a unique identification ID.
Inside these folders is the same folder structure as the pool, itself.
Your duplicated files / folders would simply be on the appropriate number of disks. They're not actually denoted as duplicates in any way.
If files are now duplicated (that shouldn't) be, it may be enough to simply re-check duplication.
-
fattipants2016 got a reaction from Christopher (Drashna) in Integrate Snapraid with Drivepool - into the product
You can run snapraidhelper (on CodePlex) as a scheduled task to test, sync, scrub and e-mail the results on a simple schedule.
If you like, you can even use the "running file" drivepool optionally creates while balancing to trigger it. Check my post history.
-
fattipants2016 got a reaction from bob19 in Integrate Snapraid with Drivepool - into the product
You can run snapraidhelper (on CodePlex) as a scheduled task to test, sync, scrub and e-mail the results on a simple schedule.
If you like, you can even use the "running file" drivepool optionally creates while balancing to trigger it. Check my post history.
-
fattipants2016 got a reaction from Jaga in Scheduled balancing >24 hours apart?
I only have experience with your second question. If you figure out Q1 you could probably just use it to trigger my solution to Q2.
1.) enable drivepool's runningfile option
2.) use bigteddy's FileSystemWatcher script (available on technet) to monitor for the removal of the runningfile you've configured, and write an event
3.) use the event-log entry you set up to trigger snapraid via a scheduled task
(in a nutshell, DP will create a dummy file while it's balancing and remove it when it's done. You can use the removal of this file to trigger SR)
Some notes:
1. ) the filesystem watcher eats up some i/o, so I still recommend you schedule it and define a max. runtime
- If you let it run all the time, it will also trigger snapraid every time DP does any routine checks, not just balancing
2.) I recommend configuring snapraid-helper (from codeplex) rather than calling snapraid from command-line
- it will check for a user-defined number of missing files prior to sync, and e-mail you so you can decide what to do
- you can also have it email you with a list of added / removed / updated / restored files after every sync if you so desire.
I'd never touched powershell prior to configuring the scenaro above, and now I use it for all kinds of cool stuff. It's worth giving it a go.
I made quite a few posts, here, while trying to get it working. they might be useful