Jump to content
  • 1

[HOWTO] File Location Catalog


Quinn

Question

I've been seeing quite a few requests about knowing which files are on which drives in case of needing a recovery for unduplicated files.  I know the dpcmd.exe has some functionality for listing all files and their locations, but I wanted something that I could "tweak" a little better to my needs, so I created a PowerShell script to get me exactly what I need.  I decided on PowerShell, as it allows me to do just about ANYTHING I can imagine, given enough logic.  Feel free to use this, or let me know if it would be more helpful "tweaked" a different way...

 

Prerequisites:

 

  1. You gotta know PowerShell (or be interested in learning a little bit of it, anyway) :)
  2. All of your DrivePool drives need to be mounted as a path (I chose to mount all drives as C:\DrivePool\{disk name})
  3. Your computer must be able to run PowerShell scripts (I set my execution policy to 'RemoteSigned')

I have this PowerShell script set to run each day at 3am, and it generates a .csv file that I can use to sort/filter all of the results.  Need to know what files were on drive A? Done.  Need to know which drives are holding all of the files in your Movies folder? Done.  Your imagination is the limit.

 

Here is a screenshot of the .CSV file it generates, showing the location of all of the files in a particular directory (as an example):

 

post-1373-0-10480200-1458673106_thumb.png

 

Here is the code I used (it's also attached in the .zip file):

# This saves the full listing of files in DrivePool
$files = Get-ChildItem -Path C:\DrivePool -Recurse -Force | where {!$_.PsIsContainer}

# This creates an empty table to store details of the files
$filelist = @()

# This goes through each file, and populates the table with the drive name, file name and directory name
foreach ($file in $files)
    {
    $filelist += New-Object psobject -Property @{Drive=$(($file.DirectoryName).Substring(13,5));FileName=$($file.Name);DirectoryName=$(($file.DirectoryName).Substring(64))}
    }

# This saves the table to a .csv file so it can be opened later on, sorted, filtered, etc.
$filelist | Export-CSV F:\DPFileList.csv -NoTypeInformation

Let me know if there is interest in this, if you have any questions on how to get this going on your system, or if you'd like any clarification of the above.

 

Hope it helps!

 

-Quinn

 

 

gj80 has written a further improvement to this script:

 

DPFileList.zip

And B00ze has further improved the script (Win7 fixes):

DrivePool-Generate-CSV-Log-V1.60.zip

 

Link to comment
Share on other sites

Recommended Posts

  • 1
22 hours ago, APoolTodayKeepsTheNasAway said:

Is there anyway this could be used to restore only the missing files in the event of a drive failure?

Good day.

Of course, this is kinda the whole point. Do you have Excel? You can load a CSV from before the loss of a drive, and a CSV from after the loss, and compare them. There is this function in Excel called VLOOKUP. You load both files into the same workbook as sheets, add a column to one of them and VLOOKUP the file paths in this one to the paths in the other; whatever's missing is what you've lost. You could setup conditional highlighting to do the same thing (I think.) Once you got a list in Excel, you can sort then copy/paste in a text file. You can then automate the process of recovery by writing a small batch script that reads the text file and copies the missing files from backup back onto the pool.

If you do not use duplication at all, then it's even easier, just sort by disk and whatever's on the lost drive is what you need to recover.

Regards,

Link to comment
Share on other sites

  • 0

nice personally i just tree to a txt file in the google drive folder but i script it like yours.

 

edited because i thought I might as well share what is in my script in case anyone prefers text files.  i have that folder regularly backed up to goodle drive.  i happen to like the format but yeah the fancier format allows some nice things to.  I run it weekly as a 80mb file is not insignificant

 

I just do the folders currently because else it becomes huge.  been thinking about moving it off the ssd though and if i do i will make it do to files at a time.  one with the files because trying to go through the entire structure with files is a bit intimidating honestly and hard to glance through

 

$date = (Get-Date -format "yyyy-MM-dd")
tree C:\Drives > "C:\Users\Administrator\Documents\Trees\trees-$date.txt"
Link to comment
Share on other sites

  • 0

Big fan of this, and certainly what I needed after a failed drive refused to give up the last 8GB or so of data after it was evac-ed.

 

Only problem I saw was that the file/folder paths were too long? Basically line after line of essentially 'path too long'.

 

Without redoing the whole directory structure, is there another way around this?

Link to comment
Share on other sites

  • 0

wolfpac,

 

If the cause is what I think it is (your folder/file name paths are over 255-260 characters long), there may not be a lot that can be done, except to fix that issue directly by renaming the file names/folders so they are shorter.

 

You could use a utility like Bulk Rename Utility (http://www.bulkrenameutility.co.uk/Main_Intro.php) to rename files (if there are too many to do manually).  You could also use the 'subst' command to temporarily create a drive letter shortcut to the file path, then use that drive letter to move/adjust as necessary.

 

Hope that helps...maybe someone else can chime in if there's a better way?

 

-Quinn

Link to comment
Share on other sites

  • 0

Big fan of this, and certainly what I needed after a failed drive refused to give up the last 8GB or so of data after it was evac-ed.

 

Only problem I saw was that the file/folder paths were too long? Basically line after line of essentially 'path too long'.

 

Without redoing the whole directory structure, is there another way around this?

Were you seeing the "path too long" in the DrivePool UI/log? 

Link to comment
Share on other sites

  • 0

Just a warning, something that got me recently - using -recurse on Get-ChildItem makes it follow all symbolic links also in case you have any. It might even follow through to .lnk file targets, I haven't tested that.

 

symbolic links on the underlying disks, or on the pool itself?

Link to comment
Share on other sites

  • 0

Hi,

 

I really liked the idea of doing this, but as others had also mentioned, I had issues with the maximum path limitation. I also wanted to log other information, like the drive model, serial number, etc. I also wanted to schedule it as a daily task, and to have it automatically compress the resulting files to save disk space.

 

I wrote a powershell script to do all of that. It relies on the dpcmd utility's output (for which you will need a recent version). DrivePool itself isn't limited by the maximum path limitation, and thus the dpcmd log output also isn't constrained by the path limitation. The script takes this log output, parses it with regex matches, retrieves associated disk information, and then writes out to a CSV. It then compresses it. The header of the file has a command you can paste into a CMD prompt to automatically schedule it to run every day at 1AM.

 

Please edit the file. Two variables need to be customized before using it, and the file describes requirements, where to put it, how to schedule it, what it logs, etc.

 

If you want to do a test run, you can just edit the two variables and then copy/paste the entire file into an elevated powershell window. The .CSV (generated once the .LOG is finished) can be viewed as it's being produced.

 

Also, you might want to hold off playing with this if you're not familiar with powershell scripting/programming in general/etc until a few other people report that they're making use of it without any issues.

 

@Christopher - If you want me to make this a separate post, just let me know. Thanks

 

DrivePool-Generate-Log-V1.51.zip

 

Link to comment
Share on other sites

  • 0

gj80,

 

Nice solution to the path issue!  I also like the idea of creating an "archive" location for past reports and zipping them up to save on HDD space.  Since I now know there is at least some interest in keeping this going, what do you think of bring this to Phase Two: GUI?  I've been meaning to get going on it for quite a while, but have had no time...

 

Here's my idea:

 

GUI with three tabs

 

Tab 1 (Drive Contents):

  • Drop-Down List of all CSV reports saved to archive folder (pick your date/time)
  • Drop-Down List of all drives identified from the above report (pick your drive, maybe the one that dies last night?)
  • Submit button, which shows a filtered list from the CSV of all files, locations, sizes, etc. from that drive.
  • Export button, to create a CSV of just those findings (this is your restore list, if needed)

Tab 2 (File Locations):

  • Drop-Down List of all CSV reports saved to archive folder (pick your date/time)
  • Search Bar (type in a filename, and it auto populates in a results box underneath) - select the file, then -
  • Submit button, which shows the file information and known drives it was found on

Tab 3 (Log Configuration):

  • Input Box for entering in a log folder path
  • Input Box for entering the DrivePool drive letter
  • Scheduling Input/Drop-Downs to select when/how often to run the log
  • Drop-Down List of how many logs to keep (maybe two weeks' worth?)
  • Submit button, which creates the scheduled task based on the parameters above

Everything "under the hood" would derive from PowerShell and/or DPCMD...

 

Any thoughts or interest?  I know this can be done with WinForms, but I have had no time to teach myself such things :)

 

-Quinn

Link to comment
Share on other sites

  • 0

It certainly sounds good. I'm not sure if it'll be worth us to put in the time required as end-users, though... If it was built into the application itself it'd provide more utility to the entire userbase, but as end-users writing an accessory utility, only a small portion of the overall userbase will end up using it. And, being honest, I'm sure not everyone is even as paranoid about all of this as we are :)

 

The only UI work I've done is in AutoIT, but from my limited experience with it, doing those first two tabs would be a lot of work. UIs, at least in AutoIT, are kind of a pain. Or, maybe I'm just less comfortable/experienced with it. Opening the CSVs directly and filtering on the headers with a spreadsheet program are probably things that anyone who would be using an accessory utility like this would be comfortable doing. I could be wrong though...

 

The third tab (and a UI for an installer which would put the file in a fixed path, run the task scheduler command, etc) would probably bring more utility to people for the time invested, though - I know some won't feel comfortable editing a script file, even to just modify a few variables. Even that would still probably be a fair amount of work, though... I work with powershell, regex stuff, etc a fair amount, so that was pretty easy. I'm not comfortable with trying to tackle a UI for it personally. If you feel inspired, though, then go for it! :)

 

The idea about retention is a good one - I think I'll update the script to add that as a variable and a routine to clean up old files.

 

Edit - I added a day-based retention setting and added the updated "V1.1" attachment to the post above.

Link to comment
Share on other sites

  • 0

ok boys its running on my 2012r2 essentials

 

i changed a couple of paths so i could put it on my d drive and have created a separate directory for the logs

 

i have initiated it from within task Scheduler and its off to the races

 

Now it has 40+TB to run through (duplicated pool) which is balancing at present - so will be interesting to see if i get files in more than one location in the log :)

 

Also will i get two instances at 1:00 as i guess it will not have finished by then :)

 

Couple of thoughts

 

1. Yes a UI would be nice but not essential

2. More pressing with large pools is the size of file its going to create - i'm thinking put the info into a free db program e.g. SQL Lite - or something more powerful - depends on performance- rather than use Excel

3. Some sort of summary file at the end - or thats generated form the db

4. Not looked whats required but there is an api for doing plugins for DP - could be integrated that way perhaps??

 

Will report back when the machine breaks or finishes  :P

 

Have experience of the above not a big task just needs a bit of time to code it up

Link to comment
Share on other sites

  • 0

Well its generated 150 mb file in 20 minutes so not huge but it has a million or more to run at  :blink:

 

did your tests have the full unc path including the volume name - as i dont have my drives lettered as i have more than 26

 

@dj80 also will the csv have the unc path or will it drop that to device 1,2 ,3 etc?

Link to comment
Share on other sites

  • 0

 

Also will i get two instances at 1:00 as i guess it will not have finished by then 

 

Since you initiated it via task scheduler, it shouldn't run a second time as long if it's still running.

 

I've done some work with SQLite via powershell before. It definitely allows for more flexibility and performance, and I'm comfortable running SQL queries, but without a UI, I think not being able to just open a CSV and filter in excel would limit the usefulness for a lot of people... it certainly might be a good alternate "mode" to have, though. Worst case, there's probably an easy way to convert a CSV over to a SQLite DB with one table.

 

It will definitely be interesting to see how this turns out for you, since it seems like you have an extremely large pool. If you get the resulting CSV open in excel when it's done, please jump to the end and let us know the number of rows, and how long it took to produce it (the difference between the timestamp on the "DPAuditLog-Current.log" and "DPAuditLog-Current.csv") along with the CSV file size (and the .ZIP file size). Thanks!

Well its generated 150 mb file in 20 minutes so not huge but it has a million or more to run at  :blink:

 

did your tests have the full unc path including the volume name - as i dont have my drives lettered as i have more than 26

 

@dj80 also will the csv have the unc path or will it drop that to device 1,2 ,3 etc?

 

It doesn't rely on having any of the drives mounted. The path it records in the CSV is relative to the base of the pool, like: "BACKUPS\365\CLIENT\data\filename.file"

 

It then records, per line, the disk number (what you see in disk management), along with the disk model and serial number (to help physically identify it) and the file size in bytes.

 

...I guess one way of cutting down on the size would be to just record the disk number, and then record the associated disk models and serial numbers for each number as a separate file that is zipped up together with the CSV.

Link to comment
Share on other sites

  • 0

Not seeing any disk models or serial no in the output

 

volume and poolpart and device no though is that what you meant?



dpcmd - StableBit DrivePool command line interface

Version 2.2.0.734

Detail level: File Parts

Listing types:

  + Directory
  - File
  -> File part
  * Inconsistent duplication
  ! Error

Listing format:

  [{0}/{1} IM] {2}
    {0} - The number of file parts that were found for this file / directory.
    {1} - The expected duplication count for this file / directory.
    I   - This directory is inheriting its duplication count from its parent.
    M   - At least one sub-directory may have a different duplication count.
    {2} - The name and size of this file / directory.

Scanning...

+ [28x/2x M] E:\
  -> \Device\HarddiskVolume1\PoolPart.24a10d83-5eb3-45d8-a126-fb04f6e3d1a2\ [Device 0]
  -> \Device\HarddiskVolume2\PoolPart.09244e60-99ae-4044-867c-c729d26ea73b\ [Device 1]
  -> \Device\HarddiskVolume3\PoolPart.4e8eb030-e003-4345-808d-8c916b0631ab\ [Device 2]
  -> \Device\HarddiskVolume4\PoolPart.ba54be14-6f03-488e-b8df-92623fba5e0a\ [Device 3]
  -> \Device\HarddiskVolume5\PoolPart.c3a46a7c-287f-4dd8-85c6-0059c52a227a\ [Device 4]
  -> \Device\HarddiskVolume6\PoolPart.a22c77ba-1cb8-4733-8d03-7f6782da4241\ [Device 5]
  -> \Device\HarddiskVolume7\PoolPart.0aed21a3-07d5-4b20-adfc-385626021e54\ [Device 6]
  -> \Device\HarddiskVolume8\PoolPart.711da57f-1c69-445c-b925-b2de11d9311b\ [Device 7]
  -> \Device\HarddiskVolume9\PoolPart.158beb49-b7f2-4635-97ff-e51bedbc6a0f\ [Device 8]
  -> \Device\HarddiskVolume10\PoolPart.5a8a5b05-62c8-4c91-9d04-6e69420e1e5b\ [Device 9]
  -> \Device\HarddiskVolume11\PoolPart.2613d588-9056-43f9-8410-31c4f267e2ca\ [Device 10]
  -> \Device\HarddiskVolume13\PoolPart.0486ff3a-604f-4c04-ada8-d7d2851cf13f\ [Device 11]
  -> \Device\HarddiskVolume15\PoolPart.bd847b99-74f9-4723-bed8-aba3a7739d93\ [Device 12]
  -> \Device\HarddiskVolume17\PoolPart.23cd6461-b6c9-4bff-b566-3ac9282ca68a\ [Device 13]
  -> \Device\HarddiskVolume19\PoolPart.f118843a-fc50-4494-bb97-7f31ce6565b0\ [Device 14]
  -> \Device\HarddiskVolume21\PoolPart.00baa63a-cf5f-4ed2-af44-c19624546738\ [Device 15]
  -> \Device\HarddiskVolume22\PoolPart.1408546a-e189-4f79-b03a-f66c1d241de3\ [Device 16]
  -> \Device\HarddiskVolume23\PoolPart.272e4bde-7f23-4bbb-9981-2d3123b86381\ [Device 17]
  -> \Device\HarddiskVolume24\PoolPart.b1d8fa8c-1210-40c0-b3d3-91c79ac81084\ [Device 18]
  -> \Device\HarddiskVolume25\PoolPart.656efc36-5b4f-417c-a7c7-e811697d3506\ [Device 19]
  -> \Device\HarddiskVolume27\PoolPart.e95092e8-fbdf-4701-9e2e-33d190be8786\ [Device 20]
  -> \Device\HarddiskVolume29\PoolPart.f6a548a2-8d41-4144-8537-1909ce861df1\ [Device 21]
  -> \Device\HarddiskVolume31\PoolPart.242cea5e-859c-4959-ae4a-0bf7e255623c\ [Device 22]
  -> \Device\HarddiskVolume33\PoolPart.640e49c6-45ca-4768-8544-0bfb3fbb1277\ [Device 23]
  -> \Device\HarddiskVolume39\PoolPart.13adb443-d50e-413a-bc4a-d251a7e68463\ [Device 25]
  -> \Device\HarddiskVolume40\PoolPart.2680be39-1dbe-4451-8c57-1a0996a98aa5\ [Device 26]
  -> \Device\HarddiskVolume41\PoolPart.aa108639-369f-42a7-9aef-cf6573cdfd81\ [Device 27]
  -> \Device\HarddiskVolume42\PoolPart.6bd8f30d-cc74-4355-ad74-e1afbdeed0d6\ [Device 28]
+ [28x/2x I] E:\ServerFolders
  -> \Device\HarddiskVolume1\PoolPart.24a10d83-5eb3-45d8-a126-fb04f6e3d1a2\ServerFolders [Device 0]
  -> \Device\HarddiskVolume2\PoolPart.09244e60-99ae-4044-867c-c729d26ea73b\ServerFolders [Device 1]
  -> \Device\HarddiskVolume3\PoolPart.4e8eb030-e003-4345-808d-8c916b0631ab\ServerFolders [Device 2]
  -> \Device\HarddiskVolume4\PoolPart.ba54be14-6f03-488e-b8df-92623fba5e0a\ServerFolders [Device 3]
  -> \Device\HarddiskVolume5\PoolPart.c3a46a7c-287f-4dd8-85c6-0059c52a227a\ServerFolders [Device 4]
  -> \Device\HarddiskVolume6\PoolPart.a22c77ba-1cb8-4733-8d03-7f6782da4241\ServerFolders [Device 5]
  -> \Device\HarddiskVolume7\PoolPart.0aed21a3-07d5-4b20-adfc-385626021e54\ServerFolders [Device 6]
  -> \Device\HarddiskVolume8\PoolPart.711da57f-1c69-445c-b925-b2de11d9311b\ServerFolders [Device 7]
  -> \Device\HarddiskVolume9\PoolPart.158beb49-b7f2-4635-97ff-e51bedbc6a0f\ServerFolders [Device 8]
  -> \Device\HarddiskVolume10\PoolPart.5a8a5b05-62c8-4c91-9d04-6e69420e1e5b\ServerFolders [Device 9]
  -> \Device\HarddiskVolume11\PoolPart.2613d588-9056-43f9-8410-31c4f267e2ca\ServerFolders [Device 10]
  -> \Device\HarddiskVolume13\PoolPart.0486ff3a-604f-4c04-ada8-d7d2851cf13f\ServerFolders [Device 11]
  -> \Device\HarddiskVolume15\PoolPart.bd847b99-74f9-4723-bed8-aba3a7739d93\ServerFolders [Device 12]
  -> \Device\HarddiskVolume17\PoolPart.23cd6461-b6c9-4bff-b566-3ac9282ca68a\ServerFolders [Device 13]
  -> \Device\HarddiskVolume19\PoolPart.f118843a-fc50-4494-bb97-7f31ce6565b0\ServerFolders [Device 14]
  -> \Device\HarddiskVolume21\PoolPart.00baa63a-cf5f-4ed2-af44-c19624546738\ServerFolders [Device 15]
  -> \Device\HarddiskVolume22\PoolPart.1408546a-e189-4f79-b03a-f66c1d241de3\ServerFolders [Device 16]
  -> \Device\HarddiskVolume23\PoolPart.272e4bde-7f23-4bbb-9981-2d3123b86381\ServerFolders [Device 17]
  -> \Device\HarddiskVolume24\PoolPart.b1d8fa8c-1210-40c0-b3d3-91c79ac81084\ServerFolders [Device 18]
  -> \Device\HarddiskVolume25\PoolPart.656efc36-5b4f-417c-a7c7-e811697d3506\ServerFolders [Device 19]
  -> \Device\HarddiskVolume27\PoolPart.e95092e8-fbdf-4701-9e2e-33d190be8786\ServerFolders [Device 20]
  -> \Device\HarddiskVolume29\PoolPart.f6a548a2-8d41-4144-8537-1909ce861df1\ServerFolders [Device 21]
  -> \Device\HarddiskVolume31\PoolPart.242cea5e-859c-4959-ae4a-0bf7e255623c\ServerFolders [Device 22]
  -> \Device\HarddiskVolume33\PoolPart.640e49c6-45ca-4768-8544-0bfb3fbb1277\ServerFolders [Device 23]
  -> \Device\HarddiskVolume39\PoolPart.13adb443-d50e-413a-bc4a-d251a7e68463\ServerFolders [Device 25]
  -> \Device\HarddiskVolume40\PoolPart.2680be39-1dbe-4451-8c57-1a0996a98aa5\ServerFolders [Device 26]
  -> \Device\HarddiskVolume41\PoolPart.aa108639-369f-42a7-9aef-cf6573cdfd81\ServerFolders [Device 27]
  -> \Device\HarddiskVolume42\PoolPart.6bd8f30d-cc74-4355-ad74-e1afbdeed0d6\ServerFolders [Device 28]
+ [25x/2x I] E:\ServerFolders\Backup
  -> \Device\HarddiskVolume1\PoolPart.24a10d83-5eb3-45d8-a126-fb04f6e3d1a2\ServerFolders\Backup [Device 0]
  -> \Device\HarddiskVolume3\PoolPart.4e8eb030-e003-4345-808d-8c916b0631ab\ServerFolders\Backup [Device 2]
  -> \Device\HarddiskVolume4\PoolPart.ba54be14-6f03-488e-b8df-92623fba5e0a\ServerFolders\Backup [Device 3]
  -> \Device\HarddiskVolume5\PoolPart.c3a46a7c-287f-4dd8-85c6-0059c52a227a\ServerFolders\Backup [Device 4]
  -> \Device\HarddiskVolume6\PoolPart.a22c77ba-1cb8-4733-8d03-7f6782da4241\ServerFolders\Backup [Device 5]
  -> \Device\HarddiskVolume7\PoolPart.0aed21a3-07d5-4b20-adfc-385626021e54\ServerFolders\Backup [Device 6]
  -> \Device\HarddiskVolume8\PoolPart.711da57f-1c69-445c-b925-b2de11d9311b\ServerFolders\Backup [Device 7]
  -> \Device\HarddiskVolume9\PoolPart.158beb49-b7f2-4635-97ff-e51bedbc6a0f\ServerFolders\Backup [Device 8]
  -> \Device\HarddiskVolume10\PoolPart.5a8a5b05-62c8-4c91-9d04-6e69420e1e5b\ServerFolders\Backup [Device 9]
  -> \Device\HarddiskVolume15\PoolPart.bd847b99-74f9-4723-bed8-aba3a7739d93\ServerFolders\Backup [Device 12]
  -> \Device\HarddiskVolume17\PoolPart.23cd6461-b6c9-4bff-b566-3ac9282ca68a\ServerFolders\Backup [Device 13]
  -> \Device\HarddiskVolume19\PoolPart.f118843a-fc50-4494-bb97-7f31ce6565b0\ServerFolders\Backup [Device 14]
  -> \Device\HarddiskVolume21\PoolPart.00baa63a-cf5f-4ed2-af44-c19624546738\ServerFolders\Backup [Device 15]
  -> \Device\HarddiskVolume22\PoolPart.1408546a-e189-4f79-b03a-f66c1d241de3\ServerFolders\Backup [Device 16]
  -> \Device\HarddiskVolume23\PoolPart.272e4bde-7f23-4bbb-9981-2d3123b86381\ServerFolders\Backup [Device 17]
  -> \Device\HarddiskVolume24\PoolPart.b1d8fa8c-1210-40c0-b3d3-91c79ac81084\ServerFolders\Backup [Device 18]
  -> \Device\HarddiskVolume25\PoolPart.656efc36-5b4f-417c-a7c7-e811697d3506\ServerFolders\Backup [Device 19]
  -> \Device\HarddiskVolume27\PoolPart.e95092e8-fbdf-4701-9e2e-33d190be8786\ServerFolders\Backup [Device 20]
  -> \Device\HarddiskVolume29\PoolPart.f6a548a2-8d41-4144-8537-1909ce861df1\ServerFolders\Backup [Device 21]
  -> \Device\HarddiskVolume31\PoolPart.242cea5e-859c-4959-ae4a-0bf7e255623c\ServerFolders\Backup [Device 22]
  -> \Device\HarddiskVolume33\PoolPart.640e49c6-45ca-4768-8544-0bfb3fbb1277\ServerFolders\Backup [Device 23]
  -> \Device\HarddiskVolume39\PoolPart.13adb443-d50e-413a-bc4a-d251a7e68463\ServerFolders\Backup [Device 25]
  -> \Device\HarddiskVolume40\PoolPart.2680be39-1dbe-4451-8c57-1a0996a98aa5\ServerFolders\Backup [Device 26]
  -> \Device\HarddiskVolume41\PoolPart.aa108639-369f-42a7-9aef-cf6573cdfd81\ServerFolders\Backup [Device 27]
  -> \Device\HarddiskVolume42\PoolPart.6bd8f30d-cc74-4355-ad74-e1afbdeed0d6\ServerFolders\Backup [Device 28]
    - [2x/2x] E:\ServerFolders\Backup\Capture.PNG (126 KB - 129,380 
      -> \Device\HarddiskVolume21\PoolPart.00baa63a-cf5f-4ed2-af44-c19624546738\ServerFolders\Backup\Capture.PNG [Device 15]
      -> \Device\HarddiskVolume22\PoolPart.1408546a-e189-4f79-b03a-f66c1d241de3\ServerFolders\Backup\Capture.PNG [Device 16]
    - [2x/2x] E:\ServerFolders\Backup\Capture2.PNG (54.2 KB - 55,484 
      -> \Device\HarddiskVolume21\PoolPart.00baa63a-cf5f-4ed2-af44-c19624546738\ServerFolders\Backup\Capture2.PNG [Device 15]
      -> \Device\HarddiskVolume24\PoolPart.b1d8fa8c-1210-40c0-b3d3-91c79ac81084\ServerFolders\Backup\Capture2.PNG [Device 18]
    - [2x/2x] E:\ServerFolders\Backup\Capture3.PNG (6.11 KB - 6,261 
      -> \Device\HarddiskVolume21\PoolPart.00baa63a-cf5f-4ed2-af44-c19624546738\ServerFolders\Backup\Capture3.PNG [Device 15]
      -> \Device\HarddiskVolume24\PoolPart.b1d8fa8c-1210-40c0-b3d3-91c79ac81084\ServerFolders\Backup\Capture3.PNG [Device 18]
    - [2x/2x] E:\ServerFolders\Backup\Capture4.PNG (8.58 KB - 8,784 
      -> \Device\HarddiskVolume21\PoolPart.00baa63a-cf5f-4ed2-af44-c19624546738\ServerFolders\Backup\Capture4.PNG [Device 15]
      -> \Device\HarddiskVolume24\PoolPart.b1d8fa8c-1210-40c0-b3d3-91c79ac81084\ServerFolders\Backup\Capture4.PNG [Device 18]
    - [2x/2x] E:\ServerFolders\Backup\Capture5.PNG (40.7 KB - 41,643 
      -> \Device\HarddiskVolume21\PoolPart.00baa63a-cf5f-4ed2-af44-c19624546738\ServerFolders\Backup\Capture5.PNG [Device 15]
      -> \Device\HarddiskVolume24\PoolPart.b1d8fa8c-1210-40c0-b3d3-91c79ac81084\ServerFolders\Backup\Capture5.PNG [Device 18]
Link to comment
Share on other sites

  • 0

 

Not seeing any disk models or serial no in the output

 

volume and poolpart and device no though is that what you meant?

 

The script takes the trailing "[Device XX]" from the dpcmd output, matches it up to the "DeviceId" property from "Get-PhysicalDisk" in powershell, and then pulls the disk's serial numbers and models from other corresponding properties. So those are added to the CSV (which the script produces once the dpcmd output is finished to the .LOG file), rather than being logged directly by dpcmd.

 

You'll know the dpcmd portion is finished when you don't see it running any longer in task manager. At that point, you could stop the task, and you should have an incomplete .CSV to examine (if you don't want to wait until it's completely finished to check the format/etc).

Link to comment
Share on other sites

  • 0
Summary:

  Directories: (14,957)

  Files: (719,212) 21.8 TB (23,990,698,777,117 B)

  File parts: (1,438,423) 43.6 TB (47,979,629,929,044 B)

 

  * Inconsistent directories: 0

  * Inconsistent files: 1

  * Missing file parts: 1 1.65 GB (1,767,625,190 B)

 

  ! Error reading directories: 1

  ! Error reading files: 0

 

Error was sys vol info 

 

Produced a 706 MB files in 1h 25mins approx

 

I think Powershell is not liking the task you have set it though - keeps taking 1.5 GB of memory and falls back to 200 MB and climbs again to 1.5GB and back to 200mB - watched it do this several times already - only taking about 6% of cpu though - will leave it for a while to if its actually doing anything
Link to comment
Share on other sites

  • 0

well got bored waiting and killed the process for Powershell - it was just cycling between high and low memory every 20 seconds or so

 

i guess it was doing something but very slowly

 

could you change your script to write as it goes for testing so i can see if its doing what we expect

Link to comment
Share on other sites

  • 0

well got bored waiting and killed the process for Powershell - it was just cycling between high and low memory every 20 seconds or so

 

i guess it was doing something but very slowly

 

could you change your script to write as it goes for testing so i can see if its doing what we expect

 

Odd - I watched for any memory issues on my end, but it only seemed to go up very minorly as the array in memory increased. Maybe powershell is handling memory allocation differently between our powershell versions.
 
Anyway, I switched it over to use a streamwriter method instead. I guess I should have done that to begin with. Now that I'm avoiding constructing objects for each line in an array, it's gone from 50 lines per second to over 700. No memory buildup will happen as well. Plus, the .CSV file can now be read while it's being written.
 
I've updated the attachment above to V1.2
Link to comment
Share on other sites

  • 0

Hi

 

well it ran all the way through - i left it running - took slightly longer to create the csv than the log took to create ??? 1hr25m vs 1h33m

 

it needs a couple of bug fixes

1. it only lists one file location per file - should have 2 as have x2 dup and a few files have 3x

2. Does not list directory locations - useful if you want to find all your files and copy them to a new disk etc

3. zip does not work - empty file

 

total number of lines appears to match file totals

 

going to look at some alternative ways of processing the log file :)

Link to comment
Share on other sites

  • 0

 

1. it only lists one file location per file - should have 2 as have x2 dup and a few files have 3x

 

The idea is that you'd sort by the path column in Excel or another spreadsheet program, and then you'd see all the disks that a particular file reside on. I did test this on a pool without duplication, though... I just thought the "dpcmd" output was going to work like that, since it listed specific disks associated with each file line by line. Did you sort by that column in excel, and still only saw one copy of each file listed???

Edit: Oh! My bad... I just looked at the example output you posted earlier and saw how it's handling duplication. I'll have to revise the script. Thanks for catching that.

 

 

2. Does not list directory locations - useful if you want to find all your files and copy them to a new disk etc

 

You mean that the path doesn't start with the drive letter?

 

 

 

3. zip does not work - empty file

 

I thought this might not work for some... can you let me know:

* What version of Windows you're using

* What version of powershell you're using (paste $PSVersionTable.PSVersion in a powershell window)

* What's the newest version of .NET you have installed? You can find this either by looking in the registry or running a script.

 

Thanks!

Link to comment
Share on other sites

  • 0

Okay, uploaded new V1.3 that fixes the bug where it wasn't recording all the duplicated entries for each file. In my test pool I've now got some 2x duplicated files (1 spare copy per file) and those are all being logged properly. If anyone has a pool with 3x or more, I'd appreciate feedback... it should scale, but I couldn't test it. Thanks!

Link to comment
Share on other sites

  • 0

Aaaand added V1.4 just now. I realized it wasn't properly zipping files on my side either, and it was due to something I had added after I tested the zip functionality originally. That has been fixed, and it's zipping files again now for me. Let me know if it doesn't work for you now with v1.4, thanks

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...