Spider99: CSV is not going to work with more than 500k of files as Excel 2007 - drops all records above 1048576
Just a million? That's surprising. And unfortunate... Hrmmm.
Maybe it would be better if a list of files was just dumped out per disk. I guess I could have it make a folder structure corresponding to all the drives, and have a text file inside with the disk information and a csv with the full list of files. I could split out to a second csv if it exceeds 1 million files on a single disk (not likely, but with an 8TB drive and some file types, maybe....). Having the separate text file with the disk info would reduce the size of the CSV data, as a nice side benefit.
...this is starting to reduce the appeal, though, since it's starting to get more fiddly, and less easy to play "what if" with a master list of all your files. I guess it would still accomplish the overall goal of "a simple list of all the files that were on a particular drive" though.
Writing it out to an SQLite database would make the most sense. Powershell can export to SQLite.. the only issue would be writing a front-end. A front-end isn't something I want to tackle for this though.
I guess the per-disk thing would probably make the most sense. Realistically, we're not going to be using this for anything other than figuring out what was on a drive in an emergency... thoughts?
Can the unicode stuff have a switch in the script as i dont need it and would like to keep the files smaller they are big enough already
How big is the uncompressed .csv and how big is the .log?
When you zip the csv (try doing it manually if 1.51 doesn't work still), what is the file size of the compressed csv?
Christopher: @gj80, if you're familiar with other programming languages, it may be possible to bypass the dpcmd utility completely
Thanks Christopher. The dpcmd's output works fine, though, and since I already wrote the regex stuff to parse it, and I'm lazy, I'm fine just sticking with that