Jump to content

blueman2

Members
  • Posts

    41
  • Joined

  • Last visited

  • Days Won

    7

Everything posted by blueman2

  1. Thanks! unchecking all the balancing options, and removing all file placement rules, solved the problem. The issue was, as you said, two conflicting balancers. The scanner balancer was set to move unduplicated files from any 'damaged' drive. All drives show fine, but one does have 8 sectors reallocated. I guess that is enough to mark it 'damaged'? Stablebit scanner says it is fine and no failure predicted, so I am confused. There was another conflicting file placement rule and I guess the 2 balancers fought each other back and forth. I also changed the check boxes on the settings as follows: File Placement Settings: uncheck: File placement fules respect limits...... Checked: Balancing plug-ins respect..... Uncheck: Unless the drive is being emptied. Thanks for the tip!
  2. Well, I have continue to battle some very odd issues with DrivePool after years of a stable experience. It all started when I removed a drive and added a larger replacement. I have a total of 4 drives: 1) SSD OS drive (no change) 2) 2TB drive (no change) 3) 2TB drive (no change) 4) 8TB drive (replaced an older 1.5TB) After replacing the 1.5TB with 8TB, the balancer just went crazy. I has gone through the process of completely emptying the 2TB drives to zero contents, then filling them back up to about 50%, then emptying them again about a dozen times in the past 3 weeks. The balancer never stops. It is ALWAYS processing files in some sort of infinite loop. Here is a screen shot of the main screen. As you can see, it is in the process of emptying the 2TB drives to zero. When it is done, it will then fill them back up once again. Here are my other settings: Any clue as to what the heck is going on??
  3. I have been using DP for many years. Decided to replace a drive recently, and the entire pool just went crazy in terms of where it is putting files!! 1) I set file placement to put certain files (such as VMs) only on my fast drives. It promptly put them all on the very slowest drive that I had explicitly excluded! 2) I set it to never put unduplicated files on the large, slow drive. It promptly started to put all the unduplicated files over on that drive! Here are some screen shots. Starting with VM location, here is what is set: Yet after doing this, all VMs were moved to the 8TB drive that I specifically de-selected. And here is the selection showing to not put unduplicated files on the 8TB drive. Yet, the file is now full of them. I did not change anything. Yet now the balancer is doing the opposite of what it was set for before. I must be doing something really dumb here!
  4. Sorry for letting my frustration come out! Yes, you are right. I did run a file monitor program to see what Drivepool was doing, and it was going one by one checking every file, accessing copies on the disk I was removing and another disk a copy was on. I just did not think about the reason for this until you mention it. I guess it just ensures, one last time, that there really are copies before allowing removal. Good safety check. Happy after the switch though. Everything back to running very smoothly. I love how well thought out DrivePool is in terms of handling removal and addition of drives. Nothing else comes close. Thanks for the great product!!
  5. UPDATE: Well, I just pulled the drive, replaced it with the new drive, rebooted. I had to do a remove of the 'missing' drive afterwards, but it worked just great. So anyone who has a drive that they have allowed only duplicated data on (or have removed non-duplicated data manually) can save themselves an entire day of wasted time by just ignoring the 'remove' command and pull the drive anyway, as if it were a failed drive. Drivepool seems quite smart about how to handle this situation. Just not very smart about knowing what it needs to do if you remove that drive using the remove command! Lesson learned.
  6. 9 hours now. Enough of this crap. I am just puling the drive out. Someday I would love to learn what the heck drivepool thought it was doing for over 9 hours when it needed to do absolutely nothing. .
  7. OK, this is just silly. 7 hours and counting, and there is NOTHING that Drivepool needs to do but just remove the drive. No duplicated files are on it. What the heck does it think it needs over 7 hours to do!!!
  8. I have a 1.5TB drive that I have been using only for duplicated files. Drivepools shows only duplicated files are on it. I plan to replace with a 6TB drive. I selected remove, and duplicate later, but after 6 hours, it is still "in process"! Shouldn't removal be almost instantaneous in this case? It contains no unduplicated files, so what is taking so long? Should I abort and just do a 'dirty' remove?? I am on the latest Beta 2.2.0.651.
  9. I use offsite backup, Crashplan. This is purely for disaster recovery, and I did it mostly for my photos and home videos. Relatively inexpensive and definitely allows me to sleep better.
  10. RFOneWatt, I just have to know. What are you using all that space for?? Video?
  11. blueman2

    Which HDD

    I avoid giving advice based on a small sample size, but of 3 external drives I removed and put in my server, all 3 have failed within 2 years. Not a single failure from my other regular internal drives. Maybe not a statistically valid sample size, but I agree with Drashna on this one.
  12. Just to chime in on my own opinion of which version to use with WHS 2011. I started with V1 and moved to V2. From my perspective, V2 is the right choice. 1) I like the V2 remote view/control scheme much better. The WHS Dashboard is soooo sloooow to load whereas the V2 remote app for DP is super fast. 2) I like being able to place certain files on a specific drive (I put my VM images on a very fast drive, for example) 3) I have yet to see any benefit of the tighter integration from V1 vs V2. So no downside with V2 over V1, and several areas of upside. Frankly, I am surprised to see Covecube still supporting a separate product for WHS. IMHO, V1 is really not needed.
  13. You don't have a button directly below the "Activation ID" box that says "Start 30-day Trial"????
  14. Well, Avast is up to its nastiness again. It killed my ability to run DP on my remote system once again. So I decided to add back the exclusion for the c:\program files\stablebit\drivepool\ directory, turned off Avast while doing a re-install, and turned Avast back on. All is working fine again now. Last time the file was drivepool.comm.dll, but this time it was DrivePool.Service.exe. Oh well, I will just leave the directory excluded for now on. I am running Beta 2.2.0.572_x64 on my client. The server as always is running fine (no Avast there!). No action needed, I just wanted to report in case others get this problem.
  15. I just wanted to confirm that using the beta 2.2.0.572 I no longer have any issues with Avast. I have removed the exclusions for stablebit directory from Avast and all is still working fine. Driver signing seems to solve that issue. Thanks!
  16. This is on my Win 7 64 bit client. Running latest 2x version of DP. Note, my Server which actually runs DP does not use Avast, so no problems there. Just clients I use to monitor and adjust DP. I reinstalled, but Avast killed it again. I have Avast set to ask me what to do when it finds a suspicious file, but oddly it does not give the 'ignore' option, just delete/quarantine/fix. I ended up putting the entire Stablebit directory as an exclusion, reinstalled again, and now all is well. Damn Avast. They should always the user to "Ignore". How do I access the beta? I will give that a try just for testing if it helps.
  17. Running latest version of Avast and it comes up showing c:\program files\stablebit\drivepool\drivepool.comm.dll as "Win32:Evo-gen[susp]". I turned Avast off, but now I can no longer see my server. I can only see the client (which of course has nothing since there is no license on the client). Anyone else see this? I will try a re-install to see if that solves it.
  18. Using 2.X, you just install another copy on another PC (no need for a another license) and that copy will find the server and gives the exact same interface as if you were directly on the server. Very neat. And it starts instantly (compared to WHS2011 Dashboard).
  19. +1. Go with 2.X. I run WHS2011 and started with 1.X of drivepool. Moved to 2.x and never looked back. I like the stand alone remote DP monitor/control app much better. It starts instantly and you can control everything on your WHS server from any PC. This is far better than having to start the WHS dashboard which, for me, takes too long to start and requires I put in the password every time. I also recall that you have more file placement options going with 2.X over 1.X of DP. Go with 2.X, no doubt.
  20. Maybe "Product 3" that is currently in development will address this. I had the same thought. The great thing about Drive Pool is its ability to use a mix of fast (i.e. local) drives and slow (might be remote), and has the intelligence to read from the fastest drive using 'read striping' option, if you are using duplication.
  21. Thanks. And yes, my choice of words were bad. Nothing simple at all about this as I think more deeply about how to implement (without causing other issues).
  22. File Placement I think is a way to do this, but it is not elegant or easy at all. And ongoing maintenance of the locations will be required as my storage changes. I will play with that. I would love it if this feature could be added to DP in the future. It could even be as simple as adding a rule option for every drive that says: "Use this drive exclusively to keep a copy of every file that is targeted for duplication". If you do this for 1 drive, this will allow 2X duplication. If for 2 drives, it will allow 3X duplication. And in all cases, the selected drives will contain only duplicate data, and other drives will not contain multiple copies of any single file. I have to wonder if this functionality might be available in "Product 3"? Since that product apparently plans to enable use of pools located on remote (i.e. cloud) devices, it would be logical to assume you would need to be able to set some of your data to local only, and some to local + remote. You would use remote pool only for duplication, and always keep 1 copy local for quick access. What is the process for making such a feature request?
  23. Are you seeing a lot of disk activity on your server? I use Procmon on the server to see what files are being accessed and by which programs. Excess disk activity might be causing some of the issue. I know for the first full day after installing DP, there was a lot of activity for various reasons. But now that it has settled down, my speed of access and transfer rate is actually higher with DP. Before I would get about 60MB/s at best. Now I am getting 90-100MB/s regularly.
×
×
  • Create New...