Jump to content
Covecube Inc.

Leaderboard

Popular Content

Showing content with the highest reputation since 01/25/21 in all areas

  1. The auto refresh of Windows File Explorer has been a long standing issue with Windows. There are a number of fixes for this issue posted on the internet, but every version of Windows 10 seems to have a different solution. I personally just live with inconvenience because the next Windows update could "un-fix" the problem you thought you had solved. I don't have enough time in my day to manually apply "fixes" to Windows that might be wiped out next week on an update. So, if I need to refresh File Explorer, I just hit F5 and it will refresh. I also use a third party File Manager and it has
    1 point
  2. Yes. But YMMV. Mostly, having the cache on a different drive means that you're separating out the I/O load, which can definitely improve performance. Though, this depends on the SSDs you're using. Things like AHCI vs NVMe, and the IOPS rating of the drive make more of a difference.
    1 point
  3. I'd take a wild stab and say probably negligible difference so long as windows is idle and behaving; checking task manager, my fileserver's boot SSD is currently averaging ~1% active time, ~450 KB/s transfer rate, ~1ms response time and <0.01 queue length over 60 seconds at idle, and it's a rather old SATA SSD. I think you're much more likely to run into other bottlenecks first. But YMMV as it depends how many simultaneous streams are you planning for, what other background programs you have running, etc. You could test it? Open task manager / resource monitor, set to show the drive's
    1 point
  4. Sorry, I haven't used CloudDrive nearly enough to be able to tell you whether your plan is actually sound (I use DrivePool and Scanner a lot more) though that may change in the near-ish future. "Drive Usage set to B only having duplicate files." - Have you tested that this works with a pair of local pools? If so, it should also work with a local+cloud pool pairing. "Is read striping needed for users of Pool C to always prioritize Pool A resources first?" - According to the manual the answer is no; regardless of whether Read Striping is checked it will not stripe reads between local a
    1 point
  5. Well, this is embarrassing. Finally realized I was logged into the wrong gmail email account for the Google Drives *slaps head*. Wanted to delete this post but can't seem to find the delete option, but maybe its best it stays up so my error will help someone else whos brain was running on autopilot and signed in with the wrong gmail email hah.
    1 point
  6. Yea the goal is to not need to pull a 20TB recovery when you could have just done a 10TB recovery it would literally take twice as long. Also you have a higher probability of failure. In my case: A+B = Duplication Pool 1 C+D = Duplication Pool 2 E+F = Duplication Pool 3 Duplication Pool 1+2+3 = Pool of Pools In the above example you can sustain more than 2 drive failures before you lose data depending on what drives fail. In the event of a failure where data loss occurred you will need to restore 1 drives worth of data. Next example: A+B+C+D+E+F = On
    1 point
  7. Solved from another post where user had similar issue, missed it before posting this.
    1 point
  8. Hi Newbie! ;D, DP needs very little, I had it running on an Intel Celeron G530 and could stream 1080p to at least one device. So a cheap build with, say, a Ryzen 3 3200G, 8GB of RAM, a decent 350W PSU and W10 would work like a charm as a file/stream server. The things you'd probably look for are SATA connectors (cheap boards often have only 4). Although you could get a cheap SAS card (IBM 1015 or somesuch, used.) which would provide plenty of expandability. The other thing is the network connection. 1Gb Ethernet, I think, should be "enough for anybody". It is a bit of a bad time as C
    1 point
  9. Another option to consider may be using a product like PrimoCache which can use both your system RAM as level 1 cache and you SSD as level 2 cache for all your computer reads/writes - so, not just limited to DrivePool. DrivePool will use the SSD cache for writes only, but does not use it for caching reads. PrimoCache is a try before you buy software, and I think they offer a 30 day trial period. It worked good for me, but I chose just to use the DrivePool SSD plugin which is all I needed for my DrivePool home media server. If you decided to go with something like PrimoCache, then you woul
    1 point
  10. From what I can figure out from trawling the manual and changelogs: CoveFs_WaitForVolumesOnMountMs (Default: "10000") - How long to wait for all the volumes to arrive in MS. This is usually for older pools (pools that don't initially know how many poolparts they should be waiting for). DrivePool will wait at most this long between detecting the first poolpart folder and mounting the corresponding pool drive. CoveFs_WaitForKnownPoolPartsOnMountMs (Default: "10000") - (Newer pools may be aware of it's parts) If a pool is aware of how many pool parts it's made of, at mou
    1 point
  11. Short answer is that - so long as you're disconnecting ALL of the drives in a pool - yes you can and it shouldn't cause any problems as far as DrivePool itself is concerned. Slightly longer answer is that you would stop the StableBit DrivePool Service* while disconnecting/reconnecting, to avoid DrivePool briefly** complaining about missing drives and to avoid it potentially re-scanning the entire pool when the drives are reconnected, but the software is designed to "fail safely" so even that wouldn't be strictly necessary if you were ever in a rush. Do make sure that all drives in th
    1 point
  12. Good to hear. Once everything else is sorted out, if you still can't find the source of the "Other" files then feel free to post here and we'll follow up.
    1 point
  13. This is a topic that comes up from time to time. Yes, it is possible to display the SMART data from the underlying drives in Storage Spaces. However, displaying those drives in a meaningful way in the UI, and maintaining the surface and file system scans at the same time is NOT simple. At best, it will require a drastic change, if not outright rewrite of the UI. And that's not a small undertaking. So, can we? Yes. But do we have the resources to do so? not as much (we are a very small company)
    1 point
  14. Alex

    check-pool-fileparts

    If you're not familiar with dpcmd.exe, it's the command line interface to StableBit DrivePool's low level file system and was originally designed for troubleshooting the pool. It's a standalone EXE that's included with every installation of StableBit DrivePool 2.X and is available from the command line. If you have StableBit DrivePool 2.X installed, go ahead and open up the Command Prompt with administrative access (hold Ctrl + Shift from the Start menu), and type in dpcmd to get some usage information. Previously, I didn't recommend that people mess with this command because it wasn't rea
    1 point

Announcements

×
×
  • Create New...