Jump to content

Search the Community

Showing results for tags 'question'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Covecube Inc.
    • Announcements
  • StableBit Cloud
    • General
  • StableBit Scanner
    • General
    • Compatibility
    • Nuts & Bolts
  • StableBit DrivePool
    • General
    • Hardware
    • Nuts & Bolts
  • StableBit CloudDrive
    • General
    • Providers
    • Nuts & Bolts
  • Other
    • Off-topic
  • BitFlock
    • General

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests

Found 8 results

  1. Hello guys I'm new to drive pool ! I've just made a pool and now I need to copy the datas now (from the same drive) and it seems like I have 2 choices One to just copy and paste my datas to pool drive (will take lot of time) Two just drag and drop to hidden pool part folder (almost instantly) I've take a look a bit here and it seems most of people are using the first method (maybe its safer ?) by testing myself with dummy files, 2nd method is working well but I am curious if something bad would happen in the future lol also I found this manual https://wiki.covecube.com/StableBit_DrivePool_Q4142489 it seems this should be followed when using the 2nd method but I want to know why. because without stopping and restarting the service everything just looks ok ... thanks in advance !
  2. Hi, since it seems Windows Server 2019 onwards can't put disks to standby but the Scanner gui can - would it be possible to call the binary via CLI with parameters to specify disks and put them to sleep? Thanks!
  3. Is there any way to see exactly which files Drivepool thinks are open? And perhaps, by which process? If I check the windows Resource Monitor, the only files that show disk activity are on the OS drive and not part of the pool. I've been turning off services, terminating running programs and uninstalling apps in an attempt to figure out which one is holding access to these 42 files. I have also scanned for malware and couldn't find anything. Another strange symptom: If I force disk activity on the pool by opening a bunch of files, Resource Monitor shows these files being accessed. However, the 42 Open Files as reported by Drivepool remains unchanged. This was never an issue until I recently rebuilt my server and reinstalled Windows. Are newer installs of Windows behaving oddly?
  4. Hello all, I have a Windows 2012R2 server with a 4TB RAID1 mirror (2 x 4TB HDD, using basic Windows software RAID, not storage spaces - can't boot from storage spaces pool) and some other non-mirrored data disks (1x 8TB, 2 x 3TB etc.) This is configured with the OS and "Important Data" on the 4TB mirror, and "Less Important Data" on the other non-redundant disks. The 4TB mirror is now nearly full, so I intend to either replace this mirror with a larger mirror, or replace this mirror with two larger non-mirrored drives, and use DrivePool to duplicate some data across them. I'm evaluating whether the greater flexibility of using DrivePool is worth moving to that, instead of using a basic mirror as I am currently. Current config is this: C: - redundant bootable partition mirrored on Disk 1 and Disk 2 D: - redundant "Important Data" partition mirrored on Disk 1 and Disk 2, and backed up by CrashPlan E:, F:, etc. - non-redundant data partitions for "Less Important Data" on Disks 3, 4, etc. If I moved to DrivePool I guess the configuration would be this: C: - non-redundant bootable partition on Disk1 D: - drivepool "Important Data" pool duplicated on Disk1 and Disk2, and backed up by CrashPlan E: - drivepool "Less Important Data" pool spread across Disks 3, 4 etc. - or - C: - non-redundant bootable partition on Disk1 D: - drivepool pool spread across all disks, with "Important Data" folders set to only be stored and also duplicated on Disk1 and Disk2, and backed up by CrashPlan (or something similar using hierarchical pools) I have a few questions about this: Does it make sense to use DrivePool in this scenario, or should I just stick to using a normal RAID mirror? Will DrivePool handle real-time duplication of large in-use files such as 127GB VM VHD's, and if so, is there a write-performance decrease, compared to a software mirror? All volumes use Windows' data-deduplication. Will this continue to work with DrivePool? I understand the pool drive itself cannot have deduplication enabled, but will the drive/s storing the pool continue to deduplicate the pool data correctly? Related to this, can DrivePool be configured in such a way that if I downloaded a file, then copied that file to multiple different folders, it would most likely store those all on one physical disk so that it can be deduplicated by the OS? CrashPlan is used to backup some data. Can this be used to backup the pool drive itself (it needs to receive update notifications from the filesystem)? I believe it also uses VSS to backup in-use files, but I as this is mostly static data storage I think files should not be in use so I may be able to live without that. Alternatively, I could backup the data on the underlying disks themselves? Are there any issues or caveats I haven't thought of here? How does DrivePool handle long-path issues? Server 2012R2 doesn't have long-path support and I do occasionally run into path-length issues. I can only assume this is even worse when pooled data is stored in a folder, effectively increasing the path-length of every file? Thanks!
  5. I have a Drivepool with 2x folder duplication for my projects. I was initially blaming VS 2019 for possible bugs. However, after I moved the project to my SSD C Drive not using Drivepool, everything works again. Things which I found broken are: - Slow to no detection of syntax errors - Quick actions for code refactoring are doing nothing. System Infomation Microsoft Windows 10 Pro Version 2004 10.0.19041 Build 19041 Drivepool 2.2.3.1019 Microsoft Visual Studio Community 2019 Version 16.7.1 VisualStudio.16.Release/16.7.1+30406.217 Microsoft .NET Framework Version 4.8.04084 Update I have found a walkaround to share the drive on the network and work from the network share.
  6. Hello, I'm a total noob and I haven't even built my pool yet but I have a question that's going to determine if Drivepool is going to work for me or not. I know that hardlinks are not supported by Drivepool because of how hardlinks work and how the files must be on the same drive and the same volume for the links not to break. My question now is this. Is it possible for me to use ordered file placement plugin to ensure that files I have the and their links stay on the same physical drive till it gets filled up and then move to the next drive and do the same thing and so on? Most of the time the files and the hardlinks get created at the same time so I don't see why this wouldn't work but I want to make sure if it will and exactly how do I set this up. Thanks
  7. Hey everyone, New to the drivepool game here, absolutely loving this software. Has made storage and backup so much easier. Sorry if these are stupid questions here. 1. I seem to be a bit confused how the "Folder duplication" process works. I have one of my movies folders set to duplicate, it's around 2.5tb. However, when I turn on duplication, the duplication across all my drives is around ~5.7tb. I don't have any other folders set for duplication, why would this figure be so high? I guess I was expecting duplication to be around the same size as the folder I was duplicating (2.5tb). Is the duplication figure the size of original folder plus the size of the duplication? 2. Am I able to safely add my C:/ drive to my pool? It's a 1TB SSD, I was hoping to harness some of its speed as a cache drive as it's mostly empty right now. Is this safe/possible? Thanks again for the phenomenal product, probably the happiest I've been with a software purchase in a long time.
  8. lateworm

    macOS port?

    Could there be a macOS version of this at some point? I could see it being really useful to a lot of the developers who are on Macbooks. Even with fusion drives and SSD's, I often hear complaints about lack of space, speed and having to bring external storage because people have big, complex dependency trees that use absolute paths and are very tedious to sync.
×
×
  • Create New...