Jump to content
Covecube Inc.

Search the Community

Showing results for tags 'drivepool'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Covecube Inc.
    • Announcements
    • Nuts & Bolts
  • BitFlock
    • General
  • StableBit Scanner
    • General
    • Compatibility
    • Nuts & Bolts
  • StableBit DrivePool
    • General
    • Hardware
    • Nuts & Bolts
  • StableBit CloudDrive
    • General
    • Providers
    • Nuts & Bolts
  • Other
    • Off-topic

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests

Found 54 results

  1. denywinarto

    server reboots when removing a pool

    Removing a pool apparently causes my server to reboots, thus i have to repeat the removal process.. I'm positive cause it happens third time already, I have checked all 3 options on remove menu, but it still reboots my PC, so what should i do? I also tried searching and deleting files inside the pool but it wont help, the drive currently looks like this, this is after the reboot I have also unmark the both options in drive usage limiter plugin after i post this but i havent tried removing again
  2. Painted

    Error Removing Drive

    Hi, I've got a drive in my pool that Scanner is reporting that the File system is not healthy, and repairs have been unsuccessful. So I added another drive the same size to the pool, and attempted to remove the damaged drive (thinking this would migrate the data, after which I can reformat the original drive and re-add to the pool). I'm getting "Error Removing Drive" though, and the detail info indicates "The file or directory is corrupted and unreadable". Yet I can reach into the PoolPart folder and can copy at least some* of the files (every test I've done copies fine). How do I migrate whatever I can recover manually? Or is there a step I'm missing to get DP to do this for me automatically?
  3. I've just finished coding a new balancing plugin for StableBit DrivePool, it's called the SSD Optimizer. This was actually a feature request, so here you go. I know that a lot of people use the Archive Optimizer plugin with SSD drives and I would like this plugin to replace the Archive Optimizer for that kind of balancing scenario. The idea behind the SSD Optimizer (as it was with the Archive Optimizer) is that if you have one or more SSDs in your pool, they can serve as "landing zones" for new files, and those files will later be automatically migrated to your slower spinning drives. Thus your SSDs would serve as a kind of super fast write buffer for the pool. The new functionality of the SSD Optimizer is that now it's able to fill your Archive disks one at a time, like the Ordered File Placement plugin, but with support for SSD "Feeder" disks. Check out the attached screenshot of what it looks like Notes: http://dl.covecube.com/DrivePoolBalancingPlugins/SsdOptimizer/Notes.txt Download: http://stablebit.com/DrivePool/Plugins Edit: Now up on stablebit.com, link updated.
  4. Hello everyone! I plan to integrate DrivePool with SnapRAID and CloudDrive and require your assistance regarding the setup itself. My end goal is to pool all data drives together under one drive letter for ease of use as a network share and also have it protected from failures both locally (SnapRAID) and in the cloud (CloudDrive) I have the following disks: - D1 3TB, D: drive (data) - D2 3TB, E: drive (data) - D3 3TB, F: drive (data) - D4 3TB, G: drive (data) - D5 5TB, H: drive (parity) Action plan: - create LocalPool X: in DrivePool with the four data drives (D-G) - configure SnapRAID with disks D-G as data drives and disk H: as parity - do an initial sync and scrub in SnapRAID - use Task Scheduler (Windows Server 2016) to perform daily synchs (SnapRAID.exe sync) and weekly scrubs (SnapRAID.exe -p 25 -o 7 scrub) - create CloudPool Y: in CloudDrive, 30-50GB local cache on an SSD to be used with G Suite - create HybridPool Z: in DrivePool and add X: and Y: - create the network share to hit X: Is my thought process correct in terms of protecting my data in the event of a disk failure (I will also use Scanner for disk monitoring) or disks going bad? Please let me know if I need to improve the above setup or if there is soemthing that I am doing wrong. Looking forward to your feedback and thank you very much for your assistance!
  5. Hello. I ihink that I might have encountered a bug in DrivePool behavior when using torrent. Here shown on 5x1TB pool. When on disk (that is a part of Pool) is created a new file that reserve X amount of space in MFT but does not preallocate it, DrivePool adds that X amount of space to total space avaliable on this disk. DrivePool is reporting correct HDD capacity (931GB) but wrong volume (larger than possible on particular HDD). To be clear, that file is not created "outside" of pool and then moved onto it, it is created on virtual disk (in my case E:\Torrent\... ) on that HDD where DrivePool decide to put it. Reported capacity goes back to normal after deleting that file:
  6. Hey, I've set up a small test with 3x physical drives in DrivePool, 1 SSD drive and 2 regular 4TB drives. I'd like to make a set up where these three drives can be filled up to their brim and any contents are duplicated only on a fourth drive: a CloudDrive. No regular writes nor reads should be done from the CloudDrive, it should only function as parity for the 3 drives. Am I better off making a separate CloudDrive and scheduling an rsync to mirror the DrivePool contents to CloudDrive, or can this be done with DrivePool (or DrivePools) + CloudDrive combo? I'm running latest beta for both. What I tried so far didn't work too well, immediately some files I were moving were being actually written on the parity drive even though I set it to only contain duplicated content. I got that to stop by going into File Placement and unticking parity drive from every folder (but this is an annoying thing to have to maintain whenever new folders are added). 1) 2)
  7. Varanus

    Trial license issue

    Hi, I decided to give Drivepool a try, see how I like it, and almost immediately after installing and setting up a trial activation, and fixing an issue I had with my bios settings, I get a notification saying I must activate to continue. All it says is "Get a trial license or enter your retail license code" and "Get a license", with all other UI blocked. Clicking Get a license shows me a screen with "Your trial license expires in 29.7 days" and asks for an Activation ID. I tried uninstalling and reinstalling, and same thing. Uninstalling failed to remove the pooled drive, so I'm not really sure what's going on there.
  8. I have a bunch of older 3, 4 and 6TB drives that aren't currently in my main system that i want to make use of. What are some suggested options to externally adding these to a drive pool? I saw this in one post IB-3680SU3 and have seen something like this before as well Mediasonic H82-SU3S2 is there anything i should be aware of to make this work optimally? I'll also have a few drives in my main system that would be part of the pool as well.
  9. DataHoarder18

    Migrating old harddrives to DrivePool

    I recently added 2 8TB HHDs to my das that had a 5TB and 8TB drive already in it. I've used DrivePool on another machine but not this one. I added all 4 HHDs to a new pool and moved the old data to the hidden "PoolPart" folders. The problem now is that the balancer won't pick up those "other" files and redistribute them amongst the new drives. How can I get this is data to be apart of the new pool?
  10. joebeem11

    very strange upload behavior

    Hi, I am posting here to see if anyone has any similar experiences when trying to upload data to GDrive via Cloud Drive using a similar setup. First, I have two hard drives connected, which are used as my cache disks. Not sure if the size matters, but one is 1 TB and the other is 1.5 TB. They were created about 1 year ago using the default allocation sizes, so the maximum each cloud drive can be is 32 TB each. I also use DrivePool to pool the Cloud Drives created on GDrive into one combined 64 TB drive. This is on an I5 2600k 3.6 Ghz with 16 GB ram on a Windows 10 Home OS. I have been having trouble recently while trying to upload from the local disks in that it seems that the PC needs to have the user logged in and active in order to upload anything. This isn't ideal because while I have a "decent" connection at 1 Gbps down, I only have 50 Mbps upload. I can usually achieve close to 50 Mbps during non-peak times, but throughout the day / busy hours, I usually max out around 30 - 35 Mbps. For this reason (and others), I typically try to let any data upload overnight while I'm sleeping. However, recently as in the last 3 to 4 weeks or so, I've noticed that even though my connection appears to be uploading throughout the night, the "To Upload" figure remains pretty close to what it was prior to going to sleep. For example, last night it said I had 199 GB left to upload to the cloud. I left my computer unattended uploading for 12+ hours (combined sleep and running errands throughout today) and when I checked just now, it says there is 184 GB left to upload. It seems kind of odd that I would have only uploaded around 15 GB in over 12 hours. My connection was steady throughout the night and no errors reported by Cloud Drive. Something may be related is that prior to going to sleep, I had cleared cache on the drive and the "To Upload" data (199 GB) and "Local" data (201 GB) were very close, around 2 GB difference. Upon returning to my computer, as mentioned the to upload was 184 GB, but my local was now at 534 GB. It almost seems as if the difference between 534 GB and 201 GB is what my connection should have uploaded during the 12 hrs, but for some reason it did not happen. This is not the first time something like this has happened either, it seems much more prevalent over the last several weeks, but I am not 100% sure when this behavior started. At first, I thought I was doing something wrong somehow, which still might be true. However, I think I found the pattern of this potential issue. It seems that when I "lock" my computer and am away from it, the issue described where my connection uploads, but doesn't actually upload much data will occur. It seems like this issue does NOT occur if I log on to the PC and just leave it unlocked and "active". I am doing a test now to see if this is true. To be clear, I am NOT logging the user out of Windows prior to letting it try to upload. I am simply locking the computer so it prompts for a password when someone tries to use it. I am also aware of the 750 GB max upload to GDrive per 24 hours. I don't think my upload speed is fast enough for that to come into play, but I have also tried waiting a few days by pausing the upload and the same thing would happen. Has anyone experienced anything remotely similar to this? If so, is there anything that can be done? I don't understand why it would upload while I am active on the PC, but not upload while its locked. Thanks for taking the time to read this.
  11. Quinn

    [HOWTO] File Location Catalog

    I've been seeing quite a few requests about knowing which files are on which drives in case of needing a recovery for unduplicated files. I know the dpcmd.exe has some functionality for listing all files and their locations, but I wanted something that I could "tweak" a little better to my needs, so I created a PowerShell script to get me exactly what I need. I decided on PowerShell, as it allows me to do just about ANYTHING I can imagine, given enough logic. Feel free to use this, or let me know if it would be more helpful "tweaked" a different way... Prerequisites: You gotta know PowerShell (or be interested in learning a little bit of it, anyway) All of your DrivePool drives need to be mounted as a path (I chose to mount all drives as C:\DrivePool\{disk name}) Details on how to mount your drives to folders can be found here: http://wiki.covecube.com/StableBit_DrivePool_Q4822624 Your computer must be able to run PowerShell scripts (I set my execution policy to 'RemoteSigned') I have this PowerShell script set to run each day at 3am, and it generates a .csv file that I can use to sort/filter all of the results. Need to know what files were on drive A? Done. Need to know which drives are holding all of the files in your Movies folder? Done. Your imagination is the limit. Here is a screenshot of the .CSV file it generates, showing the location of all of the files in a particular directory (as an example): Here is the code I used (it's also attached in the .zip file): # This saves the full listing of files in DrivePool $files = Get-ChildItem -Path C:\DrivePool -Recurse -Force | where {!$_.PsIsContainer} # This creates an empty table to store details of the files $filelist = @() # This goes through each file, and populates the table with the drive name, file name and directory name foreach ($file in $files) { $filelist += New-Object psobject -Property @{Drive=$(($file.DirectoryName).Substring(13,5));FileName=$($file.Name);DirectoryName=$(($file.DirectoryName).Substring(64))} } # This saves the table to a .csv file so it can be opened later on, sorted, filtered, etc. $filelist | Export-CSV F:\DPFileList.csv -NoTypeInformation Let me know if there is interest in this, if you have any questions on how to get this going on your system, or if you'd like any clarification of the above. Hope it helps! -Quinn gj80 has written a further improvement to this script: DPFileList.zip And B00ze has further improved the script (Win7 fixes): DrivePool-Generate-CSV-Log-V1.60.zip
  12. p4nts

    Newbie setup

    Sorry about the n00bness of this question but I'm not a Windows Server setup expert and any help/guidance would be much appreciated. I've got a new Microserver :16Gb, 4 core, 512MB SSD (OS) and 2x10TB NAS drives. I was hoping to set this up as Windows Server 2016 (with desktop experience) to act as a file server for a Linux based Kodi box. I've got various other Windows machines which would looking to access Shares on the server. Once 3-6 months has gone by, hopefully Disk prices have fallen a bit I was planning on getting a third disk and hooking up Snap RAID. I'm looking to run HyperV so that I can fire up VMs to use my MSDN sub to set up some test labs for work/community stuff. As well as using the improved docker support to dig into Containers. I was hoping to avoid having to set the box up as a DC or does DrivePool require "Server Essentials Experience" to be enabled? Or would I be better going with Win 10 Pro and just turn everything off. Finally is ReFS still proving problematic, and SnapRAID covers that anyway?
  13. ericksilvas

    HybridPool confusion

    So I'm a bit confused with the HybridPool concept. I have a LocalPool with about 6 hard drives totalling 30TB. I have about 25TB populated and with this new hybrid concept I'm trying to know if I can use another pool (cloudpool) for duplication. The part that I'm a litte confused is when I create a hybridpool with my cloudpool and my localpool I have to add all my 25TB of data on the new Hybridpool drive. On the localpool all my data appear in the "Other" section. I really have to transfer all the data from the localpool to the hybridpool, or am I missing something? Thanks
  14. Painted

    Can't write to pool

    Suddenly I cannot write to my pool. I can write to individual drives, and can create folders, etc. but any copy says there is zero space, even though there is plenty. I have installed the latest Drivepool, restarted, and tried a number of usual tricks. Halp!
  15. Hi, referring to this post and how DrivePool acts with different size drives, I found myself copy/pasting a file and DrivePool used the same drive for the entire operation (as the source and the destination). This resulted in a really poor performance (see attachments; just ignore drive "E:", there are some rules that exclude it from the general operations). Same thing extracting some .rar files. Is there a way to make DP privilege performance over "available space" rules? I mean, if there is available space on the other drives it can use the other drives instead of the "source" one. Thanks
  16. MetalGeek

    Down sizing my server

    I'm designing a replacement for my current server. Goals are modest. Less heat and power draw than my existing server. Roles are file server with drivepool, emby for media with very light encoding needs. Local backups are with Macroum Reflect. Off site backups with Spideroak. OS will be win 10 Pro. My current rig is here: New build: PCPartPicker part list: https://pcpartpicker.com/list/yhdVqk Price breakdown by merchant: https://pcpartpicker.com/list/yhdVqk/by_merchant/ Memory: Crucial - 8GB (2 x 4GB) DDR3-1600 Memory ($63.17 @ Amazon) Case: Fractal Design - Node 304 Mini ITX Tower Case ($106.72 @ Newegg) Power Supply: EVGA - B3 450W 80+ Bronze Certified Fully-Modular ATX Power Supply ($53.73 @ Amazon) Operating System: Microsoft - Windows 10 Pro OEM 64-bit ($139.99 @ Other World Computing) Other: ASRock Motherboard & CPU Combo Motherboards J3455B-ITX ($73.93 @ Newegg) Total: $437.54 Prices include shipping, taxes, and discounts when available Generated by PCPartPicker 2018-01-16 10:30 EST-0500 Storage: LSI SAS 2 port HBA flashed to IT mode. Boot: PNY 120GB SSD Data Pool: 4 x 2TB WD Red Backup Pool: 1x5TB Seagate (Shelled USB drive) When this is built my old rig goes up for sale. My only thought is on the CPU. Could I go with a dual core Celeron for about 80 bucks? Will it handle a single encoding stream and not draw more power than the i3T? The Celeron will do fine as a file server.
  17. blackhawk8863

    Notifications during Remote Control

    I have two networks, one which is connected to the internet (1), and one which is not (2). I have a machine (A) running drivepool and scanner on (2), and a machine (B) connected to both (1) and (2). Machine (B) is set up to remote monitor (A). Will any notifications generated on (A), which is being remote monitored by (B), be sent from machine (B) to my email? When sending a test message from Scanner on machine (A), since it is not connected to the internet, creates the error "Test email could not be sent to blank@blank.com. Object reference not set to an instance of an object." When sending a test message from Drivepool on machine (A), Drivepool claims to have successfully sent the test email. In both instances, I do not receive an email. When I attempt to send a test message from machine (B), I get the same response and error: Drivepool successful, and scanner "object reference" error, and I do not receive an email for either attempt. Aside from the "object reference" error, should I be able to funnel any notifications through another machine in the way that I am attempting to do it? Drivepool: 2.1.1.561 x64 Scanner: 2.5.1.3062 Thank you.
  18. Frank1212

    Onedrive drivepool using clouddrive

    Hi, I have an office 365 account where I get 5 - 1TB onedrives. I am trying to link 3 of them together using clouddrive and drivepool. I have on my pc as storage drives: x4 - Toshiba HDWE160 6TB x1 - Seagate ST3000DM001-9YN166 3TB I have all the drives pooled together using drivepool. When creating the OneDrive drivepool, how should I create the clouddrive cache? Should I put all 3 cache partitions on a single, dedicated, cache disk? Can I put a single cache partition on 3 different drives in the storage drivepool? Or do I need a dedicated cache drive for each of my onedrive cloud connections? What are your recommendations for this? I've tried putting the cache partitions on the same dedicated cache disk and get BSOD every time I write a lot of files to it. Thank you.
  19. In the advanced settings of Plex Media Server for transcoder, setting the Transcoder temporary directory to a folder in a DrivePool causes Plex to show an error on playback. I wanted to check if anyone else has that behaviour, if so, maybe include that in the limitations. Also, perhaps a general guideline would be to put "cache" folders outside of Drivepool? ____________________________ EDIT: It looks like it works, the problem were due to some other error that caused the DrivePool to go in read-only mode (which a reboot fixed).
  20. stc

    Blockchain Data OK on DrivePool?

    Is blockchain data OK to be stored on DrivePool? As opposed to directly on the hard disk (metal). Examples: Bitcoin Core Bitcoin Armory Ethereum Mist
  21. tgod

    Google Drive & Cache Errors

    Hi, I'm kind of new to CloudDrive but have been using DrivePool and Scanner for a long while and never had any issues at all with it. I recently set up CloudDrive to act as a backup to my DrivePool (I dont even care to access it locally really). I have a fast internet connection (Google Fiber Gigabit), so my uploads to Google Drive hover around 200 Mbps and was able to successfully upload about 900GB so far. However my cache drive is getting maxed out. I read that the cache limit is dynamic, but how can I resolve this as I dont want CloudDrive taking up all but 5 GB of this drive. If I understand correctly all this cached data is basically data that is waiting to be uploaded? Any help would me greatly appreciated! My DrivePool Settings: My CloudDrive Errors: The cache size was set as Expandable by default, but when I try to change it, it is grayed out. The bar at the bottom just says "Working..." and is yellow.
  22. When using DrivePool in conjuction with CluodDrive I have a problem when uploading a large amount of data. I have the following setup. 3x 3tb Drives for Cache for CloudDrive with 500Gb dedicated and the rest expandable. I am in an EU Datacentre with 1Gb/s internet connectivity I have a DrivePool with 3 different accounts - 2x Google Drive and 1x Dropbox. I have another DrivePool for where my downloads go, that consists of space left over from the listed drives above. When I attempt to copy / move files from the Downloads DrivePool into the CloudDrive DrivePool, one drive always drops off, Randomly, never one in particular. But then DrivePool will mark the Drive as read only and I can't move media to the new drive. I would have thought cache would handle this temporary outage, I would also expect that the Local Drive cache should handle the sudden influx of files, and not knock off the CloudDrives. I also would think that DrivePool would still be usable and not mark the drive as read only? What am I doing wrong? - how do I fix this behaviour?
  23. korso

    NTFS compression and failed drive

    Hello everyone! Another happy user of Drivepool with a question regarding NTFS compression and file evacuation. A few days ago I started having reallocated sectors counters on one drive. Stablebit scanner ordered drivepool to evacuate all the files, but there was not enough space so some of them remained. I bought another drive, which I added to the pool, and tried to remove the existing one, getting an "Access is Denied" error. Afterwards, I tried to force evacuation of files from the damaged drive using the appropriate option in the Stablebit scanner. This triggered a rebalance operation which was going very well, but then I notice several hundreds of GB marked as "Other" not being moved. Then it stroked to me that the new drive has some files without NTFS compression, whereas the old drives in the pool did have. I think somehow since the checksums are not the same for compressed and uncompressed files this is somehow confusing the scanner. What I did so far (for consistency at least, hope this doesn't make things worse!!!) is to disable compression from all the folders I had it enabled (from the old drives, including the faulty one) and wait for the rebalance to complete. Is this the right approach? Is this also expected to happen when using NTFS compression? In drivepool is actually not worth the hassle to have it enabled? (I was getting not fantastic savings, but hey! every little helps, and wasn't noticing perf. degradation). Hope the post somehow makes sense and also hope my data is not compromised for taking the wrong steps! Thanks! DrivePool version: 2.2.0.738 Beta Scanner: 2.5.1.3062 OS: Windows 2016 Attached DrivePool screenshot as well
  24. Hi all. I'm testing out a setup under Server 2016 utilizing Drivepool and SnapRAID. I am using mount points to folders. I did not change anything in the snapraid conf file for hidden files: # Excludes hidden files and directories (uncomment to enable). #nohidden # Defines files and directories to exclude # Remember that all the paths are relative at the mount points # Format: "exclude FILE" # Format: "exclude DIR\" # Format: "exclude \PATH\FILE" # Format: "exclude \PATH\DIR\" exclude *.unrecoverable exclude Thumbs.db exclude \$RECYCLE.BIN exclude \System Volume Information exclude \Program Files\ exclude \Program Files (x86)\ exclude \Windows\ When running snapraid sync it outputs that it is ignoring the covefs folder - WARNING! Ignoring special 'system-directory' file 'C:/drives/array1disk2/PoolPart.23601e15-9e9c-49fa-91be-31b89e726079/.covefs' Is it important to include this folder? I'm not sure why it is excluding it in the first place since nohidden is commented out. But my main question is if covefs should be included. Thanks.
  25. jjross

    Files not distributing to new drive

    I've added a new drive to my pool and I'd like to spread the files in the pool over to the new drive. I realize this isn't the default behaviour so I've added the "Disk Space Equilizer" balancer into my system. At the moment it's the only balancer I have enabled and the system STILL isn't moving the files (and therefore balancing the space). Any idea what I'm doing wrong? Is there a log or something that I can check to see why it's ignoring the only balancer present?
×