Search the Community
Showing results for tags 'backup'.
-
Dear Support, I have some trouble understanding what the best method would be in case of an SSD failing to restore the files and rebuilt the pool with an external Hardddisc Backup I would like to briefly explain two backup methods that I understand might work, not knowing what method is more reliable to use. In all cases I would use a program like Synergy, that copies all the files and changes from a source folder to a destination folder (my external Backup Drive). I need an occasional Backup (once a week) In this setup I would not use any folder duplication or pool file duplication feature from Stablebit. Method1 for Backup: Source: Destination: Backup Drive 14TB D: SSD1 -> Folder SSD1 E: SSD2 -> Folder SSD2 F: SSD3 -> Folder SSD3 G: SSD4 -> Folder SSD4 H: (Drive Pool Combining SSD1 to SSD4) As I understand all the pool files are stored in the ntfs format in the hidden file folder inside SSD1 to SSD4. My firsts method would be to backup all this „hidden“ files of each SSD (SSD1 to SSD4) on a separate folder inside the Backup Drive. I case SSD1 fails, I would need to recover the files from „Folder SSD1“ of the Backup Drive onto a new SSD and that new SSD would be automatically recognised belonging to the pool when enabled. I would just need to give the new SSD the same label naming as the old ssd or I would need to use the „identify“ feature of Stablebit when seeing the „Drive is missing“ info. Would this work ? Method2 for Backup: Source: Destination: Backup Drive 14TB D: SSD1 E: SSD2 F: SSD3 G: SSD4 H: Drive Pool Combining SSD1 to SSD4 -> Entire Pool Backup This second method would be easier to setup when doing a backup, because I would just backup the entire file structure from the virtual Dive Pool (H) to the Backup drive. In case SSD1 fails, I would use the „remove“ feature of StablePool and add a completely new empty SSD to the pool with the same or larger size. Then I would recover the files from the Backup drive to the Drive Pool H, and Stablebit would fill up this new SSD1 in whatever manner it wants to store data on it. In this case I would not have to worry about what file content each SSD has, because I already have an backup of the entire Pool Drive and Stablebit would manage everything else in the background… —————————————————————————— Would both methods work for having a save backup to restore the files ? If both work, I would tent to use Method2, as it is easier to backup one Virtual Drive Pool instead of 4 individual SSDs. Are there any disadvantages using Method2 ? Is there an easier method using the Stablebit pool file duplication feature for example (Although I really like how Synergy works with backup up files) As I understand, I dont need the Stablebit folder duplication feature because I use a separate backup on an external hard drive, is that correct ? Thanks a lot, Cheers, nostaller
-
Hi, I got a question: Ok, Here's what I am starting with: Local Machine D: 2 GB E: 5 GB F: 3 GB Local Network Server Windows Fire Share Lots of space. Network: 10 gbit network. Now, what I want to do: (already tried, actually) 1st pool: Local Pool D: E: F: No duplication, just 1 big 10 GB pool. "Local Pool Drive" is the name. 2nd pool: Cloud Drive, Windows File Share Provider, 10 GB disk, no encryption. "File Share Drive" is the name. 3rd Final Pool +"Local Pool Drive" +"Fire Share Drive" Some folders like my Steam Games I don't want to duplicate because Steam is my 'backup' I set this up, but something went backwards as it seems to be trying to put the un-duplicated files on File Share Drive, which is not what I want. I am currently taking my files back off to be able to rebuild this. I think I've set some stuff up wrong. I understand the basics, but I am missing something in the details. Now, my thinking was to allow me to have a local 10 GB drive, and a 10 GB mirror on my file server. So, I get the best of both worlds. it seems it can "Technically" do this. What I would like to know is, when reading files, will I get local copy reads, or is everything going to grind to a halt with Server Reads? Is there a way to force local reads? I expect a cache would hide the uploads unless I went crazy on it. (hint, I copied all my files on it, it went crazy.) but, I was hoping for a Quazi live backup. So, I get full use of my local disk sizes and have a live backup if a drive ever failed or I wanted to upgrade a disk. Is this one of those "yea, it can, but it doesn't work like you think" or "yea, your just doing it wrong" things. What are the best options for Could Drive (pinning, network IO settings, cache size, drive setup settings) What are the best Local Disk options, D: E: F: and the Local Disk oh, and as a nutty side note, I tend to play games that like SSDs, I have a C: SSD but it's limited in size, (yea, I know, buy a new one, ... eventually I will) so, I would like to setup those games along side my others and add C: to D: E: and F: and tell it no files go on C: cept for the ones I say and EVERYTHING just looks like a D: drive because the actual drive letters will be hidden in disk manager. so, If I want to move a game into or off of my C:, Drive pool SHOULD make it as easy as clicking a couple of boxes right? That's a lot so summary: Can I force local reads when using a pool of Local Pool and Could Drive Pool together? Can I force certain files into specific drives only? (seems I can, just want to confirm) Can specific folders not be duplicated? (again, I think I know how, but there's a larger context here.) Can I do the above 3 at once? Thanks!
- 4 replies
-
- drivepool
- cloud drive
-
(and 3 more)
Tagged with:
-
I've been seeing quite a few requests about knowing which files are on which drives in case of needing a recovery for unduplicated files. I know the dpcmd.exe has some functionality for listing all files and their locations, but I wanted something that I could "tweak" a little better to my needs, so I created a PowerShell script to get me exactly what I need. I decided on PowerShell, as it allows me to do just about ANYTHING I can imagine, given enough logic. Feel free to use this, or let me know if it would be more helpful "tweaked" a different way... Prerequisites: You gotta know PowerShell (or be interested in learning a little bit of it, anyway) All of your DrivePool drives need to be mounted as a path (I chose to mount all drives as C:\DrivePool\{disk name}) Details on how to mount your drives to folders can be found here: http://wiki.covecube.com/StableBit_DrivePool_Q4822624 Your computer must be able to run PowerShell scripts (I set my execution policy to 'RemoteSigned') I have this PowerShell script set to run each day at 3am, and it generates a .csv file that I can use to sort/filter all of the results. Need to know what files were on drive A? Done. Need to know which drives are holding all of the files in your Movies folder? Done. Your imagination is the limit. Here is a screenshot of the .CSV file it generates, showing the location of all of the files in a particular directory (as an example): Here is the code I used (it's also attached in the .zip file): # This saves the full listing of files in DrivePool $files = Get-ChildItem -Path C:\DrivePool -Recurse -Force | where {!$_.PsIsContainer} # This creates an empty table to store details of the files $filelist = @() # This goes through each file, and populates the table with the drive name, file name and directory name foreach ($file in $files) { $filelist += New-Object psobject -Property @{Drive=$(($file.DirectoryName).Substring(13,5));FileName=$($file.Name);DirectoryName=$(($file.DirectoryName).Substring(64))} } # This saves the table to a .csv file so it can be opened later on, sorted, filtered, etc. $filelist | Export-CSV F:\DPFileList.csv -NoTypeInformation Let me know if there is interest in this, if you have any questions on how to get this going on your system, or if you'd like any clarification of the above. Hope it helps! -Quinn gj80 has written a further improvement to this script: DPFileList.zip And B00ze has further improved the script (Win7 fixes): DrivePool-Generate-CSV-Log-V1.60.zip
- 50 replies
-
- powershell
- drivepool
-
(and 3 more)
Tagged with:
-
Hey all, I figure I would ask here because smart people use Drive Pool! I just got rid of WHS 2011 as my home server software due to unacceptable issues, unreliability and boot issues. Obviously it did have a great client PC backup system that worked fairly well where windows 10 does not. I am running a drive pool with first level folders that I map various users to on their PCs, so we are not talking high tech here! I gave VEEAM endpoint free a go, but it will not 'see' the network mapped drive I allocated for the backups. The permissions are all OK, just being a pain. I am now onto EASEUS ToDo backups which does backup to my 'home server' mapped network drive and seems to be OK, but what else is out there that you rate - free or paid if it is worth the investment - and are there any other Windows 10 friendly ones that would allow me to centrally administer the backups like my WHS software did? If I dont find anything better I will probs just stay with the EASEUS solution as it is pretty neat - unless VEEAM support can fix their issue. What-cha-got?
-
Drive pool backups/Keeping files in folders together
APoolTodayKeepsTheNasAway posted a question in General
Hi, I apologize if this is too general or not the right place to post this, but I havent been able to find an answer thus far, so Im posting this in the hopes I can get the answers without testing myself. I currently have quite a few hard drives in a jbod collection. What I hope to do is have one pool of 15TB of in use drives, combined with a total of 28TB in new storage where I can have a pool of 21tb in always on drives backing up periodically to a pool of 22tb TBD drives which would be either for file duplication in a raid 1 fashion or backup. What Im unsure of is how I can accomplish this with Drive pool. I imagine I would create the first always on pool, then create the second one, simply turning off and disconnecting all the drives it uses when finished backing up then plugging them back in to refresh said backup. What Im hoping would be the case is that in the event of a failure, I would be able to see what files and folders where on the failed drives and restore them from the backup. hopefully automatically without disturbing anything else. ---------------------------------------------------------------------------------------------------- Separately to that as it seems I can't test it on some usb drives I have lying around (can usb flash drives work?!), I would love to know how granular the pooling system is. Say for instance, I have 1000 folders differently named and filled with multiple files, can I then set Drive pool to only move around folders with certain attributes keeping all their child files and perhaps folders within? The way that would practically work, is that upon placing a new file into a folder, instead of rebalancing the location of the file based on size, the whole folder would be moved. Reading through the documentation in the form of the FAQ I cant seem to find a way to accomplish this, which would be great for restoring from backups in the case of a drive failure or accidental deletion. File Placement options seems very close to but just short of this unless Im missing a way to configure it to act this way. ---------------------------------------------------------------------------------------------------- Additional information: I plan to be filling some sort of external multi bay usb enclosure to hold the additional drives for my windows 10 computer using the Stablebit DrivePool and Scanner to manage all of the drives.- 7 replies
-
- backup pool
- file placement
-
(and 2 more)
Tagged with:
-
I have tried to search for this but haven't seen an answer. What files do i need to backup if I need to reinstall the os? I have done this once before and I lost all information on drive status's, bays, etc... I would like to preserve that information on my reinstall. I'm sure I could figure it out if I looked long enough but hoping you have this information handy.
-
Hi, I'm setting up a new home server and I just downloaded the DrivePool and Scanner apps to test out. I am trying to figure out the best setup to maximize my storage space but also have solid "real" backups and not just duplicated files. Duplication of files is great but isn't a real backup, since if I accidentally delete a file the duplicate is also automatically deleted. Duplication also doesn't allow for file versioning. I have six 8TB drives and I was thinking of creating two 24TB drive pools: a "Data" pool for live storage and a "Backup" pool to store backups of everything on the Data pool. I was also thinking of using the free version of CrashPlan to handle the backups. My question is this, would it be better to have CrashPlan back up the individual drives in the Data pool, or just have it back up the the entire shared Data pool as a whole? My thinking was that if I backed up the individual drives separately, if one of the drives in the Data pool failed, I could just replace it with a new drive and restore the backup of the old drive onto it to reconstitute the Data pool. However, not knowing all the technical details on how DrivePool works, I wasn't sure if that would actually work? Any thoughts or comments on this?
-
Hello all, I am just coming here to get specific instructions on how to safely and relatively easily format my current PLEX Media server PC, which i use Stable bit DRIVE POOL and Scanner on that main machine to monitor my Western digital RED HDDs 4TB (x4 in a drivepool using Stablebit). Long story short that pc is acting very very weird,slow, not loading half the time, and its freaking me out because my entire media collection are on those drives/pc. Seems that the motherboard and pc do not fully work right with Windows 10, so i guess i have no other choice but to format entire windows Drive (C: ONLY) and put windows 7 Ultimate on it, that way it will be more stable with windows 7 drivers, windows 10 hardly has any drivers for my old motherboard. (PC is about 6 years old, new gpu, and WD HDD's). hopefully RDP will be compatible between win 7 and my other win 10 machine ? but thats another issue for another day. I need to safely protect my data, and do a fresh install of windows 7 ultimate 64 bit, how do i do this without losing my data or causing issues, hopefully i can re-install everything back with ease, (stablebit, plex server) etc. i want the drives and letters are exactly the same. I would like to get the exact same setup except installed on win 7. Thanks PS: side note, i have an external HDD for backup purposes, how do i backup my media to it, without creating duplicates (some of the files already exist on there, i just have not updated my collection to the backup hard drive in a while so i dont know where im at, and dont want to waste space by trying to manually trying to put only the new media on there. (would take ages to figure out).
-
I've searched but cannot find (perhaps I missed it) whether it is possible to back a WHS 2011 to a Drive Pool. I created a pool and it was listed in the Server Backup options as a Virtual Disk. But when I selected it I received a dialog box telling me that I had selected a drive that was on multiple disks which would be reformatted and all data lost. Since I had existing data, I stopped. Is there some change in procedure needed? I'm trying to reduce the frequency of physically carrying my backups off site by creating a larger drive to receive the backup files. Thanks.
-
Hey all! I'm still on my trial of DrivePool but am overall liking it quite a bit. However, I've run into one major problem I don't know how to solve and I'm hoping someone here knows a solution. I use the paid version of Macrium Reflect to backup my entire gaming PC. It writes a main image that's a full backup (it usually comes out to about 1-1.3TB after compression) and then writes up for 4 incremental backups after that, then rotates those out, always keeping no more than 5 backups. My DrivePool lives on my home server under my desk and I run the Macrium Reflect backup set to a share I have on that server. When I tried to start my backup, my pool looked like the attached screenshot. About 4TB of space left between a few drives. When I execute my Macrium Reflect script, it also reports the same amount of space remaining. However, multiple times today I've tried to run the full backup and it's failed between 20% and 40%. According to the Reflect logs, it's because it...ran out of space. Even without compression, the total backup from my gaming PC is less than 2TB, it most certainly is not running out of space. As you can see from the screenshot, I do have duplication turned on but even with that taken into account, there's still more than enough space available. I'm not sure why this would be happening or what to do about it. I can engage Macrium support as well but I figured someone here might have a better idea. I can set Reflect to split the backup into multiple smaller files and may try that but I'm not sure why that would be necessary. I'm not sure why writing one large file over the network would keep failing this way. Anyone have an idea why this might be happening and if there's some kind of solution? Thanks all!
-
What are you guys using for an off site incremental backup solution for your drive pool?
- 13 replies
-
- Off-Site
- incremental backup
-
(and 1 more)
Tagged with:
-
Hello, I'm currently receiving SMART warnings for my main SSD, so I'm looking at purchasing a new one ASAP. I've already made a backup of my main drive using Acronis True Image 2015, so I'm ready to go there. Should I expect any wonky effects with my pool itself upon restoring my main drive from a backup? Any precautionary measures I should take before I make the switch or while restoring? Currently using Drivepool x64 v 2.1.1.561 on Windows 8.1. Thanks!
-
Hello, I have been using stablebit drivepool since may of 2012 and have been happy as a clam. No problems to speak of. I finally have a reason to register and post, I have have a HP N40L, with 5 drives. a 250 gig for os, 2 750 gig drives and two 1TB drives. Recently I removed a 750, and then added a 3TB drive. I formatted the drive by remote desktop as one large 3TB drive. Added to the pool and was working great. Now I see that my server backups are failing. (I backup to a 2TB external drive) The failure shows as \\?\volume{e and a bunch of numbers (assuming drive id) Do I have to remove the drive and let WHS 2011 format it as a 2tb and 1tb? I hate splitting disks like that. I thought there was a kb that fixed the >2tb issue. Thanks for any help and keep up the great work on a great product. Oh, and I am using 1.3.7585
-
Hello there, I am using the register Scanner v2.5.1.3062 under Windows Server 2012 R2 x64, running some HyperV VMs. Whenever Windows Server Backup performs backups (during the Shadow Copy phase I think), I get lots of errors in the Windows logs like this: The Harddisk and volume numbers keep changing, and do not correspond to a real HDD in my setup. From what I read, WSB temporarily mounts virtual drives to perform backups. Those errors (~ 10 per backup) are raised by the process Scanner.Service.exe from Stablebit Scanner. I suspect that the Scanner sees those virtual drives, and tries to check them, but fails as 1. the drives are virtual 2. they are suddenly unmounted by WSB. Finally I get 2 errors from Stablebit Scanner: All of this does not seem very harmful, but is really annoying. Is there a way to avoid those errors? Thanks
- 11 replies
-
- Server 2012
- backup
-
(and 1 more)
Tagged with: