Jump to content

asdaHP

Members
  • Posts

    46
  • Joined

  • Last visited

Posts posted by asdaHP

  1. 18 hours ago, Spider99 said:

    i use veeam to backup the server os drive and any non pool drives

    i use SyncBackPro to backup my pool - has millions of options - but i backup pool a to pool b - simple mirror (overwrites, deletes or copies new files etc) - basic a set and forget option

    I also get it to do a backup to my QNAP nas once a week - across my network (10g) and it runs as fast as the dp disks can provide the data

    if it has an issue when running - it will pop up a html report on the server - something you cant miss - for me its usually i have removed/renamed something while it was running

    possibly the free version of syncback maybe enough - but i bought it a while ago and it works very well with DP

    Thank you!

  2. On 12/1/2020 at 7:56 AM, Spider99 said:

    2012r2 and scanner work fine - picks up bad sectors - when the scan is run/or when smart reports it

    but its not going to find dead drive sectors in real time - if thats what you want - don't think anything does that - as it would kill the drives in weeks/months as you would have to read each sector - again and again

    maybe you need better backup software that does a hash check / compares the files before it overwrites?

    Thanks. Sorry didnt seen your reply in my email box yday. Just found it now.

    I see what you mean. By the way I wasnt referring to corrupt sectors on the drives in drivepool itself. Drivepool is in my server which backs the client PCs and one of these PCs presumably had corruptions develop from bad sectors. The bad file was probably being copied into the server drives. I didnt know there were corrupt files till I tired restoring the drive and windows stated they couldnt be copied to the new drive because of corruption. I am glad I had multiple versions of backups and the older ones were fine.

    By the way, are there backup softwares that you like more. I tried veeam before but it wouldnt work with Drivepool's duplication setup.

     

  3. 8 hours ago, Spider99 said:

    i would go with option 1 - as you only have two drives with duplication so they are the "same" - does not matter which one you choose - vss will not be copied etc

    3 - would work but will be slower than 1

    2 - avoid cloning liable to give you problems

    and yes - shut down any service thats writing to the pool before you start - more for maintaining the best speed

    internal copy will be approx 1TB per 3 hrs - give or take - remember speed will vary by files size (lots of small files very slow) (large files quick) and where the data is on the disk - i suspect that the new 8TB will be quicker than the 4TB so the speed will depend on the 4TB disk....

    Thanks @Spider99. I am glad you are still here on this board. Hope you are well. Its been a while since I posted.

    Ok, will plan on option 1. Now just to get some Red plus 8TB drives. 

    If I remember correctly, you used to run a WS 2012r2 with DP and scanner. I probably should post this question in the scanner section, but since you probably have a similar setup, meant to ask you. Does scanner not pick up corrupt sectors, files from a failing drive copy? This weekend coincidentally one of my home PCs had a data drive start clicking. Before i got to clone and replace it, it was all gone. I thought no big deal, I have a nightly backup. Well apparently the bare metal restore is a little more complicated on WS 16 essentials than was on my old EX490 WHS and needs a bunch of steps (windows PE etc). So i went with recovering selective folders by using the restore function. Here i ran into issues with some folders and files in the backup apparently corrupt (per the restore software). I was able to pull a much older version from august and those were not corrupted. Luckily the recently modified folders in the more recent version of backup were not corrupted. The only ones i couldnt get out were ones that claimed I didnt have permission (probably encrypted).

    But I wondered if a client PC drive was developing corruptions, and these files were nightly being backed up into the server in hard drives where the Stablebit scanner was constantly running, why didnt scanner notify me? Or does scanner only find stuff when it runs its full scan? To this minute scanner isnt reporting any corruptions on either of the duplicate server hard drives.

    Have a great day.

     

     

     

  4. Hi and hope everyone had a good weekend (and thanksgiving in the US).

    I have had stablebit drivepool and scanner for the past 3 plus years. It has worked great. I have a mirrored setup of two 4TB drives in my pool. This is housed in a windows server (essentials 2016) that backs my home client PCs nightly and also has server folders for the family. The server is also backed up into an external 8TB drive every day. I am planning on replacing the internal 2 4TB with 2 8TB drives as I am almost out of space. I do not however have an available spot inside the computer to add one or two more drives during the upgrade. So I was wondering what my best (fastest or safest if they differ) options for migration were?

    1. Remove one of the two HDDs probably via the DP UI ( i presume i would need to force it to cancel duplication when that happens?) and which of the two should I start first with, given that I believe that one of them tends to have slightly less space maybe from the VSS. Then I pop the first new 8TB drive in its spot, add it via DP, and wait for the over 3TB data to be copied over. Then repeat with the second drive? Or would this be too many writes. A to B and then back to A

    2. Switch of the PC, pull one drive out and clone it to the 8TB drive and then do the same for the other. When i put the two new 8TB cloned drives back will DP recognize that these are the same content as the previous pool except for the size? Or does it use serial numbers of the old drives and would that be an issue.

    3. Connect one or both of the new 8TB drives to the PC via an external USB enclosure such as the one I have: Unitek 2 drive unitek enclosure  Then ask DP to add them to the same pool and do its magic and move the files to the two new drives? This may take a while given in my experience the external enclosures are slow even if they claim to be USB 3.0.

    For options 1 and 2, I presume I should stop all actions in the server like client back up and server backup before I start I presume?

     

    Thank you for your time.

    Best

    AK

     

  5. Hi,

    Love drivepool and using it for 18 plus months. Other than the periodic server missing folders message in windows essentials dashboard that goes away when i reboot the server, its been great.

    I have been running for some time with two 4tb drives with duplication for entire pool including client backups. OS is on a separate SSD. I want to add an 8tb drive to the server (have physical space for it). Can i add the drive to the above pool but deactivate duplication selectively for this drive (as it impossible to duplicate nearly 7plus TB of files when i just have 1.5TB free available on the current pool) or make a second pool for just the 8tb or leave it entirely out of the pool. The main purpose of this 8tb is movies and tv shows that i dont need to duplicate (they are from my discs and i have the original ). I run plex on my server and Id like to point it to this additional 8TB drive in addition to some of the files already in the pool. Trying to find the cleanest solution.

    thank you

     

    EDIT: I cant seem to reply to your answer @Christopher (Drashna) Am i missing the reply button or does this thread only accept one reply? Using firefox. In any case...thanks for the suggestion on the server folders missing. Saves me the reboot :-) On the adding drive part, I gather you mean I can add a drive to a duplicated pool but that after doing that, I should under balancer-drive usage limiter, uncheck the box for duplicated and leave the check box for unduplicated for this 8tb drive? Thanks

  6. @drashna

     

    Thanks. Yes had to uninstall previous beta version and reinstall the version you sent. Now all sectors are green !! So it was just a bug with previous beta version

     

    Curiously this version is an exe file so runs a separate program rather than within the essentials dashboard (i am running windows server 16 essential). Not really an issue as I can constantly run both the dashboard with the drivepool and this file separately. BTW Does the exe file run as a service when the computer boots up or do I have to start it?

     

    Uninstall the software, reboot the system and then reinstall.  

     

    That should fix the issue in this case. 

  7. This is most likely a "bug".  Specifically, some drives ... have some weird issues with how the data is reported.  It causes issues for the surface scan in released versions, unfortunately.

     

    Please try this version and rescan: 

    http://dl.covecube.com/ScannerWindows/beta/download/StableBit.Scanner_2.5.2.3126_BETA.exe

     

    To rescan, double click on the drive in question, click on the fourth button on the left side (with a green circle), and select "mark all as unchecked" (or something like that). 

     

    This should complete "properly". 

    Hi @drashna

    Thanks. When i tried installing it, message that "The specified account already exists" comes. I went to task manager and stopped scanner and still same result. Must i uninstall the previous beta version 2.5.2.3103 first? Does that cause it lose the info on the other drives? thanks

  8. I bought an 8TB WD easystore external HD which has (and yes pretty much everyone knows this) a red drive EFAX (WDC WD80EFAX-68LHPN0). I ran the short and extended tests on HD sentinel and everything was fine.

     

    Now when i left it attached to my server which has stablebit scanner, i received an email stating"

    One or more disks are damaged and have data on them that is unreadable:

    WD easystore 25FB USB Device - 344 KB unreadable (688 sectors) - The rest of the disk is still being scanned

    • Model: WDC WD80EFAX-68LHPN0"

     

    When i look at the sectors it is the very first block part of it is which is unreadable (Red color block). I did a file scan and that was normal. EDIT: The rest of the drive on the surface scan checked out completely normal.

     

    Given that i didnt find something with HD sentinel, is this potentially a false alarm especially given its the first block? I vaguely remember having a similar issue nearly 9 months back on another external drive and it turned out a false alarm. There was something about using a beta rather than stable version of scanner. I am running currently 2.5.2.3103 beta. I forget what the return policy for BB is and if this is a real issue, id rather return it now than have to setup a warranty with HD.

     

    thank you

  9. Thank you!! I was away and for some reason did not realize you had responded. Thank you for explaining the basis for the issue.

     

    So to summarize, I can run dedupe on the disks managed by drivepool provided i use the beta version of the software. Interestingly I am on version 2.2.0.651 beta as you suggested and the 'bypass file system filters' option is already checked. Should I uncheck it before i try running dedupe even though this is the beta version?

     

    Lastly, just to be sure i understood you correctly, since i currently only have two drives in the pool (which are copies of each other), I should run dedupe on one of these two drives and drivepool will take care of the other drive?

     

    Can an option at some point be added in the software to only allow this dedupe filter but still bypass other filters so as not to slow down the processing (as you mentioned that typically bypassing filters speeds up processing). EDIT: Rereading your response, it sounds like that is exactly what the beta version already does, sorry.

     

    thank you again.

     

    No.  It will copy the entire file to the other disk.  However, that file may get dedup-ed in the same way.

     

    The issue here, is how deduplication and our software works.  And this is part of why the beta vesion is "required", or you need to enable an advanced option.

     

    Normally, our software bypasses all file system filters when accessing the underlying data.  This is boost performance (filters can cause serious slow down), and for compatibility (some filters freak out when the same file is being accessed repeatedly, and when one request isn't finished yet.... I'm looking at you, Avast). 

     

    This is fine, usually.  However, the deduplication feature splits the contents. It creates a special reparse point out of the original file. It leaves non-redundant data attached to this file/reparse point hybrid object, and it puts all of the duplicate data into the "System Volume Information" folder (that same one used by VSS).   

     

    Then when accessing files, it uses a file system filter to splice this data back together, in the right order. 

     

     

    Now, as to why this is an issue:

    1. Deduplication can't access the blocks of data on the Pool drive. So it just doesn't work. 
    2. StableBit DrivePool bypasses the file system filters on pooled disks, so you would only get partial (or no) data when accessing deduplicated data.   This is why you MUST disable the "bypass" option.  This way, the dedup filter can splice the data back together, properly. 

     

    The beta version looks for the "dedup" filter, and automatically disables this "bypass file system filter" option on the pool, to prevent this from being an issue.

    Additionally, the latest internal betas include some special handling when measuring the drive. 

     

     

    Also, when balancing or duplicating the data, it will grab the spliced together data, as well.

  10. Deduplication is a block based technology.  You CANNOT run it on the pool itself.   You must run it on the underlying disks.

     

    The downside is that you won't save as much space this way, but you should still see a good chunk of savings.

    Thank you. So if i run dedup on one of the disks, drivepool should duplicate those reduced parts into the mirrored disk? Since currently i only have two drives (duplicated by drivepool) i guess i just need to  run dedup on one of them, not both correct?

     

     

    That said, we don't officially support deduplication here, but we do accomidate for it.  The software should automaticlly disable the "bypass file system filters" option on systems with the Deduplication feature installed.  But it's worth checking to make sure it's disabled (Pool options -> performance).

    Will do. Should i use a beta version to get best results?

     

     

     

    As for Shared Folders themselves, you'd want to set them up on the pool itself.

     

    As for the user defined folders being missing, I'm not sure about that.  If I had to guess, it was a timing issue.

    Thank you.

  11. Hi,

    I was planning on trying dedupe (server 16 standard essentials experience). ddpeval showed 27% saving as of now. Should i run dedup on the underlying drives or the drivepool virtual drive? Any recommendations?

     

    Also for that matter, is it recommended to point the server shared folders (setup via the server essentials dashboard ) to the drivepool drive or again should I instead point it to one of the underlying physical drives. I currently have it pointing to the drivepool pool drive and its working fine expect for the one time when I got a message that the user defined folders were missing, that resolved itself (may be coincidental).

     

    thank you

  12. I am curious as to what backup software/methodology that most of the people here use?

     

    WHS11 has failed with the upgrade to Windows 10, and Windows Essentials while barely cost effective has gone over the edge with the upgrade to the 2016 version.

     

    IS there a definitive software package that works with DrivePool as the pool size goes into the terrrabye realm?  Or are most people using NAS solutions.

     

    I've turned on file duplication for now, but that's not a true solution.

     

    Any thoughts, comments, or pointers to other conversations would be greatfully appreciated!

    I have been using windows server essentials client backups (server 12r2 and now 16) of my client pcs without a hitch. None of my clients have DP, rather i have DP on my server that houses these client backups. In fact the primary reason i stuck with microsoft after WHS v1 was for the bare metal client backups.

     

    On my server, i have tried using the built in server backup, veeam and crashplan. Basically since the DP pool drive does not do the VSS stuff, you have to run it on the native drives, which means if you are duplicating info, the server backup also gets twice the size!! Veeam despite claiming that it dedups was still making massive backups when i asked it run on the duplicated drives. Crashplan was so slow that i gave up on it. So what i did was run the server backup only on one of the two drives that makeup my DP pool and thus get backups but keep the size under control.

  13. I just use the UI to do this. 

     

    On client OS's (Windows 7, 8, 10), run "control sysdm.cpl" and open the "System Protection" tab.

    From there, you can toggle the state, and delete the shadow copies.

     

    on Server OS's, run "fsmgmt.msc" and find the "Configure Shadow Copies" option in the menus. 

    Another option: On the server, i right click on any drive and i have a choice called "Configure shadow copies..." . From there i can change the shadow size.

     

    On a different matter, I installed server 2016 and reinstalled drivepool and moved my drives (not physically as this is the same server box as where 2012 r2 essentials was) to the new server. I was able to redirect the essentials dashboard server folders to the drivepool drive. Setup backups and three computers are backing up fine. However after I added a standard user account to access the server, though it created a user folder with that name, from the client computer launchpad, i am unable to access the shared folders from the client computer. I get a "Windows cannot access \\xxxxserver\shared folders Check the spelling of the name.....Error code 0x80070035". When i looked up the network pathway in the server for the shared folder it is as follows: " xxxserver\users2\xxxname" (xxxname is the name i created for the user). The reason it is users2 is probably because the drive pool drives already had a user folder with subfolders from my previous server 2012 r2 installation i suppose. So my question is how do i force lauchpad to go to the users2\xxxname account when clicking on lauchpad? Or is this issue because i connected the client pc as a workgroup (using the skip domain function)? Thanks

  14. *wave* me again :)

     

    Just wanted to check if you think this is a safe buy or not? Seems quite cheap considering it is the same card as the LSI branded one. I assume you would just crossflash this with the LSI firmware and you would have an LSI card for half the price?

    Take a look here https://forums.servethehome.com/index.php?threads/lsi-raid-controller-and-hba-complete-listing-plus-oem-models.599/

    It looks like it is listed in the same category as the Dell H310 which i flashed to LSI, so should be fine. I do not however have personal experience with the fujitsu.

     

    Also as @spider99 said, may be post in that forum as there a bunch of very experienced guys in servethehome.

    best

  15. Hi @asdaHP

     

    I went with 2012 RE2 because of lots of issues with raid cards under Win 10 and wanted something else to try - 2016 is a tad to shinny and new - and usually that means some issue with it being MS :) - and had six weeks of faffing around did not fancy anymore :)

     

    Also if you look on ebay you can find Essentials 2012 for as low as £20 for the lic number - i got my first one at £80 and downloaded the technet preview version and installed and activated no problem :)

     

    First impression is its stable and finds most things out of the gate with working drivers - what i dont like is the interface (part Win 8) and the way it forces you to the dashboard and makes finding some tools kind of difficult - i.e. several clicks to find computer management - to get to disk management - to map a drive. could be i have not found it in the dashboard but.....

     

    Overall its a server version of windows and it will do what i need once i get used to the interface or change it so its more Win7 ish rather than win 8

     

    If you do go Win 2016 - comeback and start a thread so we can see if you find it any better - not looked in detail but i did not see a huge feature set change?

     

    T

    Hi @spider99

    Thanks. Yes i too find 12 r2 stable. And there is not much of a benefit with 2016 except that being a newer software, i expect longer support. Ebay does have much better prices, i just worry about whether those are valid versions being sold as many of the sellers have only a small number of ratings. But if it works for you, may be il take a closer look. I asked the same question on homservershow a while back but did not hear from anyone. I am glad you have tried it and it has worked.

     

    I installed classic start,

    That will make it a bit easier to find.

    I also set it up to stay in desktop mode and not the tiles.

    I remember hearing about that with windows 8 but never got around to it. Is classic start a software add on for 2012 r2 too?

     

    .... I just right click the start menu button 90% of the time. :)

     

    That is more or less what i do with 12 and windows 8.1!! (i actually got somewhat used to right clicking on start and working from there...so much so i sometimes right click on windows 7 start and only get properties and explorer!)

     

  16. Hi

     

    Proud owner  :P of a shinny new Essential Server

     

    Its along time since i played with Windows server - some things are the same but a lot has changed so....

     

    Any recommendations on what to read/watch on how to set it up and use some of the features etc

     

    There appears to be a lot of info but where to start with a good guide etc?

     

    Thanks

    T

    Hi,

    Congrats on a new server :-) I was curious if you picked 2012 r2 over 2016 because it came prebuilt or if it is a better price or you just like it better!

     

    I am on the eval version of 2012 r2 essentials but am tempted to look at 2016 essentials as they are more or less priced the same (399 on newegg).

  17. Thanks for the offer :) 

     

    Il probably use the dell t20 till i run out of space and then consider moving the drives, cpu into another case and motherboard. The price of the server (no OS), from dell was just a little more than the cost of the CPU itself. But you are right, building with parts you pick is the best. Thats how i assembled my home PCs. It is just that nowadays the pre built prices are remarkably cheaper especially for NAS or server grade.

  18. could you swap the drives around and say put your ssd up top and the HGST at the bottom of the case?

     

    or even better get a better case!  :P

    Yes, I might do the first...I had the ssd originally in the bottom since the sata connection is sata 3 while the top ones are sata 2. Il probably get a little longer sata data cable and then do that.

     

    Agreed...but thats just it... dell t20 in the us when on sale is an amazing price but all these OEMs come with some compromise, dont they

  19. no i was referring to the cage as it has an adjustable fan 120 mm built in - but as i said it probably wont fit

     

    If the drive is HGST and 7200 - i have several they run 4-5 degrees hotter than say a WD 5400 drive so i would put them in the best cooling location in your case

     

    I am going to split mine up so they are not close/next to each other as they tend to heat soak each other - and with scanner you can see they slowly climb in temp with use - but nothing to worry about - i prefer to keep them in the high 30's when in use and in the low 30's when idle

     

    attachicon.gifCapture.PNG

    Hi spider99

    So i added a back case fan 120 mm and the temps in the lower drives came down dramatically but the upper cage not so much. No way to put a fan by the upper drive cage as the front is a dvd player.

     

    post-2611-0-17913000-1477957752_thumb.jpg

    post-2611-0-86096400-1477957873_thumb.jpg

  20. Not currently. 

     

    Some drives do include historical data built in.  If the drive does, then we will display it.

    EDIT..

     

    Ok, thanks,

     

    I presume then that HGST coolspin (4tb) and ultrastar dont? Or where should i look under in case they do. I see a lifetime maximum and current temperature listed so i am presuming it is logging some historical information.

     

    i just looked at hd sentinel and noticed that the free one also offers temp charting like below. May be something you could look into for future upgrades of scanner along with the temp reporting in the task bar (or whatever that bottom right of desktop screen is called)

     

    thankspost-2611-0-52414600-1477957555_thumb.jpg

  21. Hi drashna (and other forum members).

    Is there a way to view temperature charts on individual drives in scanner? I am trying to see if at certain times the temperature climbs (relative to server backups, client backups etc). Also trying to see what benefits i achieved by adding a second fan today.

     

    There was a temeperature chart function in HD sentinel (paid one i think) and was checking if there is a way here too.

     

    thanks

  22. Without the software? No, not at this point. 

     

    However, this is a common request and something that is on our mind. 

     

    The "best" solution right now, is to somehow make it available from the system hosting StableBit CloudDrive.  There are a number of solutions to do this (VPN into the network, SeaFile, a web server using WebDAV, etc). 

     

     

     

     

    As chilling as this is, pretty much every consumer cloud provider does the exact same thing. 

     

    They have access to your data, can give it to Alphabet Agencies (FBI, CIA, DHS, ATF, etc) without notifying you of doing so.  

     

    As for deduplication, this is also normal, and happens for a very good reason.  If the same file is behing uploaded by hundreds of people, that means either you need to story this same file hundreds of time, or store a single copy that is referenced by these different accounts. 

     

    This also makes finding and removing copyrighted material much easier for these companies, as well.  if the file matches a known hash, just remove it.  

     

    And yes, this has actually happened to us in the past.  That's why we obfuscate even unencrypted data. 

     

     

     

    If this disturbs or chills you, ... well, maybe it should.  

    And that's a big part of StableBit CloudDrive's reason for being.  "Trust no one" storage and encryption.  

     

     

     

     

    And that's part of the trade off: the more secure something is, the harder it is to access.  

     

    We do try to make it as easy as possible, but for now ...

    Thanks. Yes ease of access vs privacy. Personally the info i am putting up unecrypted is the kind that i am not too worried about copyright, agency access etc but rather that someday i dont find family pics being used for an amazon ad without my permission.

  23. Sorry, yeah, even the unencrypted drives do not use just the raw files. 

     

    There are a number of reasons for this, but the main one is that we store raw blocks of disk data on the cloud provider, not the files.  This allows for greater flexibility in what you can do with the drive.  

    For instance, the drive supports VSS, hard links, formatting as ReFS, data deduplication, etc. 

     

    These wouldn't be possible if we used a different approach to how the software works. 

     

     

    As for accessing the files, even the un-encrypted data is encrypted, due to the fact that various providers will analyse and index the files based on the file contents.    

    Ok, thought as much when i saw all those strange files. So is there a way if i am away from my server that has the clouddrive software, say at work, to download those chunks and then recreate the files i want without needing to reinstall clouddrive on that computer. Of course there is no way id know which files are in which chunks so could be difficult.

     

    Amazon has 100% access to your files, it even says so in the TOS.  They can and WILL browse your media.  They also use dedupe software to delete your stuff and replace it with links to one centralized version.  Everything you upload to ACD should be 100% encrypted, unless you like other people having access to your data.

    Chilling to hear that. I dont remember reading the TOS that closely but worries me if they have an option to say use my family pics as advertisements or similar. I dont really care if they dedupe as my original copies are at home in multiple locations.

     

    If i an encrypt the files, then i lose the ability to access them remotely. After all that is one of the reasons i keep certain files on the cloud.

×
×
  • Create New...