Jump to content


  • Posts

  • Joined

  • Last visited

t-s's Achievements


Member (2/3)



  1. Just to add something relevant... I tested the same setup on windows server and win 10 1709 and the problem is already present As I wrote on my first post, the dedupication changed since the (publicly released) version 1709, so was too easy to guess that the problem with drivepool started there. I didnt use such versions so I cannot corfirm the problem on first post. Now I spent some time installing Windows Server 1709 on my test machine and I can confirm the problem. So there isn't any excuse anymore, DP on deduplicated disks is broken since more than one year!!! I guess that no one noticed that because Server 1709 (and 1803) comes only as GUIless flavours and very few SOHO users relied on them.
  2. Try to share the same disk/folder via ftp if the networks speed via ftp jumps to decent levels (as I guess) the virtual NIC has nothing to do with your slowness. As a further test you may want to test the SMB speed on a android device. Just use ES file explorer, ANDSMB (or something like that) to access your windows CIFS share and see if the speed is slow also there. Given the the ever changing SMB/Cifs protocol, the problem may reside in password encription, the lack of SMB1 compatibility, or other novelties introduced in latest Windows release.
  3. Whatever that sentence wouldn't fix the fact you didn't read what I wrote for 4/5 messages... :D
  4. You still don't get the point. Windows Storage Server 2008R2 Essentials is a 2011 server. Actually it was released AFTER WHS 2011, and it is a very small variation of it. When I wrote "almost unknown" I was serious, you are a further evidence of that
  5. It's pretty clear why 2008 r2 is called 2008 r2, what's amazing is releasing 4 Colorado brothers WHS, SBS, Multipoint and WSSe and calling 3 of them 2011 (as they should) and just one 2008r2. Especially taking in account that while SBS and Multipont are different products WHS and WSSE are almost exactly the same thing. You can even make WSS to look exactly lime WHS (and viceversa) replacing just a single DLL
  6. So? You are really saying that after a user discovered a bug that makes unusable (for the average user) your product, on an already released product, you refuse to test it just because a publicly available link is removed from a single MS page O_o. That's masochism. I spent hours trying to bisect the problem (and I shared my findings), I spent more time to collect the links you need to test safely YOUR product and you just want to sit and wait that a MS webmaster re-add that link to that page? I'm sorry but that sounds a bit incredible to me. Taking any rule just like a Jehovah Witness takes the Bible has never been a good idea. W10 And Win Server are the same OS (assuming we are talking about the same build, and we are), few things are allowed or disallowed because the kernel policies (e.g. the maximum RAM available, or the ability to launch DCpromo.exe) some things are just added or removed packages (just like the dedup packages), in that specific case there is absolutely no difference between Win server 2019 and W10 1809. There isn't a single hacked/changed bit, there isn't any circumvention of a software protection, there isn't any safety risk. From a legal point you are right, no one would theoretically be allowed to use even Notepad.exe taken from win 7 on win 8, but you really should be a bit flexible on that (just like MS is since the DOS days). It's not the case here (given the server ISO is there), but assuming you're are going to test your product on W10 + the "hacked" packages, what would be the MS position or that? Do you think they feel your're "stealing" something? Or maybe they consider that action something meant to improve the OS usefulness and the user experience?
  7. Maybe this is still written somewhere in the eula, but practically isn't true anymore. Once you get your digital entitlement moving it to a new HW trough the MS account is possible w/o much hassle Remember that using W10 is more a favor you're doing to MS than viceversa, so MS agevolates the process in any way. LTSB/LTSC are practically the new Win 7 (but given the less spyware and the lack of the store MS wants real money for them just like for older OSes.)
  8. The purge issue is already fixed by the update KB4464330 (which moves the build from 17763.1 to 17763.55) obviously it is the same for both Win Server 2019, Win 10 1809 and Win Server 1809. So the EVAL ISO, (which are still there) can be safely downloaded and upgraded (especially for a test machine where you have nothing to loose) Eval Standard/Datacenter (English US) https://software-download.microsoft.com/download/pr/17763.1.180914-1434.rs5_release_SERVER_EVAL_X64FRE_EN-US.ISO And the mentioned cumulative update http://download.windowsupdate.com/d/msdownload/update/software/secu/2018/10/windows10.0-kb4464330-x64_e459c6bc38265737fe126d589993c325125dd35a.msu Well calling it may be misleading, they are the official signed package which are missing from W10, Server Essentials and the (free) Hypercore (for the record both DP and deduplication worked well on Hypercore 2012/2016 I used it to provide the [de]duplication to WHS) No need to wait anymore.
  9. I mean not officially. W10 Pro is "free" in the same meaning as W10 Home. You can buy it, and surely MS is happy if you do that, but you can also upgrade from win 7 and you will get a digital license for that machine. Upgrade from win 7 home and you will get a W10 home license. Upgrade from win 7 Pro and you will get a Win 7 Pro digital license. No matter if Win 7 was bought or not. MS silently backs this operation because it is desperatley trying to switch from beeing a razor vendor to a blade vendor. The more W10 are installed the more chances that applications and multimedia stuff will be bought. Where the OS license still matters for MS are from Servers and LTSB/LTSC which comes free from the metro crap and the store. Also Pro becomes less Pro on each W10 upgrade, there is "Pro for Workstations" now, which is the new real Pro.
  10. Well, RAM and storage space are just like money, they are never too much, but, frankly calling 64GB "a limit" is a bit exagerated.. A 64bit machine tekes on average 1.5/2GB of RAM (which is managed dinamically by HyperV anyway) for personal usage 8GB of ram are more than enough to run the host system and a couple of 64bit guests, unless you need to run SETI@home in one of such machines I never installed W10 home in my life, I didn't see the point of running any windows home when they where paid products, and even less now that PRO and Home are both "free,"... (LTSB2015/LTSB2016 and LTSC are the only worth W10 flavours) WSE 2016 comes whith the full HyperV role so the consolle is included as well.
  11. Obviously the underlying disks, like I said I use both the deduplication and DP since the days of Win8/Server2012, so i know how to deal with it, I'm anything but a newbie about that matter. See the previous reply. (anyway, as you know, DP has that option disabled by default on recent releases.) 2019 is already available, there are the 3 months evaluation ISOs, but you can also install the deduplication cabs on any W10 x64 1809 (obviously same story as server 2019 and Server 1809, tested personally) P.S. I know that the resurces are always limited but the prerelease builds are released by MS for a reason, my suggestion is to test your products on them, to be ready when a new "final" relase is launched. P.S.2 Also WSL was vastly reworked since the days of server 2016 (where it was installable only unofficially), I know it uses some kind of FS filters as well and I smell problems also from that direction (I had a reboot during the linux update, but since then I didn't have much time to investigate further.
  12. The Hyper-V consolle is present in any W8+ installation (even x86 ones), just enable it from programs and features. Then connect to a remote one no matter if the HV host has the GUI or not, no matter if paid or free (HyperCore Server). If you need to manage Server 2008[R2] from Win8+ or viceversa there's a [partly]free SW called 5nine which makes possible the remote administration across different generations of windows. Other server bits can be managed graphically via the RSAT (remote server administration tool) a package available for free for any windows client OS (except W7/8 embedded) Or you can just use powershell No, i don't use any old school backup solution at all, I even stopped to enable the essentials role. Once one understands how handy the native vhds are, most of the paradigms from the past looks outdated.
  13. Yepp that's an option, but there are many varients limted just by your skill and fantasy. My old setup was WHS 2011 or its almost identical brother Windows Storage Server 2008r2 Essentials (the latter is better and it's still supported today), installed on a native VHD (booted directly by the bare metal), then I had Hyper-V installed on it (I made the pagages for it). Inside HyperV I had a minimal recent machine, Server Core 2008/2012/1016 with the large storage (and drivepool) installed on it, the Core VM accessed the real, phisical (large) HDDs which where deduplicated by their native feature (and made redundant by Drivepool). WHS or WSS where also connected to my main TV, running the mediacenter (I made the package for it as well), and acting as a TV server via DVBlink (and/or serverWMC). Then we succeded to port WMC on W10/WinServer 2012/6 as well, so I switched to a much simpler infrastructure using Win Server 2016 (now 2019). HyperV is there just to run win2008 server (very small, being the last x86 server), which runs just as a backup domain controller. The storage is now natively managed by Win Server 2019, which deduplicates them, then duplicated by DP for resiliency (as I wrote elsewhere, sounds like a joke but really it isn't) So nowadays I don't use the WHS like backup at all. All my OSes are native VHDs (including the server) the large storage is duplicated by drivepool. All I do is to copy 4/5 vhdx from the ssd to the Pool. (usually I do that one per month after a cumulative update is released and installed.
  14. Sorry but what's the point of having 3.5 TB of data stored locally when you have a Home Server? Usually you have from 20 to 60 GB dedicated to the OS and its main programs (Office, Photoshop, whatever), perfect for a small SSD or a VHD, then the essential data you need locally in a mechanical HDD, which rarley are more than further 100GB or so, then everithing else is stored safely on a Server/NAS/Cloud and accessed whan you need it. But even when needed, 3.5GB of data are better managed by a sync SW rather than a traditional Backup. A traditional backup SWs is a perfect fit to restore an operating system to it's original state (you can't use explorer for that, unless you use a VHD, as discussed earlier). Copying/syncing a bunch of random, large files can be done in a huge number of different ways, starting from the good old powertoys or the offline folder feature. Well VDI is a term used by the marketing mumbojumbo, to define a simple VM accessed via a kind remote desktop, not necessarily the MS remote desktop. But by a larger extension defines also what's behind it, how the machines are managed, how they are backupped, the deduplication of their storage, the live migration of them (you can move a RUNNING VM across different Hyper V servers), and last but not least, how the RAM is managed (a HyperV server can allocate,dynamically, to each VM more ram than the RAM effectively present on the server, just like a Bank does with money of their clients.) BTW most of those arguments are interesting for a company that whants virtualize 10 or more PCs rather than the signel user who runs one or a couple of VMs at the same time.
  15. After all that nice tread would ne nice if @Alex , @Christopher (Drashna) or someone else from Covecube would confirm they read it and they are awre of the problem.
  • Create New...