Jump to content

srcrist

Members
  • Posts

    466
  • Joined

  • Last visited

  • Days Won

    36

Posts posted by srcrist

  1. If you haven't uploaded much, go ahead and change the chunk size to 20MB. You'll want the larger chunk size both for throughput and capacity. Go with these settings for Plex:

    • 20MB chunk size
    • 50+ GB Expandable cache
    • 10 download threads
    • 5 upload threads, turn off background i/o
    • upload threshold 1MB or 5 minutes
    • minimum download size 20MB
    • 20MB Prefetch trigger
    • 175MB Prefetch forward
    • 10 second Prefetch time window
  2. Again, there is a service log option in the technical details window that can be adjusted to record various tracing levels for numerous aspects of the application. The logs can be found in your service directory. That directory is %programdata%\StableBit CloudDrive\Service by default. 

    You will need to apply the tracing levels to record the information you're looking for, and then load the logs to parse for your purposes. Note that verbose tracing of the sort you're requesting will generate very lengthy logs. 

  3. I'm pretty sure that the service log under technical details can be adjusted to show this information, but Chris or Alex would probably need to chime in about what tracing levels need to be set to which settings to get the result you're looking for. 

  4. When you say that you created multiple partitions from the same google drive, I'm not exactly sure what that means. Are you saying that you are using CloudDrive to make a drive and partitioning that? If so, it might simply be your performance settings. I can help you, but this is probably a less helpful place on that front than the CloudDrive forums. 

  5. My first thought is that if you cleared out the DB, it's going to need to reindex the chunks from the cloud. For 53TBs that will take at least 6 hours. Are you saying you've let it go that long and it's just dismounting and remounting? I'd be glad to remote in and take a look if you'd like, but I'm guessing here that this is what is going on. Note that this is not normal drive recovery, and you can see that this process is underway by opening the service log in the technical details. You'll see hundreds of numbers scroll by as it indexes all of the chunks. 

  6. Your prefetcher settings are effectively useless, and your minimum download is far too low. Your upload threads are probably too high as well. 

    Change your minimum download to 20MB, drop your upload threads to no more than 5. Use more reasonable prefetcher settings like a 20MB trigger in 10 seconds with a 500MB prefetch. 

    All of that should make a huge difference. 

  7. 8 hours ago, Zanena said:

    Oh ok thanks for the suggestion, regarding the Gsuite account is it okay if I regisiter one with my real infos? 

    Yeah, it's perfectly fine. If you fake it, you'll risk having it terminated for fraud. Google doesn't care what you put on your space as long as it isn't facially illegal (read: child pornography), and CloudDrive is completely encrypted anyway. The only thing I've ever heard of Google getting mad at is uploading raw copy-written video files and sharing them with others right off of your space. Don't do that. Using CloudDrive is perfectly fine, though. And CloudDrive respects all of Google's throttling requests etc, so the account won't get API banned unless you exceed the 750GB/day limit, in which case you'll be locked out of the upload API until it resets. 

  8. Quote

     i've read on Reddit some users got their 10$ Gsuite terminated without any notice for exceding the 1TB limit, that is something I'd like to avoid,

    That is absolutely not why those accounts were terminated. Google isn't going to terminate an account for exceeding a limit that they could easily enforce with technical measures if they actually wanted to. Plenty of people have GSuite for business accounts well in excess of 1TB and they aren't getting terminated. The people who said that are either mistaken or lying. I don't know what threads you're talking about, but I can tell you from personal experience that Google does not terminate accounts for using too much space. They could easily simply stop you from uploading anything else, but they don't. Don't worry about it. Of course, anyone who is using an account via another institution, like a university, is at the whims of the polices of their institution, but that's a different story. 

    Quote

    I tried coping the StableBit folder to another drive, then unmounted the drive and created a new one linked the to second drive account and it worked, I had uploaded the files just once, but was able to use them on 2 different drives (not mounted at the same time tho).

     

    I think I might have misunderstood your desire here. You can duplicate the content on one google account on another google account and mount the drive, but I'm not sure what you'd gain by doing it this way. If you simply share the content from one account to another, and the original account gets terminated, the shared content will still vanish. So you'd still have to upload all of the data twice, whether by downloading it all from one drive and uploading it again, or simply by just making a second CloudDrive and mirroring it with DrivePool or some other utility. Personally, I would suggest the latter. It's an automated process and you don't need to worry about moving the data yourself. The different UUIDs generated for the two drives are irrelevant if you're mirroring it at a filesystem level. 

  9. Don't use or pay for the fake .edu accounts. They will absolutely, 100%, get terminated. You can simply pay for your own GSuite for business account and still have access to unlimited storage (they advertise that you need at least 5 users, but even a single user has unlimited as of now). It's only $10 a month, it's unlimited storage, and, to my knowledge, I haven't heard of a single one of these accounts being terminated. The .edu fakes, on the other hand, are terminated all the time. Not to mention that they're just run by shady people willing to do shady things and they shouldn't be trusted at all. 

    Your university account is subject to your institutions policies, but I have never heard of an institution requiring some arbitrary data threshold for their users. I don't even know why they would. The terms of Google for Education don't place any such limits on them, and there's really no reason that they should care how much data anyone uses. Google takes care of it all, and it's actually completely free for the university no matter how much data the students use. You'd have to have some draconian and completely unnecessary IT policies at your institution for them to arbitrarily enforce policies that do not affect them in any way, and do not cost them any more money. They have effectively no responsibility for the data or its maintenance at all. It's just a service that Google provides to educational institutions because of their background as a university research project. 

    Now, that being said,  not all institutions permit alumni to keep GSuite accounts after they graduate at all, or vice versa. My institution only uses GSuite for alumni, for example, which is what I use for my Plex. Our student accounts were another email provider entirely. I've also heard of schools that will provide you with a *different* GSuite account for alumni. I'm sure you can contact your school's IT department to verify their policies around the accounts, in any case. 

    Now, as far as your questions go:

    Quote

    1) In case I want to change machine, is it possible to create a virtual drive on it by linking the old google drive? If so how would that work?

    It's important to understand that your CloudDrive is completely attached to the account that you upload your data to. Again: Do not use a purchased, fake .edu account for your CloudDrive. Once the data is uploaded to that location, it is gone as soon as you lose the account. For good. You would have to set up multiple CloudDrive drives on multiple accounts and mirror them to have any sort of redundancy to protect against this, and you'll be doing X times the uploading of the same data for each account you need to do this for, as well as increasing the overhead on your server. Just don't do it. Use a legit .edu account or pay for GSuite for business. 

    But CloudDrive drives are portable. As long as the integrity of the GSuite account remains intact, you can simply detach it from one machine and reattach it to a new machine. Takes less than 5 minutes to do. 

    Quote

     2) Would it be possible to upload the files through CloudDrive on my main drive account and then copy those files over to other drive accounts? This way if some account gets banned I don't have to upload many TBs of data again to the new .edu account I'd buy (which wouldn't be an issue if there wasn't the 750Gb daily cap). 

    As covered above, no. And simply do not buy .edu accounts. Why are you even supporting those people and their shady fraud? If you're willing to pay, GSuite for business isn't that expensive. You would have to either upload the data twice from a mirrored drive (which you *could* do with DrivePool, if you wanted), or reupload the data from a local backup when (not if) the account gets nuked. 

    Ultimately, though, CloudDrive works wonderfully for Plex and I've been using it for exactly that purpose for over two years now. Just stay as far away as you can from the people setting up fraudulent .edus for free and them selling them on the internet. For the record, I have around 215 TB on my .legit .edu account and I haven't heard a peep from either Google or my institution. 

  10. On 12/9/2018 at 4:22 PM, Soaringswine said:

    heh.. every link was already purple on that search for me as I've been trying to figure out a cause over the year.. only recourse is to use logical data recovery software like R-Studio, but even then, good luck getting it to fully scan a 200 TB ReFS partition hosted on a cloud service without crashing or taking 3 months..

     

    honestly, ReFS support could work with Clouddrives better if you utilized Storage Spaces underneath with Integrity Streams enabled, but then I have no idea how the Clouddrive cache would handle that.

    Count me among the unfortunate souls who also lost a drive. I had a 90TB ReFS volume and a windows update killed it over two years ago, and I ditched it right then. My solution is to use 55TB NTFS volumes (so that they stay below the 60TB limit for chkdsk). I have one large CloudDrive divided up into 55TB partitions, and then I combine them with DrivePool into one unified storage pool. The end effect is identical to having one giant drive, but I can repair volumes individually. I've been using this setup for a few years now without issue. Never had any unresolvable file corruption at all. 

     

    I just expand my CloudDrive and add another 55TB volume when and if I need more space. 

  11. 8 hours ago, Soaringswine said:

    Clouddrive seems to have major issues with ReFS drives, I just lost my fifth drive (have lost something close to 50 TB from this happening this year) that only shows as RAW - even with upload verification enabled.

    Going to move back to NTFS, at least you can run chkdsk on that.

    That's actually a problem with ReFS and not CloudDrive. See the many examples on google: https://www.google.com/search?q=windows+10+reFS+RAW&oq=windows+10+reFS+RAW&aqs=chrome..69i57.7817j0j7&sourceid=chrome&ie=UTF-8

    The fall creators update even removed support for ReFS because of the stability issues, signalling that even MS doesn't necessarily see it as an NTFS successor any longer. At least not any time in the near future. 

    Going with NTFS is advisable, for now. 

  12. On 12/4/2018 at 6:28 AM, Bazzu85 said:

    thx for the reply..

    So this is not the product I'm looking for..

    I actually use rclone to backup my data..and all delete data from source are moved in specific directories on destination..

    CloudDrive is not sync software like rClone. It is a virtual drive. Though they can be used to accomplish similar things, this is a very important distinction. You could use some sort of backup tool on top of CloudDrive to easily accomplish the same thing, but the concept of a "source" and a "destination" really has no relevance with CloudDrive. CloudDrive simply *is* the destination. There is no local copy of the data (outside of the cache) unless you want to set up some sort of cloning or sync software to provide one. Once the data is on your CloudDrive drive you can do anything with it that you could do if it was on a physical hard drive, so you could easily use a tool to copy data from a local directory to the server and leave it there even if you delete it locally--if that's what I'm understanding your desire is here. As Christopher mentioned above, you can even use data recovery on the drive just like a physical disk if you were to *accidently* delete something from the CloudDrive drive--which is something that *cannot* be done with a file-based solution like rClone. 

    But as far as what happens to the data on the drive relative to some local copy, that's entirely up to you. CloudDrive doesn't operate based on interacting with actual copies of individual files stored on your cloud storage, so it doesn't really know or care what you do with those files. All it does is whatever you tell it to do, just like your hard drive. If you want redundancy or deletion protection, any tool that provides such features on a physical drive can easily be used to do the same for a CloudDrive drive. 

  13. 4 hours ago, Carltron said:

    Hi, I'm not sure how to explain this but I'll try my best. I have a ton of movie and music files that I'm wanting to backup to my cloud account but I want to save the complete files as they are on my pc and not chunks with random letters and numbers so that I can identify them without stablebit and download them from the cloud without needing stablebit clouddrive in case I need access to them without having access to stablebit. Is this possible?

     

    CloudDrive is not the solution you're looking for, in that case. It fundamentally operates in a way incompatible with your goals. There are some other solutions that might meet your needs. See this post for some more information: 

     

     

  14. 20 hours ago, schudtweb said:

    The german translation is not very helpfull :)

    The error says something like: Error while reading, see internal exception. But what exception?

     

    The internet connection is not the problem. I can download or upload from/to other sites with fullspeed (100mbit/50mbit). Only CloudDrive has problems...

    Today i started the pc and i have immediately two blue and one orange error: Canceld thread, error while reading and error while writing. But everything else works. I am connected via fastviewer to my office pc and the connection is still alive.

     

    While your internet connection may be fine, that error signifies that the connection between you and google's servers is not. That is what that error means. CloudDrive is not able to contact Google's servers and upload and download your data. The causes of such a problem can vary, but the fact that you can connect to *other* sites reliably has no bearing on this problem.

    I'm sure Christopher can walk you through troubleshooting the connection in your ticket.

     

    19 hours ago, schudtweb said:

    This folder dont exist...

     

    Check for it at C:\ProgramData\StableBit CloudDrive\Service\Logs. The ProgramData directory will be hidden. 

  15. I don't speak German, so I'm not sure what the first error is telling you, but the second error is simply telling you that Google Drive was unreachable for a little while. If you're sure that everything is fine with your connection, it was probably just a temporary network hiccup. They happen. I get that error every once in awhile. Sometimes Google Drive itself even has network problems. You can check that status here: https://www.google.com/appsstatus#hl=en&v=status

    Completely unrelated to the issue discussed in that other thread you linked. 

    Log files are located in %ProgramData%\StableBit CloudDrive\Service\Logs

    EDIT: Upon a second look, it appears that the first error is just the equivalent upload error to the second download error. Both simply indicate a network problem between you and Google. Neither are long-term concerns as long as they are temporary. 

  16. Again, I'm simply not even sure how that would be possible from a technical perspective. I suspect that any high-traffic application might cause the same problem for you. CloudDrive doesn't do anything to modify the WAN settings on your router--nor could it. If your router is dropping the connection to your modem, that's almost certainly your router, your modem, or your ISP. 

    Stranger things have happened, I suppose, but I have no idea what could cause that problem. No application on your PC should ever be able to cause that problem. 

    I know this probably isn't what you want to hear, but it's almost *certainly* not CloudDrive. CloudDrive is just demanding the data required to trigger the problem. 

  17. 10 hours ago, rotors14 said:

    With dropping I mean none of my devices can not reach any website. All of them (by wire ethernet and Wi-Fi) have IP address and I can access towards the router configuration (192.168.1.1). The only software I am currently using is PLEX and CloudDrive. 
    Sorry, but I do not know what else details I could tell you :(

    Does the router still have a WAN IP when this happens? You'll have to look at the router's config page when it drops out to see. It sounds to me like the router is losing its connection to the modem when this happens, and I have no idea how any application could possibly cause that unless there is a problem with the router, the modem, or your ISP. 

    10 hours ago, rotors14 said:

    Thank you very much! I use different Google Drive accounts, so I would need the best config for different purposes. What config would you use only for PLEX and only for torrent? I would prefer use a config only for PLEX (3-4 bdremux streaming) and a config only for torrent.

    Thank you again! You are really good!

     

    That setup works fine for both. I don't use torrents very much, but that setup works fine when I do. If you're hosting a drive specifically for torrents, you can probably adjust the prefetcher down somewhat. Torrent pieces are only around 4-8MB apiece even for a very large torrent. So caching 500MB at a time is unnecessary. Honestly, with a minimum download size of 10MB, I'd probably just disable the prefetcher altogether for torrents. Torrents grab pieces in a somewhat arbitrary order, based on the availability of pieces being seeded. So there isn't a real advantage to caching a lot of contiguous data. It won't likely need the data contiguously. If you're seeding, torrents are also comparatively slow. You'll rarely need more than 1MB/sec or so, and your connection is fast enough to handle that on the fly--as is mine. 

  18. On 11/12/2018 at 2:06 PM, rotors14 said:

    Thank you for your help. I will read this article. I have used two different routers because I moved to another house and I even tested a third router (Linksys EA8500) as neutral router with the ISP routers in bridge mode but the problem persists.

    OK. In that case, I think my follow up question is for some more detail about what you mean when you say that the router is "dropping" your internet connection. Is it simply not responding? Are you losing your DHCP to your modem? What specifically is it doing when it drops? 

    I'm trying to think of things that *any* application on your PC could be doing that would cause a problem with the connection between your router and your WAN without there being an underlying router issue, but I'm having trouble thinking of anything. Some more detail might help.

     

    On 11/12/2018 at 5:33 PM, rotors14 said:

    By the way, I read almost all your messages about CloudDrive for PLEX and torrent. Thank you for all of them. They are very useful. I have not good upload speed. My best upload speed has been 1-2 MBps. My prefetch settings are 1MB, 100MB, 180s. I do not how can I do to improve it. (1000/100 mbps connection)

     

    You're welcome. Some of them are probably out of date compared to what I'm using today. I had written up a little tutorial on reddit to help people, but I ended up deleting that account and wiping all of the posts. 

    Nothing will influence your upload more than the number of upload threads. Personally, I throttle my upload to 70 mbps so that I never exceed Google's 750GB/day limit, but my current settings easily saturate that. My settings are as follows, and I highly recommend them:

    • 10 Download Threads
    • 5 Upload Threads
    • No background I/O (I disable this because I want to actually prioritize the reads over the writes for the Plex server)
    • Uploads throttled to 70mbps
    • Upload Threshold set to 1MB or 5mins
    • 10MB minimum download size (seems to provide a relatively responsive drive while fetching more than enough data for the server in most cases)
    • 20MB Prefetch Trigger
    • 500MB Prefetch forward
    • 10 second prefetch window

    I actually calculated the prefetch settings based on the typical remux quality video file on my server. 20MB in 10 seconds is roughly equivalent to a 16mbps video file, if I remember the standard that I used correctly. That's actually a very low bitrate for a remuxed bluray, which is typically closer to 35mbps. The 10MB minimum download seems to handle any lower quality files just fine. 

    I still play with the settings once in awhile, but this setup successfully serves up to 6 or 7 very high quality video streams at a time. I've been extremely happy with it. I'm using one large drive (275TB) divided up into 5 55TB volumes (so I can use chkdsk), and it has about 195TB of content on it. I combine all of the volumes using DrivePool and point Plex at the DrivePool drive. Works wonderfully. 

    The drive structure is 20MB chunks with a 100MB chunk cache, running with 50GB expandable cache on an SSD. 

     

×
×
  • Create New...