Jump to content

Kraevin

Members
  • Posts

    54
  • Joined

  • Last visited

  • Days Won

    7

Posts posted by Kraevin

  1. On 10/15/2020 at 6:32 PM, srcrist said:

    It looks like Google's Enterprise Standard plan still has unlimited for even a single user for $20 a month. So this change appears to be little more than a price increase for people who were previously using their unlimited business plan. That remains, to my knowledge, the only provider comparable to Google's own previous plan.

    I can confirm you can upgrade to the enterprise plan with 1 user for 20 bucks and it includes unlimited storage, you can do it through your admin console > add subscription. It will show your old plan and allow you to upgrade to any of the new ones including enterprise.

    Enterprise also seemed to remove the user limit. 

    https://workspace.google.com/pricing.html

    "Business Starter, Business Standard, and Business Plus plans can be purchased for a maximum of 300 users. There is no minimum or maximum user limit for Enterprise plans."

     

  2. I know this post is a couple months old, just wanted to give an update. Gsuite Business is increasing their prices from 10 dollars a month to 12 dollars a month beginning April 2. Still worth it though for unlimited space.

    Zanena, yes you can register with your real info if you are registering yourself, you only need 1 user for unlimited even though it says 5 is required. You will need to have a domain name though to link the account, you can either use your own or purchase one from them during the setup which is cheap, i believe its like 12 dollars a year through them for a domain name.

  3. From my experience, i have yet to hit the API ban using Google Drive and Stablebit, i have a very large library. My Movies library is setup into sub folders as you suggested. I have done full library scans to test this specifically for a API ban. 

    As to answer your cache question, Plex of course would have to rescan your library of your files again due to path location being changed. So yes i would guess that it would update the cache to reflect that. 

    As i said above, i have yet to hit any kind of ban with Google and Stablebit, if you do some how achieve hitting a ban, please post and let us know so we can keep track =).

     

  4. On 11/15/2017 at 2:27 PM, keithdok said:

    I have 10TB expandable drive hosted on Google Drive with a 2TB cache. Whenever I add a large amount of data to the drive (for example I added 1TB yesterday) CloudDrive will occasionally stop all read and write operations to do "metadata pinning". This prevents Plex from doing anything while it does its thing, and took over 3 hours to do yesterday. I don't want to disable metadata pinning, but I would like to be able to schedule it, if necessary, for 3AM or something like that. 

    In the meantime, is there a drawback to having such a large local cache? Would it improve operations like metadata pinning and crash recovery if I decreased it?

    How did you bypass the 750GB daily upload limit if you uploaded 1TB yesterday? I am curious on this =)

  5. 6 hours ago, Christopher (Drashna) said:

    Welcome! 

    And just keep in mind that not all cloud providers are equal! 

    Very true Chris. So far from my experience Google seems to perform the best, just wish the daily upload limit was higher. I will be sticking with Google, just was curious to see how the other providers handled.

  6. 4 hours ago, Christopher (Drashna) said:

    I think for just the upload, but I'm not sure.  I've seen conflicting data about that.  

     

    For now, just the upload. But if that doesn't help, then both. 

    From what i have seen its just upload.

  7. Just to add, while the team drive does sound like a good idea, i do not see it really being possible, as a team drive has a limit of 100k files per drive. I could see hitting that pretty quickly with the way cloudrive uses chunks.

  8. I have been testing several providers (Google Drive, Drop Box and Box.com). I am curious if anyone else here has been testing Box.com and their results with uploading files. I noticed with Google and Dropbox i can pretty much max out my upload connection (gigabit connection), 700-1100 mbps. and this is with just 5 upload threads. 

    With box.com i see to get a different story with uploading only able to max about 50-70 mbps. Yes i have adjusted the upload threads and still no difference and upload bandwidth throttle isn't on. Box.com claims they do not throttle. So is anyone else here having slow uploads with them? If it helps i am uploading from the UK area.

    I have a feeling this is probably more of an issue with Box.com and their servers.

  9. 5 minutes ago, srcrist said:

    I've never really tested the limits, but I have 5 to 8 regularly without any issues. Server is an E3 1245v2 and 32GB of ram on a 250mbit connection and media is stored predominately in 20000 mbit/s files or higher. I've been quite happy with it. 

    Awesome, what cloud provider are you using? 

  10. Can't get google drive unlimited (i'm not a student and my company has less than 5 users), i'ts only 1TB per user

     

    This is incorrect. 

     

    Yes the site does list that, but  they don't enforce that, just signup for the unlimited as one user and you will be charged 10 bucks a month for unlimited. You will also need to pay for a domain name for the year, for me it was 12 bucks a year.

     

    I have had Google Drive Unlimited as a single user with unlimited space for 3 years now.

  11. Kraevin,

     

    Do you have more than one Stablebit Clouddrive on Google running at a time?  I believe there is a 20 thread limit for Google as a provider and if you have more than one clouddrive running and have each drive set to use more then 10 threads, than you would get the userRateLimitExceeded error. 

     

    I would also like to hear from Christopher to address some of these issues.

     

     

    Google is limiting you for making too many API calls/sec, not CloudDrive. No way for them to get around this. You could raise your chunk size so that you are making less API calls per second. Google does not seem to care how much bandwidth you are pushing/pulling, just how many API calls you make. With larger chunks you will obviously make less API calls to upload/download the same amount of data.

     

     

    I am only using one Cloud Drive. I will try raising the chunks and see if it helps. Thanks =)

  12. There is definitely something going on. I am still getting the [ApiGoogleDrive:39] Google Drive returned error (userRateLimitExceeded): User Rate Limit Exceeded  and also getting server is being throttled errors..

     

    its been 2 weeks since i seen a post from Christopher on this thread, hoping they are just busy trying to figure it out the issue.

×
×
  • Create New...