Jump to content

ksnell

Members
  • Posts

    5
  • Joined

  • Last visited

Posts posted by ksnell

  1. On 12/12/2019 at 3:36 AM, srcrist said:

    My read of this is that Google's API keys have a query limit per key and they are approaching that limit, and their requests to have the limit raised have thus far been unanswered. There is quite a bit of overreaction here, though, since you can simply use your own API key regardless (see: http://wiki.covecube.com/StableBit_CloudDrive_Q7941147). If you're using your own key, the limit will not affect you. Worst case scenario here is that everyone will simply have to provide their own personal API keys. You won't lose access to your data. Note that anyone can get an API key for google cloud, and the default limits on personal keys are more than enough for a personal CloudDrive installation. 

    I think that it is a rather frustrating move on Google's part not to simply raise the limits when their developers are successful, though.

    In any case, unless I am mistaken here, this isn't a major problem. It just removes some ease of use. 

     

    @srcrist Followed this wiki you posted to get my key, modify json file, etc, but how do I know it's working (using my new key)?

     

    The bottom of the guide says: "The next connection that you make to Google Drive from StableBit CloudDrive's New Drive tab will use your new API keys." so does that mean I need to: detach my drive, delete the connection and completely re-setup and then it will pick up the new key?

  2. 1 hour ago, srcrist said:

    It is a Google server-side limitation. Google will not allow you to upload more than 750GB/day, and will lock you out of the upload API for a day if you hit that threshold. CloudDrive cannot do anything to evade that limitation. You will either have to throttle your upload to 70-75mbps so that you don't need to worry about it, or you'll have to personally monitor your upload so that you upload less than 750GB/day of data. rClone has the same limitation, with the same solution. The --bwlimit flag in rclone is the same as setting the upload throttling in CloudDrive. 

    To be clear, though, this post was not about an inability to reach over 70mbps upload. If that's your problem, you aren't talking about the same thing. CloudDrive can easily achieve over 70mpbs upload, but there isn't generally any reason to. 

    Thanks a bunch for the thorough response. I guess I never realized that was the limitation before now. I typically would not hit it except for an initial upload. I have limited my upload speed to rectify the problem.

  3. Wondering theoretical feasibility as well as best practices for this scenario using Google Drive.

     

    My backup software (Acronis True Image) creates very large individual files for each backup. Generally speaking, I am wondering what happens if your file size you are transferring to CD is larger than your cache size (assuming cache type is "Fixed"). Will it just transfer the single file in pieces until it creates a whole file on the cloud (keeping in mind the upload could take days), or do I need to tell the Acronis software to break up the backup into more manageable chunks of data?

  4. I am having the same issue with Google Drive. Is 70mpbs a Google Drive StableBit CloudDrive limitation or something? My connection is 150mpbs and I can fully saturate it with rclone so I am not sure what the issue is.

×
×
  • Create New...