Jump to content
  • 0

[.901] Google Drive - server side disconnects/throttling


borez

Question

Have been getting lots of throttling/server side disconnect messages from Google Drive recently, see below. Have been using the same settings for some time, seems that there's some changes to their throttling mechanism?

 

If this helps, some comments:

 

1) I had an unsafe shutdown, and as a result i need to reupload 18GB of data. From the tech page, it seems that I'm redownload chunks for partial re-writes. Seems to be fairly intensive vs a straight-upload?

2) The errors have started since I upgraded to the .900/901 builds.

3) I typically run 6 threads, with a max upload speed of 400mbps. Scaling this down to 3/4 threads doesn't help.

0:10:29.5: Warning: 0 : [IoManager:32] Error performing read-modify-write I/O operation on provider. Retrying. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:15:35.5: Warning: 0 : [ReadModifyWriteRecoveryImplementation:62] [W] Failed write (Chunk:564, Offset:0x00000000 Length:0x01400500). Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:15:35.9: Warning: 0 : [TemporaryWritesImplementation:62] Error performing read-modify-write, marking as failed (Chunk=564, Offset=0x00000000, Length=0x01400500). Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:15:35.9: Warning: 0 : [WholeChunkIoImplementation:62] Error on write when performing master partial write. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:15:35.9: Warning: 0 : [WholeChunkIoImplementation:62] Error when performing master partial write. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:15:35.9: Warning: 0 : [WholeChunkIoImplementation:61] Error on write when performing shared partial write. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:15:35.9: Warning: 0 : [IoManager:62] Error performing read-modify-write I/O operation on provider. Retrying. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:15:35.9: Warning: 0 : [IoManager:61] Error performing read-modify-write I/O operation on provider. Retrying. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:16:51.3: Warning: 0 : [ReadModifyWriteRecoveryImplementation:95] [W] Failed write (Chunk:2743, Offset:0x00000000 Length:0x01400500). Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:16:51.6: Warning: 0 : [TemporaryWritesImplementation:95] Error performing read-modify-write, marking as failed (Chunk=2743, Offset=0x00000000, Length=0x01400500). Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:16:51.6: Warning: 0 : [WholeChunkIoImplementation:95] Error on write when performing master partial write. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:16:51.6: Warning: 0 : [WholeChunkIoImplementation:95] Error when performing master partial write. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:16:51.6: Warning: 0 : [WholeChunkIoImplementation:96] Error on write when performing shared partial write. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:16:51.6: Warning: 0 : [IoManager:95] Error performing read-modify-write I/O operation on provider. Retrying. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:16:51.6: Warning: 0 : [IoManager:96] Error performing read-modify-write I/O operation on provider. Retrying. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
0:16:56.1: Warning: 0 : [ApiGoogleDrive:102] Google Drive returned error (userRateLimitExceeded): User Rate Limit Exceeded
0:16:56.1: Warning: 0 : [ApiHttp:102] HTTP protocol exception (Code=ServiceUnavailable).
0:16:56.1: Warning: 0 : [ApiHttp] Server is throttling us, waiting 1,517ms and retrying.
0:16:56.2: Warning: 0 : [ApiGoogleDrive:79] Google Drive returned error (userRateLimitExceeded): User Rate Limit Exceeded
0:16:56.2: Warning: 0 : [ApiHttp:79] HTTP protocol exception (Code=ServiceUnavailable).
0:16:56.2: Warning: 0 : [ApiHttp] Server is throttling us, waiting 1,873ms and retrying.
0:16:56.3: Warning: 0 : [ApiGoogleDrive:100] Google Drive returned error (userRateLimitExceeded): User Rate Limit Exceeded
0:16:56.3: Warning: 0 : [ApiHttp:100] HTTP protocol exception (Code=ServiceUnavailable).
0:16:56.3: Warning: 0 : [ApiHttp] Server is throttling us, waiting 1,561ms and retrying.
0:16:56.3: Warning: 0 : [ApiGoogleDrive:84] Google Drive returned error (userRateLimitExceeded): User Rate Limit Exceeded
0:16:56.3: Warning: 0 : [ApiHttp:84] HTTP protocol exception (Code=ServiceUnavailable).
0:16:56.3: Warning: 0 : [ApiHttp] Server is throttling us, waiting 1,664ms and retrying.
0:16:57.3: Warning: 0 : [ApiGoogleDrive:96] Google Drive returned error (userRateLimitExceeded): User Rate Limit Exceeded
0:16:57.3: Warning: 0 : [ApiHttp:96] HTTP protocol exception (Code=ServiceUnavailable).
0:16:57.3: Warning: 0 : [ApiHttp] Server is throttling us, waiting 1,546ms and retrying.
Link to comment
Share on other sites

Recommended Posts

  • 0

Been getting those since this morning as-well. Basically started when Windows decided to reboot for an update and my drive got closed non-cleanly and started re-upping my entire cache again. Its getting flooded with these:

Exception:

CloudDriveService.Cloud.Providers.Apis.GoogleDrive.GoogleDriveHttpProtocolException: User rate limit exceeded. ---> System.Net.WebException: The remote server returned an error: (403) Forbidden.
   at System.Net.HttpWebRequest.GetResponse()
   at CloudDriveService.Cloud.Providers.Apis.Base.Parts.HttpApi.HttpApiBase.GetJsonResponse[T](HttpWebRequest request)
   at CloudDriveService.Cloud.Providers.Apis.GoogleDrive.Files.<>c__DisplayClass14_0.<UploadNew>b__1(HttpWebRequest request)
   at CloudDriveService.Cloud.Providers.Apis.Base.Parts.HttpApi.OAuth2HttpApiBase`1.<>c__DisplayClass6_0`1.<RequestBlock>b__0(HttpWebRequest request)
   at CloudDriveService.Cloud.Providers.Apis.Base.Parts.HttpApi.HttpApiBase.<>c__DisplayClass19_0`1.<RequestBlock>b__0()
   --- End of inner exception stack trace ---
   at CoveUtil.RetryBlock.Run[TException,TResult](Func`2 Func, Action`1 Control, Boolean ThrowOnOperationCanceled)
   at CoveUtil.RetryBlock.Run[TException,TResult](Func`1 Func, Action`1 Control)
   at CloudDriveService.Cloud.Providers.Apis.Base.Parts.HttpApi.HttpApiBase.RequestBlock[TResponse](String fullUrl, Func`2 func)
   at CloudDriveService.Cloud.Providers.Apis.Base.Parts.HttpApi.OAuth2HttpApiBase`1.RequestBlock[TResponse](String fullUrl, Func`2 func)
   at CloudDriveService.Cloud.Providers.Apis.GoogleDrive.Files.UploadNew(String fileName, Stream buffer, String parentFolderId, Action`1 generatedId)
   at CloudDriveService.Cloud.Providers.Io.GoogleDrive.GoogleDriveIoProvider.#bjg(#UBf #rxd, ChunkInfo #fAe, #Xtf #gb)
   at CloudDriveService.Cloud.Providers.Io.ChunkIo.ChunkIdIoProvider`2.#4af.#8tf(#UBf #rxd)
   at CloudDriveService.Cloud.Providers.Io.ChunkIo.Helpers.ChunkId.ChunkIdHelper.#Gqe(UInt64 #HAe, Action`1 #wBf)
   at CloudDriveService.Cloud.Providers.Io.ChunkIo.ChunkIdIoProvider`2.#Gqe(ChunkInfo #fAe, #Xtf #gb)
   at #Iqe.#PAe.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#UEf.#Saf.#9Lf(Stream #oYb)
   at #AKf.#zKf.#yKf(ChunkInfo #fAe, Stream #gb, Action`1 #wBf)
   at #Iqe.#UEf.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#9Qg.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#6Uf.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#3Pf.#jbf.#9Lf(Stream #oYb)
   at #AKf.#zKf.#yKf(ChunkInfo #fAe, Stream #gb, Action`1 #wBf)
   at #Iqe.#3Pf.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#NRf.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#LCf.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#1Sf.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hzf.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#CAe.#Saf.#9Lf(Stream #oYb)
   at #AKf.#zKf.#yKf(ChunkInfo #fAe, Stream #gb, Action`1 #wBf)
   at #Iqe.#CAe.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#vzf.#U9e.#9Lf(Stream #EQf)
   at #AKf.#zKf.#yKf(ChunkInfo #fAe, Stream #gb, Action`1 #wBf)
   at #Iqe.#vzf.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#UAe.#adf.#9Lf(Stream #oYb)
   at #AKf.#zKf.#yKf(ChunkInfo #fAe, Stream #gb, Action`1 #wBf)
   at #Iqe.#UAe.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hzf.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hzf.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#TEf.#OLf(ChunkInfo #fAe, Stream #gb)
   at #Iqe.#Hqe.#Gqe(ChunkInfo #fAe, Stream #gb)
   at #Kqe.#Tqe.#Wc(UInt64 #Y, Int32 #59f, Stream #vAe, IoType #NOe)
   at CloudDriveService.Cloud.IoManager.#Wc(WriteRequest #89f, RetryInformation`1 #7tf)
   at CloudDriveService.Cloud.IoManager.#iTf.#3bf(RetryInformation`1 #7tf)
   at CoveUtil.RetryBlock..(RetryInformation`1 )
   at CoveUtil.RetryBlock.Run[TException,TResult](Func`2 Func, Action`1 Control, Boolean ThrowOnOperationCanceled)

Link to comment
Share on other sites

  • 0

+1 been getting the same errors the past couple days.

 

2:26:57.5: Warning: 0 : [ApiHttp] Server is throttling us, waiting 4,029ms and retrying.
2:27:02.6: Warning: 0 : [ApiGoogleDrive:143] Google Drive returned error (userRateLimitExceeded): User rate limit exceeded.
2:27:02.6: Warning: 0 : [ApiHttp:143] HTTP protocol exception (Code=ServiceUnavailable).
2:27:02.6: Warning: 0 : [ApiHttp] Server is throttling us, waiting 8,437ms and retrying.
Link to comment
Share on other sites

  • 0

I mean, the good news is that the drives are still working. Or, at least, mine are. But maybe there are going to be additional throttling concerns.

 

Anyone noticed if it's on reads and writes? As far as I can tell it's only writes, no?

Thats my exp. as well.

 

 

Sent from my iPhone using Tapatalk

Link to comment
Share on other sites

  • 0

Reading through the rclone thread and some other forums, it looks like it might just be a new upload limit. I'm reading a lot of people say that it kicks in at around 800GB/day or so. That's a pretty reasonable limit imo, but it's probably going to change some things for CloudDrive and other solutions. We'll have to see what Christopher has to say when he has a chance to hop on the forums.

 

Sadly, for me, I was in the middle of transferring data from one large drive to two smaller ones (to delete the large one), so that's going to take a lot longer now. 

Link to comment
Share on other sites

  • 0

the error in question is a throttling related error, and I believe is account wide, not user based.

 

Try reauthorizing the drive, as this may help.

 

 

Also, this won't go away right away when upgrading, it make take a bit before it starts working properly again.

 

And the 901 build should significantly reduce the amount of API calls, and the amount of data requested.  It does this by changing the "minimum download size" code, and making it more intelligent and efficient.

Link to comment
Share on other sites

  • 0

It seemed to fix it for about 5 minutes and then went back to:

 

0:04:22.1: Warning: 0 : [ApiGoogleDrive:8] Google Drive returned error (userRateLimitExceeded): User Rate Limit Exceeded
0:04:22.1: Warning: 0 : [ApiHttp:8] HTTP protocol exception (Code=ServiceUnavailable).
0:04:22.1: Warning: 0 : [ApiHttp] Server is throttling us, waiting 1,294ms and retrying.
0:04:22.6: Warning: 0 : [ApiGoogleDrive:57] Google Drive returned error (userRateLimitExceeded): User Rate Limit Exceeded
0:04:22.6: Warning: 0 : [ApiHttp:57] HTTP protocol exception (Code=ServiceUnavailable).
0:04:22.6: Warning: 0 : [ApiHttp] Server is throttling us, waiting 1,075ms and retrying.
0:04:32.9: Warning: 0 : [OAuth2TokenBase:11357795] HttpProtocolException refreshing token. HTTP Status BadRequest
0:04:32.9: Warning: 0 : [ValidateLogins] Error instantiating management provider. Security error.
Link to comment
Share on other sites

  • 0

To be clear: This is not going to go away by upgrading. The cutoff is server side and transfer based. Once you upload somewhere around 750-800GB of data in a 24 hour period you will be API locked until the following day. CloudDrive can certainly add some features to help us manage that total, but the throttling will not go away unless Google decides to change their policies again. 

 

This simply appears to be a new limitation being imposed by Google. Uploads are now limited, server-side, to around 750GB or so per day. As far as anyone can tell, reads are not limited--they seem to function as they always have. 

Link to comment
Share on other sites

  • 0

To be clear: This is not going to go away by upgrading. The cutoff is server side and transfer based. Once you upload somewhere around 750-800GB of data in a 24 hour period you will be API locked until the following day. CloudDrive can certainly add some features to help us manage that total, but the throttling will not go away unless Google decides to change their policies again. 

 

This simply appears to be a new limitation being imposed by Google. Uploads are now limited, server-side, to around 750GB or so per day. As far as anyone can tell, reads are not limited--they seem to function as they always have. 

 

 

Correct.

 

Once the issue is triggered, only time will fix it.

 

 

That said, the newer version should help with avoiding hitting this issue, as it should be more efficient about using what has been downloaded and keeping it in the cache better.   This should reduce the likelihood that it will trigger this.

 

 

that said .... it may be a good idea to throttle your speeds....  setting it to 50/25 mbps  should allow you to (narrowly) avoid the 800-900GB/day apparent limit.

Link to comment
Share on other sites

  • 0

Are you sure it's the "User rate limit exceeded" issue?

 

If it is, then ... that will be normal, as it takes time for this status to be removed from your account.  

 

 

And for reference, 75mbps for 24 hours will hit around 800-900GBs.  So that may/will hit this limit.  It's ... disturbingly low, IMO. 

Link to comment
Share on other sites

  • 0

 

 

I ask him if there is a limit for uploading XXX GB per 24 hours. This is the answer: “I have double checked and there is no limitations regarding that, as per this article : https://support.google.com/drive/answer/37603?visit_id=1-636381266873867728-2759172849&rd=1â€

 

 

Google does not publish their API limits. So you won't see a public statement on this. Many people have tested it though. The limit is around 750GB/day. This is the same way we discovered the API limits that rClone previously ran into as well. You can always test it yourself if you want to verify. Stop uploading data for at least 24 hours and then track how much you upload before you start getting 403 errors. 

Link to comment
Share on other sites

  • 0

Google does not publish their API limits. So you won't see a public statement on this. Many people have tested it though. The limit is around 750GB/day. This is the same way we discovered the API limits that rClone previously ran into as well. You can always test it yourself if you want to verify. Stop uploading data for at least 24 hours and then track how much you upload before you start getting 403 errors. 

 

 

Sadly, that's not entirely correct.

 

Google has been very clear with their published API limits, in how many requests in a day, etc.

 

This new behavior ... is well new, and doesn't appear to be documented.  This is definitely not "typical google behavior" and doesn't bode well for consumers....  (or us, for that matter)

Link to comment
Share on other sites

  • 0

I just know that there is an unpublished 10TB/day download limit as well, and limits on drive to drive transfers. I know they publish other details, but it seems like their fair use policies (if you want to call them that) are more obscure for some reason. Their own support folks don't even seem to be aware. It is, in any case, a concerning practice.

Link to comment
Share on other sites

  • 0

Maybe, but 10TB/day is a huge amount, and most won't hit that. In fact, you'd need near gigabit internet to hit that. 

 

But that's not really the point hear.

 

As for the "fair use", the blunt term will be "bandwidth cap", Because that is what it is.  It may be rated per day, but at the end of the day, that's what this is.

 

And it sucks, all around.

Link to comment
Share on other sites

  • 0

Are you sure it's the "User rate limit exceeded" issue?

 

If it is, then ... that will be normal, as it takes time for this status to be removed from your account.  

 

 

And for reference, 75mbps for 24 hours will hit around 800-900GBs.  So that may/will hit this limit.  It's ... disturbingly low, IMO. 

 

That was what I was seeing in the error logs. The errors kicked in around 20-30 mins of operations so I'm definitely way below the bandwidth caps.

 

A couple of observations:

  • The issues arose when CD was working on partial re-read/write chunk operations. In that instance the GUI showed i was running way more threads than what I stipulated in the settings. For example, I was downloading with 6-7 threads where my settings indicated 4.
  • The errors seemed to have stopped when I was doing full writes to CD.
Link to comment
Share on other sites

  • 0

So far only the WebUI is works, can't do anything else. CloudDrive doesn't even connect anymore. Not sure why, been solid since I first started using it and now all of the sudden its unusable. Tried Google Drive File Stream (their own product, same thing).

 

I have a GSuite account which im the only user of.

Link to comment
Share on other sites

  • 0

so is there really any difference to restricting upload to 70 Mb/s (~750 GB / day) vs letting it upload uncapped until the user rate limit and letting it auto-resume once the limit goes away? seems you can upload the same amount in either case, just faster in the latter case, right?

 

 

The main difference would be the flexibility of being to address unforeseen situations wherein it might be nice to have the ability to write to the cloud. I think, on the balance, it's probably better for the drive to have the ability to read and write at any time than requiring it to wait until it has connectivity again. But that probably depends on your caching situation. It will not, in any case, affect the operation of the drive in the long-run. 

Link to comment
Share on other sites

  • 0

so is there really any difference to restricting upload to 70 Mb/s (~750 GB / day) vs letting it upload uncapped until the user rate limit and letting it auto-resume once the limit goes away? seems you can upload the same amount in either case, just faster in the latter case, right?

 

 

Yes.

Capping in the software limits the upload and download rate that the software will use.  

 

While the "user rate limit" will block you from uploading OR downloading. And if you can't download data .... it will cause the drive to unmount after 3 of these errors in a 120 second period.

 

 

Obviously, neither is a good solution, but better to have slow access than no access.... 

But ideally, Google would lift these limits (but that's a wish/dream)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...