Jump to content

  • Log in with Twitter Log in with Windows Live Log In with Google      Sign In   
  • Create Account

Photo

[.901] Google Drive - server side disconnects/throttling


  • Please log in to reply
35 replies to this topic

#21 borez

borez

    Member

  • Members
  • PipPip
  • 20 posts

Posted 13 August 2017 - 04:09 AM

Are you sure it's the "User rate limit exceeded" issue?

 

If it is, then ... that will be normal, as it takes time for this status to be removed from your account.  

 

 

And for reference, 75mbps for 24 hours will hit around 800-900GBs.  So that may/will hit this limit.  It's ... disturbingly low, IMO. 

 

That was what I was seeing in the error logs. The errors kicked in around 20-30 mins of operations so I'm definitely way below the bandwidth caps.

 

A couple of observations:

  • The issues arose when CD was working on partial re-read/write chunk operations. In that instance the GUI showed i was running way more threads than what I stipulated in the settings. For example, I was downloading with 6-7 threads where my settings indicated 4.
  • The errors seemed to have stopped when I was doing full writes to CD.


#22 JohnKimble

JohnKimble

    Member

  • Members
  • PipPip
  • 15 posts

Posted 15 August 2017 - 05:32 PM

So far only the WebUI is works, can't do anything else. CloudDrive doesn't even connect anymore. Not sure why, been solid since I first started using it and now all of the sudden its unusable. Tried Google Drive File Stream (their own product, same thing).

 

I have a GSuite account which im the only user of.



#23 Soaringswine

Soaringswine

    Member

  • Members
  • PipPip
  • 10 posts

Posted 16 August 2017 - 12:08 AM

so is there really any difference to restricting upload to 70 Mb/s (~750 GB / day) vs letting it upload uncapped until the user rate limit and letting it auto-resume once the limit goes away? seems you can upload the same amount in either case, just faster in the latter case, right?



#24 srcrist

srcrist

    Advanced Member

  • Members
  • PipPipPip
  • 79 posts

Posted 16 August 2017 - 04:07 AM

so is there really any difference to restricting upload to 70 Mb/s (~750 GB / day) vs letting it upload uncapped until the user rate limit and letting it auto-resume once the limit goes away? seems you can upload the same amount in either case, just faster in the latter case, right?

 

 

The main difference would be the flexibility of being to address unforeseen situations wherein it might be nice to have the ability to write to the cloud. I think, on the balance, it's probably better for the drive to have the ability to read and write at any time than requiring it to wait until it has connectivity again. But that probably depends on your caching situation. It will not, in any case, affect the operation of the drive in the long-run. 



#25 Christopher (Drashna)

Christopher (Drashna)

    Customer and Technical Support

  • Administrators
  • 8,208 posts
  • LocationSan Diego, CA, USA

Posted 17 August 2017 - 10:38 PM

so is there really any difference to restricting upload to 70 Mb/s (~750 GB / day) vs letting it upload uncapped until the user rate limit and letting it auto-resume once the limit goes away? seems you can upload the same amount in either case, just faster in the latter case, right?

 

 

Yes.

Capping in the software limits the upload and download rate that the software will use.  

 

While the "user rate limit" will block you from uploading OR downloading. And if you can't download data .... it will cause the drive to unmount after 3 of these errors in a 120 second period.

 

 

Obviously, neither is a good solution, but better to have slow access than no access.... 

But ideally, Google would lift these limits (but that's a wish/dream)


Christopher Courtney

aka "Drashna"

Microsoft MVP for Windows Home Server 2009-2012

Lead Moderator for We Got Served

Moderator for Home Server Show

 

This is my server

 

Lots of "Other" data on your pool? Read about what it is here.


#26 Soaringswine

Soaringswine

    Member

  • Members
  • PipPip
  • 10 posts

Posted 17 August 2017 - 11:27 PM

does it actually block from downloading? I don't think I've noticed that and I definitely haven't had the drive unmount when uploads are failing.



#27 Christopher (Drashna)

Christopher (Drashna)

    Customer and Technical Support

  • Administrators
  • 8,208 posts
  • LocationSan Diego, CA, USA

Posted 18 August 2017 - 07:14 PM

does it actually block from downloading? I don't think I've noticed that and I definitely haven't had the drive unmount when uploads are failing.

 

If you're getting the "User rate limit exceeded", yes, it will block ALL API calls to the provider (that aren't from it's official app), and that will prevent you from downloading from the provider. 


Christopher Courtney

aka "Drashna"

Microsoft MVP for Windows Home Server 2009-2012

Lead Moderator for We Got Served

Moderator for Home Server Show

 

This is my server

 

Lots of "Other" data on your pool? Read about what it is here.


#28 Soaringswine

Soaringswine

    Member

  • Members
  • PipPip
  • 10 posts

Posted 18 August 2017 - 07:44 PM

according to some posts on the rclone thread (https://forum.rclone...ceeded/3469/148) about this issue:

 

 

 

With my Enterprise account I received this message 40 minute ago from the Google Cloud Platform support, API team:

 

Recently the Drive engineering team implemented new daily upload quotas on a user account basis, once these limits are hit they take 24 hours to reset. The Drive engineering team do not make their limits public and limits are subject to change at their discretion. Support have not been informed of the limit however recent cases indicate that it is indeed a 750GB daily upload limit per user account.

 

but who knows?


  • Christopher (Drashna) likes this

#29 borez

borez

    Member

  • Members
  • PipPip
  • 20 posts

Posted 19 August 2017 - 02:56 AM

 

That was what I was seeing in the error logs. The errors kicked in around 20-30 mins of operations so I'm definitely way below the bandwidth caps.

 

A couple of observations:

  • The issues arose when CD was working on partial re-read/write chunk operations. In that instance the GUI showed i was running way more threads than what I stipulated in the settings. For example, I was downloading with 6-7 threads where my settings indicated 4.
  • The errors seemed to have stopped when I was doing full writes to CD.

 

 

To recap on my issues, I have been hitting the API call limits even I've not hit the proverbial 750GB cap.

 

To provide some context: I'm trying to create hierarchical pools in Drivepool (backup-ing my local drivepool to the cloud). I manually replicated my data in the cloud using GoodSync, and had relocated the files into the Poolpart folder. Created a combined pool and did a duplication check.

 

This triggered off a series of read and partial writes (and re-uploads) on CD. Based off the technical details, i see the following:

 

  • Partial re-writes are extremely slow (< 130kbps). I'm not sure if it's the same chunk that's being written (and re-written) again. See below for screenshot and error logs.
  • Once that partial write clears, the upload speeds are back up to 100-200mbps. Still not as fast as what I used to get, but will leave that for another day.

So apart from the bandwidth caps, it seems that there are caps with the number of calls you can make, for a specific file. It's my gut intuition, but appreciate some thoughts. Pulling my hair out here!

2:09:05.8: Warning: 0 : [ReadModifyWriteRecoveryImplementation:96] [W] Failed write (Chunk:34168, Offset:0x00000000 Length:0x01400500). Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
2:09:06.0: Warning: 0 : [TemporaryWritesImplementation:96] Error performing read-modify-write, marking as failed (Chunk=34168, Offset=0x00000000, Length=0x01400500). Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
2:09:06.0: Warning: 0 : [WholeChunkIoImplementation:96] Error on write when performing master partial write. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
2:09:06.0: Warning: 0 : [WholeChunkIoImplementation:96] Error when performing master partial write. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
2:09:06.0: Warning: 0 : [WholeChunkIoImplementation:4] Error on write when performing shared partial write. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
2:09:06.0: Warning: 0 : [WholeChunkIoImplementation:27] Error on write when performing shared partial write. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
2:09:06.0: Warning: 0 : [WholeChunkIoImplementation:104] Error on write when performing shared partial write. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
2:09:06.0: Warning: 0 : [IoManager:4] Error performing read-modify-write I/O operation on provider. Retrying. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
2:09:06.0: Warning: 0 : [IoManager:96] Error performing read-modify-write I/O operation on provider. Retrying. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
2:09:06.0: Warning: 0 : [IoManager:27] Error performing read-modify-write I/O operation on provider. Retrying. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
2:09:06.0: Warning: 0 : [IoManager:104] Error performing read-modify-write I/O operation on provider. Retrying. Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host. 

MS5e9Oz.png



#30 Soaringswine

Soaringswine

    Member

  • Members
  • PipPip
  • 10 posts

Posted 20 August 2017 - 12:28 AM

I'm getting User Rate Limit Exceeded on downloads (prefetches) and there's no way I've uploaded 750 GB within 24 hours.



#31 JohnKimble

JohnKimble

    Member

  • Members
  • PipPip
  • 15 posts

Posted 21 August 2017 - 11:05 AM

Well it works again here but my download speed went from 200mbit to around 20mbit max. Gonna be fun restoring a backup with this speed.



#32 Soaringswine

Soaringswine

    Member

  • Members
  • PipPip
  • 10 posts

Posted 23 August 2017 - 04:38 PM

I'm getting User Rate Limit Exceeded on downloads (prefetches) and there's no way I've uploaded 750 GB within 24 hours.

AH HA! it's because I have Arq running backups to the same Gsuite account. must be hitting the limit there.



#33 srcrist

srcrist

    Advanced Member

  • Members
  • PipPipPip
  • 79 posts

Posted 24 August 2017 - 04:14 PM

If you're getting the "User rate limit exceeded", yes, it will block ALL API calls to the provider (that aren't from it's official app), and that will prevent you from downloading from the provider. 

 

To be clear, Christopher: that actually is not the case here. When you hit the threshold it will begin to give you the 403 errors (User rate limit exceeded), but ONLY on uploads. It will still permit downloaded data to function as normal. It isn't actually an API limit, wholesale. They're just using the API to enforce an upload quota. Once you exceed the quota, all of your upload API calls fail but your download calls will succeed (unless you hit the 10TB/day quota on those as well). 

 

AH HA! it's because I have Arq running backups to the same Gsuite account. must be hitting the limit there.

 

Right. For the people who think they aren't hitting the 750GB/day limit, remember that it's a total limit across *every* application that is accessing that drive. So your aggregate cannot be more than that. But enough people have confirmed that the limit is somewhere right around 750GB/day, at this point, that I think we can call that settled. 


  • Christopher (Drashna) likes this

#34 borez

borez

    Member

  • Members
  • PipPip
  • 20 posts

Posted 25 August 2017 - 11:45 PM

To be clear, Christopher: that actually is not the case here. When you hit the threshold it will begin to give you the 403 errors (User rate limit exceeded), but ONLY on uploads. It will still permit downloaded data to function as normal. It isn't actually an API limit, wholesale. They're just using the API to enforce an upload quota. Once you exceed the quota, all of your upload API calls fail but your download calls will succeed (unless you hit the 10TB/day quota on those as well). 

 

 

Right. For the people who think they aren't hitting the 750GB/day limit, remember that it's a total limit across *every* application that is accessing that drive. So your aggregate cannot be more than that. But enough people have confirmed that the limit is somewhere right around 750GB/day, at this point, that I think we can call that settled. 

 

BTW is the 750GB/day cap enforced on a per user account basis, or is it enforced amongst the whole user group?



#35 srcrist

srcrist

    Advanced Member

  • Members
  • PipPipPip
  • 79 posts

Posted 27 August 2017 - 06:05 AM

BTW is the 750GB/day cap enforced on a per user account basis, or is it enforced amongst the whole user group?

 

That actually seems to depend. I use a edu account (legit edu account via my Alma Mater), for example, and I seem to get my own personal 750GB/day quota that exists independently of any other users. But *most* people, who are using their own personal GSuites accounts on their own domain, seem to get 750gb/day for all accounts on that domain. So it would seem that Google is treating different organizations differently. I'm sure they're just smart enough to recognize that 750GB/day is very low for larger organizations, but I don't know at what threshold they start treating it differently. 



#36 Christopher (Drashna)

Christopher (Drashna)

    Customer and Technical Support

  • Administrators
  • 8,208 posts
  • LocationSan Diego, CA, USA

Posted 27 August 2017 - 08:15 PM

srcist, thank you for that clarification! 

 

Would be nice of Google outright documented this rather than letting the community find out the hard way. 

 

 

AH HA! it's because I have Arq running backups to the same Gsuite account. must be hitting the limit there.

 

And that would do it.  If you're hitting the limit between the two different products... I can image that not taking that long


Christopher Courtney

aka "Drashna"

Microsoft MVP for Windows Home Server 2009-2012

Lead Moderator for We Got Served

Moderator for Home Server Show

 

This is my server

 

Lots of "Other" data on your pool? Read about what it is here.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users