Jump to content
  • 0

CloudDrive disconnects when trying to pull data


jik0n

Question

When I try to copy files from my clouddrive local over network to another computer the cloud drive starts the DL on its end and starts a transfer, after a few minutes it will disconnect and error out forcing it to re-mount. Is there a fix for this? I tried googling and checking around the forums and couldn't find much.

 

Thanks,

 

 

Edit:
 

I tried the fix posted in:

http://community.covecube.com/index.php?/topic/2997-automatically-re-attach-after-drive-error/

 

I set the threshold to about 35

 

so far this is working, will transfer more files over the next few hours / days and report back.

 

Edit: okay so its been about a day and what I'm finding is that it'll stall still but this time it doesn't disconnect. So if I have a pending transfer that will stop / pause and ask me to retry but stablebit won't disconnect. I'm going to find a transfer software that auto retries

Link to comment
Share on other sites

8 answers to this question

Recommended Posts

  • 0

35 is waaaaaaay too high. I wouldn't go over 10. 

 

It worked up until this point. Still hasn't locked up. It just disconnected again. Sigh... is there any fix for this currently? Is there a way to tell stablebit to download files into the cache without trying to access them? Or if they are video files, if I try to play it, will it download the whole thing if I attempt to play it for just 5 seconds then close it?

Link to comment
Share on other sites

  • 0

CloudDrive doesn't typically do this. So a fix isn't really something that is needed. It's likely a network issue. But I would open a ticket. 

 

In any case, if you're changing that setting to anything over 10 or so and still seeing disconnects that's a much larger problem.

Link to comment
Share on other sites

  • 0

CloudDrive doesn't typically do this. So a fix isn't really something that is needed. It's likely a network issue. But I would open a ticket. 

 

In any case, if you're changing that setting to anything over 10 or so and still seeing disconnects that's a much larger problem.

 

Disconnect only has happened once since I've changed it. I'm not convinced its a network issue since when I'm not running the cloud drive I have zero issues. I could transfer terabytes of files over the network or DL / UL tons of files via p2p and never have issues.

 

I'll try doing the ticket route via the troubleshooter.

Link to comment
Share on other sites

  • 0

OK, yeah. I think that's probably your best bet. 

 

My main point, in any case, is that the application doesn't have those problems for most people. So, network issue or not, it's probably something to tweak for your specific setup. I'm sure a ticket can help to troubleshoot that though.

Link to comment
Share on other sites

  • 0

OK, yeah. I think that's probably your best bet. 

 

My main point, in any case, is that the application doesn't have those problems for most people. So, network issue or not, it's probably something to tweak for your specific setup. I'm sure a ticket can help to troubleshoot that though.

 

Something interesting to note, and idk if I can mention it here, but another software like stablebit (but I think is mainly used for linux) has been getting errors while trying to pull from gcloud / gsuites "403: rate limit exceeded." I don't know if google is doing something on their end but it may come up at some point with stablebit since the two programs act the same

Link to comment
Share on other sites

  • 0

It's fine to mention rClone. They aren't really competitors. They don't even achieve the same ends. I think there is room for both.

 

In any case, it looks like Google implemented some new limits on writes per day (somewhere around 800GB/day). There is some discussion of it here: http://community.covecube.com/index.php?/topic/3050-901-google-drive-server-side-disconnectsthrottling/

 

Honestly, depending on the cache setup, it will likely affect CloudDrive less than rClone, since the CloudDrive can still operate and make changes (reads AND writes) to the cache when it cannot upload new data. It may present the opportunity for new tweaks, I would guess, but, overall, the drives should continue working fine--assuming we understand the changes that have been made. 

Link to comment
Share on other sites

  • 0

Something interesting to note, and idk if I can mention it here, but another software like stablebit (but I think is mainly used for linux) has been getting errors while trying to pull from gcloud / gsuites "403: rate limit exceeded." I don't know if google is doing something on their end but it may come up at some point with stablebit since the two programs act the same

 

Okay, good to know.  I've actually let Alex (the developer) know about this, so we can look into it.

 

 

that said, we don't mind competitors being discussed.  I mean, there are those here that can attest that I've recommended competitor products.  So, we definitely don't mind.  We would rather help those find the right solution rather than pigeonholing them into our solution. 

 

 

 

But this means that scheduling and bandwidth limiting code will be implemented sooner rather than later.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...