I have few T to transfer from an external source to CloudDrive and i have little free space availiable on my local harddrive.
So i did use rclone and use an rclone copy from my external server to my gdrive clouddrive disk with a bandwith limit around 7MB to avoid quota ban from google.
I have an upload speed that allow cloud drive to do the upload quick enought to keep my local buffer at an acceptable level.
But, i was getting ban, so i did a test :
I transferer a 1Gb.file to my CloudDrive at full speed and monitor my bandwith, so i saw CloudDrive uploaded around 1.2GB.
Next, i transfered 1Gb.file to my CloudDrive but with a bwlimit at 5M. And that's the issue, CloudDrive uploaded around 6GB.
Is there an option or something to avoid this useless upload and wait for the file complete before sending it to google drive ?
I saw, upload threshold, but it's worthless since it can't go over 10MB, i would need something like 30GB.
And from my point of view, the threeshold is a non-sense, instead of waiting for few minutes elapsed since the data was first written, we should have an option to wait few minutes AFTER the data did not change (like if a file did not grow since one minute, it's good to go, and you can start uploading it)
Or maybe as a workaround something like an API to pause upload / resume upload from an external script ?
Question
Malgos
I have few T to transfer from an external source to CloudDrive and i have little free space availiable on my local harddrive.
So i did use rclone and use an rclone copy from my external server to my gdrive clouddrive disk with a bandwith limit around 7MB to avoid quota ban from google.
I have an upload speed that allow cloud drive to do the upload quick enought to keep my local buffer at an acceptable level.
But, i was getting ban, so i did a test :
I transferer a 1Gb.file to my CloudDrive at full speed and monitor my bandwith, so i saw CloudDrive uploaded around 1.2GB.
Next, i transfered 1Gb.file to my CloudDrive but with a bwlimit at 5M. And that's the issue, CloudDrive uploaded around 6GB.
Is there an option or something to avoid this useless upload and wait for the file complete before sending it to google drive ?
I saw, upload threshold, but it's worthless since it can't go over 10MB, i would need something like 30GB.
And from my point of view, the threeshold is a non-sense, instead of waiting for few minutes elapsed since the data was first written, we should have an option to wait few minutes AFTER the data did not change (like if a file did not grow since one minute, it's good to go, and you can start uploading it)
Or maybe as a workaround something like an API to pause upload / resume upload from an external script ?
2 answers to this question
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.