Jump to content
  • 0

Cloud Drive + G Suite = Backup disk


Bigsease30

Question

Hello All,

I have been using Drivepool for a long time now and love it. I am now looking to further protect my files so I purchased Cloud drive and G Suite Business today. I currently have 34.6TB of total space on my server with 8.71TB free. Duplication is currently turned on using (2X). I have connected my Google Drive to the cloud drive and created a 50TB drive. Assuming that this maybe to much now after figuring out that I am only actually using 13TB of space if I were to remove duplication from my current setup. I configured the download to use 10 threads and the upload to use 5 with an upload limit of 70mbps to avoid being banned.  To further add fuel to the fire, most of my data is movies and TV shows that currently are being utilized from Plex media player.

What I would like to achieve is to have a local copy of my files and have them dupe to my cloud drive. I would like for Plex to only use the local files and not attempt to re-download any files. Doing this would allow me to reclaim my duplicated space on my local server. I have tried to figure it out myself but have been struggling the last few hours. I don't want to accidentally delete my pool and lose everything. Any help would be appreciated. I have included screen shots of my drive pool and Cloud Drive. Please let me know if you need additional information to assist me.

Thanks in advance.

drivepool.PNG

clouddrive.PNG

Link to comment
Share on other sites

Recommended Posts

  • 0
13 hours ago, srcrist said:

Sure thing. Two other things to note:

1) You'll also need to MOVE all of the existing data to the PoolPart folder created on Z: when you made the new pool. Otherwise it will not show up in the new pool for duplication to CloudDrive. 

2) The old pool no longer needs a drive letter. Move your Z: label to the NEW pool now, and applications will access the data via that. After you've moved the data to the new PoolPart folder, of course. 

OK, moved all of the data over to the new pool, changed the drive letter back to Z:/. Changed the option on the pool to allow duplication on both new drives. This seemed to work well, data was uploading fine, until it gave me an error. It told me that I was being throttled because my C:\ was full. Sure enough, it created a hidden folder called CloudPart.XYXYX that was consuming 50GB of space maxing out the drive.

I checked the "Cache"Settings and it was set to 5GB expandable. I have since changed it to Fixed and space appears to be coming back very slowly. What is the appropriate setting for this to not consume my entire C: drive?

Link to comment
Share on other sites

  • 1

That's OK. That just means you're out of space on your cache drive. It's normal when it's copying tons of data, and all it will do is slow down writes to the CloudDrive. You can only upload 750GB/day of data to Google anyway. (Which reminds me, throttle your upload to 70-75mbps or you'll get banned from the upload API for 24 hours.) The expandable cache will grow to use your entire drive for writes only. Read cache will still be limited to the amount that you set. If it's important for it not to take up the entire drive, you'll have to use a fixed cache--with the caveat that it will throttle writes to the drive as soon as the cache space fills up, instead of once the drive is out of room. So if you choose to use a fixed cache, give it a fairly generous cache. 

What are your CloudDrive performance settings? We might want to make some tweaks there to speed up the overall process. 

Link to comment
Share on other sites

  • 0
2 hours ago, Bigsease30 said:

Hello again srcrist,

Thanks again for all the assistance that you have provided thus far. This community alone is one of the biggest reasons I keep referring Stablebit to friends and Colleges.

My performance settings are as follows.

Change your minimum download to your chunk size (probably 20MB, if you used the largest chunk possible). If you're just using this for backup, you really only need to engage the prefetcher when you're making large copies off the drive, so set it to 10MB in 10secs and have it grab maybe 100MB at a time. You can probably even disable it, if you want to. Keeping it enabled will basically only help smooth out network hiccups and help copies move smoother when you're copying data off of the drive. Other than that, you look good. 

Glad to help. Hope everything works out well for you. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...