Jump to content
Covecube Inc.
  • 0

Build 854: What are the optimal Amazon and GDRIVE settings?




I just updated to Build 854. I had an existing Amazon drive that took a day to connect because it was rebuilding the drive (2.41 TB) and plan to add a Google Drive as well.

I would like to know the best setting to optimize performances.


For Amazon (Encrypted), I have it set up that way:




I/O Performance

I/O Manager

Download threads: 4

Upload threads: 2 (Background I/O checked)

Download throttling: 50 Mbps

Upload throttling: 20 Mbps (to still be able to use my internet while uploading)

Upload threshold: 1.00 MB or 0 minutes


Enable prefetching: Checked

Prefetch trigger: 5 MB

Prefetch forward: 100 MB

Prefetch time window: 300 seconds


Now I am about to create an unencrypted google drive. What would be the best options for that one?


Sector Size: 4.00 Kb or 512 B?

Storage chunk size: Various options there, which one would you recommend?

Chunk cache size: Again, various options, which one would you recommend?

Minimum download size: Should I leave it to "OFF" or should I select an option?

Verify chunk integrity: Required for GDRIVE?

Cluster size: Again, various options, which one would you recommend?


Once it's created, should I use the same settings I used for Amazon?



Link to post
Share on other sites

3 answers to this question

Recommended Posts

  • 0

Amazon Cloud Drive (aka Amazon Drive) needs the "upload verification" enabled. If you don't have it enabled, weird issues can occur sometimes. 


For sector size, this is the "physical" sector size. 4096 being "advanced format", while 512 being "normal". 

The Storage chunk size is the size of the chunks stored on the provider.  Larger can be better, but it depends on a number of factors. 

Minimum download size can be changed, as this affects the size of the partial "units" that we grab. Setting this higher may increase speed but may also increase disk latency. 

Verify chunk integrity means that it stores a checksum for the chunks, so that you can be more certain that the data isn't getting corrupted. 

Cluster size is the NTFS cluster size, and affects the max volume size. The default is fine if the drive is 10TBs or lower. But if you plan on making the drive larger, the default changes.  If you plan on making a very larger drive, eventually, setting it to 64kb may be a good idea.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Create New...