Jump to content

ndr

Members
  • Posts

    7
  • Joined

  • Last visited

Everything posted by ndr

  1. Hi, TLDR: For some reason all my Gdrive disks are dynamic now and I don't know why this happened. Is it possible to fix it without data loss or is my 60TB data gone? Longer story: I have been using GDrive for a while now to store all my media content and everything has worked wonderfully for several months. I started with CloudDrive (Google Drive) + x1 drive (60TB) and after some trial and error managed to increase this to x3 60TB drives and included DrivePool to the setup for pooling everything. I choose not to use duplication option in DrivePool (all plugins disabled) since that would burn the upload limit/day (according to what I've read) and some sector settings etc might not be optimal from the beginning but everything worked without any issues so I didn't care much But, something broke today and I don't know why. I had one physical drive (with multiple partitions) which I only used for VMs on my PC and today I decided to format it. As soon as I pressed the format button DrivePool started warning me about missing disks. For me this is a mystery because the drive and the partitions were never a part of the GDrive pool ever. So I started troubleshooting CloudDrive / DrivePool first, reinstalled Drivepool etc but nothing helped. After some google searches on missing volumes in DrivePool I came across "dynamic disks" and when I looked my partitions in Windows I realized that yes, all of the GDrives are dynamic. I don't know when or why this happened but now the whole setup is broken because dynamic disks seems to be unsupported by DrivePool. I can still see the content if I manually add a drive letter to the GDrives (so I haven't lost everything yet) but since they are split into three volumes the content is not same on them and I dont' really now how to repair this without messing it up even more. So my questions are: Is there any way to solve this in a easy way? I read some about converting the disks but most of the guides involves deleting the partitions, I guess that will destroy all my data too even if it's stored in the cloud? Should I create a new GDrive disk (with optimal settings from start), partition it and then move all data to this drive? This will probably take ages since there is a upload cap/day but at least I won't loose anything. If this is the solution, please give me some tips on which settings I should use in CloudDrive/DrivePool. I only stream Plex media and the content is 1080p or lower, my connection is 1Gbit/1Gbit. Could I maybe solve this by not using DrivePool anymore and just keep use CloudDrive? This will make things more complicated since I must switch drive paths when they get full so I hope this is the last resort. Any other options? Apprieciate all the help I can get! Thanks!
  2. Hi, When I started with CloudDrive I created a 60TB volume and thought that would be enough but soon I am out of space. I need to increase this volume but not sure what options I got. What is the easiest way to solve this? I tried the "resize" option but seems like I a sector size setting (4kb) for my drive won't let me increase it any more. Is it possible to recreate the drive from scratch without removing the uploaded content in Gdrive?
  3. ndr

    Question about my setup

    Thanks for the explanation @srcrist!
  4. Hi, Does anyone have any experience streaming remux movies with GDrive through CloudDrive/Plex? I have 500Mbit/500Mbit connection and 720p/1080p and even 2160p movies stream without any issues but Remux doesn't, the buffering is constant from start and doesn't get better. I have tried streaming on my Apple TV 4K (ethernet cable) and iPad (5Ghz WiFi) but both of them have the same issue. Are there maybe any settings I need to change in Plex or CloudDrive to make it work?
  5. ndr

    Question about my setup

    Hi @srcrist, Thanks for the reply. One thing I'm thinking about is the "temp folder" location. Since everything is first downloaded to temp which is my GDrive before being handled, does that increase the data usage? Since there is cap of 750GB/day I want to avoid unnecessary traffic between my PC <> GDrive. For example which of these two scenarios require least data sent to GDrive? #1 1. Download files directly to GDrive temp 2. Extract files in same folder 3. Move .mkv file to \Movies\ folder on same GDrive 4. Delete .rar files #2 1. Download files to local PC temp 2. Extract files in same folder 3. Move .mkv files to \Movies\ folder on GDrive 4. Delete locally stored .rar files My first guess is that #1 require more data sent but I don't know how Google/CloudDrive handles step 2-4 and what impact they have for the data usage. Sorry for my stupid questions but I am not that good at this
  6. Hi, I'm going to evaluate DrivePool together with CloudDrive. My goal is to use these two so I can work around the 750GB/day limit and in the backend I have Plex running for consuming data. I have two GSuite accounts and mapped up two drives with CloudDrive. Currently only one of the GDrives have data on it (2TB). I installed DrivePool and everything is setup as default and then I added the two GDrives to one pool. My questions is; Since the pool is empty from start, what should I do from here. Copy/paste all data from GDrive to the pool drive? And is there any default settings I should tweak or turn on/off before I start using this? EDIT: What happens with the data if I dont wan't two drives in the future? Are you able to just delete one of the virtual GDrives and still keep all data or do I need to transfer them in some way before? Thanks!
  7. Hi, I just started using CloudDrive for syncing data to Google Drive and so far it works great, uploading around 750GB / day I tried tweaking some settings based on user recommendations here but I got some questions about my workflow and if I should change it because of optimal performance etc... My current setup looks like this: Hardware and connection: x2 SSD M.2 disks with following volumes: Disk 1 - C: 500GB (OS) Disk Disk 2 - E: 500GB (CloudDrive cache) 30GB (expandable) CloudDrive D: 60TB (Google Drive) 500/500Mbit fiber connection I/O Performance settings Download threads = 10 Upload threads = 5 Download throttling = 200 Mbits/s Upload throttling = 75 Mbits/s Upload threshold: 1,00 MB or 5 min Minimum download size: 20MB Prefetcher trigger = 10MB Prefetch forward 100MB Prefetch time window = 10 sec My current workflow looks like this: > Radar/Sonarr/Qbitorrent sends data to D:\Temp\ > CloudDrive is set to use E: as cache volume. > Radarr/Sonarr/Qbitorrent does it's magic and sorts the data etc and moves around the folders from D:\Temp\ to different subfolders like D:\Movies etc. Then I have my Plex on the same machine installed on the C drive grabbing all data etc (you know the drill). My question is; As you can see with the workflow above, everything lands directly into the virtual drive and gets cached on a different drive. Is this a good way to transfer files to GDrive or should I change where my temp data is stored before sending it to Google Drive? Or is the application smart enough to handle it? So far I haven't come across any issues yet but maybe I'm doing it wrong and there are better "optimal" ways to do it?
×
×
  • Create New...