Jump to content

kird

Members
  • Posts

    30
  • Joined

  • Last visited

Posts posted by kird

  1. 49 minutes ago, Jellepepe said:

    why do you say that? i am many times over the limit and never had any issues until a few weeks ago. 
    I was also unable to find any documentation or even mention of it when i first encountered the issue, and even contacted google support who were totally unaware of any such limits.

    fast forward 2 weeks later and more people start having the same 'issue' and suddenly we can find documentation on it from googles side.

    Sorry, this rule since i have been paying my last gdrive, approximately 3 or 4 months, already existed as it was commented in telegram groups at the time. I don't invent anything, mate.

  2. Personally if I am not forced to do it I will be perfectly as I am now, I don't want to think for a second about doing this with 121TB of data... if everything goes well without any loss in the process, I am sure, it would be more than months in doing the whole process. 

    Note to developers, please for people who don't have any problem we hope that future stable releases of SCD don't force us to do the migration because of the API issue, please develop versions where we are not forced to do this step since we don't have any need at the moment and we have been already with gdrive for years, this will not change if there is no need to apply our own api. 

  3. Look how many clients have this problem in their gdrive with StablebitCD, everyone here (not too many) reporting the error has something in common, they all had their own api configured, this is what really changed in google that makes this error that is being reported. I haven't seen anyone say they have this bug with the standard api. Without getting into technical details, it's clear that this is not an SCD incompatibility but an API incompatibility, in my case I didn't need my own api to work with my gdrive on a daily basis. Perhaps others have needed it. If you feel more safe upgrading to beta when you say you are not having problems yourself... go ahead and get lucky.

  4. 47 minutes ago, darkly said:

    I mean, I'd imagine I'd have to sooner or later. The child directory/file limit still exists on google drive, and 1178 isn't equipped to handle that in any capacity. I'm just wondering why I haven't gotten any issues yet considering how much data I have stored.

    Here one with 121 TB used and without any problem, so i don't need anything and even less to lose my data by doing an upgrade that i don't need at all. Perhaps the secret lies in not having configured any personal API (I've always been with the standard/default google api).

  5. 2 hours ago, darkly said:

    Soo..... I'm still on version 1.1.2.1178 (been reluctant to upgrade with the various reports I keep seeing), but my gdrive has over 100TB of data on it and I've never seen the error about the number of children being exceeded. I use my own API keys. Have I just be extremely lucky up until this point?

    my advice, don't even think about upgrading.  :D

  6. Have any formal announcements been made by the developers? How much hurry to migrate to a new system with a beta version that you don't even know what it leads to... I do not intend to adopt this system, if that is the case, without first making an official announcement by developers, with 130 TB I do not think it is a good idea.  That said, with my current version, which is by no means the last stable one officially released, I have no problem with Gdrive at the moment, everything is perfect. To say that I have the standard google api. 

  7. 3 minutes ago, JulesTop said:

    When I tried adding a file to the clouddrive content folder manually (via the gdrive web interface)

    I have never been able to download or upload anything via gdrive web, all content is encrypted. How do you do it? In my case to upload/download content always through the mounted network drive or Air Explorer (great for managing clouds) but never from the google web interface.

    Yup, hopefully they can fix this nasty error or give information.

     

  8. Yes, i was referring logically to the recycle bin of your gdrive account, the windows one here plays no role. As I understand it here the point is not the amount of Gb/Tb in files on the drive but the total number of folders that can be nested in the account.

    If I understand correctly, according to this information the error occurs when it reaches 500.000 folders, files... inside a root folder in the drive.

    "A numChildrenInNonRootLimitExceeded error occurs when the limit for a folder's number of children (folders, files, and shortcuts) has been exceeded. There is a 500,000 item limit for folders, files, and shortcuts directly in a folder. Items nested in subfolders do not count against this 500,000 item limit."

  9. 4 hours ago, JulesTop said:

    I doubt I hit my file and folder limit. The reason I say that is that I deleted a few folders and their files, and nothing changed. I still get this error and my uploads are hanging.

    I'm wondering if this is happening to anyone else...

    Figuring it out, have you deleted the files from the recycle bin? If you have not configured it in a special way, by default the deleted files go there and if they are not deleted they still count for space.

  10. Thanks Christopher, unfortunately it didn't work out, fire out this: 

    from admin cmd -->   sdelete64 -s I:UNT\oue

    Error deleting I:UNT\oue:
    The file or directory is damaged or unreadable.

    any other suggestions?

    Thanks for your time.

    ps: instead if I create a normal directory with files inside the Gdrive it is able to delete it.

     

×
×
  • Create New...