Title: "list_bucket max-keys 200" error Post by: OutThere on July 15, 2008, 09:56:17 AM Hi
I'm trying to back up a couple hundred gigabytes from a NAS drive on our company's network using S3sync. Some (most?) of the data show up in S3, so it is being backed up, but the script I run always gives this error: S3 command failed: list_bucket max-keys 200 prefix Restores/Restores/restore_2008_04_03_12_32_255159/C/TeamCity/ delimiter / With result 500 Internal Server Error 99 retries left, sleeping for 30 seconds S3 command failed: list_bucket max-keys 200 prefix Restores/Restores/restore_2008_04_03_12_32_255159/C/TeamCity/bin/ delimiter / With result 500 Internal Server Error 98 retries left, sleeping for 30 seconds And on and on..... What does the text in bold mean? How can I fix it? Is it due to a limitation on the length of the file path? Title: Re: "list_bucket max-keys 200" error Post by: ferrix on July 15, 2008, 01:55:42 PM It's just part of the command that "failed". 500 errors are normal in S3, and means "try again shortly"
Title: Re: "list_bucket max-keys 200" error Post by: OutThere on July 15, 2008, 02:52:40 PM Good to know, thanks. Is it a bug that S3sync actually waits for way way more than 30 seconds sometimes?
Title: Re: "list_bucket max-keys 200" error Post by: ferrix on July 16, 2008, 12:12:30 PM Good to know, thanks. Is it a bug that S3sync actually waits for way way more than 30 seconds sometimes? It would be a bug, but I bet it's really just moving on and no other lines get printed. You can also decrease the delay by setting a parameter (see readme) |