S3Sync.net
February 02, 2014, 01:26:18 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
 
  Home Help Search Login Register  
  Show Posts
Pages: [1]
1  General Category / Closed Bugs / Re: Local Directory Structure on: November 20, 2007, 09:32:24 PM
I see that I didn't do enough research on this issue.  Sorry for being a jerk.
2  General Category / Closed Bugs / Re: Downloading Not Required on: November 20, 2007, 09:24:38 PM
I think the situation is tied to my other issue.  If I do a single folder with files but no sub folder it works.  If I do a folder that has sub folders, and it fails because those sub folders don't already exist.  THEN I get the result.  Sorry for not identifying this earlier.

This should duplicate the results:
1 On AWS, create folders 100, 200, 300, 400
2. On AWS, put files in all three of these folders
3. Locally create folder 100
4. ./s3sync.rb -r -v --progress BUCKET: /usr/local/backup/s3_aws/BUCKET (the files in 100 will download again, and then this will fail because the local folders don't already exist)
5. ./s3sync.rb -r -v --progress BUCKET: /usr/local/backup/s3_aws/BUCKET (the files in 100 will download again and it will fail again)

3  General Category / Closed Bugs / Re: Local Directory Structure on: November 20, 2007, 03:24:27 PM
ferrix:  I appreciate your reply.  The folder nodes DO exist on the S3 side.  Just not on the local side.  I want to duplicate the layout I have on S3 locally.
4  General Category / Closed Bugs / Re: Downloading Not Required on: November 20, 2007, 10:03:07 AM
Here are the steps to duplicate.

1. Download v1.1.4 of s3sync
2. create a bucket with a couple large files(easier to demonstrate that it is downloading)
3. (In the bucket, I have these large files in sub directories)
4. Create the same local directory structure(you have to do this because of another _bug_ I have posted about)
5. ./s3sync.rb -r -v --progress BUCKET: /usr/local/backup/s3_aws/BUCKET
6. Watch the files download:
       Create node /100/Another_test.m4v
       Progress: 5114057b  1273489b/s  98%
7. ./s3sync.rb -r -v --progress BUCKET: /usr/local/backup/s3_aws/BUCKET
8. Watch the files download again
       Create node /100/Another_test.m4v
       Progress: 5114057b  1273489b/s  98%
5  General Category / Closed Bugs / Re: Local Directory Structure on: November 20, 2007, 09:57:54 AM
ferrix:The result of your "design decision" is the tool is only doing half of the job.  I did search, and I did see your earlier post.  No one has clearly articulated the problem earlier, so I am trying.  This tool _should_ be able to copy files from S3 to local.  But it can't.  As it is, it is only useful for uploading.  You might as well remove the function to download.  That way you aren't setting the wrong expectation.
6  General Category / Closed Bugs / Re: Local Directory Structure on: November 20, 2007, 09:57:23 AM
maelcum: I don't want to TO s3.  I am backing up the files FROM s3.  In fact, my application would be totally broken if I ever sync'ed to S3.

I am running version 1.1.4

Here is the error:
# ./s3sync.rb -r -v --progress BUCKET: /usr/local/backup/s3_aws/BUCKET
Create node /100/test.jpg
./s3sync.rb:614:in `initialize': No such file or directory - /usr/local/backup/s3_aws/BUCKET/100/test.jpg.s3syncTemp (Errno::ENOENT)
        from ./s3sync.rb:614:in `open'
        from ./s3sync.rb:614:in `updateFrom'
        from ./s3sync.rb:378:in `main'
        from ./s3sync.rb:708


7  General Category / Closed Bugs / Local Directory Structure on: November 19, 2007, 01:00:51 PM
I am trying to use s3sync to backup the AWS files locally.

My AWS bucket has a directory structure and I want to mirror that locally.  But s3sync fails if the local directory doesn't already exist.

If I create the directory, the downloads the files and then complains about the next directory.
8  General Category / Closed Bugs / Downloading Not Required on: November 19, 2007, 12:57:54 PM
I want to use s3sync to backup the AWS files locally.

s3sync seems to be downloading the files locally every time even when they exist already.  It should check to see if the file exists and skip it if it does.  Or at least it should have this option.

Thanks.
Pages: [1]
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!