S3Sync.net
February 02, 2014, 01:26:20 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
 
   Home   Help Search Login Register  
Pages: [1]
  Print  
Author Topic: Downloading Not Required  (Read 6528 times)
timdau
Newbie
*
Posts: 8


View Profile
« on: November 19, 2007, 12:57:54 PM »

I want to use s3sync to backup the AWS files locally.

s3sync seems to be downloading the files locally every time even when they exist already.  It should check to see if the file exists and skip it if it does.  Or at least it should have this option.

Thanks.
Logged
ferrix
Sr. Member
****
Posts: 363


(I am greg13070 on AWS forum)


View Profile
« Reply #1 on: November 19, 2007, 08:30:31 PM »

I want to use s3sync to backup the AWS files locally.

s3sync seems to be downloading the files locally every time even when they exist already.  It should check to see if the file exists and skip it if it does.  Or at least it should have this option.

Thanks.

If the data was stored with s3sync to local once, then I agree that subsequent syncs should not re-copy the files unless they changed.  If you create a detailed set of steps including things like dir/file names and the order of your steps, I may be able to help more.
Logged
timdau
Newbie
*
Posts: 8


View Profile
« Reply #2 on: November 20, 2007, 10:03:07 AM »

Here are the steps to duplicate.

1. Download v1.1.4 of s3sync
2. create a bucket with a couple large files(easier to demonstrate that it is downloading)
3. (In the bucket, I have these large files in sub directories)
4. Create the same local directory structure(you have to do this because of another _bug_ I have posted about)
5. ./s3sync.rb -r -v --progress BUCKET: /usr/local/backup/s3_aws/BUCKET
6. Watch the files download:
       Create node /100/Another_test.m4v
       Progress: 5114057b  1273489b/s  98%
7. ./s3sync.rb -r -v --progress BUCKET: /usr/local/backup/s3_aws/BUCKET
8. Watch the files download again
       Create node /100/Another_test.m4v
       Progress: 5114057b  1273489b/s  98%
Logged
ferrix
Sr. Member
****
Posts: 363


(I am greg13070 on AWS forum)


View Profile
« Reply #3 on: November 20, 2007, 05:47:02 PM »

I don't see the same behavior.  Although I used small files, the below results plainly show it does not re-get:
Code:
C:\Users\Gregory\s3sync>s3sync.rb -r -v --progress ServEdgeTest: ./restore
Create node 100/ClearSigning.pfx
Create node 100/chain.cer

C:\Users\Gregory\s3sync>s3sync.rb -r -v --progress ServEdgeTest: ./restore

C:\Users\Gregory\s3sync>

Please try yours again but use -d so I get some idea what the generators and comparator are thinking.
Logged
timdau
Newbie
*
Posts: 8


View Profile
« Reply #4 on: November 20, 2007, 09:24:38 PM »

I think the situation is tied to my other issue.  If I do a single folder with files but no sub folder it works.  If I do a folder that has sub folders, and it fails because those sub folders don't already exist.  THEN I get the result.  Sorry for not identifying this earlier.

This should duplicate the results:
1 On AWS, create folders 100, 200, 300, 400
2. On AWS, put files in all three of these folders
3. Locally create folder 100
4. ./s3sync.rb -r -v --progress BUCKET: /usr/local/backup/s3_aws/BUCKET (the files in 100 will download again, and then this will fail because the local folders don't already exist)
5. ./s3sync.rb -r -v --progress BUCKET: /usr/local/backup/s3_aws/BUCKET (the files in 100 will download again and it will fail again)

Logged
ferrix
Sr. Member
****
Posts: 363


(I am greg13070 on AWS forum)


View Profile
« Reply #5 on: November 20, 2007, 11:59:22 PM »

Then moving this issue to closed.
Logged
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!