Hello,
I managed to get s3sync to upload my test folder to Amazon S3 and can see it in the MWS Managment Console.
Downloading the data back to a test folder results in the following error message:
root@mybucketname:/var/s3sync# ./week_download.sh
s3Prefix backups/weekly
localPrefix /var/s3sync/testdown/weekly
s3TreeRecurse mybucketname backups/weekly
Creating new connection
Trying command list_bucket mybucketname prefix backups/weekly max-keys 200 delimiter / with 100 retries le
ft
Response code: 200
prefix found: /
s3TreeRecurse mybucketname backups/weekly /
Trying command list_bucket mybucketname prefix backups/weekly/ max-keys 200 delimiter / with 100 retries l
eft
Response code: 200
S3 item backups/weekly/
s3 node object init. Name: Path:backups/weekly Size:0 Tag:d41d8cd98f00b204e9800998ecf8427e Date:Fri O
ct 29 14:21:53 UTC 2010
local node object init. Name: Path:/var/s3sync/testdown/weekly/ Size: Tag: Date:
source:
dest:
Update node
s3sync.rb:638:in `initialize': No such file or directory - /var/s3sync/testdown/weekly/.s3syncTemp (E
rrno::ENOENT)
from s3sync.rb:638:in `open'
from s3sync.rb:638:in `updateFrom'
from s3sync.rb:393:in `main'
from s3sync.rb:735
I am using the following download script:
#!/bin/bash
# script to upload local directory upto s3
cd /var/s3sync/ export AWS_ACCESS_KEY_ID=nothing to see here export AWS_SECRET_ACCESS_KEY=nothing to see here export SSL_CERT_DIR=/var/s3sync/certs ruby s3sync.rb -r -v -d --progress --make-dirs mybucket:backups/weekly /var/s3sync/testdown
Any idea's? Does the download script need to download to the source of Amazon S3 i.e testup folder? Was hoping on the instance of a complete failure and the original folders won't exist that it would just download everything from me.
Note: changed my bucket names to "mybucketname" so that it is not public!