February 02, 2014, 01:23:29 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
  Home Help Search Login Register  
  Show Posts
Pages: [1]
1  General Category / Questions / Re: Last Hurdle - Won't Download on: October 30, 2010, 05:06:17 PM
edited above post, pasted in my upload script exam ple rather than my download script!
2  General Category / Questions / Last Hurdle - Won't Download on: October 29, 2010, 09:33:56 AM

I managed to get s3sync to upload my test folder to Amazon S3 and can see it in the MWS Managment Console.

Downloading the data back to a test folder results in the following error message:

root@mybucketname:/var/s3sync# ./week_download.sh
s3Prefix backups/weekly
localPrefix /var/s3sync/testdown/weekly
s3TreeRecurse mybucketname backups/weekly
Creating new connection
Trying command list_bucket mybucketname prefix backups/weekly max-keys 200 delimiter / with 100 retries le
Response code: 200
prefix found: /
s3TreeRecurse mybucketname backups/weekly /
Trying command list_bucket mybucketname prefix backups/weekly/ max-keys 200 delimiter / with 100 retries l
Response code: 200
S3 item backups/weekly/
s3 node object init. Name: Path:backups/weekly Size:0 Tag:d41d8cd98f00b204e9800998ecf8427e Date:Fri O
ct 29 14:21:53 UTC 2010
local node object init. Name: Path:/var/s3sync/testdown/weekly/ Size: Tag: Date:
Update node
s3sync.rb:638:in `initialize': No such file or directory - /var/s3sync/testdown/weekly/.s3syncTemp (E
from s3sync.rb:638:in `open'
from s3sync.rb:638:in `updateFrom'
from s3sync.rb:393:in `main'
from s3sync.rb:735

I am using the following download script:

# script to upload local directory upto s3
cd /var/s3sync/ export AWS_ACCESS_KEY_ID=nothing to see here export AWS_SECRET_ACCESS_KEY=nothing to see here export SSL_CERT_DIR=/var/s3sync/certs ruby s3sync.rb -r -v -d --progress --make-dirs mybucket:backups/weekly /var/s3sync/testdown

Any idea's? Does the download script need to download to the source of Amazon S3 i.e testup folder? Was hoping on the instance of a complete failure and the original folders won't exist that it would just download everything from me.

Note: changed my bucket names to "mybucketname" so that it is not public!
Pages: [1]
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!