I am new to AWS and I have been attempting to setup automated backups from my OS X computer to S3 using s3sync.
I tried to create a new bucket with s3cmd.rb and was finally able to figure out how to get it work.
For others like me who didn't know where to start run the following commands:
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
export SSL_CERT_DIR=
ruby s3cmd.rb createbucket bucketname
(remember the bucket name has to be unique, it is usually suggested that it be your accesskey.bucketname)
The problem I had was that I didn't have a unique name and so it was failing... but that is solved now.
My problem now is that when I try to sync a directory using this command.
ruby s3sync.rb -r --ssl --delete /home/john/localuploadfolder/ mybucket:/remotefolder
Is it correct to have mybucket be the same as the accesskey.bucketname as before?
And do I need to create the remotefolder or will it be created for me?
I tried a 300 meg file and it was uploading (i checked the output bandwith and it was solid upstream for about 20 mins) and then it issued the following error.
Broken pipe: Broken pipe
99 retries left
SSL Error: SSL_write:: bad write retry
98 retries left
I killed the script before it continued on down. Is there something wrong in the above method?
Thanks