S3Sync.net
February 02, 2014, 01:32:20 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
 
   Home   Help Search Login Register  
Pages: [1]
  Print  
Author Topic: Problems with s3cmd.rb and s3sync.rb  (Read 4771 times)
juster3
Newbie
*
Posts: 1


View Profile
« on: February 27, 2007, 11:00:45 PM »

I am new to AWS and I have been attempting to setup automated backups from my OS X computer to S3 using s3sync. 
I tried to create a new bucket with s3cmd.rb and was finally able to figure out how to get it work.
For others like me who didn't know where to start run the following commands:

export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
export SSL_CERT_DIR=
ruby s3cmd.rb createbucket bucketname 

(remember the bucket name has to be unique, it is usually suggested that it be your accesskey.bucketname)
The problem I had was that I didn't have a unique name and so it was failing... but that is solved now.

My problem now is that when I try to sync a directory using this command.
ruby s3sync.rb -r --ssl --delete /home/john/localuploadfolder/ mybucket:/remotefolder

Is it correct to have mybucket be the same as the accesskey.bucketname as before?
And do I need to create the remotefolder or will it be created for me?
I tried a 300 meg file and it was uploading (i checked the output bandwith and it was solid upstream for about 20 mins) and then it issued the following error.

Broken pipe: Broken pipe
99 retries left
SSL Error: SSL_write:: bad write retry
98 retries left

I killed the script before it continued on down.  Is there something wrong in the above method?

Thanks
Logged
thatsmymouse
Newbie
*
Posts: 1


View Profile
« Reply #1 on: April 06, 2007, 08:29:35 AM »

I'm also getting this problem - it only seems to occur with large files (I'm trying to transfer a 4GB tar) - whenever I use s3sync for anything else it works fine.

Anyone have any ideas? Should I just not use ssl?
Logged
lowflyinghawk
Jr. Member
**
Posts: 52


View Profile
« Reply #2 on: April 08, 2007, 10:35:56 AM »

key content larger than 2GB is currently not supported by s3.  this issue has been around for nearly a year and aws just keeps saying "we are working on it".
Logged
ferrix
Sr. Member
****
Posts: 363


(I am greg13070 on AWS forum)


View Profile
« Reply #3 on: April 08, 2007, 08:31:55 PM »

Keep complaining (to aws).  This needs fixing!!
Logged
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!