S3Sync.net
February 02, 2014, 01:28:49 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
 
  Home Help Search Login Register  
  Show Posts
Pages: [1]
1  General Category / Questions / Re: Connection refused - connect(2) (Errno::ECONNREFUSED) on: February 20, 2008, 05:16:43 PM
Ok, i'm officially an idiot Smiley
  The 'Connection refused' error that I mentioned above is actually related to a different part of my script and not
to the use of s3sync.

- cheers
frank
2  General Category / Questions / Connection refused - connect(2) (Errno::ECONNREFUSED) on: February 16, 2008, 07:56:43 PM
I've been seeing this error regularly during the last 2 or 3 days:

/usr/local/lib/ruby/1.8/net/protocol.rb:206:in `initialize': Connection refused - connect(2) (Errno::ECONNREFUSED)

Anyone have any idea what this is about?

- cheers
Frank
3  General Category / Questions / Re: initialize error. on: January 08, 2008, 11:03:26 PM

I'm using ruby 1.8.6 (2007-09-24 patchlevel 111) on both boxes. Things are working
fine again at the moment, so I'm guessing it was a network glitch of sorts.

- Frank
4  General Category / Questions / initialize error. on: January 08, 2008, 12:28:10 PM
Here's an an error that I saw for the first time early this morning. It occured on two of my boxes, one
of those boxes was running the latest version of s3sync (1.2.4) while my other box was running version
 1.2.3 but with the 'dodo' modification mentioned in this thread: http://s3sync.net/forum/index.php?topic=123.0

/usr/local/lib/ruby/1.8/net/http.rb:564:in `initialize': Invalid argument - connect(2) (Errno::EINVAL)
        from /usr/local/lib/ruby/1.8/net/http.rb:564:in `open'
        from /usr/local/lib/ruby/1.8/net/http.rb:564:in `connect'
        from /usr/local/lib/ruby/1.8/timeout.rb:48:in `timeout'
        from /usr/local/lib/ruby/1.8/timeout.rb:76:in `timeout'
        from /usr/local/lib/ruby/1.8/net/http.rb:564:in `connect'
        from /usr/local/lib/ruby/1.8/net/http.rb:557:in `do_start'
        from /usr/local/lib/ruby/1.8/net/http.rb:552:in `start'
        from ./S3_s3sync_mod.rb:55:in `make_http'
         ... 7 levels...
        from ./thread_generator.rb:76:in `initialize'
        from ./s3sync.rb:266:in `new'
        from ./s3sync.rb:266:in `main'
        from ./s3sync.rb:724

5  General Category / Questions / Re: Broken Pipe: solved? on: January 07, 2008, 01:26:06 AM
I've been running several days without any problems since implementing the 'dodo' hack. I'm going to give the new version a try now and see if I also get problem-free operation. Thanks Ferix for providing this new update.



6  General Category / Report Bugs / S3SYNC_RETRIES in s3config.yml on: December 29, 2007, 08:25:09 PM
I'm trying to set the number of retries to 500 and so I entered
S3SYNC_RETRIES: 500 into my s3config.yml file but the default
of 100 was still used. I then tried using lower case and entered
s3sync_retries: 500 in my s3config.yml and still no luck. 

The only other settings that I have in the s3config.yml file are:
aws_access_key_id: <mykeywhichiwontsharewithyouhere>
aws_secret_access_key: <mysecretaccesskeywhichiwontshareeither>
ssl_cert_dir: /home/s3sync/certs

Wondering if anyone else has encountered this?

- thanks,
Frank
7  General Category / Questions / Re: Broken Pipe: solved? on: December 28, 2007, 08:51:04 PM
There is only one instance where s3sync fails for me and that is when a DNS query fails, seems like s3try doesn't catch that kind of exception. But that happens almost never so I don't bother.

But broken pipes are no longer a problem for me...  for me my hack works and I consider this solved.

Thx Dodo, I'm going to give your hack a try as I've been having a fair share of these broken pipes. I'll report back later on whether my results concur with yours.
8  General Category / Questions / Re: Broken Pipe: solved? on: December 26, 2007, 04:40:34 AM
Hey dodo,
  are you still finding that your solution has helped the broken pipe issue? I'm also seeing
many failures due to broken pipe so I'm thinking of giving your solution a try.

- cheers
Frank
9  General Category / Questions / backing up large amounts of data on: December 23, 2007, 06:51:03 PM
I've been trying to use s3sync to backup about 100 GB of data from a server to s3. Here are some observations so far:

- in my case s3sync seems to transfer data at roughly 1 GB per hour but usually runs out of retries within a few hours (3-12 hours). I'm
using the SSL option. So every few hours I have to restart the process to deal with running out of retries. Sometimes the reported error
is relating to SSL, sometimes its "Broken pipe: Broken pipe" but most times it is "500 Internal Server Error" just runs out of retries.

- I also tried using rsync from my server to a small EC2 instance and was able to achieve about 10 GB per hour. From EC2 I then used s3sync to transfer data and was able to transfer at roughly 15 GB per hour.

Conclusion: If I want to do an initial dump  of 100 GB from my server to s3 it will take about 4 days and I will have to restart the process several times due to s3sync running out of retries. If on the other hand I rsync to EC2 then s3sync from EC2 to s3 the whole process will take about 17 hours.

Are my observations consistent with what everyone else is experiencing?

I'm using version 1.2.3 on a FreeBSD 6.x system with ruby 1.8.6



- cheers
Frank

Pages: [1]
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!