S3Sync.net
February 02, 2014, 01:30:36 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
 
  Home Help Search Login Register  
  Show Posts
Pages: [1]
1  General Category / Questions / Re: Full TCP buffer problem on: March 19, 2009, 08:36:23 PM
I try with --progress option:
Code:
ruby /usr/emanuele/s3sync/s3sync.rb -r -v --progress "/usr/emanuele/s3sync/archivi/dati/integramenti.it/" backup-server-integramenti1:backup-dati/integramenti.it/

this is the response:

Create node integramenti.it.tar.gzaa
Progress: 557056b  557044b/s  5%       Connection reset: Connection reset by peer
Skipping backup-dati/integramenti.it/integramenti.it.tar.gzaa: No buffer space available - connect(2)




I try making splitted rar files of 10 MB size (I've tried also 4MB). The domain after compression is composed of about 1500 files (if 10 MB each).
Sometimes the error come at 1st file (integramenti.it.tar.gzaa) sometimes at 10th or 20th...not with regularity!

this is the line in plesk panel (like in the other post):
tcpsndbuf   2,712,477   2,867,477   bytes       Total size of buffers used to send data over TCP network connections

in the plesk help i found (virtuozzo section):
tcpsndbuf is the total size of send buffers for TCP sockets, i.e. the amount of kernel memory allocated for the data sent from an application to a TCP socket, but not acknowledged by the remote side yet

not acknowledged by the remote side yet -> is a s3 problem that refuses my files?
2  General Category / Questions / Full TCP buffer problem on: March 19, 2009, 12:59:15 PM
Hi,
after some problems with a TCP buffer error (when uploading big files) now I split my data in 4MB tar files with this command before using s3sync:
Code:
tar -czf - "$domaindir/$website" | split -b 4m - "$destdir/$website/$website.tar.gz" &> /dev/null

but when my bash script run:
Code:
ruby /usr/emanuele/s3sync/s3sync.rb -r $destdir/$website/ backup-server-integramenti1:backup-dati/$website/

TCP buffer become full after some seconds.
In my Plesk panel normally have (name, current, software limit,unit,description):

tcpsndbuf   163,228   2,867,477   bytes       Total size of buffers used to send data over TCP network connections

during s3sync uploads:

tcpsndbuf   2,712,477   2,867,477   bytes       Total size of buffers used to send data over TCP network connections


why?!   Huh

- maybe all the s3sync connections run at the same time?
- if is true, is there a way to limit s3sync simultaneus connections?
3  General Category / Questions / Re: Ruby script fails if called by cron (but works in shell) on: March 19, 2009, 08:54:41 AM
yesssss! it works now!  Wink

I've compared the "set" outputs from shell and cron and in shell I have this intereresting line:
Code:
S3CONF=/usr/emanuele/s3sync

the only thing I've done is adding a line on the beginning of my bash script:
Code:
export S3CONF=/usr/emanuele/s3sync

and now it's ok

thanks for your help!
4  General Category / Questions / Ruby script fails if called by cron (but works in shell) on: March 18, 2009, 03:21:33 PM
Hi,
I have a bash script that I want to run every week (backup_dati.sh).
The script do something like that (but a bit more complex):

Code:
mkdir /usr/emanuele/s3sync/archivi/dati/officineparon.com

tar -czf - "/var/www/vhosts/officineparon.com" | split -b 4m - "/usr/emanuele/s3sync/archivi/dati/officineparon.com/officineparon.com.tar.gz"

/usr/emanuele/s3sync/s3sync.rb -r "/usr/emanuele/s3sync/archivi dati/officineparon.com/" backup-server-integramenti1:backup-
dati/officineparon.com/

rm -f -r /usr/emanuele/s3sync/archivi/dati/officineparon.com



It works if I run from shell:
sh /usr/emanuele/s3sync/script/backup_dati.sh

but if I setup a cronjob, the 3rd line (where I call the ruby script) is skipped.
I try to edit main cron with:

crontab -e

and also with:

vi /etc/crontab

adding the line

3 19 18 3 * root sh /usr/emanuele/s3sync/script/test.sh



Then I tried to modify the line in:
Code:
ruby /usr/emanuele/s3sync/s3sync.rb -r -v "/usr/emanuele/s3sync/archivi/dati/officineparon.com/" backup-server-integramenti1:backup-dati/officineparon.com/ >> "/usr/emanuele/s3sync/archivi/log/backup_test_log2.html"

but the result is the same...

"tar" instruction works ok, ruby not....someone knows why?!
I must configure something?


thanks...
Pages: [1]
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!