Title: STDOUT maxes out on 16K? Post by: mrvanes on November 22, 2007, 09:59:31 AM This is funny.
Since the backup takes a lot of time, but I want to monitor the backup process at any given time I redirect stdout and stderr to 2 files like this in the backup script. ./s3sync.rb -rv /media/fotos [somebucket]: >backup.log 2>backup.err I noticed that the process seemed to stop for no apparent reason (backup.log didn't grow) so I restarted it a couple of times and I was thinking about a bugreport or not untill I extended the stderr output with -d! ./s3sync.rb -rvd /media/fotos [somebucket]: >backup.log 2>backup.err Now this is interresting, because I can monitor progress of s3sync in backup.err (and I see it does!) but backup.log suddenly stops! That's when I saw the filesize of both redirect files: 4364 -rw-r--r-- 1 root root 4452913 Nov 22 15:55 backup.err 16 -rw-r--r-- 1 root root 16384 Nov 22 15:41 backup.log 16384 is exactly 16K I would think this had something to do with my shell or ruby, but stderr continues way beyond 16K so I suspect a hard limit somewhere in s3sync? Anyway, I thought it would be worth mentioning it. update: It doesn't max out on 16K, it just stalls a _very_ long time on 16K... Martin Title: Re: STDOUT maxes out on 16K? Post by: ferrix on November 24, 2007, 06:11:09 PM Wow that's news to me! I can say that my software does not contain any special code or limits here, it just prints to stderr and stdout a line at a time as you'd expect.
Title: Re: STDOUT maxes out on 16K? Post by: mrvanes on November 28, 2007, 04:49:59 PM I have been thinking about this and thought this might have to do with buffering. This could be the responsibility/flaw of the shell (bash), but also Ruby. Is there a way to flush buffers in Ruby, like there is in PHP?
|