S3Sync.net
February 02, 2014, 01:26:37 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
 
   Home   Help Search Login Register  
Pages: [1]
  Print  
Author Topic: Crash when file is deleted during sync  (Read 5725 times)
bbrown
Newbie
*
Posts: 5


View Profile
« on: April 28, 2007, 12:09:33 AM »

Hi,

I'm a new user to s3sync. As I write this, I'm doing my first full backup. It has been going for a couple of hours already.

During the sync, I've been cleaning up a few files here and there. This exposed an apparent problem with s3sync: if a file is deleted while s3sync is working, it crashes. Here's the error:

Create node web/users/docs/CC-Document-13
/usr/lib/ruby/1.8/delegate.rb:158:in `method_missing': undefined method `length' for nil:S3sync::ProgressStream (NoMethodError)
        from /usr/lib/ruby/1.8/net/http.rb:1507:in `send_request_with_body'
        from /usr/lib/ruby/1.8/net/http.rb:1496:in `exec'
        from /usr/lib/ruby/1.8/net/http.rb:1044:in `_HTTPStremaing_request'
        from ./HTTPStreaming.rb:45:in `request'
        from ./S3_s3sync_mod.rb:50:in `make_request'
        from ./S3.rb:152:in `put'
        from ./s3try.rb:57:in `S3try'
        from ./s3sync.rb:499:in `updateFrom'
        from ./s3sync.rb:370:in `main'
        from ./s3sync.rb:690


The file web/users/docs/CC-Document-13 was one of the ones I had deleted before s3sync got to it.

Can s3sync be modified so it simply ignores (or prints a warning) if a file has been deleted?
Logged
lowflyinghawk
Jr. Member
**
Posts: 52


View Profile
« Reply #1 on: April 29, 2007, 04:12:25 PM »

I don't know the internals, but presumably it is making up a list and then processing it?  if it processed a file after opening it, it wouldn't matter (on unix-like systems) if you deleted the file, it would back it up anyway because the file doesn't go away until the ref count is zero.  of course it shouldn't barf if opening a file fails ;-).
Logged
bbrown
Newbie
*
Posts: 5


View Profile
« Reply #2 on: April 29, 2007, 05:52:14 PM »

No doubt s3sync is compiling a list of files to copy before it actually begins copying. That would account for the long pause before any files start being transferred.

However, I doubt that it is keeping each and every file open. Instead, it is trying to open a file that is non-existent.
Logged
ferrix
Sr. Member
****
Posts: 363


(I am greg13070 on AWS forum)


View Profile
« Reply #3 on: April 29, 2007, 06:05:52 PM »

"compile a list" is a bit misleading.  It only keeps a partial list, the minimum of things needed in order to make sure the ordering will match up correctly.  Creating a whole list on drives with a lot of small files sometimes uses up all memory on the machine, so I don't do that.
Logged
ferrix
Sr. Member
****
Posts: 363


(I am greg13070 on AWS forum)


View Profile
« Reply #4 on: June 03, 2007, 12:04:55 AM »

By the way, this error should have been caught in 1.1.3
the NoMethodError thing happens when using --progress and the file is not readable on the local side.

This happens when it's removed, or when it just errors out such as when it's a unix pipe or what not.
Logged
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!