goomba
Newbie
Posts: 1
|
|
« on: May 22, 2007, 02:22:06 AM » |
|
This program is very good at backing up and one of the easiest I've found to work with s3.
I have just one question.
Say for example, I run this command:
s3sync.rb -r --ssl --delete --progress -v s3bucket:/home /root
The s3bucket contains something like /home/test/files.tar.gz
When I run the command, I get an error like:
Could not mkdir /root/home/test: No such file or directory - /root/home/test Could not change owner/permissions on /root/home/test: No such file or directory - /root/home/test
--------- But if I do this:
s3sync.rb -r --ssl --delete --progress -v s3bucket:/home/test /root
There are no errors.
I assume that s3sync.rb can't handle if there's another subdirectory?
I'm just thinking if there's another way to restore all the subdirectories under s3bucket:/home quickly instead of manually restoring each directory under it.
|
|
|
Logged
|
|
|
|
ferrix
|
|
« Reply #1 on: May 22, 2007, 09:49:33 AM » |
|
Are you using the latest version? It used to have some problems with directory creation, but that should be long gone by now.
The only other problem is if the directory node for "home" is missing from the s3 side. This can happen if you selectively only backed up certain subdirectories of "/home/..." but not "/home" itself.
In this case, on the local machine you'll need to create any top level dirs that don't exist on the s3 side. so for example mkdir /root/home and then your first command would go fine.
|
|
« Last Edit: May 22, 2007, 09:52:30 AM by ferrix »
|
Logged
|
|
|
|
nickt
Newbie
Posts: 4
|
|
« Reply #2 on: May 26, 2007, 03:56:32 PM » |
|
Hi, I seem to be having a similar problem - can you tell me what I'm doing wrong? I'm using s3sync.rb version 1.1.2. So, I've got this in MYBUCKET:
./s3cmd.rb list MYBUCKET -------------------- category_images/ category_images/2 category_images/2/ring.jpg photos/
Note: the photo's dir is empty.
So if I run s3sync.rb -r MYBUCKET: /root/testing/
I get the output:
S3 command failed: head MYBUCKET category_images With result 404 Not Found S3 command failed: get_stream MYBUCKET photos #<File:0x2acc74f57150> With result 404 Not Found
I get the files, but end up with bad permissions: d--x-wx--T 4 root root 4096 May 26 21:51 category_images
How should I sync the entire content of an s3 bucket back down?
Thanks,
Nick
|
|
|
Logged
|
|
|
|
ferrix
|
|
« Reply #3 on: May 26, 2007, 10:11:10 PM » |
|
I doubt that the permissions thing is related to the directory thing.
Can you create a tar.gz of your test data (with correct permissions) and send it to me (email address is in the README)
|
|
|
Logged
|
|
|
|
nickt
Newbie
Posts: 4
|
|
« Reply #4 on: June 01, 2007, 01:23:46 PM » |
|
Hi - I sent you an email through with steps to reproduce and an example tar file as requested. I haven't heard anything - did you get it ok?
Thanks
|
|
|
Logged
|
|
|
|
ferrix
|
|
« Reply #5 on: June 02, 2007, 05:43:10 PM » |
|
Was busy during the week. I'll take a look today though!
|
|
|
Logged
|
|
|
|
ferrix
|
|
« Reply #6 on: June 02, 2007, 11:31:05 PM » |
|
|
|
|
Logged
|
|
|
|
nickt
Newbie
Posts: 4
|
|
« Reply #7 on: June 05, 2007, 12:44:36 AM » |
|
Ok, so a really stupid question - where do I get it from? I've just downloaded the version linked as "latest" and that identifies itself as 1.1.2 (and still has the same prob). Thanks!
|
|
|
Logged
|
|
|
|
ferrix
|
|
« Reply #8 on: June 05, 2007, 03:02:47 AM » |
|
Oops I forgot to update the version number in the s3sync.rb file. Done.
Notice too that the new release puts its files in a subdir (unlike previous ones)
And in case it was not clear from the release notes: In order to "fix" this, you would have to re-sync from local sources to s3 targets. The problem was in that aspect, not in the retrieval. So if you were just trying to test by re-downloading your bucket, the thing would still appear broken.
|
|
« Last Edit: June 05, 2007, 03:28:49 AM by ferrix »
|
Logged
|
|
|
|
nickt
Newbie
Posts: 4
|
|
« Reply #9 on: June 05, 2007, 09:46:05 AM » |
|
Hmmm, ok. My problem is that the file structure is actually generated by another application (in my case the Rails plugin attachment_fu). So re-syncing the source isn't really a fix. Is it possible that you can make the retrieval more "lenient" given this situation? I reckon that'd be worth a pint Thanks.
|
|
|
Logged
|
|
|
|
ferrix
|
|
« Reply #10 on: June 05, 2007, 10:52:53 AM » |
|
It's a simple request but it opens a can of nightmare
|
|
|
Logged
|
|
|
|
maelcum
Newbie
Posts: 43
|
|
« Reply #11 on: June 05, 2007, 04:31:18 PM » |
|
Sorry if I ask, but I need to understand this 100%: In order to "fix" this, you would have to re-sync from local sources to s3 targets. As long as I can live with the way things worked in the past (Version<1.1.3) I don't need to do anything. No resync, no different way of restoring things. Right? Using s3sync.rb version 1.1.3 will only behave differently (read: fail) as long as I don't prepare the receiving end (my local machine) as I did before (making certain all top level folders are created before starting the restore). Correct? If I want the restore to be easier - which version 1.1.3 would offer - I would need to resync things to s3 again. Okay? Thanks for your patience.
|
|
|
Logged
|
|
|
|
ferrix
|
|
« Reply #12 on: June 05, 2007, 09:24:25 PM » |
|
I can't tell you specifics about your other tool because I haven't used it.
1.1.2 and earlier had a bug where the top level directory node, in certain cases, was made with a trailing slash in the name. This is fixed in 1.1.3.
If you want to interop with s3sync, then the tool should store directory nodes in a compatible fashion (unlikely, since no one else has realized how cool it is to do it the way I do)
Alternately, you could write a patch for s3sync that makes it work the way you want. Beware though.. slash parsing semantics is A BEAR.
|
|
|
Logged
|
|
|
|
|