321
|
General Category / General Discussion / Re: linux/windows filesystem ruby portability
|
on: March 06, 2007, 11:13:07 AM
|
Sure but your scenario still works as long as one of the systems is S3. Why would you move [platform A] => S3 => [platform B] why not just rsync between the two platforms? The point of s3sync is because one of your endpoints is S3... not because you are using S3 as a cache between the two endpoints
|
|
|
326
|
General Category / General Discussion / Re: Amazon object limit of 5 GB causing problem for rsync (s3sync)
|
on: March 05, 2007, 07:39:18 AM
|
They're not especially "compatible", as you can tell from the following caveats. s3cmd can't create the "folder nodes" that s3sync expects, and there's no facility for automatically adding s3sync-like meta-data for permissions and ownership. It's not a defect, per se.. s3cmd is not intended as a "light weight s3sync" or anything of the kind. It is strictly meant to cover the low level s3 operations like other s3 "shells". Things such as listing and creating buckets, poking at individual files, etc.
Let me say that again and clarify: s3cmd is LOW LEVEL.. direct access to the keys and objects on s3. You can't be thinking about them like directories and files, because they're not, and they don't behave as such. You can't use s3cmd to "get a sub directory" because there's no such thing.
Having said that, s3cmd should be able to list the keys in your bucket (added by s3sync or anything else) and then you can do a "get" for whatever key you want.
I would say in general you shouldn't be trying to poke at s3sync'd stuff with s3cmd.. it's just not intended. It would be more likely, if anything, for me to enhance s3sync to handle single files some day. The code for that is mostly there already, but the initial conditions setup is brittle, and I am disinclined to kick it around any more than necessary.
|
|
|
329
|
General Category / Closed Bugs / Re: s3sync performance testing: Slow Downloads
|
on: March 03, 2007, 04:27:23 PM
|
Yes I was testing with your file lfh.
The total time to get it is not of interest.. It just reflects my maximum available bandwidth to my home windows machine here. In addition to having a cheap connection, I am also running freenet, which is a nearly constant drain on part of my available width.
|
|
|
|