S3Sync.net
February 02, 2014, 01:27:15 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
 
  Home Help Search Login Register  
  Show Posts
Pages: [1]
1  General Category / Feature Requests / Re: Show Upload Time in s3cmd's list command on: March 23, 2007, 06:08:46 PM
I managed to get s3cmd.rb to print the file's last_modified (upload time) by adding the
following line after line no. 118 in s3cmd.rb.
puts Time.parse( item.last_modified ).to_i

2  General Category / Feature Requests / Create empty folder on: March 23, 2007, 06:02:14 PM
Hi Greg Ferrix

Thank you for coming out with this excellent s3sync.rb.

However, I am more interested in s3cmd.rb, and need a 'create empty folder' feature.

I found something here:
http://developer.amazonwebservices.com/connect/thread.jspa?messageID=36210
...  could create an empty object for each directory and use a meta tag to flag it as directory ...

Wondering if you drop some direct hints on how to tweak your s3cmd.rb script to do this?

The reason why I want to create emtyp folder :

I noticed to that when I used s3cmb.rb to put something on Amazon, e.g,
s3cmd.rb  put  pa_svn:testdir2/testfile2b localtestdir002/localtestfile2b,
s3cmd.rb will create the folder testdir2 containing the file testfile2b on Amazon.

But when I use Firefox S3 organizer to delete the file testfile2b on Amazon,
the testdir2 on Amazon also disappeared.
It seems that the folder testdir2 created by s3cmd.rb cannot be empty.
This is unlike in Firefox S3 organizer where you can create an empty folder.

When I do this,
s3cmd.rb  put  pa_svn:testdir4  localtestdir4  ,
s3cmd.rb will create something that looks like a file on Amazon (Firefox S3 organizer sees a file 'testdir4').
Then I do this,
s3cmd.rb  put  pa_svn:testdir4/testfile4  localtestdir4/localtestfile4,
Firefox S3 organizer still sees the same thing, something that looks like a file 'testdir4'
and when I download using Firefox S3 organizer, 'testdir4' is not a directory but a file which is unreadable.

I guess s3cmd.rb is not fully compatible with Firefox S3 organizer.
But if there is a command in s3cmd.rb to create an empty folder on Amazon, I think
if I create the folder first on Amazon before putting the file there, it will
be compatible with Firefox S3 organizer.

Any direct hints on how to create an empty folder on Amazon S3?

Thank you very much.

3  General Category / General Discussion / Re: Amazon object limit of 5 GB causing problem for rsync (s3sync) on: March 23, 2007, 05:21:20 PM
Thanks ferrix

I managed to get s3cmd.rb to print the file's last_modified (upload time) by adding the
following line after line no. 118 in s3cmd.rb.
puts Time.parse( item.last_modified ).to_i

Now, I want to create an empty folder in Amazon S3, but that is another post.
http://s3sync.net/forum/index.php?topic=39.0










Hints:
- Learn how S3 REST interface works by reading the amazon documentation on it.
- Get a working knowledge of Ruby with http://www.rubycentral.com/book/index.html
- Look at s3sync.rb and s3cmd.rb code and comments
4  General Category / General Discussion / Re: Amazon object limit of 5 GB causing problem for rsync (s3sync) on: March 05, 2007, 05:08:10 PM
Thank you once again Greg.

Do you think you can come up with a command in s3cmd to show the Upload Time of the uploaded data on Amazon?

Or the unix timestamp (modified time) of the files if they are preserved on the Amazon server after being uploaded.

If I can use s3cmd to see the upload time of the files on Amazon, I can then use s3cmd put command to upload files which are newer, and won't need to use s3sync.

s3cmd may be useful in instances where users want to be able to access the Amazon data using other tools like Firefox S3 Organizer.

Or maybe you can drop some hints on how to modify your scripts to do this?     If it isn't too difficult, I might want to give it a shot myself -- although I'm a newbie.


5  General Category / Feature Requests / Re: Show Upload Time in s3cmd's list command on: March 05, 2007, 12:18:58 AM

Hi Greg

Thank you for the reply.  I get it now.

One question -- are s3cmd and s3sync compatible?

I've posted this question in  http://s3sync.net/forum/index.php?topic=18.msg84#msg84

When I upload a folder, say /data/folder1, to Amazon using s3sync.rb,
I cannot use s3cmd's get command to download subfolders or individual files inside /data/folder1 from Amazon.

When I upload individuals files one by one to Amazon using s3cmd's put command -- and duplicating
the same directory structure /data/folder1 on Amazon,  I cannot use s3sync to download anything
from the /data/folder1 at all.
6  General Category / General Discussion / Re: Amazon object limit of 5 GB causing problem for rsync (s3sync) on: March 02, 2007, 02:23:23 PM
Hi lowflyinghawk

Thank you for the clarification.

When I upload a folder, say /data/folder1, to Amazon using s3sync.rb,
I cannot use s3cmd's get or Firefox S3 organizer to download subfolders or individual files inside /data/folder1 from Amazon.
In fact, Firefox S3 organizer sees the /data/folder1 as a single file on Amazon and I cannot get anything from it using Firefox S3.
I know the author of s3sync.rb had said that s3sync.rb may not be compatible with other tools,
but s3cmd's get command also cannot download subfolders and individual files inside /data/folder1.
This led me into thinking that /data/folder1 is uploaded as a single object.
However, s3sync.rb is able to download subfolders inside /data/folder1.

When I upload individuals files one by one to Amazon using s3cmd's put command -- and duplicating
the same directory structure /data/folder1 on Amazon,  I cannot use s3sync to download anything
from the /data/folder1 at all.
So it seems that s3cmd and s3sync are not compatible?

7  General Category / General Discussion / Re: Amazon object limit of 5 GB causing problem for rsync (s3sync) on: March 02, 2007, 02:18:12 AM
hi lowflyinghawk

So are you saying that if you have a folder of 20GB (containg 10 2GB files),
you can do this
./s3sync.rb -r --ssl /local/20GBFolder  bucket1:prefix1/data1

The entire 20GB /local/20GBFolder can be uploaded to Amazon's bucket1:prefix1/data1  at one go?
As long as none of the files inside the 20GB folder are bigger than 2GB, the above will work?

The Amazon 5GB limit is actually a limit on the file size and not object size?
Or one single file is an object?

I thought if I do this -- /s3sync.rb -r --ssl /local/20GBFolder  bucket1:prefix1/data1,
then /local/20GBFolder is a single object on Amazon?


Thank you.



no, it's one key per file, so a folder containing 10 2G files maps to 10 separate keys each of which is PUT separately.  the only limit is on the individual keys, i.e. foo:bar/baz can't be over 2G, but the total of foo:/* doesn't matter.  remember, S3 is not a file system on a disk, it is a name/value database, so only the individual keys matter.  /foo/bar and /foo/bar/baz are not contents of the folder "/foo" in the way you may be used to thinking of it, each one is just a key.
8  General Category / Feature Requests / Re: Show Upload Time in s3cmd's list command on: March 02, 2007, 02:11:51 AM

No, I cannot use s3sync.rb to put my folders to Amazon because my folders are  much bigger than 5GB, and I cannot divide my folders into smaller parts to s3sync.

I have to use s3cmd.rb's put to put the files one by one.
I want to be able to read the upload time of the files already on Amazon, and then upload newer files from my server.

Is there a HEAD command in s3sync or s3cmd?
How do I use it?  Thanks.

s3sync puts the files in a folder one by one already right?  so if you can s3cmd put multiple individual files, then can't you use s3sync on the folders?

if you do a HEAD before PUT it is possible to check a file for changes and not do the PUT at all if it's the same, i.e. approximately the way rsync behaves.  if s3sync does this already (and I assume it does), then cm_gui could just put the whole folder and only the changed files will actually be transferred.  this is the whole idea isn't it?
9  General Category / General Discussion / Re: Amazon object limit of 5 GB causing problem for rsync (s3sync) on: March 01, 2007, 01:08:07 PM
Hi Ferrix

Thank you for the reply.

Are you saying that I can run
./s3sync.rb -r --ssl /local/test1 bucket1:prefix1/data1
even if the test1 folder is greater than 5GB ?

Wouldn't the local/test1 become a single object on Amazon?

Gui


The size of the folder is irrelevant, only the size of each node.  s3sync maps one file per node.  So if you have a file > 5G then you can't use s3sync.  Otherwise it should be OK.

Note however I think there may still be an S3 bug about not being able to send a file that is >2GB because of some .. hardware issues.  But I'm too lazy to look up the details right now.  AWS forum should be swarming with stuff about it.
10  General Category / Feature Requests / Show Upload Time in s3cmd's list command on: March 01, 2007, 01:00:21 PM
Hi Greg

Thank you for the excellent s3sync script.

Is it possible to have the s3cmd's list command show the Upload Time of the files?

This is the reason why we need this feature:
Because of Amazon 5GB limit on object size (2GB actually due to an Amazon bug), we cannot use s3sync.rb.
The data folders we want to s3sync are bigger than 5GB and we cannot split them up into smaller parts.

So I am using s3cmd's put command to put our files one by one onto Amazon S3 every night; putting only files that were modified within the last 24 hours (i.e. since the time of the last backup) --- however this nightly script will cause problem if it doesn't run for a day or two
--  in which case, the data during these days would not be backed up to Amazon.

It would be better if the s3cmd's list command can show the Upload Time of the files -- so that we can compare the Upload Time of the files on Amazon with the Modified date of the files on our server, and then put the newer files onto Amazon.   

Thank you very much.
11  General Category / Feature Requests / command to move objects? on: February 20, 2007, 01:33:41 PM
Hi Greg

Is it possible to have a command to move objects from one location to another on the Amazon S3 server?

S33r is able to do this, but I prefer s3cmb/s3sync.
http://s33r.rubyforge.org/


Thank you.

Gui
12  General Category / General Discussion / Amazon object limit of 5 GB causing problem for rsync (s3sync) on: February 20, 2007, 01:16:29 PM
Hi All

Right now, we are using rsync to backup our data to our own backup servers. 
We are thinking of using Amazon S3 with the s3sync.rb tool.

But the Amazon's limit of 5 GB per object is causing problem for us because
many of the folders we are rsync-ing to our backup servers now are very much
bigger than 5 GB.    And we cannot break up the folders into smaller chunkers
as it would be very messy and would in fact negate the benefits of rsync.

Does anybody else have this problem?
And how do you workaround it?

Right now, we are thinking of using s3cmb's put command to put modified files
individually to Amazon but we don't think this is as good as rsync.

Thank you.

Gui



Explanation why we cannot break up our data into smaller chunks.
For example, right now, we can run
rsync --delete -aruvze ssh /usr/ftp 192.168.1.2:/back/
and the entire ftp folder gets backed up onto our backup server 192.168.1.2
If we delete some subfolders in /usr/ftp, the same in 192.168.1.2:/back will get deleted.

If we were to s3sync.rb each sub-sub-subfolder in /usr/ftp individually onto Amazon S3 server,
we would have write scripts to delete those sub-sub-subfolders on Amazon S3 server which
have been deleted on our server.     Have to do the sub-sub-subfolder level
because the 1st level subfolders in /usr/ftp are also very much larger than 5 GB.

13  General Category / Feature Requests / s3sync.rb to run without current directory defined to where s3try.rb, S3.rb are. on: February 17, 2007, 05:28:36 PM
Hi Greg

Thank you for the excellent s3sync.rb script.
It is working great for us.   

But we are running s3sync.rb from a bash script, and it would be nice if s3sync.rb is able to run without having to define the working directory to where s3try.rb, S3.rb, etc. are.     

This was actually discussed before -
http://developer.amazonwebservices.com/connect/message.jspa?messageID=46318#46318
http://developer.amazonwebservices.com/connect/message.jspa?messageID=46312#46312

If the working directory is not defined to where the s3sync scripts are, these error messages will appear -
<< /usr/local/etc/s3sync/s3sync.rb:18:in `require': no such file to load -- thread_generator (LoadError)  from /usr/local/etc/s3sync/s3sync.rb:18uuid /usr/local/etc/s3sync/s3cmd.rb:12:in `require': no such file to load -- s3try (LoadError) from /usr/local/etc/s3sync/s3cmd.rb:12 >>

I tried changing the scripts.   For example, changingline 18 in s3sync.rb from "require 'thread_generator'"  to "require '/usr/local/etc/s3sync/thread_generator'", etc.   But this introduced other errors.

If you can modify the scripts so that they can be run from anywhere, that would be great.
Thank you.

Pages: [1]
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!