S3Sync.net
February 02, 2014, 01:35:48 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
 
   Home   Help Search Login Register  
Pages: [1]
  Print  
Author Topic: Show Upload Time in s3cmd's list command  (Read 6809 times)
cm_gui
Newbie
*
Posts: 13


View Profile
« on: March 01, 2007, 01:00:21 PM »

Hi Greg

Thank you for the excellent s3sync script.

Is it possible to have the s3cmd's list command show the Upload Time of the files?

This is the reason why we need this feature:
Because of Amazon 5GB limit on object size (2GB actually due to an Amazon bug), we cannot use s3sync.rb.
The data folders we want to s3sync are bigger than 5GB and we cannot split them up into smaller parts.

So I am using s3cmd's put command to put our files one by one onto Amazon S3 every night; putting only files that were modified within the last 24 hours (i.e. since the time of the last backup) --- however this nightly script will cause problem if it doesn't run for a day or two
--  in which case, the data during these days would not be backed up to Amazon.

It would be better if the s3cmd's list command can show the Upload Time of the files -- so that we can compare the Upload Time of the files on Amazon with the Modified date of the files on our server, and then put the newer files onto Amazon.   

Thank you very much.
Logged
lowflyinghawk
Jr. Member
**
Posts: 52


View Profile
« Reply #1 on: March 01, 2007, 07:05:39 PM »

s3sync puts the files in a folder one by one already right?  so if you can s3cmd put multiple individual files, then can't you use s3sync on the folders?

if you do a HEAD before PUT it is possible to check a file for changes and not do the PUT at all if it's the same, i.e. approximately the way rsync behaves.  if s3sync does this already (and I assume it does), then cm_gui could just put the whole folder and only the changed files will actually be transferred.  this is the whole idea isn't it?
Logged
cm_gui
Newbie
*
Posts: 13


View Profile
« Reply #2 on: March 02, 2007, 02:11:51 AM »


No, I cannot use s3sync.rb to put my folders to Amazon because my folders are  much bigger than 5GB, and I cannot divide my folders into smaller parts to s3sync.

I have to use s3cmd.rb's put to put the files one by one.
I want to be able to read the upload time of the files already on Amazon, and then upload newer files from my server.

Is there a HEAD command in s3sync or s3cmd?
How do I use it?  Thanks.

s3sync puts the files in a folder one by one already right?  so if you can s3cmd put multiple individual files, then can't you use s3sync on the folders?

if you do a HEAD before PUT it is possible to check a file for changes and not do the PUT at all if it's the same, i.e. approximately the way rsync behaves.  if s3sync does this already (and I assume it does), then cm_gui could just put the whole folder and only the changed files will actually be transferred.  this is the whole idea isn't it?
Logged
lowflyinghawk
Jr. Member
**
Posts: 52


View Profile
« Reply #3 on: March 02, 2007, 06:45:08 AM »

you are mistaken, the size of your folders is irrelevant, only the size of the individual files matters. see your other thread.

the HEAD comment was intended for the developer, not for you.
Logged
ferrix
Sr. Member
****
Posts: 363


(I am greg13070 on AWS forum)


View Profile
« Reply #4 on: March 02, 2007, 06:47:44 PM »

cm_gui: I'm sorry you're having trouble, but posting the same request in another way without understanding the first response isn't going to get us anywhere.

As we have repeatedly said, the object size limit, as well as the 2GB bug, is for an individual S3 node (in this case, files), not collections of them (folders).  s3sync does make a node representing the folder, but it is simply a small placeholder to record the permissions etc.  It will in no way impact size limitations.
Logged
cm_gui
Newbie
*
Posts: 13


View Profile
« Reply #5 on: March 05, 2007, 12:18:58 AM »


Hi Greg

Thank you for the reply.  I get it now.

One question -- are s3cmd and s3sync compatible?

I've posted this question in  http://s3sync.net/forum/index.php?topic=18.msg84#msg84

When I upload a folder, say /data/folder1, to Amazon using s3sync.rb,
I cannot use s3cmd's get command to download subfolders or individual files inside /data/folder1 from Amazon.

When I upload individuals files one by one to Amazon using s3cmd's put command -- and duplicating
the same directory structure /data/folder1 on Amazon,  I cannot use s3sync to download anything
from the /data/folder1 at all.
Logged
ferrix
Sr. Member
****
Posts: 363


(I am greg13070 on AWS forum)


View Profile
« Reply #6 on: March 05, 2007, 07:31:40 AM »

Oops I missed that.  I'll answer it over there.
Logged
cm_gui
Newbie
*
Posts: 13


View Profile
« Reply #7 on: March 23, 2007, 06:08:46 PM »

I managed to get s3cmd.rb to print the file's last_modified (upload time) by adding the
following line after line no. 118 in s3cmd.rb.
puts Time.parse( item.last_modified ).to_i

Logged
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!