February 02, 2014, 01:24:50 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
  Home Help Search Login Register  
  Show Posts
Pages: [1] 2 3
1  General Category / Questions / Re: Bandwidth transfer for unchanged files on: November 23, 2009, 09:30:16 AM
1. Is there a bandwidth transfer (and resulting Amazon charge) for files that exist in both the source and destination and are unchanged?
Yes. To compare files, s3sync fetches some information for the file on Amazon S3.

If you have a zillion files, s3sync needs to compare the zillion Amazon S3 files with the locally stored ones (to find out that nothing has changed. duh. but how else could it be sure.). There is little information flowing back to your machine, but each file is at least touched once. Resulting in a zillion times access. Amazon will bill you for that, and for the little traffic that was generated.

The total amount obviously depends on how often you care to check. If you are sure things change little, check less frequently. If you use S3 as a backup and have fewer files, it might not make much of a difference to to it more often.
2  General Category / Questions / Re: s3sync works from shell but not cron job on: November 23, 2009, 09:20:10 AM
>  Is there any way to find out where you got the quest from?

Beg you pardon? Which quest?
3  General Category / Questions / Re: IllegalSequence on: May 22, 2009, 02:51:27 AM
> Can I assume that IllegalSequence is relating to a filename, probably a song file??
Yes. Most definately. It looks as if you are trying to sync something like "Björk"... :-)

Have you set
somewhere in your script (or yml or wherever)?

On which platform are you working on? Are you feeding s3sync the files it should sync by some other program?
4  General Category / Questions / Re: s3sync works from shell but not cron job on: May 22, 2009, 02:42:44 AM
> This unfortunately will not work, simply because when you export your key, you are setting them for a shell environment.
> Cronjobs run without any shell environment per-say, which is also why you need to fully qualify the path of your ruby installation.
I beg to differ. I am running s3sync-scripts since years directly from crontab in exactly the way svittal suggests. I wouldn't go so far to say that cronjobs have exactly the same environment than normal users, but it is certainly sufficient to set the appropriate environment variables to get s3sync to work. There is no need for any .yml-file (which has been introduced to s3sync at a later development stage, when people already were happily syncing with crontab). And you don't have to use the fully qualified path to the ruby installation either, if the path variable is properly set as well.

That brings us back to svittals problem. Which might be solved by setting the path the way it's done for the user.
Thats how my crontab-environment looks without doing anything to it:

This is what it looks like in my scripts. I am using an ruby- and s3sync-installation in the /opt-directory, so have added that to the PATH-variable. Done.
export AWS_ACCESS_KEY_ID=...
export SSL_CERT_FILE=/opt/bin/s3sync/ca-certificates.crt
export PATH=/opt/bin:/opt/bin/s3sync:/opt/sbin:$PATH

Probably the easiest way to find out whats missing is making a cronjob that does nothing but "printenv >some_filename_where_the_output_should_go". By comparing this output and the results that printenv gives as a logged in user, you would know what to add.

5  General Category / Questions / required module not found on: February 26, 2009, 03:15:50 PM
I'd like to second this.

It seems (but I am not sure of that) that upgrading to ruby has caused this error.

Error Message is:
/.../s3sync.rb:23:in `require': no such file to load -- md5 (LoadError)
   from /.../s3sync.rb:23:in `<module:S3sync>'
   from /.../s3sync.rb:11:in `<main>'

Can anybody give a hint as where to look?

I've tried to find the md5 executable, but there is none installed. Does it had to be there? Or is that some ruby-internal command?
6  General Category / General Discussion / Re: Getting CA certificates, re-examined on: February 15, 2008, 07:36:07 AM
Works like a charm.
Thanks for providing the ca-certificates.cer directly! Makes things so much easier.


7  General Category / Questions / Re: Problems with undersccores "_" in filenames on: January 27, 2008, 03:50:20 PM
Not exactly inspiring while looking for errors, but then - I'm glad those are just/still network errors.
Thanks again.

8  General Category / Questions / Re: Problems with undersccores "_" in filenames on: January 27, 2008, 06:13:35 AM
I didn't say it is a bug in sync. If it were, I'd have had the problem from the very beginning, just as you said. Thats why I ignored it in the first place.
But then, the errors still crop up. I'm trying to find something else to check/investigate. Maybe you could elaborate a little on the delim and cf "messages". What do they mean?

9  General Category / Questions / Problems with undersccores "_" in filenames on: January 26, 2008, 05:08:22 PM

for a very long time (maybe 10 month of continuous use) I have been syncing files day in, day out, without a single retry or error. Since maybe a month, I'm getting retrys, more and more of them, with files that haven't changed for ages. Since they haven't changed, I was ignoring the retries as a matter of network problems for a while. Recently I was able to check on the network (the machine, the load of them, my internet provider, dns or arp caches and the like) and make sure it's *not* the reason.

While debugging the logs, it turns out, the retries only happen with files or directories that have an underscore "_" in them. I know how exotic that sounds, but just to clarify: Could that be a problem?

What are the "delimiter" and "Content-Length" error reasons? What could I investigate?
Running the current version, v1.2.4.
Here's a sample output:
S3 command failed:
list_bucket max-keys 200 prefix el_GRANDE/Treiber/Sony VAIO VGC-RA104/XP/06 High
 Definition Audio/06b Conexant HDA SoftV92 Data Fax Modem with SmartCP
0, 28.04.2004/ delimiter /
With result 500 Internal Server Error
99 retries left

S3 command failed:
put el_GRANDE/console-Tools/vim 7.1.239/_QuickInstall/ftplugin.vim #<S3::S3Objec
t:0x106ba830> Content-Length 971
With result 500 Internal Server Error
98 retries left

S3 command failed:
put el_GRANDE/console-Tools/vim 7.1.239/_QuickInstall/syntax/pinfo.vim #<S3::S3O
bject:0x10530c60> Content-Length 5284
With result 500 Internal Server Error
97 retries left

S3 command failed:
put el_GRANDE/console-Tools/vim 7.1.239/_QuickInstall/syntax/spice.vim #<S3::S3O
bject:0x10568418> Content-Length 2566
With result 500 Internal Server Error
96 retries left

S3 command failed:
list_bucket max-keys 200 prefix el_GRANDE/Mobiles/Palm Treo/_Win/Palm Emulator/S
cripting/Perl/ delimiter /
With result 500 Internal Server Error
99 retries left
10  General Category / Questions / Re: backing up large amounts of data on: January 06, 2008, 01:45:22 PM
Are my observations consistent with what everyone else is experiencing?

No. Not at all. s3sync has been most stable for me. I haven't had a single retry in about 4 month of continuous transfer, in addition to a very constraint environment.

See here: http://s3sync.net/forum/index.php?topic=55.0

This was an older version of s3sync (1.1.4, if I remember correctly), though I haven't had any problems with the newer versions either. And yes, the script is still running once a week (as it takes about a day to upload anything new).

11  General Category / Questions / Re: No Problems using SSL anymore with 1.2.3 on: November 25, 2007, 04:53:16 AM

Just synced about 2.500 objects, old and new ones, without a hitch. In a US bucket, that is; I'm not yet using the European ones (and probably will only for sensitive data, not the bulk of things - because of the pricing).

Thank you, ferrix. Thank you very much.


12  General Category / Questions / Re: Problems using SSL on: November 24, 2007, 03:49:41 PM
And now it hit me, too.
Switching from 1.1.4 to 1.2.2 broke the scripts I've been using without chance for month'.

Just for sakes:
I have done nothing but changed the .rb-files in the directories that once held the 1.1.4-version. Everything else was kept as it worked before (scripts that call s3sync.rb, certificates, everything).

Because it's no big deal, I'll revert back to 1.1.4 for the moment; there's a lot to sync tonight, even on the backup machine I'm testing this on. But I'll be happy to try out any idea. Anybody?



PS: berlin, seengee: You might want to check out the older version (available on the front page of www.s3sync.net) - it works for me - again, now that I've reverted.
13  General Category / Questions / Re: Problems using SSL on: November 20, 2007, 03:32:56 PM
Did you run the shar-file, or just downloaded it?

You'd need to:
> wget http://mirbsd.mirsolutions.de/cvs.cgi/~checkout~/src/etc/ssl.certs.shar
and then:
> sh ssl.certs.shar

If you are running unter any Un*x-OS, that would be.

14  General Category / Closed Bugs / Re: Local Directory Structure on: November 20, 2007, 03:29:40 PM
maelcum: I don't want to TO s3.  I am backing up the files FROM s3.  In fact, my application would be totally broken if I ever sync'ed to S3.
I was trying to help. Checking if things were uploaded, because then they do download fine at my place here.
But I guess you weren't appreciating help. Sorry I tried. Waste of time.
15  General Category / Questions / Re: Problems using SSL on: November 20, 2007, 03:14:40 AM
No, not really. I'm using a lower version of ruby (1.8.4) but I'll try your version tonight (European Time, that is).

Could it be your certificate? Did you use your own, or followed the procedures floating around here?
Pages: [1] 2 3
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!