S3Sync.net
February 02, 2014, 01:19:11 PM
Welcome,
Guest
. Please
login
or
register
.
1 Hour
1 Day
1 Week
1 Month
Forever
Login with username, password and session length
Home
Help
Search
Login
Register
S3Sync.net
>
General Category
>
Questions
>
how to skip large files ?
Pages: [
1
]
« previous
next »
Print
Author
Topic: how to skip large files ? (Read 3233 times)
GetuWired
Newbie
Posts: 2
how to skip large files ?
«
on:
March 11, 2010, 02:28:42 PM »
I know there is a 2gb issue with aws.
The goal is to use s3sync to back up our backups to s3. Its working fine until it gets to some backups that are larger then 2gb, and we run into:
Broken pipe: Broken pipe
99 retries left, sleeping for 30 seconds
Connection reset: Connection reset by peer
98 retries left, sleeping for 30 seconds
This will take forever to work itself out. If we don't care about the files larger then 2gb, how can I lower this retry counter to like 2 or just have it skip it all together but keep uploading the rest of the files.
We want this to be an automated process and since backups change size daily there is no way to know which ones are going to be over 2gb. Almost all < 2gb.
Any suggestions ?
Logged
GetuWired
Newbie
Posts: 2
Re: how to skip large files ?
«
Reply #1 on:
March 15, 2010, 08:49:22 AM »
if there is no way to modify the rb to check for this, is there a better way to push the files rather then,
ruby s3sync.rb -r --ssl --delete /home/folder/localuploadfolder/ mybucket
can you suggest a shell script that would loop through the localuploadfolder and upload just the names of the files less then 2gb ?
Logged
Pages: [
1
]
Print
« previous
next »
Jump to:
Please select a destination:
-----------------------------
General Category
-----------------------------
=> Announcements
=> Questions
=> General Discussion
=> Report Bugs
===> Closed Bugs
=> Feature Requests
Loading...