S3Sync.net
February 02, 2014, 01:37:26 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
 
   Home   Help Search Login Register  
Pages: [1]
  Print  
Author Topic: Create node _$folder$  (Read 5425 times)
sascha
Newbie
*
Posts: 6


View Profile
« on: January 07, 2008, 10:39:46 AM »

My backups via s3sync are not working and this is the output I get: Create node _$folder$

Anyone have any idea what this means?
Logged
ferrix
Sr. Member
****
Posts: 363


(I am greg13070 on AWS forum)


View Profile
« Reply #1 on: January 07, 2008, 02:25:46 PM »

Means that you created the s3 side with some other tool and expected s3sync to pull it back down to local.  But since the folder nodes were not stored by s3sync, it is horking now.

"$folder$" is a give away.. s3sync doesn't build nodes like that. 

You can try the option to force local folder creation and see if that helps.  You might also try including your command line and the -d output here.
Logged
sascha
Newbie
*
Posts: 6


View Profile
« Reply #2 on: January 08, 2008, 09:17:09 AM »

Ran it with the -d option. I got a bunch of "Source does not have" for the files I am trying to backup. I am confused as my intention is to upload files to S3 and it sounds like it is trying to copy them from S3 to my local machine??

This is what I'm running: ruby s3sync.rb -r -v --ssl bucket:folder /path/to/files/
Logged
ferrix
Sr. Member
****
Posts: 363


(I am greg13070 on AWS forum)


View Profile
« Reply #3 on: January 08, 2008, 11:47:22 AM »

Well there is a problem. Your arguments are backward.  Putting s3 path first and local after means s3 -> local Smiley
Try switching the order.
Logged
sascha
Newbie
*
Posts: 6


View Profile
« Reply #4 on: January 10, 2008, 11:49:21 AM »

Hah! That seems like it was the problem. Thanks, ferrix!
Logged
sascha
Newbie
*
Posts: 6


View Profile
« Reply #5 on: January 11, 2008, 10:00:18 AM »

Hey ferrix, that definitely worked. When I executed the command with the source/dest properly all my files went right up to s3. Only problem is when I executed it a second time, I got a bunch of errors about "Source does not have". Is there a variable to allow overwriting or something that I am missing?

Thanks again!!
Logged
ferrix
Sr. Member
****
Posts: 363


(I am greg13070 on AWS forum)


View Profile
« Reply #6 on: January 11, 2008, 10:08:11 AM »

They aren't errors, you probably have -v or -d on. 
Logged
sascha
Newbie
*
Posts: 6


View Profile
« Reply #7 on: January 11, 2008, 10:36:34 AM »

I did have -v and -d. I turned those off an executed the command, which had no output this time. However, it ran very quickly (should of took a lot longer to upload all those files) and when I check the data on s3 the upload time is dated for yesterday (when I successfully ran it for the first time).

Any thoughts? Your help is much appreciated.
Logged
ferrix
Sr. Member
****
Posts: 363


(I am greg13070 on AWS forum)


View Profile
« Reply #8 on: January 12, 2008, 01:19:48 PM »

Why should it upload again when they haven't changed?  It checks md5 and size first and skips stuff that is still equivalent.
Logged
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!