In the last articles we have install s3cmd in Linux and Windows systems and learn about it working. In that article you can find how to create bucket, upload/download data between buckets etc. This article will help you to how to sync file between s3 bucket and local directory in both directions.
Before start syncing files, make sure you have installed s3cmd in your system, or use following articles to install it
How to Install s3cmd in Linux and Manage s3 Buckets
How to Install s3cmd in Windows and Manage S3 Buckets
1. Syncing Files from Local => S3 Bucket
For example I want to sync my local directory /root/mydir/ to S3 bucket directory s3://tecadmin/mydir/ where tecadmin is bucket name. I have created some new files in /root/mydir/ and sync to s3 bucket using following command.
#s3cmd sync /root/mydir/ s3://tecadmin/mydir/ [Sample Output] /root/mydir/index.php -> s3://tecadmin/mydir/index.php [1 of 2] 397 of 397 100% in 0s 4.02 kB/s done /root/mydir/readme.html -> s3://tecadmin/mydir/readme.html [2 of 2] 9202 of 9202 100% in 0s 103.62 kB/s done Done. Uploaded 9599 bytes in 0.3 seconds, 27.92 kB/s
Note: Do not forgot to add trailing slash (/) in local directory path when specifying s3 bucket with full directory path.
To keep preserve file attributes like date/time etc use -p or –preserve parameter like below
# s3cmd sync /root/mydir/--preserve s3://tecadmin/mydir/
If we want to sync only newly created file on source use –skip-existing parameter. It will skip all files which already exists on destination either its modified on source.
# s3cmd sync /root/mydir/--skip-existing s3://tecadmin/mydir/
If you want to delete all files from s3 bucket which has removed from local use –delete-removed parameter.
# s3cmd sync /root/mydir/--delete-removed s3://tecadmin/mydir/
2. Syncing Files from S3 Bucket => Local Directory
For this example I am again using same folder and bucket used above. To test this i have put some extra files in s3 bucket (s3://tecadmin/mydir/) and executed following command to sync all files to local directory.
# s3cmd sync s3://tecadmin/mydir/ /root/mydir/ [Sample Output] s3://tecadmin/mydir/logo.jpg -> /root/mydir/logo.jpg [2 of 3] 7219 of 7219 100% in 0s 125.28 kB/s done s3://tecadmin/mydir/user.php -> /root/mydir/user.php [3 of 3] 40380 of 40380 100% in 0s 596.33 kB/s done Done. Downloaded 47599 bytes in 0.3 seconds, 184.40 kB/s
We can also used –preserve, –skip-existing and –delete-removed parameters during syncing files from S3 bucket to Local directory as followings.
# s3cmd sync s3://tecadmin/mydir/ --preserve /root/mydir/ # s3cmd sync s3://tecadmin/mydir/ --skip-existing /root/mydir/ # s3cmd sync s3://tecadmin/mydir/ --delete-removed /root/mydir/
Read more about s3cmd sync http://s3tools.org/s3cmd-sync
what is data copy speed local server to s3.i have more then 8 tb on local desk but just copy on 1.2 tab on s3 from last two week data same on s3 , but sync process still working on background ,plz help
Hi,
The above command worked perfectly, But i am unable to make the folder public.
Hi Kirthan,
I think you want to access your s3 files on web browser ? To do this make sure you have enabled s3 bucket to serve content as html site…
use –acl-public
It would be nice if there’s a feature where it would only sync files that are new on S3 to local directory, even if the local directory older files are deleted. For e.g only keep synching new incoming files to S3. Some parameter like “sync-new”. And possible also a parameter to specify date/time where you can say sync before or sync after date/time.
Will it download all the files again or will append the new log lines to the existing files while syncing from the s3 bucket.
Thanks for share . very usefull..