AWS CLI is a command line utility provided by the Amazon Web Services Team for managing the AWS infrastructure. It also allows us to create, and manage s3 buckets directly from our computers using command line interfaces. Before start syncing files, make sure you have installed awscli in your system, or use the following articles to install it
In this tutorial, you will learn about synchronizing files between the local file system and s3 buckets.
1. Sync files from Local => S3 Bucket
For example I want to sync my local directory /root/mydir/ to S3 bucket directory s3://tecadmin/mydir/ where tecadmin is bucket name. I have created some new files in /root/mydir/ and sync to s3 bucket using the following command.
aws s3 sync /root/mydir/ s3://tecadmin/mydir/
upload: mydir/index.php to s3://tecadmin/mydir/index.php
upload: mydir/readme.html -> s3://tecadmin/mydir/readme.html
Note: Do not forget to add a trailing slash (/) in the local directory path when specifying the s3 bucket with the full directory path.
To keep preserve file attributes like date/time etc use -p or –preserve parameter like below
aws s3 sync /root/mydir/ --preserve s3://tecadmin/mydir/
If we want to sync only newly created files on source use –skip-existing parameter. It will skip all files which already exist on the destination or it is modified on the source.
aws s3 sync /root/mydir/ --skip-existing s3://tecadmin/mydir/
If you want to delete all files from the s3 bucket which has been removed from the local use –delete-removed parameter.
aws s3 sync /root/mydir/ --delete-removed s3://tecadmin/mydir/
2. Sync files from S3 Bucket => Local
For this example, I am again using the same folder and bucket used above. To test this I have put some extra files in the s3 bucket (s3://tecadmin/mydir/) and executed the following command to sync all files to the local directory.
aws s3 sync s3://tecadmin/mydir/ /root/mydir/
download: s3://tecadmin/mydir/logo.jpg to mydir/logo.jpg
download: s3://tecadmin/mydir/user.php to mydir/user.php
We can also used –preserve, –skip-existing and –delete-removed parameters during syncing files from S3 bucket to Local directory as followings.
aws s3 sync s3://tecadmin/mydir/ --preserve /root/mydir/
aws s3 sync s3://tecadmin/mydir/ --skip-existing /root/mydir/
aws s3 sync s3://tecadmin/mydir/ --delete-removed /root/mydir/
7 Comments
what is data copy speed local server to s3.i have more then 8 tb on local desk but just copy on 1.2 tab on s3 from last two week data same on s3 , but sync process still working on background ,plz help
Hi,
The above command worked perfectly, But i am unable to make the folder public.
Hi Kirthan,
I think you want to access your s3 files on web browser ? To do this make sure you have enabled s3 bucket to serve content as html site…
use –acl-public
It would be nice if there’s a feature where it would only sync files that are new on S3 to local directory, even if the local directory older files are deleted. For e.g only keep synching new incoming files to S3. Some parameter like “sync-new”. And possible also a parameter to specify date/time where you can say sync before or sync after date/time.
Will it download all the files again or will append the new log lines to the existing files while syncing from the s3 bucket.
Thanks for share . very usefull..