Push all the files in a directory to an s3 bucket
UPDATE: Of course, as soon as I hit publish, a coworker mentions that s3cmd
has this functionality built in (and more!) including md5 + filesize checking so it doesn't push files that are already in s3. Check out the s3cmd sync
command.
Say you've got a bunch of files sitting on a web server and you'd like to push them all to a bucket on s3. Maybe you want to set this up as a cron task so that the bucket is updated, say, every night.
find . -type f | sed 's|^./||g' | xargs -I {} s3cmd put {} s3://<REPLACE BUCKET NAME HERE>/{}
The find
command lists all of the files recursively that are nested somewhere inside the current directory. Then we use sed
to replace all of the ./
at the beginning of the file's paths so that we don't create an extra directory called .
at the top of our s3 bucket.
Then we use xargs
to run the s3cmd put
command against each of the files' names. We're using the -I {}
placeholder syntax that I learned earlier this week.
s3cmd is a great little utility. Directions for installing s3cmd
are here.