For your Linux cloud in AWS, try sending your server logs to an S3 bucket. I’ll show you how!
A Linux command line utility called s3cmd features a sweet syncing option that works much like rsync.
1. First, download and install s3cmd.
When first running ‘s3cmd’ from the command line as a regular user, it will prompt for your AWS keys.
2. Next, I suggest creating a dedicated S3 bucket for your logs.
3. Finally, run s3cmd sync to copy your logs or log directory over to the S3 bucket.
$ s3cmd sync /var/log/coollogs/ s3://[bucketname]/server1/
The sync command works much like rsync, so this command can be run periodically to update logs to S3. More info can be found on the s3cmd page here. Or consider tossing that command to a cron job.
Here is one of my examples for hosts in AWS for instances that come and go. Tomcat and nginx logs are dumped into a directory for the web app, organized by date and then hostname.
#!/bin/bash # send logs to s3! # s3cmd sync /var/log/tomcat7 s3://company-ec2instance-logs/application_name/`date +%F`/`hostname`/ sleep 10 s3cmd sync /var/log/nginx s3://company-ec2instance-logs/application_name/`date +%F`/`hostname`/