When running a lot of EC2 instances, keeping tabs on usage can become extremely cumbersome. In an effort to better visualize the data, I’ve written a simple bash script that pipes it into a Graphite server; I then can simply bookmark and share the graph URL so anyone who cares (i.e. the person who pays the invoices) can easily see what our load has been like.

This turns hundreds of lines like:

Service, Operation, UsageType, StartTime, EndTime, UsageValue
AmazonEC2,RunInstances,DataTransfer-In-Bytes,07/01/12 00:00:00,07/01/12 01:00:00,20950718
AmazonEC2,RunInstances,USW2-BoxUsage:m1.large,07/01/12 00:00:00,07/01/12 01:00:00,1
AmazonEC2,RunInstances,BoxUsage:m1.medium,07/01/12 00:00:00,07/01/12 01:00:00,1
AmazonEC2,RunInstances,USW2-BoxUsage,07/01/12 00:00:00,07/01/12 01:00:00,1
AmazonEC2,RunInstances:S0001,USW2-SpotUsage:m1.medium,07/01/12 00:00:00,07/01/12 01:00:00,1
AmazonEC2,RunInstances:S0001,SpotUsage:m1.medium,07/01/12 00:00:00,07/01/12 01:00:00,6
AmazonEC2,RunInstances:S0006,SpotUsage:m1.medium,07/01/12 00:00:00,07/01/12 01:00:00,18
AmazonEC2,RunInstances:S0012,SpotUsage:m1.medium,07/01/12 00:00:00,07/01/12 01:00:00,17
...

… into a nice, manager friendly picture like this: graph

Simply download your usage report from your AWS Account Activity page pass it to the following script:

#!/bin/bash
GRAPHITE_HOST="your.graphite.server"
GRAPHITE_PORT=2003 
CSVFILE=$1
grep Usage: $CSVFILE | while IFS=, read svc opr type start end usage
do
os=`uname`
if [ "$os" == "Darwin" ]; then
    start=$(echo $start | tr ':' '-')
    epoch=`date -jf "%m/%d/%y %H-%M-%S" "$start" +%s`
else
    epoch=`date -d "$start" +%s`
fi

if [ "${type:0:2}" != "US" ]; then
  type="USE1-$type"
fi

DATA="EC2.$(echo $type | tr ':-' '.') $usage $epoch"

echo $DATA
echo $DATA | nc $GRAPHITE_HOST $GRAPHITE_PORT
done

NOTE: replace the GRAPHITE_HOST and GRAPHITE_PORT variables accordingly with you server’s values.

This script only pulls out box usage as that’s what I currently care about but it wouldn’t be too difficult to graph the other stat’s like S3 and EBS usage, data transfer, etc


To Do:

As is, I am still having to log in, download the csv file and run the script manually - I plan to automate that process and will post a followup when I do so.



blog comments powered by Disqus

Published

09 July 2012

Tags

Last Twitter Updates