Amazon recently introduced the AWS Reserved Instances Marketplace. The idea is great – allow people to sell their reserved instances which they don’t need for whatever reason instead of losing the reservation money (or if you are in heavy utilization – the complete run cost of the instance 24 x 7 x Number of years you reserved).
Before you can sell a reserved instance you need to setup various details to which Amazon will wire the money – however if you are not located in the US or have a US bank account you are out of luck. Unfortunately for me – I’m located in Israel with no US bank account.
Instead of messing with various taxing issues I would like to suggest AWS to simply give back AWS credits. That is – if I sell my reserved instance for $100 I should have the option of directly crediting my AWS account with $100 which I can then use on various AWS services.
I know AWS has some mechanism to work with such a thing since they do give out gift/trial credits all the time. I also know that the Amazon Associates program for referring customers to Amazon can give you back Amazon gift certificates instead of actual money.
Just a thought that would keep the money inside the AWS ecosystem while making non US customers happy.
Continuing my post about the JSON files used in the Amazon EC2 Page, I’ve created a small Python library that also acts as a command line interface to get the data.
The data in the JSON files does not contain the same values as the EC2 API for things like region name and instance types so the library/cli translates these values to their corresponding values in the EC2 API.
You can filter the output by region, instance type and OS type.
The command line output support a human readable table format, JSON and CSV.
To use the command line you’ll need to install the following Python libraries:
- argparse – only if you are using Python < 2.7 (argparse is included in Python 2.7 and 3.x)
- prettytable – if you want to print the human readable and pretty good looking ASCII table output
Both libraries can be installed using pip.
Grab the code from Github
Don’t forget to send feedback! Enjoy!
Have you ever wanted a way to access the Amazon Web Services EC2 pricing data from code?
It seems Amazon uses predefined JSON files in the EC2 page to display the pricing per region per instance type and type of utilization.
You can easily access these JSON files, load it and use it in your own apps (at least until Amazon changes these URLs).
The naming in these JSON files is sometimes different than the naming used in the API so for example, a small instance (m1.small) is “size” : “sm” and its type is “stdODI” or “stdResI” for reserved instances.
Below are the links to the relevant files:
On Demand Instances
Data Transfer Pricing
Cloud Watch Pricing
Elastic IPs Pricing
Elastic Load Balancer (ELB) Pricing
Crunch these files and enjoy it while its there 🙂
In a recent post on the AWS blog, Jeff Barr and Matt Wood, showed the architecture and code they wrote which lists the most interesting AWS related jobs from the Amazon Jobs site.
It serves as a rather good example of how service components such as the ones AWS provides (SNS, SQS, S3 to name a few that are AWS agnostic) a great set of building blocks that can easily help you focus on writing the code you really need to write.
I found the auto scaling policy for spinning up & down machines just to tweet a bit of an over kill at first (and Jeff could have easily added the code on the same instance running the cron), however thinking about it a bit more and considering the various pricing strategies it actually makes a lot sense.
I just read a post on Slashdot about a poor guy getting a huge chunk of Netflix traffic to his server.
The problem seemed to have been caused by the nature of IP address in EC2 which are quite fluid and gets reassigned when you spin up and down a machine. The same goes for Elastic Load Balancers (ELB) which are managed by Amazon and may switch the IP address as well (that’s why they ask to map to their CNAME record for the ELB instead of the IP).
In the Slashdot post, there is a link to this article, which describes the problem and lists some possible implications and possible ways of avoiding leaking data such as passwords and session ids when such a problem occurs.
The article mostly talks about what happend if someone hijacks your ELB, but the original problem reported was that you accidentally got someone elses traffic. This can lead to some other severe consequences:
- Your servers crashing (in which case you should probably notice that rather quickly. Duh!)
- If you are running some kind of a content site that depends on SEO and crawlers picked on the wrong IP, you might end up with a HUGE SEO penalty because another site’s content will be crawled on your domain
There is a very simple and quick solution for the problem I am describing above. Make sure you configure your web server to answer only to YOUR host names. Your servers will return response ONLY for a preconfigured set of hostnames, so if you get Netflix traffic, which probably has netflix.com hostname, your server will reject it immediately.
You can easily configured that in Nginx, Apache or if you have a caching proxy such as Varnish or squid.
A better solution for this problem is to add hostname checks support to ELB itself. I’ve posted a feature request on the AWS EC2 forum with the hopes that it will get implemented.
Ever since Amazon introduced tags in EC2 I felt such a relief that I can name my instances and actually remember which one is which.
It took a while for Name tags to find its way to various aspects of the AWS Console, however, connecting to machines still requires a way to find the IP address from the console or via a command line using the EC2 API Tools.
I thought that it would be much easier for me and others to utilize the Name tags to connect more easily to the machines.
Initially, the script was a simple bash script which utilizes a the ec2-describe-instances command with a flag that matched the Name attribute, however, managed various other command line parameters such as the user name to connect with (ubuntu images, for example, users the ‘ubuntu’ user instead of ‘root’. Amazon Linux AMI uses ‘ec2user’, etc) so I’ve decided to rewrite it in Python and use the magnificent Boto Python library.
This allowed better argument handling as well as remove the need to install Java and the EC2 API Tools to access the EC2 API.
Grab the code here
Don’t forget to send some feedback!