Categories
Amazon Web Services AWS Bucket Policy JSON S3

S3 Bucket Policy to Restrict Access by Referrer, Yet Allow Direct Access to File(s)

Recently Amazon rolled out S3 Bucket Policies (see Access Policy Language) to more finely control access to S3 buckets or resources in buckets, than with just ACL’s alone.  This was very timely as I had a need arise to use a bucket policy just after it came out.  Basically I needed to block access of a single file, let’s call it xyz.htm, from certain referrers, yet allow all others.  After a little research and some trial-and-error I was able to define a policy which did just this:

{
"Version":"2008-10-17",
"Id":"mydomain-widgettest",
"Statement":[{
"Sid":"1",
"Effect":"Deny",
"Principal":{
"AWS":"*"
},
"Action":"s3:GetObject",
"Resource":"arn:aws:s3:::widgettest.mydomain.com/xyz.htm",
"Condition":{
"StringLike":{
"aws:Referer":[
" http://blockedreferer1.com/*",
" http://blockedreferer2.net/*",
]}}},
{
"Sid":"2",
"Effect":"Allow",
"Principal":{
"AWS":"*"
},
"Action":"s3:GetObject",
"Resource":"arn:aws:s3:::widgettest.mydomain.com/xyz.htm",
"Condition":{
"StringLike":{
"aws:Referer":[
"*",
" http://widgettest.mydomain.com/*"
]}}}]}

However, this had the undesired effect of blocking direct access to the file, i.e. http://widgettest.mydomain.com/xyz.htm, where there is no referrer, or the referrer is null.  This one took me a little longer to figure out, and a key piece of it was found in the Amazon developer forums.  I was then able to write a bucket policy which behaves as desired:

{
"Version":"2008-10-17",
"Id":"mydomain-widgettest",
"Statement":[{
"Sid":"1- Allow direct access to xyz.htm - i.e. no referrer.",
"Effect":"Allow",
"Principal":{
"AWS":"*"
},
"Action":"s3:GetObject",
"Resource":"arn:aws:s3:::widgettest.mydomain.com/xyz.htm",
"Condition":{
"Null":{
"aws:Referer":true
}}},
{
"Sid":"2- Allow all referrers to xyz.htm except those listed.",
"Effect":"Allow",
"Principal":{
"AWS":"*"
},
"Action":"s3:GetObject",
"Resource":"arn:aws:s3:::widgettest.mydomain.com/xyz.htm",
"Condition":{
"StringNotLike":{
"aws:Referer":[
" http://blockedreferer1.com/*",
" http://blockedreferer2.net/*"
]}}}]}

This policy effectively allows direct access to xyz.htm (null or “no” referrer), and allows access to all referrers except those explicitly listed in the Sid:2 section.  One important note is that “public” read access must not be set in the ACL for this file as it will allow anyone access, effectively bypassing this policy.

NOTES:

  • Amazon S3 bucket policies use JSON.  If you aren’t familiar with JSON as I wasn’t you can read more here.
  • I found a handy JSON Formatter and Validator – to do just that. . .
  • Since Amazon doesn’t provide an easy method for us non-programmers to apply bucket policies I found CloudBerry S3 Bucket Explorer Pro essential and simple to use to apply bucket policies.
  • Sometimes as I applied a policy to test I would receive the message “invalid aspen elements,” which basically mean something is wrong, usually one of the required elements was either missing or incorrect, and, interestingly no results were found using Google.
See Also:
Categories
Amazon Web Services AWS EC2

Recover From 120 Day Terminal Services Eval Time Bomb in Windows Servers on EC2

I’ve always been frustrated by Windows messages like, “please see your administrator. . .”  I AM the administrator, I don’t need to see myself, I need useful information to lead me in the right direction to troubleshoot and correct a problem.

Here’s a new one that really frustrated me this week.  I have several Amazon EC2 servers.  Most of which run Windows 2003 or Windows 2008.  Often when I start a server for our development team I will install Terminal Services (120 day eval) so more than two developers can connect at a time with RDP.  Usually those servers are in use for a few weeks to a couple months.  Every so often they are used over four months.  Well, as that time approaches Windows kindly displays reminders as to how many days remain in the trial.  We see this so often it just gets ignored.

Well, after the 120 days we can’t login to the box any longer, which sucks in and of itself, especially since we can only use RDP (unless we installed something else) to connect to these servers and cannot log on to the console.  The first few times this happened I had to scrap the server and start a new instance.  I have since figured out a work-around. . .

It used to be that after the 120 days was up a nice, informative message was displayed (don’t remember the exact wording) that basically said, “time is up you cheap bastard.  You cannot log in to this server any longer and must pay the mighty Micro$oft.”  Or something like that.

Now for some reason I’m getting the message, “To log on to this remote computer, you must have Terminal Server user Access permissions on this computer.  by default, members of the Remote Desktop Users group have these permissions.  If you are not a member of the Remote Desktop users group or another group that has these permissions, or if the Remote Desktop User group does not have these permissions, you must be granted these permissions manually.”

When I first saw this new message it scared me.  Recently we had some employees leave under less-than-ideal circumstances.  And while I was careful to disable their accounts on our production servers I missed a couple of the dev servers.  My first thought was that one of these guys removed my account and all others from the Administrators group.  After all, that’s what the message indicated.  I was able to connect to the crippled server from another EC2 server with Computer Management where I reviewed the security event logs and found nothing afoul.  I also checked the date on NTUSER.DAT for all users.  Again, no smoking gun.

When the same thing happened to another server last night I began to get more worried.  What in the world was happening?  After some crack investigation on my part I was also not able to find anything on this other server which would lead me to the culprit.

I did discover though that both servers had been started initially about four months ago.  This really got me thinking that perhaps the 120 day terminal services time bomb might be the problem.

As I mentioned earlier I discovered how to reset this 120 days on a Windows server running on EC2 – image the machine.  While imaging an EC2 server has a couple of annoying side-effects, like resetting the timezone to Pacific time and creating a new certificate, it does whatever is required by Windows to reset the time to 0 for things like 120 day eval of TS.

Not trying to cheat the system here, just pointing out a way I found to logon to a server I thought was toast.  We are actually done with both of these servers and can terminate them now anyway.

Categories
Amazon Web Services AWS ElasticFox

Copying ElasticFox Tags from One Browser to Another

The ElasticFox Firefox extension allows you to tag EC2 instances, EBS volumes, EBS snapshots, Elastic IPs, and AMIs. ElasticFox’s tags are stored locally within Firefox, so if you use ElasticFox from more than one browser your tags from one browser are not visible in any other browser.  Also, if your browser crashes you may lose your tags – so back them up.

Manually Copy ElasticFox Tags
Use this method to manually copy ElasticFox tags for backup or copy to another machine.

  1. Open about:config (enter about:config in address field of Firefox.
  2. In the filter field enter the word, “tags.”
  3. Copy entries with data in the value field such as:
    1. ec2ui.eiptags…
    2. ec2ui.instancetags…
    3. ec2ui.volumetags…
    4. ec2ui.snapshotTags…

See full article, which covers:

  • How to Export ElasticFox Settings
  • How to Import ElasticFox SettingsCopying ElasticFox Tags to Another Browser Manually
  • Copying ElasticFox Tags to Another Browser with OPIE
  • Copying ElasticFox Tags to Another Browser with Shell Scripts
Categories
Amazon Web Services AWS ElasticFox

Maximum EBS Volumes on EC2 Windows EBS-backed Instances – EBS Volume Limit

Last week I wrote about The Maximum (EBS) Drives/Volumes for an EC2 Windows Instance.  In that I discussed the max I had discovered was 12.  While that is accurate, it is important to understand that was based on “instance-store” (or S3-backed) instances.  Since I’ve been working recently with EBS-backed Windows (2003 and 2008) instances I wanted to see what they could handle.

I started by creating about a dozen EBS volumes, then using ElasticFox I attached them to my Windows instance.  ElasticFox will auto-assign the device name – xvdh, xvdi, xvdj, and so on up to xvdp.  After that it will begin using xvdg, xvdf, xvde, etc.  After I had 14 total drives (Amazon calls them volumes, Windows calls them drives – therefore I am using the terms interchangeably) attached I received the message, “The request must contain the parameter device” and couldn’t attach any more.

Next I turned to Amazon’s AWS Management Console which displays a list of available devices.  However, it only displays xvdf – xvdp.

Since xvdd – xvdp were already in use, and I reasoned that the root volume was using xvda, I tried to manually use xvdc, and it worked.  I was also able to manually assign device xvdb.

At this point I had 16 EBS volumes (drives 0-15) attached to my Windows instance.

I was able to successfully reboot the instance and everything worked fine, unlike when I lost connectivity to the instance-store instance as described in my previous post.

Just for fun I tried to device names outside the specified range.  For example when I tried to use xvdq I received the message, “Value (xvdq) for parameter device is invalid.  xvdq is not a valid EBS device name.”

This all makes sense as “p” is the 16th letter in the alphabet.  Therefore, devices xvda – xvdp are available and usable on Windows 2003 and Windows 2008 Amazon EBS-backed instances.

Categories
Amazon Web Services AWS EC2

What is the Maximum Drives for an EC2 Windows Instance? – EBS Volume Limit

Yesterday while I was doing some performance testing on Amazon EBS (elastic block storage) volumes attached to a Windows AMI (Amazon machine instance) I ran into an unanticipated issue – the maximum number of drives associated with an EC2 Windows server was lower than I expected.  The max connected drives is 12 – this includes both ephemeral drives and EBS volumes.  This is a little bit of a surprise, especially since Linux instances are supposed to handle 16.

NOTE: This article is about instance-store instances.  For information about drive limitations on EC2 Windows EBS-backed instances see, “Maximum EBS Volumes on EC2 Windows EBS-backed Instances.”

I haven’t run across this little tidbit anywhere, nor could I find it today when specifically searching for it, so I thought I’d post a few details about my findings.

First off, yesterday I spun up an Extra Large Instance (AKA m1.xlarge) and created a dozen or so 5GB EBS volumes and began attaching them to the instance.  Since I was creating several I used the EC2 command line tool ec2-create-volume:

ec2-create-volume -s 5 -z us-east-1d

In the preceding command the “-s 5” creates a 5GB volume and “-z us-east-1d” creates the volume in the specified Amazon Availability Zone, which by the way, has to match that of the instance to which you will attach the volume.

I attached some volumes using ElasticFox. . .

. . . then attached some with the EC2 command line tool ec2-attach-volume:

ec2-attach-volume vol-0d62c264 -i i-999919f2 -d xvdk
ec2-attach-volume vol-0362c26a -i i-999919f2 -d xvdl
ec2-attach-volume vol-0562c26c -i i-999919f2 -d xvdm

Doing this particular task isn’t for the faint of heart as you have to specify the device name (-d xvdm, for example) which has to be unique for each volume attached to a server instance.  You may find it easier generally to use ElasticFox or the AWS Management Console.

Let me take just a moment to point out that, depending on the instance type, you will already have two or more drives.  For example the instance I used here, m1.xlarge, has a 10GB “C drive,” and four 420GB drives, D, E, F & G (by default) in Windows.  In Windows Disk Management these will be disks 0-4.  As you add an EBS volume it will be Disk 5, and so on.

I actually attached five EBS volumes in one fell swoop from the command line, and much to my chagrin I immediately lost connectivity to my instance – I had an RDP session open at the time which immediately quit responding.

Since I lost connectivity to the instance and couldn’t re-establish a Remote Desktop connection I manually rebooted the instance with ElasticFox.  However, this didn’t work.  Initially I thought I had overlapped a device name which the instance couldn’t handle so I detached the five EBS volumes previously attached from the command line and rebooted the instance.  I was overjoyed when I was able to login again.

Next I set about to more carefully attach the volumes, which I did one at a time with ElasticFox.  Again, after attaching the five additional volumes my instance stopped responding.  At this point I wasn’t sure if I had reached the limit of attached volumes, if one or more volumes had some sort of problem, or if someone at Amazon was messing with me.  I had to find out so I did some testing. . .

I starting running a continuous ping (tcping actually) to the instance (so I would know if/when it crapped out and when it was back online after rebooting) and set about testing connecting EBS volumes to the instance. Sure enough, every time I connected too many EBS volumes the instance would hang.  I wanted to test this against instances with more (and less) ephemeral drives so I also started up a Small Instance (AKA m1.small) and the mack-daddy of them all a High-Memory Quadruple Extra Large Instance (AKA m2.4xlarge).  These two instance types come “out-of-the-box” with two and three drives each, respectively.

Don’t believe me on the m2.4xlarge instance?

So, with all three server types, m1.small, m1.xlarge and m2.4xlarge running Windows the magic number of (total) drives was 12 before I started having problems.  An interesting note is that you can actually add a 13th drive and everything appears to be fine.  It’s when you add the 14th drive that all hell breaks loose & you instantly lose access to the instance.  Once this happens you have to detach two volumes then forcibly reboot the instance before it starts to respond.  It certainly is good that you can at least regain access.

Remember how I said everything appears to be fine after adding the 13th drive?  Well, appearances aren’t everything. . .  What I found was that although you could connect the 13th drive/volume & the instance seems fine, when you reboot it the instance doesn’t come back online.  I had to detach the 13th drive then forcibly reboot the instance before I could connect.

Another interesting note is that the device names went up to xvdp (which is actually displayed as the highest device letter when attaching volumes in ElasticFox) then started back at xvdf.

Device range when attaching volumes in ElasticFox:

Attached EBS volumes:

The bottom line is that through a little work yesterday and today I was able to determine definitively that Windows instances (at least instance-store, or S3-backed instances running Windows 2003 – not sure about Windows 2008 on EBS-backed storage) cannot have more than 12 total drives attached.

See also:

Categories
Amazon Web Services AWS ElasticFox

Elasticfox Firefox Extension for Amazon EC2

I’ve been using ElasticFox for a while now and thought I’d jot down a few notes about it.  I don’t use it exclusively to manage my Amazon Web Services (AWS) EC2 instances, EBS volumes, etc., but usually I do go there first.  It’s worth mentioning that I also use the AWS Management Console and EC2 command line tools (both in Windows and Linux).  Typically I keep ElasticFox open as I access it several times a day.

One of the best things about ElasticFox is that you can add “Tags” to instances, EBS volumes, etc.  Tags are your own notes about the instance, like a friendly name.  I really wish Amazon had a field like this that was tied to the instance so it would be available in all tools.  One minor draw-back to tags are that they only apply to the particular browser in which they are created.  They don’t, for example, if you logon as a different user to your machine, or on other machines.

With ElasticFox you can connect to various regions, even with different credentials.  It allows you to manage the following by selecting the appropriate tab:

  • Instances
  • Images
  • KeyPairs
  • Security Groups
  • Elastic IPs
  • Volumes and Snapshots
  • Bundle Tasks
  • Reserved Instances
  • Virtual Private Clouds
  • VPN Connections
  • Availability Zones

Read more about and download the Elasticfox Firefox Extension for Amazon EC2 and enjoy.

See also Copying ElasticFox Tags from One Browser to Another.

UPDATE:

ElasticFox doesn’t support versions of FireFox above 3.x, so get Elasticfox (for EC2 Tag) which adds a few more nice features.

Categories
Amazon Web Services CloudBerry EC2 S3 S3.exe

Amazon S3 Command Line Utilities for Windows

I’ve searched high and low for a good all around command line utility to interact with Amazon S3 buckets from Windows.  While I’m still searching for just the right utility for me here are a few which I use from time-to-time.   Why use more than one, you ask?  Well, since I haven’t found just the right one for all occasions I use the one that works best for the particular task at hand.

S3.exe
S3.exe is a Windows command-line utility for Amazon’s S3 & EC2 web services that requires no installation, is a single .EXE file with no DLLs, and requires only .NET 2.0 or Mono, so it will work on a plain Windows installation.

Key Features

  • Efficiently uploads and downloads large numbers of files (or whole directories) between Amazon S3 and Windows PCs.
  • Everything is in one .EXE. Nothing to install or configure, just download it where it’s needed and run it.
  • Doesn’t require anything except .NET 2.0 or Mono.
  • Works well in an automated backup solution or as an ad-hoc system administration tool.
  • Can split large files into chunks for upload without creating any temporary files on disk.
  • Can use HTTP HEAD command to quickly determine which files don’t need to be uploaded because they haven’t been updated (/sync).
  • Support for various EC2 operations as well.

CloudBerry Explorer PowerShell Snap-in
CloudBerry Explorer offers PowerShell extension to manage file operations across Amazon Simple Storage Service (Amazon S3) and file system.  The CloudBerry Explorer PowerShell Snap-in allows using the majority of Amazon S3 functionality. You can combine CloudBerry Explorer commands with PowerShell commands. PowerShell is designed to operate with Net objects, so you are not limited with command syntax. You can write complicated scripts with loops and conditions. You can schedule periodical tasks like data backup or cleanup.

#Sh3ll (Amazon S3 command shell for C#)
#Sh3ll (pronounced sharp-shell) is a C# based command shell for managing your Amazon S3 objects.  It is open source and provided by SilvaSoft (click to download #sh3ll and for more information). #Sh3ll is built upon the Amazon S3 REST C# library, and it runs on both .NET 1.1 and .NET 2.0.

Also from SilvaSoft:

  • Sh3ll – Amazon S3 command shell for Java
  • rSh3ll – Amazon S3 command shell for Ruby
Categories
Amazon Web Services AMI AWS EC2 AMI EC2 API ELB tools Linux SSH WGET Windows

Installing EC2 Command Line Tools on Windows

UPDATE (12-2016): See HowTo: Install AWS CLI on Both Windows and Linux for updated information on installing, configuring and using the AWS CLI unified tools.

NOTE: This tutorial contains information for both AMI and API command line tools along with ELB tools. Most users will need the API tools, some the ELB tools, and not many will need the AMI tools.

There are a number of GUI tools for working with Amazon EC2 services such as ElasticFox, RightScale and AWS Management Console.  However often you need to use the command line tools because you want to script a task, or access features that a GUI tool doesn’t provide.

There are several guides and tutorials on installing and configuring the command line tools on Linux, but not much for Windows.  So this aims to be THE GUIDE to setting up the EC2 API, ELB and EC2 AMI command line tools on Windows.

Prerequisite
The first requirement is to have Java 5 or later installed.  If you don’t already have it download and install from here.

AWS Command Line Tools Directory
I like to organize my programs a certain way so I installed the tools to c:adminaws.  You can install the tools wherever you like.  Note, this is where you may store your certificates, the services API files, etc.

Download Amazon command line tools
I used wget (for Windows) to download the files:

wget http://s3.amazonaws.com/ec2-downloads/ec2-api-tools.zip
wget http://ec2-downloads.s3.amazonaws.com/ElasticLoadBalancing-2009-05-15.zip
wget http://s3.amazonaws.com/ec2-downloads/ec2-ami-tools.zip

Alternatively you could download one or both directly from your browser.  EC2 API Tools.  ELB Tools.  EC2 AMI Tools.

Unzip all three files. Each will unzip to separate directories, usually including the version number of the tool.  To simplify things I moved all files from their respective locations to the following directories:

c:adminawsec2-api-tools
c:adminawsec2-elb-tools
c:adminawsec2-ami-tools



Retrieve and Store AWS Certificates
Authentication to AWS uses a certificate and private key.  You will have to retrieve these files from AWS.

Logon to the AWS Console and scroll down to the X.509 area.  You may have to create a new certificate.  Once you do Amazon will provide you a Private Key File (pk-.pem) and a Certificate (cert-.pem).

KEEP THESE FILES PRIVATE.  Possession of these two files give you (or anyone else with them) access to your AWS account.

Configure Environment Variables
You need to configure your command line environment with a few environment variables. 

Method 1
This method is used to launch a command prompt with required settings.  These settings are available only for this session.  If you’d like to configure your system to have these settings available always and system-wide use method 2.

Create a batch file in c:adminaws called awsTools.bat.  Edit this file with the following text:

REM Path should contain binjava.exe
set JAVA_HOME=”C:Program Files (x86)javajre6″

REM Path to Primary Key and Certificate retrieved from AWS
set EC2_PRIVATE_KEY=C:AdminAWSpk-<Insert your key name here>.pem
set EC2_CERT=C:AdminAWScert-<Insert your key name here>.pem

REM Path to EC2 API, subfolders of bin and lib
set EC2_HOME=C:AdminAWSec2-api-tools
set PATH=%PATH%;%EC2_HOME%bin

REM Path to ELB API, subfolders of bin and lib
set AWS_ELB_HOME=C:AdminAWSec2-elb-tools
set PATH=%PATH%;%AWS_ELB_HOME%bin

REM Path to EC2 AMI, subfolders of bin and lib
set AWS_AMI_HOME=C:AdminAWSec2-ami-tools
set PATH=%PATH%;%AWS_AMI_HOME%bin

cls
cmd

Note: Make sure none of the path statements in this file end with a trailing slash.

Configure Environment Variables – Method 2
This method adds the necessary system variables to either your profile or system-wide and makes them available anytime you launch a command prompt.  Open the environment variables dialogue (right-click on My Computer, select System Properties, click Advanced tab, then Environment Variables button).  Add the following to either your user account or system variables section depending on your needs.

  • JAVA_HOME – C:Program Files (x86)javajre6
  • EC2_PRIVATE_KEY – C:AdminAWSpk-<Insert your key name here>.pem
  • EC2_CERT – C:AdminAWScert-<Insert your key name here>.pem
  • EC2_HOME – C:AdminAWSec2-api-tools
  • AWS_ELB_HOME – C:AdminAWSec2-elb-tools
  • AWS_AMI_HOME – C:AdminAWSec2-ami-tools
  • Add ;C:AdminAWSec2-api-toolsbin;C:AdminAWSec2-elb-toolsbin;C:AdminAWSec2-ami-toolsbin to your path

Explanation of System Variables

JAVA_HOME needs to be set to the appropriate path for your machine.

For example on my (64-bit Window 7) system java.exe is located at “C:Program Files (x86)javajre6binjava.exe” so I set JAVA_HOME to “C:Program Files (x86)javajre6”

EC2_Private_Key and EC2_Cert both are the location of the private key and certificate that you retrieved from the AWS website in the previous step.  You could rename the key and certificate for simplification.  If you have multiple AWS accounts all you need to do is modify these lines to switch between accounts.

EC2_HOME and AWS_ELB_HOME both point to the folders you unzipped the API into.  Both folders should have two subdirectories called bin and lib.  Bin will contain the cmd files of the different commands for that API.  You set the path variable to include these cmd files in your path so that you do not have to be in that directory to run them.

Now you only need to run the batch file to get a command line with the environmental variables set.  You also could permanently set these variables and have them available in any command window if you choose.  If you want to get fancy you could even put in the logic to set the paths based on the current directory of the batch file, and then put the folder on a thumb drive and carry it around.

Testing Your Setup
If you run awsTools.bat you should have a command prompt where you can run the EC2 tool.  A simple command to test is “ec2-describe-regions”:

c:adminaws>ec2-describe-regions

Results:
REGION  eu-west-1       ec2.eu-west-1.amazonaws.com
REGION  us-east-1       ec2.us-east-1.amazonaws.com
REGION  us-west-1       ec2.us-west-1.amazonaws.com

If you receive an error running this command then you need to go back and verify your installation.

UPDATE: Recently I had to change my Amazon access credentials and created a new X.509 certificate.  When I tried to run any commands from the command line I received the message, “Client.AuthFailure: AWS was not able to validate the provided access credentials.”  So I just downloaded my new Private Key File (pk-.pem) and Certificate (cert-.pem) file replacing my existing ones, and, viola, I was back in action.

UPDATE (12-2016): See HowTo: Install AWS CLI on Both Windows and Linux for updated information on installing, configuring and using the AWS CLI unified tools.

Commands Documentation
Amazon documentation.

Related

Categories
Amazon Web Services EC2 Linux Windows 2008

Windows 2008 Server on Amazon’s EC2 – a First Look

Within a couple hours of Amazon’s announcement of the availability of Windows 2008 machine images (AMI’s) on their EC2 (Elastic Compute Cloud) platform a few days ago I had to give it a try – see my previous post, “Amazon EC2 Now Offers Windows Server 2008 – Finally!”

I used RightScale to locate and launch a Windows 2008 instance.

Now that I know the AMI (ami-5a07e533) I can easily launch instances in the future from the command line using Amazon’s command line tools:

ec2-run-instances ami-5a07e533 -n 1 -g <group1> -g <group2> -g <group3> -k <My AWS Key> -t m1.small -z us-east-1a

Once it was up and running I got the administrator password in ElasticFox, launched RD & went to work checking it out and setting it up to suit my needs.

Here are a couple things I noticed:

  • I knew the 10GB “C” drive partition Windows 2003 instances have wouldn’t be big enough for Windows 2008 so I started there. I was pleasantly surprised to see a 30GB partition, however that’s all, it didn’t have a “D” drive like with other instances (both Linux and Windows 2003 have a 340GB partition, at least on m1.small instances, larger instances have larger data drives). Needless to say I was a little disappointed the Windows 2008 instance didn’t have an additional drive for data. Guess I’ll just have to use EBS (Elastic Block Storage) volume(s).
  • Looks like the Windows 2008 instances are priced the same as the Windows 2003 instances, albeit a with a little disk space – probably have to squeeze a little more $$$ out of us to pay the mighty Microsoft. When you break it down it could actually cost you quite a bit more for Windows 2008 than 2003.

Windows 2003 small instance: 720 hour/mo. * $.12 = $86.40 per month.
Windows 2008 small instance: 720 hour/mo. * $.12 = $86.40 per month + $36 (to make up for the lost disk space) = $122.40 per month, or 30% more.

Of course I had to check the Windows Update status & found it needed 14 “recommended” or critical updates, which I promptly installed. Probably 1/2 of these were release two days ago by Microsoft on Patch Tuesday. But even still I hoped the image would have been a little more up-to-date.

While the updates were downloading and installing I tweaked my desktop a little so it would be setup the way I like.

Another feature Amazon announced recently, “Booting From Amazon EBS,” is being used by the Windows 2008 instances. This is what enables the larger system partition, or “C” drive. This also enables the ability to “shutdown” the instance, then you can start it back up at a later time & it will pick up where it left off. While the machine is shutdown you won’t be charged for computing resources time, but you will still be charged for the EBS volume(s) on which the server is based.

Bundling an Instance Backed by Amazon EBS
One common use case is the desire to make a point‐in‐time copy of the contents of the root device so that another instance could boot off of that image. Images are typically created for backup purposes or to make clones of the existing instance. Previously, this process on Linux required you to create an image of your instance on the instance itself and no APIs were available to assist. On Windows, there was an API that you could call to create an image of the instance, but had to make another subsequent call to register the AMI. Now, there is one API for both Linux/UNIX and Windows that allows you to bundle your AMI backed by Amazon EBS and register it.

After setting up and playing with my new EC2 Windows 2008 server for a while I shut it down so I could start it up in the future when I’m ready to dive a little deeper into it. Right now it’s costing me $3 a month to sit there – not bad.

All in all I’m glad Amazon finally supports Windows 2008 & it seems to function just fine. My existing tools, from ElasticFox, to RightScale, to Amazon Command Line Tools, all work with Windows 2008 without any upgrade or modification, which is a definite plus. I was a little disappointed my small instance has 320GB less storage than a small Linux or Windows 2003 instance, which means I’ll have to pay $32/mo. more to get that back – the more I think about it I’m a lot disappointed about the hidden price increase.

Categories
Amazon Web Services AWS EC2 Linux Windows 2008

Amazon EC2 Now Offers Windows Server 2008 – Finally!

I opened my email this morning and much to my pleasure I found this announcement from Amazon:

Amazon EC2 Now Offers Windows Server 2008
Starting today, Amazon EC2 now offers Microsoft Windows Server 2008 and Microsoft SQL Server® Standard 2008 instances in all Amazon EC2 Regions. This new announcement extends Amazon EC2’s existing Microsoft-based offerings that include Windows Server 2003 and SQL Server 2005 instances. Like all services offered by AWS, Amazon EC2 running Windows Server or SQL Server offers a low-cost, pay-as-you-go model with no long-term commitments and no minimum fees. Please visit the Amazon EC2 service page for more information on using Amazon EC2 running Windows


It’s about time!


Of course, I had to give it a try.  I accessed my account with Elasticfox and browsed through the images, but didn’t immediately find a Windows 2008 image, so I headed over to RightScale, found what I was looking for and immediately launched an instance.  I launched it through RightScale (which I do sometimes anyway) because I was in a hurry to get to a meeting and didn’t see the AMI ID.  So I started it and headed to my meeting while it spun up.




Now that I know the AMI (ami-5a07e533) I can launch instances in the future from the command line using Amazon’s command line tools:

ec2-run-instances ami-5a07e533 -n 1 -g <group1> -g <group2> -g <group3> -k <My AWS Key> -t m1.small -z us-east-1a


Once it was up and running I got the administrator password in ElasticFox, launched RD & went to work checking it out and setting it up to suit my needs.


Here are some of the things I noticed:

  • I knew the 10GB “C” drive partition Windows 2003 instances have wouldn’t be big enough for Windows 2008 so I started there.  I was pleasantly surprised to see a 30GB partition, however that’s it.  No “D” drive like with other instances (both Linux and Windows 2003 have a 340GB partition, at least on m1.small instances, larger instances have larger data drives.)  Needless to say I was a little disappointed the 2008 instance didn’t have an additional drive for data.  Guess I’ll just have to use EBS (Elastic Block Storage).
  • Looks like the Windows 2008 instances are priced the same as the Windows 2003 instances, albeit a with a little disk space – probably have to squeeze a little more $$$ out of us to pay the mighty Microsoft.  When you break it down it could cost you quite a bit more for Windows 2008 than 2003.
    • Windows 2003 small instance: 720 hour/mo. * $.12 = $86.40 per month.
    • Windows 2008 small instance: 720 hour/mo. * $.12 = $86.40 per month + $36 (to make up for the lost disk space) = $122.40 per month, or 30% more.

Of course I had to check the Windows Update status & found it needed 14 “recommended” or critical updates, which I promptly installed.  Probably 1/2 of these were release two days ago by Microsoft on Patch Tuesday.  But even still I hoped the image would have been a little more up-to-date.


While the updates were downloading and installing I tweaked my desktop a little so it would be setup the way I like.


Next I bundled the instance and shut it down.  I used RightScale for the bundling because their interface is easy to use and does it all in one step.  Now I have my own “customized” image to start from when I’m ready to work with Windows 2008 on Amazon EC2 in the future.



All in all I’m glad Amazon finally supports Windows 2008 & it seems to function just fine.  My existing tools, from ElasticFox, to RightScale, to Amazon Command Line Tools, all work with Windows 2008 without any upgrade or modification, which is a definite plus.  I was a little disappointed my small instance has 320GB less storage than a small Linux or Windows 2003 instance, which means I’ll have to pay $32/mo. more to get that back – the more I think about it I’m a lot disappointed about the hidden price increase.