Posts Tagged 'Amazon Web Services'

Image Caching Headers and Amazon S3

A while ago I wrote an article about using Amazon S3 to serve images for my site, cocktailsrus.com. One thing I hadn’t noticed – until it was pointed out by my colleague, Richard Howells, who is very hot on caching and efficiency in general – was that the images were not being served with an expiry header to set client-side caching. That leads to two questions: why should I care, and what do I do about it?

1. Why should I care?

If you don’t set an expiry date, every single request for your page will lead to a request to the server to a. check if the image has changed, and b. download it if it hasn’t. That potentially means lots and lots of requests – like this:

Even though they aren’t downloading the images, all those requests take time. By adding a caching policy, we can make sure that those requests never happen. What we need is an HTTP header that tells the client to save the image until some specified date in the future:

With this header in place, a visit to the page (but not a browser refresh: that behaves differently) will lead to many fewer requests:

2. What do I do about it?

Different web servers will offer their own mechanisms for setting cache policies – but if you’re using Amazon S3, you have three options:

  1. Add the policy programmatically when you insert the image
  2. Set caching rules individually through the Amazon AWS console
  3. Use a tool to set the policy on all images in an existing bucket

1. Setting the cache value programmatically:

The Amazon SDK provides the PutObjectRequest object. This has an AddHeader() method that accepts two arguments – the header key and value, both as strings. You could calculate the date as so many days/months/years in the future, but the upshot needs to be something along these lines.AddHeader(“expires”, “Mon, Jan 1 2024 11:11:11 GMT”);

2. Set cache policies directly on individual images:

You can do this via the AWS Management Console. Just select the individual image:


Then add the metadata rule you want and click Save:


3. Use a tool to set caching on an entire bucket:

There’s a free tool – cloudberry, which you can download here – which will do this for you. You need to give it your AWS keys, and then you can connect to your buckets. Right click on the bucket to bring up the Set HTTP Headers option:

Then tell it to add the header you want:

You’ll need to make sure your security policy is set to allow you to make the changes from your machine for this to work. I temporarily removed my policy and re-added it after the changes were complete – and now my images no longer require a 304 check every single time someone visits the page. Since S3 charges by usage, that could add up to significant savings over time as well as the obvious performance advantages.

Kevin Rattan

For related information, check out these courses from Learning Tree:

Cloud Computing with Amazon Web Services

Building ASP.NET Web Applications: Hands-On

Working with SSL at Design Time with IIS Express

One issue that arose from my planned switch to serving images from Amazon S3 was: how to deal with HTTPS?

Hitherto, that’s been a non-issue. I used a relative path for images so the switch between HTTP and HTTPS happened automatically and painlessly. Now, of course, there is a potential problem. I have to use fully qualified paths for the images – and if an HTTPS page tries to serve images over HTTP, the browser will give the end user a warning about mixed content. So what to do?

The solution, of course, is to switch the images over to HTTPS along with the rest of the page. S3 supports HTTPS, so that’s fine – but there are still a couple of questions:

  1. How to manage switching over to HTTPS?
  2. How to test it in the development environment?

I’ll deal with the second one first. One of the nice things about Visual Studio 2012 is that it comes with IIS Express as the development server. That means you can use and test HTTPS/SSL during development. All you have to do is select the website in Solution Explorer and then change the property setting to enable SSL.

properties window

That’s it. Now, if you browse to the alternative URL you have HTTPS. You’ll get a warning message because there’s no certificate, but the functionality is all there.

So now we can write code to switch to HTTPS for images and test whether it actually worked.

I decided the easiest thing to do was have two configuration settings – one for standard images and one for HTTPS images. Here is the main configuration for the development setup (I left HTTP as relative, and just switched to a full path for HTTPS):

configuration

And here is the configuration transform for deployment to the live setup:

configuration transform

Now the question is – where to pick this up? In the current/old version, I set a ViewBag variable inside the base controller’s constructor. I can’t do that now because I need to find out whether the request uses HTTPS… and the context is not available inside the constructor. So it’s back to the drawing board. I don’t want to have to repeat the code, and I can’t use inheritance to get what I want… so it’s time for attribute based programming – in this case, with an action  filter. As the View is about to execute I check if the Request is over HTTPS, and switch the path appropriately.

Code sample OnResultExecuting

I’m checking that this is a ViewResult, so I can safely add this as a global filter without have to worry about methods that don’t return views:

Code sample register global filters

So now when I switch over to HTTPS, my path switches appropriately, and I don’t get annoying messages about delivering mixed content:

working ssl

Kevin Rattan

For other related information, check out this course from Learning Tree:

Building Web Applications with ASP.NET MVC

Building Web Applications with ASP.NET and Ajax

Configuring Amazon S3 to Serve Images

In my last post, I looked at how you can use the AWS SDK to upload images to Amazon S3 (Simple Storage Service). The only problem is that the storage bucket is private by default. If you try and access your image, even through the AWS interface, you get the following error:

access denied xml

You can make the individual files public – but this is hardly a practical solution for a web site with dynamically uploaded images. I want to have all the images be public by default – not to have to write code to make them public one by one:

make public option

Fortunately, it is possible to make your bucket public by default. Right click on the bucket and select properties.

bucket properties link

This will open a new pane showing the permission options. Click on Add Bucket Policy to open the bucket policy window:

properties details

Then, inside the modal window, click on the AWS Policy Generator link:

bucket policy editor

The generator requires you to fill out a form and then writes a policy for you. The only complications are 1) deciding what permissions to allow and understanding what the options mean (for reading your images you want to expose ‘GetObject’), and 2) what your ARN name is. This is in the form arn:aws:s3:::<bucket_name>/<key_name>. In my case, that means arn:aws:s3:::cocktailsrus/*.

Here is the form:

bucket policy generator

Once you click “Add Statement” you’re given a button to Generate Policy – and that generates the policy text for you to paste into your bucket policy:

bucket policy

Once you’ve saved this policy, clicking on the link in the online tool shows you the image. All the images in your bucket are now public.

Unfortunately, that means EVERYONE can see them – including search engines and other web sites. I know S3 is cheap, but I still don’t like the idea of paying the fees for search engines or other websites showing my images. So how do I stop them?

We can stop the search engines easily enough with a standard robots.txt file in the bucket telling them not to index its contents:

User-agent: *

Disallow: /

Now all we have to do is stop hot-linking so we’re not paying for someone else’s use of our images. The answer to this is to refine the policy file so that it only serves images to people who are coming from (in my case) www.cocktailsrus.com. Sadly, the AWS policy generator isn’t much help with this, as it doesn’t seem to include an option to test against the referring web site. But while it’s not in the generator, there does seem to be such an option. I found the solution here and implemented it on my bucket.

refined bucket policy

Now, if I try and access my Ajax loader image via the link in the AWS online tool (i.e. NOT via cocktailsrus), I get the familiar XML message:

denied by non-referral

But if I access it via a test page on cocktailsrus.com, I get the image:

working image

So, the images are now public, but only work if accessed via cocktailsrus.com – and I have a solution that will work when I move over to a web farm, unlike storing images to the file system.

Kevin Rattan

For related information, check out these courses from Learning Tree:

Cloud Computing with Amazon Web Services

Building Web Applications with ASP.NET MVC

Serving Images from Amazon S3

One issue that’s been nagging me as I refactor www.cocktailsrus.com from a RAD site focused on jQuery Mobile to a properly architected site that just happens to use jQuery Mobile is the way I’ve been storing images on the file system. The file system approach is quick and easy and doesn’t clog the database with lots of BLOB data… but it’s also not at all future proofed for moving to a web farm environment. I don’t want to have to deal with synchronizing files across multiple servers, so what to do?

In the old days (like, maybe last year) I’d probably have bitten the bullet and saved the images into the database. But these days we have so much more choice – and since I’m hosting on Amazon EC2 and using Amazon’s easy mail service, then the obvious step is to use Amazon S3 (Simple Storage Service).

S3 isn’t just for images – you can store anything you want – but it makes a natural choice for image hosting. (You can also store your private files there if you want – S3 is private by default). With your images in S3, all your web farm servers are saving to the same place so there’s no longer any need to synchronize between servers – and, as usual with AWS, it’s very cheap.

You need to sign up with AWS to get an account. Then pick S3 from the bewildering array of services available.

amazon AWS services

Once you’ve signed up for S3, you can then create a bucket (don’t pick a name with a dot in it – that just makes life more complicated later on).

creating a bucket

And once the bucket is created, you can use the AWS web interface to upload files.uploading files

So – you have an online storage bucket and you can add and remove files. Now for the next step – doing so programmatically.

The first thing you need is the AWS SDK. The easiest thing to do is install it via NuGet.

The AWS SDK NuGet

The SDK comes with samples, so it’s easy to get up and running. You need to add a reference to the SDK in your project, and then work with the AmazonS3 object, which is created for you by the Amazon.AWSClientFactory.

You can upload files from your hard drive or file streams. In my case, I resize images and create thumbnails from uploaded files so I use the stream approach. Here is my code creating the AmazonS3 object and passing it through to a method that does the actual writing (note the using block – the AmazonS3 object implements IDisposable):

code sample using AmazonS3

And here is the code doing the actual write to amazonS3. (The bucketName variable in the code sample is a private static string variable “cocktailsrus.” The keyName is your unique filename for the new image):

code sample PutObject:

(In order for the above code to work, you will also need to have set up two configuration keys – one for your AwsAccessKey, the other for AwsSecretKey).

So now I have a bucket and I have code to write my images to the cloud. I’m all set, right? Well, almost – because there’s the small issue of the bucket being private by default. I’ll deal with that issue and a few other niceties of setting up S3 to serve images in my next post.

Kevin Rattan

For related information, check out these courses from Learning Tree:

Cloud Computing with Amazon Web Services

Building Web Applications with ASP.NET MVC

Sending SMTP Email with Amazon SES and C#

For a while now I’d been meaning to add a ‘contact us’ page to www.cocktailsrus.com and I finally got around to it this week.

Sending SMTP email is very straightforward – but you do need to have access to an SMTP server. I suppose I could have just turned on the SMTP component in Windows 2008, but I’m hosting on Amazon’s Elastic Cloud Compute infrastructure and thought there had to be a better way… And there is: the Amazon Simple Email Service. And, as it turns out, you can use it even if you don’t host on EC2.

First, you have to go to amazon and sign up for the simple email service – which means signing up to use Amazon Web Services. Once you do so, you’re given a sandbox in which to play… or, in this case, send emails. You need to verify the email addresses you send from (to make sure that you’re not an evil spammer) and then you can send test messages either from the online interface or programmatically. Once you’re through with testing, you can ask for production access – and once that’s granted, you can send up to 2,000 emails a day (and up to 1GB of data transfer per month) for free. If you want to send more, then you will start incurring costs – but, as ever with EC2, those costs are more than reasonable.

So – the big picture: you get to send lots of emails for free, and someone else is responsible for maintaining and securing the SMTP server.

And best of all – it works when you’re testing on your development machine as well as on the live box. Nice.

The only remaining problem is that you can’t just change the server name in your existing .NET code 😦. Now that you’re using Amazon’s service, you need to download the AWS SDK (it’s available as a NuGet) and program against the Amazon objects. The following code relies on using statements for Amazon.SimpleEmail and Amazon.SimpleEmail.Model, and also the following keys in the Web.config:

<add key=FromEmailAddress value=[Enter verified email address here] />
<add key=AwsAccessKey value=[Enter your access key here] />
<add key=AwsSecretKey value=[Enter your secret key here] />

Once you have those set, the minimum code to send emails is along the following lines:

public static void SendEmail(EmailViewModel email, List toAddresses)
{
    AmazonSimpleEmailServiceConfig amazonConfiguration =
	new AmazonSimpleEmailServiceConfig();
    AmazonSimpleEmailServiceClient client =
	new AmazonSimpleEmailServiceClient(
		ConfigurationManager.AppSettings.Get("AwsAccessKey").ToString(),
                ConfigurationManager.AppSettings.Get("AwsSecretKey").ToString(),
                amazonConfiguration);
    Destination destination = new Destination();
    destination.ToAddresses = toAddresses;
    Body body = new Body() { Html = new Content(email.Body) };
    Content subject = new Content(email.Subject);
    Message message = new Message(subject, body);
    SendEmailRequest sendEmailRequest =
	new SendEmailRequest(
		ConfigurationManager.AppSettings.Get("FromEmailAddress"),
		destination,
		message);
    client.SendEmail(sendEmailRequest);
}

The secret key is sent securely by default, by the way, so you don’t need to worry about compromising it.  There are, of course, other things you can do (like checking the response to see if you had any problems sending the email) but the only complication I’ve had to deal with so far is the fact that I can’t just hit reply to emails from users.  The problem, of course, is that you can only send from verified addresses.  The solution is to add a mailto: inside the email. As problems go, it’s pretty trivial – and very much out-weighed by the benefits of not having to set up and use my own SMTP server.

Kevin Rattan

For other related information, check out these courses from Learning Tree:

Building Web Applications with ASP.NET MVC

Building Web Applications with ASP.NET and Ajax

Cloud Computing Technologies: A Comprehensive Hands-On Introduction

Cloud Computing with Amazon Web Services

Amazon Web Services Tools For Visual Studio

Recently I’ve been needing to make a few tweaks to the hosting for my cocktailsrus project. It’s hosted on Amazon EC2, and until now I have been logging on to the Amazon console whenever I need to make changes. That’s not a great hardship, as it is (unsurprisingly) a well constructured and user friendly interface.

But it turns out there’s a better way. Amazon have created a set of tools that integrate with Visual Studio. They make it easy to manage your instances directly through Visual Studio.

The tools give you a new set of project templates for working with Amazon Web Services (including EC2):

project templates dialog

And you also get a new set of tools for managing your instances. The AWS Explorer window is available via View | AWS Explorer.

View drop down

When you open it, you need to select the account and region you want to manage:

.account and region dropdowns

These only become available after you’ve clicked on the Add Account icon (highlighted in red above). The icon brings up a dialog allowing you to enter your credentials. If you don’t know your secret key, you can get it via the AWS console.

credentials dialog

Once you’ve completed the form, you can manage your instances (including security groups etc.) directly from inside Visual Studio.

Here’s the big picture view of a management screen:

Visual Studio with tools

And here’s a close up (with a few details snipped out for obvious reasons):

detail of Visual Studio tools

The toolkit integrates with both 2008 and 2010, and allows you to manage AWS through a familiar user-friendly Visual Studio interface – and without having to log on to the console every time you want to make a change.

Kevin Rattan

For other related information, check out these courses from Learning Tree:

Cloud Computing Technologies: A Comprehensive Hands-On Introduction

Cloud Computing with Amazon Web Services


Learning Tree International

.NET & Visual Studio Courses

Learning Tree offers over 210 IT training and Management courses, including a full curriculum of .NET training courses.

Free White Papers

Questions on current IT or Management topics? Access our Complete Online Resource Library of over 65 White Papers, Articles and Podcasts

Enter your email address to subscribe to this blog and receive notifications of new posts by e-mail.

Join 29 other followers

Follow Learning Tree on Twitter

Archives

Do you need a customized .NET training solution delivered at your facility?

Last year Learning Tree held nearly 2,500 on-site training events worldwide. To find out more about hosting one at your location, click here for a free consultation.
Live, online training

%d bloggers like this: