AWS or AZURE

History

So way back circa 2008 I registered for AWS free tier, now back then I was working in a different industry that didn’t have much need for ‘the cloud’ I played with a few linux vm’s during that year but nothing came of it and my trial expired.

Fast forward two years and Azure was born at least in public,, I was immediately sold and was all-in. I’ve used abused and consulted on more Azure projects than I can remember and anytime the subject of AWS came up I dismissed it as a inferior pioneer on cloud tech, I mean just look at the console it’s offending to the eye is it not?

Fast forward another two years and I found myself while heavily swallowing the PAAS cool-aid recommending AWS over Azure to a client, why? Simply because AWS have a managed offering for Oracle and that particular client did not have the knowledge or appetite to manage their own oracle server.

This did open my eye that there might be a bit more to AWS than an ugly console, an opportunity presented itself to become AWS certified and I jumped at it, now as I write this article I can put this lovely logo Solutions-Architect-Associateon my business card.

 

 

 

Learning’s

So what have I learned about AWS in in my quest for certification? Well the console is not nearly as offending as I once believed it to be, in fact I think it’s more practical than that sexy looking new Azure portal, it’s faster to get things done in than constantly sliding those Azure portal blades around the place that’s for sure. As for feature parity, for the most part both platforms tend to support the same features in the general sense but once you drill down differences do start to emerge.

I’ve also decided it about high time that I also get certified in Azure, (underway), this should give me the street cred I need for what I’m going to try achieve and hopefully my findings be as impartial as possible.

Cloud Wars

Starting from my next blog post I’m going to start comparing features on both platforms and outline the pros and cons of each… Stay tuned to what should be a very interesting blog series. Obviously the topics are vast, so, if anyone has any requests please send me an email: b at briankeating.net.

Azure, Sql User invalid from Azure Website

Problem

I was faced with a problem this morning that took me a good 30 minutes to figure out..

I had created a website and associated SQL database. However I changed said database as part of some development work. The problem was that even though my publish profile was overriding the Release Connection String with my new database it was getting ignored!

image

 

I knew that the connection string I was supplying was correct as I could log in with Visual Studio and SSMS.

Cause:

The reason is that the website had already an connection string (under the Configure tab) and this was taking preference. The reason this is here is that one does not have to store the Azure connection string in the publish profile which is quite nice, same goes for a lot of other Azure features.

image

 

Solution

I removed same and then it works. (Fixing it is also another option but this code is in a private git repository so it’s not a concern for me just now).

Writing to an Azure Queue

If you've seen my previous post then this post is quite similar, this time however I write to an Azure Queue and not to a blob.

Code

First of all you need an Azure storage account as before, but once this is setup, consider the following code…

image

 

What I’m doing in the code above is

  1. Connecting to my storage account
  2. Creating the queue if it doesn't exist (remember you’ll get a bad request if you don’t name the queue correctly!).
  3. Then I create a simple message, I’m using an POCO object from another project and serializing it to JSON.

Did it work?

Lets use VS2013 U3 to check!

image

Open your server explorer and select the queue under the storage account you’ve chosen in your connection string, double click

image

Above you see the message added to the queue, you can see how many times it was de-queued  and when it’s set to expire, If we use a competing consumer pattern that count may be more than 1!

Next

I’m a little thorn re my next post, I’ve been writing a post on c# expression trees which is nearing completion, however I think to keep in line with the current trend I’ll post how this queue can be read and feed to an Azure Service Bus topic (pub/sub)… stay tuned ;-)

Azure WebJob triggering on BlobUpload

Tonight I’m going to follow up on my previous post where I promised to show you how to react on somone/something uploading a Blob.

Please read http://azure.microsoft.com/blog/2014/06/18/announcing-the-0-3-0-beta-preview-of-microsoft-azure-webjobs-sdk/ as there is a lot of old information lying about on the web regarding v0-2-0 which will not work in v0-3-0, I’m not ashamed to say it’s now the early hours of the morning before I’ve finally managed to get this working as most of the documentation I was reading was v0-2-0.

Let’s get started by creating a console application.

Nuget

image

 

Code

In this code you can see that I’m just appending worked… to the input file, the important parts to consider are the BlobTrigger and Blob attributes, the trigger is the item that will start to process when a blobupdates on the reuters-input container, the BlobAttribute is the output and the reuters-gdmx is the identified container for same.

image

 

There are a few options with the job, schedule/on demand/continuous..

image

 

For the automatic trigger I’m setting the job to be On Demand, however I know in the current version of webjobs, that ondemand for blobs are polled every 10 minutes (I hope can be more real-time once WebJobs exit preview).

image

image

 

Connection Strings

You need to add two connections string to your blob storage account (get the connection string with visual studio Azure explorer), I’ve set both to the same storage account. The last two are from v0-3-0.

image

 

Test

I’m going to use the AzureStorage explorer to upload a file, once this file gets uploaded the WebJob will run and create the associated blob in reuters-gdmx

image

image
image

Output

Here you can see the result of the WebJob (appending worked….)

image

Uploading a Blob to Azure container

image

 

In the picture above you see a storage account in Azure, in the storage account we have an ecbfx (European Central Bank FX Rates) container. Now let’s see how to upload some data to this container using a C# console application.

NUGET

Given we are going to work with C# the best option is to use the .NET library, this can be retrieved from NuGet

image

 

Code

image

The code above connects to the pre-created container, notice that my container has built in Geo Redundancy (primary storage in Dublin, secondary in the Amsterdam) so after running there will be 6 copies of this blob, 3 in Dublin and 3 in Amsterdam, this is the storage package I’ve chosen.

image

 

View Blob

The easiest way to view the newly uploaded blob is to use the Windows Azure server explorer in Visual Studio, it’s the easiest way of getting the connection string to the storage account also.

image

In the next post I’m going to show you how react to someone uploading a Blob with an automatic trigger.