Feeds:
Posts
Comments

Posts Tagged ‘Azure’

Azure Blobs is a great, inexpensive, and scalable cloud storage service that you can use to host any static file. In particular, I find it extremely useful for hosting progressive download videos (e.g. .mp4 files). Note: if you need the uber-solution (encoding, adaptive streaming, …etc) check out Windows Azure Media Services which offers a full range of services to deliver high quality video to a range of different platforms.

While Azure Blobs is very easy to setup and manage, there’s one essential configuration setting you will want to change before serving progressive download video that isn’t so obvious: byte-range support.

Let’s start with the problem. Out of the box, if you throw a .mp4 file on your Azure Blobs container and point your video player at the video, you’ll see the video play and buffer just fine. But, what happens, when you try to seek to an unbuffered portion of the content?

image

Result: Buffering, waiting, annoyance, time for a coffee break?

image

The reason is simple: by default, Azure Blobs does not enable support for byte range requests. Described with less jargon: in order to jump to a portion of the video that hasn’t downloaded yet you can either:

1) wait until the download (buffer) catches up to the new position of playback (results seen above) or

2) the solution: tell the server to skip all the data yet to be buffered prior to the new position (identified below in red) and instead start buffering from the new position.

image

Under the hood, this is supported automatically by most modern video players AS LONG AS the server serving the video supports returning byte ranges and returns the response header: “Accept-Ranges: bytes”

The good news is, Azure Blobs does support this, you just need to enable it by setting the default version of the service to a newer one.

The bad new is, this setting is not exposed in the Azure Portal nor most tools I’ve used (e.g. CloudBerry).

So how do it change it? Send a Set Blob Service Properties REST API request.

Seriously? Isn’t there an easier way? Well, there are free tools out there to do this for you. For example: Plasma AzureBlobUtility.

BlobUtility.exe -k AccessKey -a AccountName -c ContainerName --setDefaultServiceVersion 2012-02-12

Or, if you’re paranoid like me and don’t like to give the key to your house to a stranger (no offense Plasma). Here are the steps and code to do it yourself in VS2012 in <1 minute:

image

Create a new C# project. (I chose a WPF application which is probably not the best choice but I’m just going to throw this away when I‘m done).

image

Add a NuGet package for Azure Storage by right clicking on your project and choosing Manage NuGet Packages.

image

Search for “azure storage” and install the “Windows Azure Storage” NuGet Package by Microsoft.

Finally, add the following code to your app, replace the account name and secret key for your Azure Blobs account and run it…

public MainWindow()
{
    InitializeComponent();

    var credentials = new StorageCredentials("myaccountname", "mysecretkey");
    var account = new CloudStorageAccount(credentials, true);
    var client = account.CreateCloudBlobClient();
    var properties = client.GetServiceProperties();
    properties.DefaultServiceVersion = "2012-02-12";
    client.SetServiceProperties(properties);
}

Notice that we didn’t need to specify the container. This change will apply to all containers for that account.

DONE! Now all videos served from your Azure Blobs account will support seeking into the unbuffered area of the video without significant delay.

Advertisement

Read Full Post »

This week Microsoft finally revealed its pricing structure for Windows Azure hosting services. Using Azure to host the simplest website in the world costs a minimum of $0.12 / hour. Work out the math: 0.12 * 24 * 30 = $86.40 / month.

While this might sound reasonable to a large organization with tons of traffic or anyone currently using Amazon EC2 or Rackspace’s Mosso, this is way out of reach for the majority of developers and organizations who are just trying to create a useful webservice or website that could scale in the off chance their idea took off or got mentioned by the press.

Based on this pricing it’s obvious that Microsoft is trying to compete with Amazon and targetting the same market. Nevertheless, I personally had high hopes that Microsoft was actually trying to compete with Google App Engine by offering the first and only affordable and scalable Windows hosting option… which raises the point: (in case anyone from Microsoft is listening) if Microsoft wants .NET to compete long-term as a server-side platform (which is essential for Windows to thrive as a server-side OS), someone is going to have to solve this problem soon or it will find itself playing catch-up.

I love Windows Azure and I believe it is a great, simple and affordable option for the big boys. But as Windows Azure leaves beta and the world says hello, I say goodbye before I have to start coughing up ~$100/mo for my personal websites. Back to shared hosting at GoDaddy ($4/mo for Windows + SQL).

Read Full Post »

I just finished one of the coolest and most exciting apps I’ve ever written and the client-side was done in 100% native Silverlight 2. Fortunately, I was able to get it done just in time to debut for the NewCloudApp Windows Azure contest.

It’s an online jigsaw puzzle and it was as fun to write as it is to play. You can choose from over a hundred images or use your own photo, select practically any number of pieces (I recommend 12 to start), and even send puzzles to your friends.

Click the link below to check it out and tell your friends if you like it!

PuzzleTouch Online Jigsaw Puzzles

I can’t wait to write more about why I chose Silverlight for this project (really, why Silverlight was the only platform up for the job) and all the new things I learned along the way. But for now, go forth and play puzzles.

Read Full Post »

I’ve tried to avoid commentary blogging but in this case can’t help but offer a few opinions and predictions on the 3 biggest scalable hosting options today for Windows servers.

Over the last year I’ve spent a lot of time thinking about scalable hosting and working with both Mosso Cloud Sites and Amazon EC2. I’ve recently also been able to get down to business with Windows Azure hosting and have developed some opinions about how the three stack up against each other. This is by no means meant to be an exhaustive comparison of the three but merely my impressions from a 30,000 foot view on the choices for scalable Windows hosting today.

Control: Amazon EC2 offers the most control. Aside from the internal guts of EC2 and the ability to deploy multiple instances, Amazon EC2 is for all practical purposes a dedicated server. You control every aspect of it and are responsible for it as you would be for any other dedicated server. The up side to this is that if you need to install 3rd party software on your hosting environment, impersonate users, or do any number of other rare and unusual things that you can’t do on anything but your own box, you have the all the control you could want with EC2. The downside of course is that you have a lot of rope to hang yourself with and Amazon isn’t going to come to your rescue when you do so.

Ease: Mosso and Azure definitely share the prize here. Who wants to worry about installing security patches, deploying new server instances, user permissions…etc. As a developer before an IT administrator, I want to spend my time developing, not configuring and mainting servers. Let the Mosso staff (same company as Rackspace) or the folks at MS (the company that wrote Windows Server) do this work for you. Upload and scale w/o worrying about much more than your code.

Price: Both Mosso and Amazon EC2 are around $100/mo. Windows Azure pricing has yet to be announced. The big question in my mind is whether Azure will be like Amazon S3 where if you have super low usage you get charged practially nothing? With S3, I once saw a bill for litterally 1 cent! Or, is the pricing going to be like Amazon EC2, where your cost to host a site that is practially never hit still costs a minimum of almost $100/month?

Conclusion: Why use EC2 unless you need the extra control? – Which by the way, I did need the control and settled on EC2 for my latest project. But as time goes on and we rely less and less on legacy code, server component vendors start using best practices to write their components, and the standardization of web services,  more control will be necessary less often. Amazon may eventually offer something that competes with Mosso and Azure in terms of ease of use, but for now the two stand alone in this niche rivaled only by shared hosting (which of course lacks the scalability).

Azure has potential to dominate the market but this will all depend on pricing. Azure can go down the road of Mosso and EC2 and compete purely on features, or it can be priced in a way where you pay for only the horse power that you use and open the cloud computing flood gates and admit those with tight budgets. This won’t matter to many medium and large companies who need at least one dedicated server but it matter tremendously to everyone else who needs less than a server to host their applications and sites. If MS wants this audience (presumably the majority of those that need hosting), it must make it’s price scale like it’s hardware.

Some examples of who would be left out if MS takes the EC2/Mosso pricing model: imagine an entrepreneurial developer with a great SaaS idea. Either they keep their costs low until it proves itself by hostting at a place like GoDaddy or they shell out $1200 / year to make sure it can scale. Or, how about the local restaurant who isn’t shooting for an international web presense. GoDaddy can get you hooked up with ASP.NET and SQL for $4/month last time I looked. Why would you ever pay 25 times that for something like Mosso or EC2? And last, imagine the programmer who writes a simple little web service to perform some small function for an in-house app they write. Either they piggy back on some other server (probably being used for mission critical functions) or they create a special server for themselves. If you’re a developer, ask yourself: how many times have wondered or asked, “which server should I put this on?”

Here’s a chance for MS and Azure to really change the world of software and cloud computing. By choosing a pricing model that scales at the low end, they could essentially eliminate cost as a constraint to launching an application in the the cloud. Never will a developer abandon an idea because it costs too much to make sure it could scale. If they go with the Mosso/EC2 pricing model where you get charged a relatively large amount for having a nearly idle server, then the majority of programmers will be left to suffer traditional, unscalable, shared hosting a little longer while those of us with the bigger budgets will have 3 great choices for scalable Windows hosting.

Read Full Post »

I finally found some time to try out my preview account on Windows Azure and the new January CTP of the SDK and VS tools and thought I’d share some my impressions & some hurdles I ran into while getting up and running.

1) To debug your application locally you need to be running a local instance of IIS which I didn’t realize until trying to run my project in VS that I hadn’t actually added to Windows. I guess I’ve been so spoiled with VS.NET’s built in localhost that I didn’t have a need for a local instance of IIS until now. I remember the day when this was one of the first thing I did after installing Windows. Looks like a return to those days.

2) To run the development fabric (the thing that allows you to simulate and debug Windows Azure on your workstation), you have to run VS.NET as an administrator. So far I’ve forgotten everytime I’ve gone into my project and I’m sure it won’t be the last. It’s kind of a bummer when you launch VS, load your project hit F5 and Arg!… I have to start all over. Yes, I get impatient when it comes to repeating my own mistakes 🙂

runasadministrator

Note: Turning of UAC does not eliminate this.

3) I ran my app locally and all I got was a blank white page instead of my SL app or an error. Fortunately I’ve ran into this more than once now on Windows Server so the problem & solution were still lingering somewhere in the back of my head: the xap mime type wasn’t added to IIS. Once I realized this, a quick search on google yielded the solution and a minute in IIS was all it took to move to the next problem…

4) Next, I added a reference to my WCF service via the ‘discover’ feature in service references and it was added as http://localhost:12404/Service1.svc. However, the Azure development fabric actually runs the app under: http://127.0.0.1:81/. It only took a quick glance at my address bar in IE to discover this and realize that my service was probably running on port 81 too. Changing ServiceReferences.ClientConfig to the new service url was all it took.

5) Last, I received HTTP error 403.3 when trying to hit my local .svc file. This time I was prepared because of the “xap incident” (#3 above). Again, I needed to add a mime type for .svc files as well. As with the xap file extension problem, it only took a few seconds on google and I was up and running with the fix.

Finally, I was in business running locally and ready to deploy! I wanted to see my app and service running in the cloud… no time for reading documentation right!? Well the publish experience for Azure was made for people like me. I right clicked on my startup project and chose ‘Publish’ not entirely sure what to expect and was pleased to find the whole process very intuitive. Up came a web page to upload your package (.cspkg) and configuration (.cscfg) files to along with the folder where those two files resided.

Simply upload the two files and start your server instance (staging or production) and away you go. Publishing wasn’t quite as easy as publishing to an ftp site but I had no trouble figuring out what to do and in no time I had my app running in a staging environment and moments later running from my vanity url. Very cool! There was a little confusion for a few moments because after the management console reported my instance as “Started” it still took a minute or two before it worked in my browser. In the words of Axle Rose and Yoda, I just needed a little patience.

All in all, I was a little dissapointed with the experience in Visual Studio and worry about first impressions of those not as familiar with VS development. Then again, VS.NET 2008 was out the door long before Azure hit the scene, so I’d expect a little retro-fitting to be required to get VS to play nice with Azure and the development fabric. Hopefully in VS2010 it will all be much more integrated as ASP.NET apps are in VS today.

P.S. You can see the fruits of my labor on my previous post where I created an application to peer into Silverlight’s BrowserInfo and ASP.NET ServerVariables collection.

Read Full Post »

ASP.NET allows you to get at some great information about the client and the server via the HttpContext.Current.Request.ServerVariables collection. Likewise, Silverlight allows you to get at a few local variables of its own through the System.Windows.Browser.HtmlPage.BrowserInformation object.

But, to use these variables we often need to know what kind of values to expect. For example, let’s say you’re going to create a condition based on which browser the user is using. You would use BrowserInformation.Name. But Name is a string, not an enum. So what are the various values that can be returned by this property? This might be documented somewhere for the officialy supported browsers, but the only fool proof way is to actually try it by writing a dummy Silverlight app that spills out this variable and run it in all the different browsers to see what comes back. The same applies to ServerVariables but even more so because this is just a big dictionary so you don’t even know which variables are going to be present let alone what their values will be.

Here’s a utility I wrote for anyone to use that will help you look at all the BrowserInfo properties and ServerVariables. Hit this page from any machine to see what values it is sending up to the server. Bookmark this page, it will probalby come in handy someday when you’re scratching your head wondering what useragent you’re sending up to the server.


BrowserInfo and ServerVariables

Another cool part is that it not only shows you what servervariables are available at the time your web page is requested, but also what servervariables are available when you hit a WCF service from Silverlight. There are some subtle differences.

Also note that it’s hosted on Azure so you can also get a glimpse of which ServerVariables Azure provides access to. On first glance it looks the same as Windows Server but I haven’t done a variable by variable comparison.

Enjoy, I hope this comes in handy!

Download the source code here to see how it works or to host on your own server.

Read Full Post »