Backing up SQL databases to Azure

I recently read a blog post by Pinal Dave about how you can backup straight to Azure Storage. The procedure he described is only available for SQL Server 2014 or later.

I won’t go into detail of this method as Pinal describes it better than I can, but the basic of it requires setting up credentials and then running a backup command that includes the URL of the storage container on Azure.

Unfortunately I am running SQL Server 2005 so this process will not work for me but it did start me thinking of what ways there might be for me to use.

The next thing I tried was Microsoft SQL Server Backup to Microsoft Azure Tool. Unfortunately I did not get this tool to work correctly on my setup. However it sounds like a flexible tool that allows compression and encryption of your backup files. This tool redirects your backup files to your Azure Storage so even if I had got it to work correctly it would not have been an ideal solution as I want local copies of my backup files as well.

After this I started to look at powershell again. Following on from my recent success with powershell I know how to connect to my Azure account so all I needed to script was copying a file from my server to Azure.

Get-ChildItem *.bak -File -Recurse | Set-AzureStorageBlobContent -Container $DestContainer -Force

This command gets all the backup files in a directory (the recurse switch looks in sub directories as well) and then pipes them to the Set-AzureStorageBlobContent command. This command uploads them to the storage container defined in $DestContainer. I have added the Force switch so that it will replace any files on Azure which have the same name.

I have only been using this script for the last few days but so far it has been working well. Now if I completely loose all data from the office I can restore from any other location using the data saved on Azure. A great improvement to my disaster recovery policy.

Copying settings to an Azure Website

The Software as a Service (SaaS) website that I work on has been sold to lots of clients now. Which is great news.

2275.app-1However the more Azure websites we have, the more websites we have to administer, especially if like us you take advantage of Traffic Manager which requires multiple website in different regions. Azure has some great options for making this administration easier. One job is adding all the settings onto the Azure portal, so far I have been manually adding these, but a quicker way is to use powershell.

PowerShell

PowerShell is everywhere these days. You can use it to control Servers, Active Directory and Exchange. So it is no surprise that you can use it to control Azure.

Open a powershell window and run the following command.

Get-AzurePublishSettingsFile

This command opens an IE window which you can login to Azure and download a file which contains settings that Azure can use. Save the *.publishsettings file and run the following command.

Import-AzurePublishSettingsFile “C:\MyPublishSettings\mysubscriptions.publishsettings”

This imports your Azure settings so that PowerShell can do clever things.

Select-AzureSubscription -Default “mysubscription”

This selects which of your Azure subscriptions to use. Now run the following to import settings into PowerShell.

$s = @{“DebugEmailAccount”=”test@example.com”;”SiteWarningBannerText”=””}

And finally run the following to import this settings into the Azure website you specified.

Set-AzureWebsite azure-websitename -AppSettings $s

Sounds easy doesn’t it. Well it is. The hardest part is getting the settings in the correct format to be imported but this is only string manipulation.

For my project I already have a build script which populates a settings.config file with all these settings, so I have just duplicated this to create a settings.config.importtoAzure file. Next time I have a website to create, I can create it on Azure and run the above script, pasting in the settings file that my build has already produced for me.

This only scratches the surface of what you can do with Azure and PowerShell, hopefully I will do far more in the future.

Visual Studio

I recently replaced my installation of Visual Studio 2013 with Visual Studio 2015 RC.

I like the new version, I am not a Visual Studio expert so it will probably take me a while to find all the good stuff but here are some initial thoughts.

Being as my MSDN subscription is still valid I have installed the professional version to take advantage of its extra features like CodeLens.

One of the first things I spotted was that the integration with Azure has been improved. In the last version it was difficult to sign in to Azure with the correct credentials, especially if you have more than one Azure account. Now you can add multiple Accounts and Subscriptions.

i1CodeLens is a feature that has been around in Visual Studio since 2013, but in 2015 it is available in Professional meaning more people have access to it.

CodeLens gives coders useful information at a glance. Above each class/method is listed how many references there are. If you click on the number of references you can see where that class or method is referenced in the rest of your code. Useful to be able to see which methods or classes are not being used.

Next CodeLens shows who (according to git) last changed the class or method and how many days ago that was. Clicking on it shows a cool graph of when changes have happened and who did them.

Next you can see the number of changes that have been made, basically a source control history, but without having to load up your git client.

For code that doesn’t contain classes or methods such as T_SQL you can see at the bottom of your code window the last two CodeLens information to help you track down what changes have happened recently.

The last new feature that I have noticed is the Light Bulbs that keep showing up all over my code. I think the Light Bulbs might be called Quick Actions, but whatever they are called they are suggestions on how to improve your code. So far they have suggested to be to get rid of using references that are not used, simplify a fully qualified name, drop unneeded this keywords. I am sure more will popup as I do more coding.

These improvements to Visual Studio I like, and I am sure there are many more that I haven’t noticed. I expect support for the vNext .net framework is also in there somewhere which hopefully I can play around with soon.

As A Service

In Cloud Computing there are a lot of terms that end aas or As a Service. Most of these I hadn’t heard of until I started writing this list.

Any service that is delivered over the internet instead of hosted locally on your network or PC could in theory be described as an As a Service.

19656eePaaS Platform As a Service

This is one of the big ones. Microsoft Azure provides a Platform as a Service which I am familiar with. Platform as a Service is where a provider provides a platform where you can build apps or websites.

SaaS Software As a Service

Another popular one. Software as a Service can be as simple as a website that runs a service that a customer wants to use, I work for a company that provides a SaaS product.

IaaS Infrastructure As a Service

The last of the big ones. A good example of IaaS is a Virtual Machine which can either be hosted on a server somewhere (a private cloud) or on the internet via a company such as Azure (public cloud) The Pizza as a Service diagram illustrates the differences between Saas, PaaS and Iaas.

NaaS Network As a Service

This is just a type of IaaS that specializes in providing networking. Anything that provides network connectivity could be included in this category.

CaaS Communications As a Service

Another subtype of IaaS this time specializing in communications, this could include Voice over IP or other similar technologies.

MONaaS Monitoring As a Service

If you have a SaaS, PaaS or IaaS you will most likely want to monitor that it is working, I certainly do. This is something that is often included in your IaaS or PaaS package. Azure has various tools for monitoring and this could be included in this category.

BaaS Backup As a Service

With the growth of cloud storage and the decrease in its cost, backing up to the cloud is a very attractive option. Any service that allows you to backup and restore from the internet can be included in this category. Your provider needs to manage your backups for it to be truly BaaS rather than just another place to store your files.

DaaS Desktop As a Service

This is where your desktop is visualized and stored in the cloud. I know very little about this as I have never used it, but I would imagine you need a strong internet connection for it to work reliably.

DBaaS Database As a Service

This is a simple one if your database is stored in the cloud like Azure SQL Database then it fits into this category. If you run your own sql server install on a VM then it doesn’t fit in this category as you are still managing it yourself and is IaaS

HaaS Hardware As a Service

HaaS this is another subcategory of IaaS which concentrates on hardware.

IDaaS Identity As a Service

This is where the management of who you are is managed in the cloud. Single Sign-On could be achieved if a website redirected determining if you are who you say you are to a particular IDaaS. Azure Active Directory is an example of this.

SaaS Storage As a Service

You get the idea now, storing files on a remote cloud product is an example of Storage as a Service. DropBox or OneDrive are good examples of this.

FoaaS F Off As a Service

.NetRocks mentioned this a few weeks ago and is a joke As a Service. http://foaas.herokuapp.com/ The idea is that you can use this service to tell people to F off.

Writing this blog post has given me a better understanding of all the aaS that are out there. I am sure I haven’t explained some of these very well and no doubt missed some off.

Monitoring Screens

We all know that it is important to monitor your servers and services, so you can spot issues before they become problems. I personally have spent a lot of time configuring nagios to email me about issues and I have recently been configuring various different alerts in Azure.

My old boss has this idea that I should have a big monitor screen displaying all the vital stats of my servers and services, I personally disagree with this idea and think that notifications on my phone and email alerts are sufficient. He will no doubt correct my thinking when he reads this, but I believe part of his thinking is to make the monitoring of your infrastructure move visible and make it obvious to anyone that walks past that you have your eye on everything.

For the purpose of this blog post lets assume he has convinced me and I have convinced my actual boss to spend money on the required technology to do this (No easy feat). What exactly would I display on this screen?

I have Google Chromecast that I use for streaming various things to my TV, this is a relatively cheap bit of technology that could allow a TV or monitor to display a web page with the required stats displayed. perf

The two main sources of information that I want to display are New Relic for monitoring my azure websites and Nagios for monitoring my internal servers. New Relic allows you to easily export live performance data as iframes so I quickly threw together a web page full of these graphs. However if you have a static screen on the wall you don’t want to have to scroll to see different information so I needed to come up with another way to display this information.

My first thought was a slide show. There are lots of javascript scripts that cycle through a series of images like a slideshow, this could be adapted to cycle through a series of iframes and display everything I want.

My script goes something like this and requires jquery as well as javascript. First of all the script waits for the page to load completely with the ready function, it then defines the urls which will be put into the iframe one at a time. It than counts the number of urls you have. It then loops through changing the contents of the src attribute in the iframe every few seconds, in my example it changes every 9 seconds but once this is used in production you may want to increase this.

<script type="text/javascript">
$(document).ready(function(){
var locations = ["URL1", "URL2", "etc"];
var len = locations.length;
var iframe = $('#frame');
var i = 0;
setInterval(function () {
iframe.attr('src', locations[++i % len]);
}, 9000);
});
</script>

Now what information wants to be included in a script like this? Showing too much performance data can almost be as bad as not doing it at all as problems gets drowned out in the noise. For me I have performance of my websites, followed by Nagios problems, followed by the azure status page, followed by memory usage of all my servers and lastly showing number of connections to my databases. Another question to consider is what time scales do you want to graph over, too long and you don’t see what is happening now, but too short and you may only worry about an intermittent issue?

 

Azure Traffic Manager

I have spent most of the day tweaking my Azure websites. Lots of fun!

Last week unfortunately Azure had some problems and many websites that were running in the North Europe data centre were unavailable for several hours. And you guessed it my websites were hosted here.

All hosting providers are going to have downtime from time to time and this is just something you have to take on the chin. The important thing to do in times like these is communicate with your customers about what is going on and that you are doing everything you can to restore service.

However Azure has some amazing features that you can configure to help manage when downtime occurs.

Azure is Microsoft’s global cloud platform. And it really is global, there are data centres in North Europe, West Europe, Brazil, Japan, two more in Asia and five in the US. In the event of problems it is highly unlikely that more than one of these would go down at once. If all of these are unavailable, I expect the planet earth is facing some kind of cataclysmic event and the fact that my website is down is not a priority.

IC750592To take advantage of these multiple data centres, Azure has something called a Traffic Manager.

Traffic Manager has various settings but I am using it in failover mode. This means that if one website goes down, the next one is used.

All you need to do is create a traffic manager, add two or more websites to it (called endpoints) and choose a page that needs to be monitored so Azure knows which websites are up and which are down.

If you are using SSL or custom domain names, there are a few extra steps you need to do. Your custom domain name needs pointing at the traffic manager, not the individual websites. The websites themselves have three domain names, the traffic manager address, the azure address and the custom domain name. The SSL certificate can then be assigned to each website that you have added to the traffic manager.

That was easy wasn’t it, and now if a website goes down traffic manager will use the next one. While testing this, the transition to the next website was almost immediate. I did notice that if you had a browser showing the website open during a problem you sometimes got an error page, I think this was probably due to browser caching, reopening a tab or browser fixed this issue.

Cloud Computing

I work for a company that provides a cloud computing website, but what is Cloud Computing?

Wikipedia defines it as “Cloud computing involves deploying groups of remote servers and software networks that allow centralized data storage and online access to computer services or resources.”

Erm. What does that mean? Well in simple terms it means storing you data on a remote server. Any time you submit information onto a website, you are submitting your information to the “Cloud”

But there is more that the Cloud can do for you. The search giant Google, which also develops the android mobile phone operating system, have developed a range of cloud computing products. The Google Drive product allows you to upload and even create files, these files are saved into the Cloud, once in the cloud they can be accessed from any device that has an internet connection. No longer do you run the risk of your favourite file being on a computer that won’t turn on or being on a dead hard drive, save it to the cloud and you will always have access to it.

Google isn’t the only one that does this, Microsoft have OneDrive which they are incorporating into the heart of their operating system, Apple and many other companies have released similar products.

This age of cloud computing allows a clearer separation between your data and your device. It won’t be long before whatever device you are using your files will be there to be worked on. You are in the office working on a spreadsheet, you save it to the Cloud, you can then continue working on it with your tablet while on the road. Just before bed you want to check if you added something to that doc, so you quickly check your phone. How cool is this!

But if I put my system administer hat on, Cloud computing goes far further than that. Microsoft Azure allows many of the services you would typically have running on your server, saved into the cloud. This has many advantages and allows you to concentrate on other problems while leaving the cloud to look after these services.

Looking back at 2014

As today is New Years eve I thought it would be a good chance to look back over the last year.

tangleAt the start of the year I finally got agreement for the installation of a 30MB leased line. This was an amazing achievement as I had been really struggling to keep everything working on our ADSL connections and the complexity of the connections was starting to cause problems itself.

However it was not a smooth process, the site survey revealed a substantial installation cost would be required, however the Council was about to launch a voucher scheme that would give us £3000 off. So most of this was just waiting around for the Council to get things going. By April we had it installed and I have been very happy with the ISP we chose, York Data Service (YDS).

Also during April this year was the end of support for Windows XP and Office 2003. This didn’t impact us too much, mostly due to my constant nagging of the directors to replace our oldest machines. We now only have a couple of machines still running XP and no machines running Office 2003. Most machines are now running Windows 7, I have one Director running Windows 8 for presentations and a few surveyor tablets running windows 8 as well. Like many businesses I am waiting for Windows 10 next year before doing any more substantial replacement of OS.

In May it was decided that I would move more into development work, so I got a new desktop PC and Visual Studio. I am very happy with my new machine (subject to dell fixing a problem later today) So far I have only dipped my toe into development work, this is largely due to needing more time to build up others to do what I do.

In June I made another huge achievement. I got agreement for a new server. This was our first Server 2012 server and allowed us to make huge strides towards virtualizing our services. Once we had this server up and running I was able to decommission two of our Windows Server 2003 machines well before it goes out of support next year.

I am a huge fan of this new server, I have only used a fraction of its memory. Hyper V is really great and I hope to make progress next year into HyperV replication.

Also during 2014 we made huge progress with our clouds software. Within the last few weeks it is running on the Azure platform which will not only make it far easier to expand and create new features it will provide us with cost savings. I am looking forward to learning more about Azure as I get deeper into development.

Wow I achieved a lot didn’t I, on top of this I hired a new IT person and trained him up (still more to do), I have also fixed lots of printers, websites, servers (had a server motherboard die!), and made hundreds of changes to our internal databases.

I don’t know what is in store for 2015. I expect a lots of the same with hopefully more focus on developing in Visual Studio, training myself and my department and getting the last few tweaks into our internal servers.

I would also like to thank everyone that I have worked with over the last 12 months (you know who you are), as you made a lot of what I have just described possible.