Looking to establish a presence in Azure?

Direct connection to Azure through Microsoft ExpressRoute.

Written by Aram Sadeghi.

One of the key items in the shopping basket of businesses considering the move to the cloud or expanding their existing tenancy, is the connecting link between their infrastructure and the cloud.

In today’s topic, I am going to briefly cover a dedicated circuit, used to connect Azure customers to Microsoft cloud.

One of the most commonly used and easiest way to connect is using VPN.

We all know VPN and the numerous advantageous it has brought to the IT world by making end-to-end communication safer and cheaper to run.

Can we therefore conclude VPN connection to Azure is the right solution for business?

In my view, it certainly can and does provide some great benefits, however as with anything in life, it does come with some disadvantageous.

Whilst some might argue that internet is reliable, hence VPN can be seen as a reliable connection, in my view this is not the case. As a home user or a typical user browsing the internet you may not notice issues that commonly happen, which can be a nightmare for an organisation running business processes.

An example of this is would be whilst streaming a video, the video is progressively downloaded with most streaming platforms in advance thus if there is a hiccup, this might not be noticed by the end user.

Compare this with a business that is running hundreds of SQL queries or other tasks that are latency sensitive, what happens when there is a hiccup?

Well, loss of revenue, damage to reputation and the list goes on.

One of the key benefits for most organisations is the predictability of the circuit.

You can expect a stable and reliable latency in addition to throughput using ExpressRoute, also known as ER.

This becomes even more important when businesses have a hybrid model where part of the infrastructure in the cloud needs to communicate with an on-premise on latency sensitive application.

This is in comparison with traditional model of connecting to Azure using VPN where there is no guarantee on reliability of the circuit as it is crossing public network.

Another key item on the list is the support. As an enterprise, you will need to ensure that you can escalate any issues you have between two end points and any underlying infrastructure is supported by a business Service Level Agreement.

Often if you have any problems with VPN, it is quite difficult to troubleshoot it due to no visibility or in fact supportability of the underlying infrastructure.

One can still troubleshoot and debug any issues to the egress point of their infrastructure which is fine, however how about hops between egress and Microsoft?

This can quite easily be a blocker for most business as it is not possible to make the move towards cloud without having proper support on the transit link thus supporting the criticality of implementing an ER circuit.

There are also other advantageous to deploying ER:

  • Dynamic bandwidth scaling
  • Robust failover options
  • Dynamic routing

I am going to cover other aspects of ER in future posts so watch out this space.

If you are interested to know more or have any arising questions, drop us a message on the contact us section of the website and we will come back to you as soon as we can. Alternatively, drop Aram an e-mail direct at

Aram Sadeghi, Network Practice Lead, CoreAzure.


Microsoft Azure – Single Instance SLA

Microsoft announced on Monday a significant improvement to their Service Level Agreements in respect of a single instance Virtual Machine in Azure – now offering 99.9% availability!

Many organisations with legacy line of business applications are nervous about moving those workloads to the Cloud owing to the SLA provided by the cloud vendor. Often those legacy applications are unable to take advantage of scale-out features and therefore reside on a single instance server, which in turn suffers from relatively poor SLA commitments.

Microsoft have done extensive work to improve availability of the Azure infrastructure, including innovative machine-learning to predict failing hardware early and offering premium storage to help improve reliability and performance of attached disks – the net effect of this work is that they can now offer single instance virtual machines in Azure with 99.9% availability, allowing organisations to take advantage of the agility of the cloud without compromising on their expectations of availability.

To qualify for the single instance virtual machine SLA, all storage disks attached to the VM must use Azure Premium Storage which offers both high availability and performance (80,000 IOPS and 2,000 MBps throughput).

To put that SLA into context that’s less than 9 hours per year downtime – there are very few organisations capable of running their own infrastructure with single instance line of business applications that can provide that level of availability.

Of course, if your legacy line of business applications do support scale-out then you can continue to build multi-machine high availability by having two or more virtual machines deployed in the same Availability Set or by utilising VM Scale Sets – both of which provide machine isolation, network isolation, and power unit isolation across multiple virtual machines.

If you want to know more about how you can migrate applications and workloads safely, securely and efficiently to Azure then please feel free to contact us here at CoreAzure.

The Cloud changes everything

The single biggest business opportunity to impact government is the arrival of ubiquitous internet.

Take an example of an organisation that ignored the business implications of the internet: Blockbuster Video.  One of the last actions of its board before it went into receivership was to order a revamp of its website and stores: a new front end on an old business model.  For Netflix, the arrival of the internet meant an entirely new business – Netflix didn’t duck the business implication of the internet.  Public services is just the same: in my view, the single biggest failing would be to duck the business implication.  So what is it?

Just as in private sector, what the cloud (which is just utility, rather than private, internet) means for large vertically-integrated public corporates is progressive dis-integration & separation of business from technology layers.  As a result, the business focuses increasingly on outcomes, and consumes standard processes from the cloud; and because this stuff is standard, it attracts lots of customers, and so get lots of investment and innovation.  Take a look at the annual Meeker report (published on the internet – where else).  Slack (comms); Stripe (online payments); Square (offline payments); Docusign (transaction mgmt.); Zenefits (HR); Directly (customer service); Anaplan (ERP); Greenhouse (recruiting); Checkr (background checks); increasingly these are becoming low-code and avoiding need for ‘IT people’ altogether.  In fact Techcrunch 2015 reckoned that pretty much all of the software and services a reasonably-sized organisation needs can be consumed from the cloud for approximately $75 per month.

Organisations using these are cheap to run, agile, data-driven, and responsive to customers – all the things we want our public services to be.

Industry agrees: here’s TechMarketView editor on 26 September:

We believe infrastructure services providers will play a crucial role in enabling the digitisation of the enterprise. The underlying infrastructure (e.g. networks, processing, storage) is the platform for the emerging software and services that will fuel the digital enterprise and will therefore be an investment priority for many CIOs.

However, cloud and the digitisation of processes and services will both drive and suppress overall IS market growth at the same time. On the one hand it will draw new investment into the market but by the same token we will see cannibalisation of traditional outsourcing services and an increase in automation leading to cheaper IT services and consequently lower revenue.

So what does all this mean for government?  Three simple things:

  1. We need to start rethinking our business models to be ‘of the net’, not just ‘on the net’; that means taking cloud-first and PaaS much more seriously.
  2. Local service providers need to evolve business models, and procure tech, based on accessibility, re-usability, and consumption of data and capability over that cloud/PaaS. That means proportionate approaches to risk, open data assets, open business logic, open APIs, opening all those organisational black boxes to expose those uncomfortably inefficient value chains, all those ‘commercial-in-confidence’ clauses that hide some of those really embarrassing contracts.
  3. Local public services (health, social care, housing, third sector, and local government) need, somehow, to evolve consistent approaches to consumption & re-use; we’re not competing orgs with different shareholders, yet we behave as if we are. So we need proactive initiatives from the centre that make consumption and re-use easy to specify, procure, & bolt together. Let’s start with commodity tech; then move to commodity business capability.

Imagine this: a place on where you could see the ‘live DNA’ of local services: drill into a local authority’s service to see its service architecture: the actual value chain, with its common service patterns, consumed modules, the supplier, and the cost.  Then imagine, at a click, looking horizontally across other local service providers: exposing everyone that’s currently consuming that service pattern and underlying modules, whose price lowers with each new additional customer.  Imagine the innovation and investment such a consolidated, accessible service architecture would trigger across public, private, and third sectors.  Imagine, too, the democratic impact such accountability would have on our local services.

In fact, probably the single biggest test of the public value of most of us in this room as technologists in government is whether we’ve enabled cloud-based service architectures that make ourselves redundant within a generation or so! Let’s divert more resources to the front line; as far as the public are concerned, management and admin is little more than a necessary evil.  This would be a reasonable indicator of whether we’ve achieved Netflix, or Blockbuster, local government; whether we’re lots of public corporates polishing our websites or whether we’ve the courage to rethink our business model – and even our own future.


Hybrid cloud storage for the Enterprise – What’s your strategic storage plan?

We love the way technology can help underpin business change. Take Microsoft StorSimple as an example, an enterprise storage solution with true hybrid cloud storage capabilities.

StorSimple delivers enterprise storage with true hybrid cloud storage capabilities, enabling you to take advantage of economical cloud storage for your ‘historical’ data, while keeping your frequently accessed data on-premises for the highest levels of performance and all of this all happens automatically.

It still amazes me how many organisations continue to store vast quantities of duplicate historical data on premise, most of which will probably never see the light of day again but will continue to increase the cost of the IT service.

CoreAzure can help you to assess the best way to deploy StorSimple within your organisation to achieve the best value and ensure the following business benefits are realised.

Primary storage solution

Use StorSimple as primary storage for a range of workloads: collaboration, file sharing, databases, virtual machines.

Automated archive

Archive your cold data from on-premises to the cloud—automatically.

Cost-effective backup

Take advantage of automated cloud snapshots and back up data based on software policies without costly backup infrastructures.

Reliable disaster recovery

Get integrated data protection and location-independent disaster recovery—and only recover the data your business need when it’s required.

Microsoft Azure Active Directory: Preview

For customers who are struggling with federating Active Directory and other directory stores with Microsoft Online Services (Windows Azure and Office 365), Microsoft has made a confession: “integrating your on premises identities with Azure AD is harder than it should be” and requires “too many pages of documentation to read, too many different tools to download and configure, and far too much on premises hardware required.

The good news? It has done (and is continuing to do) something about it, in the form of a new, “four-clicks-and-you’re-done” tool: Azure Active Directory Preview.

The tool is currently in Beta and is billed as “a single wizard that performs all of the steps you would otherwise have to do manually for connecting Active Directory and local directories to Azure Active Directory.

That means it installs all the required bits of .NET Framework, the Azure Active Directory Powershell Module and the Microsoft Online Services Sign-In Assistant, then gets Dirsync up and running between your on-premise environment and Microsoft Azure.

For now, the tool only allows a single Active Directory forest with Windows Azure Active Directory, but Microsoft promises to bring more forests into the cloud in future.

Customers wishing to join the program will find the following information useful:

To join the program through Microsoft Connect:

For more information about AADSync:

Azure DocumentDB is finally here

At last Microsoft have taken the wraps off their fully managed, JSON document database service. As of today (21st August) Microsoft has placed Azure DocumentDB into public preview, and made it available to the following regions: –

  • US West
  • Europe North
  • Europe West

There is an increasing demand for NoSQL databases but invariably developers are craving features and capabilities inherent to relational database systems. Unfortunately NoSQL means tough choices: –

  • strong OR eventual consistency
  • schema-free with limited query capability OR schematised and rich query capability
  • transactions OR scale

Wouldn’t it be great if you could have a massively scalable, schema-free database with rich query and transaction processing using the most ubiquitous programming language (JavaScript), data model (JSON) and transport protocol (HTTP)? That is exactly what Microsoft have given us with DocumentDB.

DocumentDB provides the ability to automatically index documents without requiring any schema or secondary indices, has the ability to issue SQL based relational and hierarchical queries over heterogeneous JSON values, has the ability to integrate database transactions with JavaScript exceptions, and the ability to seamlessly operate over JSON documents.

Above all they’ve provided a multi-tenant database service which is blazingly fast and (via tenant isolation) safe and secure.


DocumentDB supports SQL queries without forcing developers to create explicit schema or secondary indices or views. DocumentDB is able to efficiently index, query and process heterogenous documents via the deep commitment to the JSON data model.

DocumentDB’s SQL language is based on the JavaScript type system, expression semantics and ability to invoke JavaScript UDFs, ensuring the query grammar has a familiar SQL dialect for developers creating an efficient and natural way for you to query over JSON documents.

There is (of course) a downloadable .NET SDK which includes a LINQ provider – there is even a rumour that Microsoft are considering native JavaScript mapping to their SQL query language.

Here is a great link to understanding how to Query DocumentDB: –

JavaScript as a modern day T-SQL

As we adopt NoSQL systems for their simplicity, speed and scalability, we’re often forced to give up the transactional processing capabilities of traditional RDBMS systems. Support for transactions provides a performant and robust programming model for dealing with concurrent changes, resulting in faster apps that are easy to maintain. JavaScript is an obvious choice when considering that you want application code execution within the database, but you don’t want to invent yet another procedural language.

DocumentDB has JavaScript execution deeply embedded within the database engine. All application JavaScript execution is sandboxed, resource governed, and fully isolated. As a developer you can write stored procedures and triggers natively in JavaScript, allowing you to write application logic which can be shipped using HTTP POST and executed directonly on the storage partition within a transaction boundary. JSON can be materialised as JavaScript objects and transactions can be aborted by throwing an exception. This approach frees application developers from the complexity of OR mapping technologies.

DocumentDB Open and Approachable

We really don’t need more data formats, languages or protocols – let’s face it the learning curve for new systems can be steep, and we’re all pressed for time. The DocumentDB product team claim they held a mantra of ensuring they resisted the urge to be inventive where it didn’t deliver real value to the developer.

Programming against DocumentDB is really simple, approachable and doesn’t require you to buy into a specific tool-chain or require custom encodings or extensions to JSON or JavaScript. All functionality including CRUD, query and JavaScript processing is exposed over a RESTful HTTP interface. By offering simple, scalable JavaScript and JSON over HTTP, DocumentDB doesn’t invent in the area of data models, application models or protocols.

In my opinion DocumentDB is unique in how it embraces and builds on standards that are already abundantly available and established, yet adds huge value and capabilities on top – it feels like DocumentDB gives the developer the very best of all worlds!

You can find out lots more about Azure DocumentDB at the product page here – please bear in mind though that at present DocumentDB is in preview (online cloud speak for Beta).

Feel free to drop me a line if you have any questions