Setting up a custom URL shortening service using Bitly.Pro

This content is 14 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

If, like me, you’ve been reading the stories about Libya taking back control of certain .ly addresses, then you might be thinking “what about all those short URLs I’ve been sharing?”. For those who haven’t read the story, the Libyan authorities have been reclaiming addresses from sites whose content breaches Sharia law.  Bit.ly, the American-based link shortener that’s become very popular since the demise of tr.im doesn’t think it’s at risk, even though some of the content it signposts might offend those of a Muslim faith, because: it doesn’t actually host the content; and bit.ly addresses are also accessible using the slightly longer bitly.com URL (for example, resolves to the same address as bitly.com/markwilsonit).

Even so, I decided to implement my own custom domain for link shortening – one that I have control over.

I decided to stick with a top level domain from a country that’s not likely to take back the address and, even though Italy (.it) is a slight risk to me as I don’t live/work there, .it domains are officially available to anyone who resides within one of the European Union member states (and that includes the UK), giving me some legitimate claim to the domain at least. Unfortunately, I found that I couldn’t register any 2 or 3 character domains, but Matt Marlor suggested I go for mwil.it and, yesterday, I successfully registered that domain.

Incidentally, if you’re looking for a short domain name, DomainR is a website that will work through various permutations of your name/company name and flag those that are valid/available.

Grabbing the domain is only the first step though – I also needed a link shortening service. I could have implemented my own (indeed, I may still do so) but decided to use the service instead, thinking that I can still migrate the links at a later date, should that become necessary.

One of my friends, Alex Coles, asked why I selected the Bitly.Pro route rather than using a script like YoURLs and, aside from the fact that would be something else for me to manage on my webspace, the Bit.ly API is widely supported by many of the other services that I use – like TwitterFeed, and TweetDeck – reducing the effort involved in generating new short links.

At the time of writing, Bitly.Pro is still in beta but I completed the form to apply to join and, shortly afterwards, received an e-mail to say I was in. At that time I hadn’t registered my domain but, once that was done, it was a simple case of:

  • Creating a DNS A record (actually, I created two – one for * and one for @) on the short domain (mwil.it) to point to Bit.ly’s servers (168.143.174.97).
  • Adding a DNS CNAME record (3bae9d57b0bf.markwilson.co.uk. CNAME cname.bit.ly) to my tracking domain (markwilson.co.uk) to prove ownership (other options included uploading a file or adding some metadata to the site).
  • Waiting for DNS propagation (which didn’t take long for me but may have been up to 24 hours) and verifying the details in my Bitly.Pro account settings.

With these steps completed, I had everything in place to start generating short URLs using Bitly.Pro, but the was one more step for my client applications – TwitterFeed and TweetDeck both needed to be provided with an API key in order to use the Bit.ly API with my account (TweetDeck even gives the link to go and get the key). After entering those details, I sent a test tweet and was pleased to see it using the mwil.it domain, with no additional work required on my part.

So, what’s left to do? Well, I still don’t know why sites like the New York Times and TechCrunch get custom URLs when I link to them, even without an API key (I suspect for that I would need an Enterprise account) and it may still be prudent to keep an offline copy of my short-long URL mappings, just in case Bit.ly should ever cease to exist. There are also some client applications that don’t use my custom shortener (for example, Twitter’s own app for the iPad uses another Bit.ly domain, j.mp, and doesn’t appear to have any options to enter an API key) but at least my auto-posted tweets (i.e. links to my blog posts, etc.) now use a domain that’s under my control.

Live Mesh reaches out to the Mac

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Graphic showing files moving between devicesOn the same day that I published my recent post about Windows Live FolderShare, I heard that the current Live Mesh beta is now available on a Mac.

Despite already being a Mesh user, I tried to add my Mac as a device but was disappointed to read that the Live Mesh Tech Preview was out of invitations so I tried again this evening and was pleased to find that it accepted me and let me install the software.

First impressions were good, with a really straightforward installation and good client support – working like a Mac application (not a Windows application running on OS X) and with support for both Safari and Firefox.

Then I realised that Mac-PC synchronisation in Mesh still needs to go via the Live Desktop (i.e. out to the ‘net and back), as evidenced when I tried to sync a folder that was not fully replicated:

The current version of Live Mesh cannot synchronize a folder with a Mac computer unless the folder is also synchronized with your Live Desktop.

This lack of LAN-based peer-to-peer support, combined with Mesh’s 5GB storage limit means that FolderShare is still the sync option for my work in progress (be prepared for a long wait if you’re syncing via the web and an ADSL connection – ADSL downloads are fine, but uploads are s…l…o…w…).

Predictably, some features are Windows-only too (like the remote desktop capability). There’s mobile device support too but it does depend on the phone – for example my Apple iPhone 3G was recognised as a Mac, after which Safari refused to install anything (I didn’t expect it to work but I just had to try!).

I don’t want to sound negative – Live Mesh is has so much potential and it is still a beta – over time new features will be added and it will be fantastic. Right now it’s still a little confusing – with the feature sets of Windows Live Skydrive, Mesh, FolderShare and Office Live Workspaces all overlapping slightly it’s sometimes difficult to fathom out the best tool to use – and those are just the Microsoft options! Hopefully this will all shake down over the coming months and the vision of my digital life being available wherever I am will become a reality.

Windows Live FolderShare – an example of Microsoft’s cloud computing platform that’s here to use today

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I started off writing this post on the train, as the stacation (taking a break from work but staying at home) part of last week became the vacation part (a few days by the seaside with my wife and sons – the fact that I woke up to snow in Buckinghamshire didn’t seem to put the boys off wanting to build sandcastles in Dorset… even with their winter coats on).

The point of this is that I wanted to use the time on the train to good effect – and that meant catching up on my writing. Despite having spent a few days decommissioning my old file server in favour of a new NAS box, I still have a certain amount of local data that I need to access – spread across multiple Windows and Macintosh PCs. This is where Microsoft’s web services platform comes in. I’ve been using the Live Mesh CTP for a while now, but the current version of Mesh is just a starting point and there is another Live service in beta that I’m using here – Windows Live FolderShare.

FolderShare is a web service for distributed folders across multiple devices – either personal or shared folders. If you’ve used Windows Live SkyDrive as file storage in the cloud, then imagine if that data was hosted on your PCs (phones, and other devices) rather than in cyberspace – and replicated automatically.

Over time, I expect to see FolderShare move into Live Mesh and, in my coverage of the recent PDC keynote, I wrote about how:

Live Mesh bridges [islands of information] with a core synchronisation concept but Mesh is just the tip of the iceberg and is now a key component of Live Services to allow apps and websites to connect users, devices, applications and to provide data synchronisation.

My personal file data may not be the scale of enterprise service Microsoft plans for Windows Azure but Windows Live FolderShare does nicely demonstrate the concept in a way that most of us can appreciate. Here I am, creating content on the train using my Macintosh PC and I know that, when I hook up to a network, FolderShare will sync this (via Windows Live Services) to people/devices that I want to share the data with – for example my Windows PC in the home office. Then, whichever device I’m using, I can continue my work without worrying about where the master copy is. Add a phone into the mix and one would expect me to be able to access that data wherever I am as well as creating additional content – for example photos, or location specific data.

Jasdev Dhaliwal has an interesting article about Microsoft’s cloud computing announcements over at the Web Pitch. Jas’ post includes: Microsoft’s “Overnight Success” video which talks about the greater sum of software plus services “moving beyond devices and across borders to capture the imagination of the world… a world where the richness of software and the ubiquity of services are rapidly converging”; a BBC interview with Ray Ozzie where he talks about how it has become burdensome to manage the computer we’ve got at work, the computer we have in the den, childrens’ PCs, a cellphone with contacts, photos and information, cable boxes with recorded movies and how “Windows in the sky” can bring all of those devices together and make it easier to manage – more than just applications in the cloud but a total computing infrastructure; another BBC film where Rory Cellan-Jones visits one of Microsoft’s vast datacentres; and finally Microsoft’s “Synchronizing Life” video where a Mum takes a picture of a child at play using her mobile phone and that picture appears on a display many miles away in Dad’s office, on his PC, on his Mac, and how the Live Mesh extends to his media player, phone, into the car and to the childrens’ games console.

I started this post on the train, using a Mac. Now I’m ending it in the office, on a Windows PC – and I haven’t had to think about which copy of the data is current – it just works. That’s what connected synchronicity is about – it’s not about uploading everything I do to some website but about a mesh of devices working together to make my local data available globally… synchronising my life.

PC, phone and web: How Microsoft plans to build the next generation of user experiences

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Channel 9 man watching PDC onlineI’m supposed to be taking a week off work, but the announcements coming out of Microsoft’s PDC have the potential to make a huge impact on the way that we experience our IT. So, it’s day 2 of PDC and I’ve spent the afternoon and evening watching the keynote and blogging about new developments in Windows…

Yesterday I wrote about Ray Ozzie’s PDC keynote during which the Windows Azure services platform was announced. Today, he was back on stage – this time with a team of Microsoft executives talking about the client platform, operating system and application innovations that provide the front end user experience in Microsoft’s vision of the future of personal computing. And, throughout the presentation, there was one phrase that kept on coming back:

PC, phone and web.

Over the years, PCs have changed a lot but the fundamental features have been flexibility, resilience and adaptability to changing needs. Now the PC is adapting again for the web-centred era.

Right now, the ‘net and a PC are still two worlds – we’ve barely scratched the surface of how to take the most value of the web and the personal computer combined.

PC, phone, and web.

Ozzie spoke of the most fundamental PC advantage being the fact that the operating system and applications are right next to the hardware – allowing the user experience to take advantage of multiple, high resolution, screens, voice, touch, drag and drop (to combine applications), storage (for confidentiallity, mobility, and speed of access) so that users may richly create, consume, interact with, and edit information. The PC is a personal information management device.

The power of the web is its global reach – using the ‘net we can communicate with anyone, anywhere and the Internet is “every company’s front door” – a common meeting place. The unique value of the web is the ability to assemble the world’s people, organisation, services and devices – so that we can communicate, transact and share.

Like PCs, phone software is close to the hardware and it has full access to the capabilities of each device – but with the unique advantage is that it’s always with the user – and it knows where they are (location) and at what time – providing spontaneity for capture and delivery of information.

Microsoft’s vision includes applications that spans devices in a seamless experience – harnessing the power of all three access methods.

PC, phone and web.

“We need these platforms to work together and yet we also want access to the full power and capabilities of each”

[Ray Ozzie, Chief Software Architect, Microsoft Corporation]

I won’t cover all of the detail of the 2-and-a-half hour presentation here, but the following highlights cover the main points from the keynote.

Steven Sinofsky, Senior Vice President for Microsoft’s Windows and Windows Live Engineering Group spoke about how Windows 7 and Server 2008 R2 share the same kernel but today’s focus is on the client product:

  • Sinofsky brought Julie Larson-Green, Corporate Vice President, Windows Experience on stage to show off the new features in Windows 7. Windows 7 is worth a blog post (or few) of its own, but the highlights were:
    • User interface enhancements, including new taskbar functionality and access to the ribbon interface for developer.
    • Jump lists (menus on right click) from multiple locations in the user interface.
    • Libraries which allow for searching across multiple computers).
    • Touch capabilities – for all applications through mouse driver translation, but enhanced for touch-aware applications with gestures and a touch-screen keyboard.
    • DirectX – harnessing the power of modern graphics hardware and providing an API for access, not just to games but also to 2D graphics, animation and fine text.
    • And, of course, the fundamentals – security, reliability, compatibility and performance.
  • Windows Update, music metadata, online help are all service-based. Windows 7 makes use of Microsoft’s services platform with Internet Explorer 8 to access the web. Using technologies such as those provided by Windows Live Essentials (an optional download with support for Windows Live or third party services via standard protocols), Microsoft plans to expand the PC experience to the Internet with software plus services.

PC, phone and web.

“We certainly got a lot of feedback about Windows Vista at RTM!”

[Steven Sinofsky, Senior Vice President, Microsoft Corporation]

  • Sinofsky spoke of the key lessons from the Windows Vista experience, outlining key lessons learned as:
    • Readiness of ecosystem – vendor support, etc. Vista changed a lot of things and Windows 7 uses the same kernel as Windows Vista and Server 2008 so there are no ecosystem changes.
    • Standards support – e.g. the need for Internet Explorer to fully support web standards and support for OpenXML documents in Windows applets.
    • Compatibilty – Vista may be more secure but UAC has not been without its challenges.
    • Scenarios – end to end experience – working with partners, hardware and software to provide scenarios for technology to add value.
  • Today, Microsoft is releasing a pre-beta milestone build of Windows 7, milestone 3, which is not yet feature complete.
  • In early 2009, a feature complete beta will ship (to a broader audience) but it will still not be ready to benchmark. It will incorporate a feedback tool which will package the context of what is happening along with feedback alongside the opt-in customer experience improvement program which provides additional, anonymous, telemetry to Microsoft.
  • There will also be a release candidate before final product release and, officially, Microsoft has no information yet about availability but Sinofsky did say that 3 years from the general availability of Windows Vista will be around about the right time.

Next up was Scott Guthrie, Corporate Vice President for Microsoft’s .NET Developer Division who explained that:

  • Windows 7 will support .NET or Win32 client development with new tools including new APIs, updated foundation class library and Visual Studio 2010.
  • Microsoft .NET Framework (.NET FX) 3.5 SP1 is built in to Windows 7, including many performance enhancements and improved 3D graphics.
  • A new Windows Presentation Framework (WPF) toolkit for the .NET FX 3.5 SP1 was released today for all versions of Windows.
  • .NET FX 4 will be the next version of the framework with WPF improvements and improved fundamentals, including the ability to load multiple common language runtime versions inside the same application.
  • Visual Studio 2010 is built on WPF – more than just graphics but improvements to the development environment too and an early CTP will be released to PDC attendees this week.
    In a demonstration, Tesco and Conchango demonstrated a WPF client application for tesco.com aiming to save us money (every little helps) but to spend more of it with Tesco! This application features a Tesco at home gadget with a to do list, delivery and special offer information and providing access to a “corkboard”. The corkboard is the hub of familiy life, with meal planning, calendar integration, the ability to add ingredients to the basket, recipes (including adjusting quantities) and, calorie counts. In addition, the application includes a 3D product wall to find an item among 30,000 products, look at the detail and organise products into lists, and the demonstration culminated with Conchango’s Paul Dawson scanning a product barcode to add it to the shopping list.
  • Windows 7 also includes Internet Explorer 8 and ASP.NET improvements for web developers. In addition, Microsoft claims that Silverlight is now on 1 in 4 machines connected to the Internet, allowing for .NET applications to run inside the browser.
  • Microsoft also announced the Silverlight toolkit with additional controls on features from WPF for Silverlight 2 as a free of charge toolkit and Visual Studio 2010 will include a Silverlight designer.

David Treadwell, Corporate Vice President, Live Platform Services spoke about how the Live Services component within Windows Azure creates a bridge to connect applications, across devices:

PC, phone and web.

  • The core services are focused around identity (e.g. Live ID as an openID provider), directory (e.g. the Microsoft services connector and federation gateway), communications and presence (e.g. the ability to enhance websites with IM functionality) and search and geospacial capabilities.
  • These services may be easily integrated using standards based protocols – not just on a Microsoft .NET platform but invoke from any application stack.
  • Microsoft has 460 million Live Services users who account for 11% of total Internet minutes and the supporting infrastructure includes 100,000s of servers worldwide.
  • We still have islands of computing resources and Live Mesh bridges these islands with a core synchronisation concept but Mesh is just the tip of the iceberg and is now a key component of Live Services to allow apps and websites to connect users, devices, applications and to provide data synchronisation.
  • The Live Service Framework provides access to Live Services, including a Live operating environment and programming model.
  • Ori Amiga, Group Program Manager – demonstrated using Live Framework to extend an application to find data on multiple devices, with contact integration for sharing. Changes to the object and its metadata were synchronised and reflected on another machie without any user action and a mobile device was used to added data to the mesh, which sychronised with other devices and with shared contacts.
  • Anthony Rhodes, Head of Online Media for BBC iPlayer (which, at its peak, accounts for 10% of the UK’s entire Internet bandwidth) spoke of how iPlayer is moving from an Internet catchup (broadcast 1.0) service to a model where the Internet replaces television (broadcast 2.0) using Live Mesh with a local Silverlight application. Inventing a new word (“meshified”), Rhodes explained how users can share content between one another and across devices (e.g. watch a program on the way to work, resuming playing from where it left off on the computer).

In the final segment, before Ray Ozzie returned to the stage, Takeshi Numoto, General Manager for the Office Client spoke of how Microsoft Office should be about working the way that users want to:

  • Numoto announced Office web applications for Word, Excel, OneNote and PowerPoint as part of Office 14 and introduced the Office Live Workspace, built on Live Services to allow collaboration on documents.
  • In a demonstration, a document was edited without locks or read only access – each version of the document was synchronised and included presence for collaborators to reach out using e-mail, instant messaging or a phone call. Office web applications work in Internet Explorer, Firefox or Safari and are enhanced with Silverlight. Changes are reflected in each collaborator’s view but data may also be published to websites (e.g. a Windows Live Spaces blog) using REST APIs so that as the data changes, so does the published document, extending office documents onto the web.
  • Office Web apps are just a part of Office 14 and more details will be released as Office 14 is developed.
  • Numoto summarised his segment by highlighting that the future of productivity is diversity in the way that people work – bringing people and data together in a great collaboration experience which spans…

PC, phone and web.

  • In effect, software plus services extends Office into connected productivity. In a direct reference to Google Apps, Microsoft’s aspirations are about more than just docs and speadsheets in a browser accessed over the web but combine to create an integrated solution which provides more value – with creation on the PC, sharing and collaboration on the web and placing information within arms reach on the phone. Seamless connected productivity – an Office across platform boundaries – an office without walls.

PC, phone and web.

Windows vs. Walls
Software plus services is about combining the best of Windows and the best of the web. Windows and Windows Live together in a seamless experience – a Windows without walls. All of this is real – but, as Ray Ozzie explained, it’s also nascent – this is really just the beginning of Microsoft’s future computing platform and, based on what Microsoft spoke of in yesterday and today’s PDC keynotes, the company is investing heaviliy in and innovating on the Windows platform. Google may have been one to watch lately but it would be foolish to write off Windows just yet – Microsoft’s brave new world is enormous.

Cloud computing takes centre stage with Windows Azure

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Channel 9 man watching PDC onlineI wasn’t planning any major PDC coverage on the blog this week (there will be plenty of that elsewhere) but I did catch the PDC keynote today. It was the first time I’ve seen Ray Ozzie present and I was impressed – none of the Ballmer madness, or the Gates geekiness. Instead, over two hours, backed up with key executives from throughout Microsoft, Ozzie gave a calm and inspiring presentation which in which we finally found out some of the detail behind where Microsoft is heading – and how software plus services is going to transform Windows.

Windows Azure logoToday’s keynote was focused on the back-end – the platform which will be needed to run our datacentres in a world of cloud computing and key points that I picked up on were:

  • Most enterprise computing architectures have been designed for inward-facing solutions whilst the reach and scope is expanding as part of the “the externalisation of IT”. Regardless of the industry, the web has become a key demand generation mechanism – “every organisation’s front door” – and companies now need to serve external users.
  • Software development and operations have become intertwined – developers and IT professionals need to jointly learn how to design, build and develop systems.
  • Organisations over-engineer infrastructure to ensure that there is sufficient capacity (computing, storage, network, power) with multiple datacentres for continuity and the complexity that this introduces.
  • The world of the web needs a different approach to designing a platform. Microsoft has many systems that serve millions of users worldwide – and has used the common expertise from this experience to shape its cloud computing strategy and package Microsoft’s own experiences from managing the externalisation of IT:
    • Tier 1 is experience: the PC on the desk or the phone in your pocket
    • Tier 2 is enterprise: back end infrastructure hosting systems – with the scale of the enterprise.
    • Tier 3 is externally facing: the web tier of computing – with the scale of the web and is named Windows Azure – a new service based operating environment for the cloud.
  • Azure is Windows so it will remain familiar and developer-friendly but it also needs to be different. Rather than being rooted in a scale-up model, it embraces new model-based methods for a world of horizontal scale.
  • It is a service – not software. It is being released as a CTP today, with initial features that are just a fraction of where it will be going. Designed for iteration and continuous improvements and as the system scales out, Microsoft will bring more and more of its own services onto Azure.
  • The platform includes Windows Live Services, .NET Services, SQL Services, SharePoint Services and Dynamics CRM Service.

Windows Azure Services Platform

Amitabh Srivastava, Corporate Vice President for cloud infrastructure services, explained that:

  • The original Windows NT architect, Dave Cutler, is the kernel man behind Windows Azure. Kernels don’t demonstrate well but a good kernel allows others to build killer apps.
  • Windows Azure is an operating system for the cloud – it manages entire global datacentre infrastructure – and provides a layer of abstraction to ease the programming burden.
  • A fabric controller maintains the health of the service. When a service is changed, specify desired end state and the fabric manages services, not just servers. Windows Azure is based on a service model, with roles and groups, channels and endpoints, interfaces, and configuration settings – all stored as XML for manipulation with any tool.
  • When deploying toWindows Azure, there are two things for a developer to provide:
    • The code for a service.
    • A service model defining architecture to guide fabric controller to automatically manage the lifecycle of the application.
  • Windows Azure provides 24×7 availability, with all components built to be highly available under varying loads with no user intervention. This allows a highly available service to be provided using the Azure subsystem, orchestrated by the fabric and deveopers can concentrate on the business application logic.
  • Existing tools transfer to the cloud and Windows Azure works with managed and native code. Steve Marx demonstrated new cloud templates in Visual Studio using standard ASP.NET development skills to create a “hello cloud” application. The cloud may be simulated in an offline scenario so there is no need to deploy an application to the cloud in order to test its functionality.
  • Publishing involves repackaging the application for deployment and using the Windows Azure Developer Portal to create a hosted service with a friendly DNS name, supplying the package and configuration files.
  • Windows Azure is an open platform with a command line interface, REST protocols and XML file formats, as well as managed code support – making it easy to integrate with other platforms.
  • In summary, Windows Azure is an operating system for the cloud, providing scalable hosting, automated service management, and a familiar developer experience for enterprise and hobbiests alike.

Bob Muglia, Senior Vice President for server and tools, spoke of a next generation, services platform looking back at the various models used over the years:

  • Monolithic – 1970s mainframes.
  • Client server – 1980s PC revolution.
  • Web – a new generation of Internet and intranet applications developed in the 1990s.
  • SOA – the web services used today, communicating over standard protocols (web services or REST).
  • Services – going forward, building on web and SOA but with improved scalability.

He went on to discuss: 

  • A new product (codenamed Geneva) which provides a link between Active Directory and cloud services.
  • System Center Atlanta – a portal to provide administrators with access to information about their systems in the cloud – connecting on-premise SCOM to Azure databases using a service bus.
  • Knowledge and skills transfer between on-premise enterprise computing and cloud-based architectures and of how Microsoft is working with partners to take Azure developments and incorporate them into Windows Server, SQL Server etc., so the industry can provide its own Azure services.
  • A next generation modelling platform (codenamed Oslo) which enables consistency between IT and developer processes (built on previous dynamic IT developments) using a new language called M.

Muglia summarised by pointing out that, at a previous PDC in 1992, Windows NT was introduced and it now has a huge presence. As services become more broadly used, Microsoft expects Azure to have the same sort of impact. Dave Thompson, Corporate Vice President for Microsoft Online, spoke of how:

  • Customers with strong IT staff and discipline find it straightforward to deploy software but many others see IT as a frustrating burden – essential but not core to their business.
  • Microsoft Online provides enterprise class software as a subscription service, hosted by Microsoft and sold with partners.
  • In the future all Microsoft enterprise software will optionally be delivered as an online service.
  • Software plus services provides the power of choice. Generally, enterprises don’t want all cloud services, or all on-premise computing but a hybrid must be seamless and easy for administrators – federated identity is one challenge and extensibility is another.
  • With Windows Azure, IT administrators manage Active Directory as they do now and the Microsoft services connector links into the cloud, to the Microsoft federation gateway. Users use the federation gateway but do not know if the service they access is on-premise or in the cloud.
  • Extensibility is facilitated with the integration of online services with on-premise servers, sharing and accessing shared data using a variety of flexible presentation methods. Windows Azure components in business applications allow services to be extended as required.

Ray Ozzie returned to the stage to wrap up Microsoft’s view of the software plus services world. He was very clear in explaining once more that Windows Azure is a community technology preview and that there will be no charges for its use during preview period. As the service moves closer to commercial release, Microsoft will unlock access to more and more capabilities and the business model at launch will be based on a combination of resource consumption and service level.

I really do hope that Windows Azure does not pass the way of previous efforts to provide online services for enterprises (Microsoft Passport was supposed to be the solution for web services authentication) but I have a feeling it will not. Google, Amazon and others have proved the demand for cloud computing but Microsoft has a credible hybrid model, with a mixture of on-premise and services-led software access.

Virtualisation as an enabler for cloud computing

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

In my summary of the key messages from Microsoft’s virtualisation launch last week, I promised a post about the segment delivered by Tom Bittman, Gartner’s VP and Chief of Research for Infrastructure and Operations, who spoke about how virtualisation is a key enabler for cloud computing.

Normally, if I hear someone talking about cloud computing they are either: predicting the death of traditional operating systems (notably Microsoft Windows); they are a vendor (perhaps even Microsoft) with their own view on the way that things will work out; or they are trying to provide an artificial definition of what cloud computing is and is not.

Then, there are people like me – infrastructure architects who see emerging technologies blurring the edges of the traditional desktop and web hosting models – technologies like Microsoft Silverlight (taking the Microsoft.NET Framework to the web), Adobe AIR (bringing rich internet applications to the desktop) and Google Gears (allowing offline access to web applications). We’re excited by all the new possibilities, but need to find a way through the minefield… which is where we end up going full circle and returning to conversations with product vendors about their vision for the future.

What I saw in Bittman’s presentation was an analyst, albeit one who was speaking at a Microsoft conference, talking broad terms about cloud computing and how it is affected by virtualisation. No vendor alegiance, just tell it as it is. And this is what he had to say:

When people talk about virtualisation, they talk about saving money, power and space – and they talk about “green IT” – but virtualisation is more than that. Virtualisation is an enabling technology for the trasnformation of IT service delivery, a catalyst for changing architectures, processes, cultures, and the IT marketplace itself. And, through these changes, it enables business transformation.

Virtualisation is a hot topic but it’s also part of something much larger – cloud computing. But rather than moving all of our IT services to the Internet, Gartner see virtuaInternetlisation allegiancetransformationas a means to unlock cloud computing so that internal IT departments deliver services to business units in a manner that is more “cloud like”.

Bittman explained that in the past, our component-oriented approach has led to the management of silos for resource management, capacity planning and performance management. Gartner: Virtualising the data centre - from silos to clouds
Then, as we realise how much these silos are costing, virtualisation is employed to drive down infrastructure costs and increase flexibility – a layer-oriented approach with pools of resource, and what he refers to as “elasticity” – the ability to “do things” much more quickly. Even that is only part of the journey though – by linking the pools of resource to the service level requirements of end users, an automated service-oriented approach can be created – an SoA in the form of cloud computing.

At the moment internal IT is still evolving, but external IT providers are starting to deliver service from the cloud (e.g. Google Apps, salesforce.com, etc.) – and that’s just the start of cloud computing.

Rather than defining cloud computing, Bitmann described some of the key attributes:

  1. Service orientation.
  2. Utility pricing (either subsidised, or usage-based).
  3. Elasticity.
  4. Delivered over the Internet.

The first three of these are the same whether the cloud is internal or external.

Gartner: Virtualisation consolidation and deconsolidationVirtualisation is not really about consolidation. It’s actually the decoupling of components that were previously combined – the application, operating system and hardware – to provides some level of abstraction. A hypervisor is just a service provider for compute resource to a virtual machine. Decoupling is only one part of what’s happening though as the services may be delivered in different ways – what Gartner describes as alternative IT delivery models.

Technology is only one part of this transformation of IT – one of the biggest changes is the way in which we view IT as we move from buying components (e.g. a new server) to services (including thinking about how to consume those services – internally or from the cloud) and this is a cultural/mindset change.

Pricing and licensing also changes – no longer will serial numbers be tied to servers but new, usage-based, models will emerge.

IT funding will change too – with utility pricing leading to a fluid expansion and contraction of infrastructure as required to meet demands.

Speed of deployment is another change – as virtualisation allows for faster deployment and business IT users see the speed in which they can obtain new services, demand will also increase.

Management will be critical – processes for management of service providers and tools as the delivery model flexes based on the various service layers.

And all of this leads towards cloud computing – not outsourcing everything to external providers, but enabling strategic change by using technologies such as virtualisation to allow internal IT to function in a manner which is more akin to an external service, whilst also changing the business’ ability to consume cloud services.

Software as a Service – or Software plus Services?

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

There’s a lot of media buzz right now about cloud computing – which encompasses both “web 2.0” and Software as a Service (SaaS). Whilst it’s undeniable that web services are becoming increasingly more important, I’ll stand by my comments from a couple of years ago that the “webtop” will not be in mainstream use any time soon and those who are writing about the the death of Microsoft Windows and Office are more than a little premature.

Even so, I was interested to hear Microsoft’s Kevin Sangwell explain the differences between SaaS and the Microsoft idea of software plus services (S+S) during the recent MCS Talks session on infrastructure architecture.

I’ve heard Microsoft executives talk about software plus services but Kevin’s explanation cut’s through the marketing to look at what S+S really means in the context of traditional (on premise) computing and SaaS:

Kevin made the point that there is actually a continuum between on premise and SaaS solutions:

Software delivery continuum and software services taxonomy

  • We all understand the traditional software element – where software is installed an operated in-house (or possibly using a managed service provider).
  • Building block services are about using web services to provide an API to build applications “in the cloud” – so Amazon’s simple storage service (S3) is an example. This gives developers something to hook into and onto which to deliver a solution – for example, Jungle Disk uses the Amazon S3 platform to provide online storage and backup services.
  • Attached services provide self-contained functionality – for example anti-spam filtering of e-mail as it enters (or exits) an organisation.
  • Finished services are those that operate entirely as a web service – with salesforce.com being one, often quoted, example – Google Apps would be another (not that Microsoft are ever likely to promote that one…).

S+S is about creating a real-world hybrid – not just traditional or cloud computing but a combination of software and services – for example an organisation may use a hosted Exchange Server service but they probably still use Microsoft Outlook (or equivalent software) on a PC.

So, would moving IT services off to the cloud make all the associated IT challenges disappear? Almost certainly not! All this would lead to is a disjointed service and lots of unhappy business users. SaaS and S+S do not usually remove IT challenges altogether but they replace them with new ones – typically around service delivery (e.g. managing service level agreements, integrating various operational teams, etc.) and service support (e.g. presenting a coherent service desk with appropriate escalation between multiple service providers and the ability to assess whether a problem relates to internal IT or the hosted service) but also in relation to security (e.g. identity lifecycle management and information rights management).

Kevin has written an article for The [MSDN] Architecture Journal on the implications of software plus services consumption for enterprise IT and, for those who are interested in learning more about S+S, it’s worth a read.

The Microsoft view of connected systems

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few weeks back I was at a breakfast briefing on connected systems (Microsoft’s view of web services and BizTalk Server), delivered by David Gristwood (one of Microsoft UK’s Architect Evangelists). Even though I’m not a developer, I think I understood most of what David had to say (although many of my colleagues’ blogs will undoubtedly have more to offer in this subject area).

David explained how the need to connect applications has led to a shift towards service orientation as applications have longer lifetime and no longer consist of just a single executable program. Consequently there are requirements for application customisation and integration (generally loosely coupled) with the four tenets of a service oriented architecture (SOA) being:

  • Explicit boundaries.
  • Autonomous services (i.e. other services do not have to wait for your schedule).
  • Shared schema and contract (not class).
  • Compatibility based on policy (generally written in XML).

The Web Services Interoperability Organization‘s WS-* architecture is about providing a framework for web services with broad industry support (in the same way that the open system interconnection 7 layer network model has become the industry model for networking).

WS-I web services standards stackBasic web services operate well but are easy to make inoperable. As such WS-I is concerned with identifying the lowest common denominator – the basic profile (BP) or core set of specifications that provide the foundation for web services.

When developing web services, Visual Studio 2005 (codenamed Whidbey) will represent a huge step forward with the Microsoft .NET Framework v2.0 including numerous improvements in the web services protocol stack and ASMX (ASP.NET web services) representing an ongoing evolution towards the Windows communication foundation (formerly codenamed Indigo).

Although coding first and using web methods is still a good way to producing web services, there is a move to interface-based service contracts – first designing the interface using web service definition language (WSDL) and then adding contracts. The new application connection designer (ACD) (codenamed Whitehorse) is a visual tool to drag and drop connections which represent service contracts, allowing the generation of skeleton projects and the basic code required to implement/consume contracts.

In terms of standards and interoperability, this code is WS-I BP 1.1 compliant by default (and hence fits well into the WS-* architecture), whilst ASMX web services automatically support simple object access protocol (SOAP) 1.1 and 1.2.

Web services enhancements (WSE) is a fully supported download which sits on top of ASMX and extends the existing web services support within the Microsoft .NET Framework. WSE is a set of classes to implement on-the-wire standards and is actually an implementation of several WS-* specifications including WS-Addressing and WS-Security, to provide end-to-end message-level security (in a more sophisticated manner than SOAP over HTTP). The current version is WSE 2.0 SP3, and WSE 3.0 will be released with Visual Studio 2005 (due to a dependency on the Microsoft .NET Framework v2.0), with new features including:

  • Message transmission optimization mechanism (MTOM) for binary data transfer, replacing SOAP with attachments and WS-Attachments/direct Internet message encapsulation (DIME).
  • Enhancements to WS-Security/WS-Policy.

It should be noted that there are no guarantees that WSE 2.0 and 3.0 will be wire-level or object-model compatible, but there will be side-by-side support for the two versions. WSE 3.0 is likely to be wire-compatible with the Windows communication foundation (which will ultimately replace WSE around the end of 2006).

The Windows communication foundation itself is about productivity (writing less code), interoperability (binary, highly-optimised interoperability between computers, dropping to WS-I BP 1.1 if required) and service oriented development. Implemented as a set of classes, the Windows communication foundation takes messages, transforms them, maps them to a structure and pushes them to the receiving code.

To illustrate the productivity gains, using an example cited by Microsoft, an application implemented using Visual Studio .NET 2003 consisting of 56296 lines of code (20379 lines of security, 5988 lines for reliable messaging, 25507 lines for transactions, and 4442 lines for infrastructure) was reduced using WSE to 27321 lines of code (10 lines for security, 1804 lines for reliable messaging, and no change to the 25507 lines for transactions) and reduced further using the Windows communication foundation to just 3 lines of code (1 line for security, 1 line for reliable messaging and 1 line for the transactions)! This sounds extreme to me; but even an infrastructure architect like myself can appreciate that less code means easier management.

Evolution of Microsoft.NET FrameworkIn terms of a roadmap, the Windows communication foundation will supersede existing connected systems technologies (e.g. ASMX), but other technologies will continue to exist, supported by the Windows communication foundation (e.g. enterprise services, .NET remoting, COM, COM+ and MSMQ).

Another tool in Microsoft’s integration arsenal is the SQL Server 2005 Service Broker, which will provide a SQL-to-SQL binary data messaging protocol, allowing developers who are familiar with the database programming model to think about queues as databases and to take data from queues as as a kind of hanging query/result sets. Over time, this will be adapted to use the Windows communication foundation so that this will run on top of the Service Broker protocol before eventually allowing the Windows communication foundation to become the transport for WS-* interoperability.

Web services integrationOf course, Microsoft’s most significant integration product for connected systems is BizTalk Server. At 1.5 million lines of C# code, BizTalk Server 2004 is one of the largest Microsoft .NET products written to date (although SQL Server 2005 will exceed this at around 3 million lines). BizTalk Server allows the mesh of point-to-point web service (and other) connections to be replaced with a BizTalk Server “hub”.

Microsoft BizTalk Server
Another advantage of such a process is the ability to take business processes out of (potentially unstable) code and allow BizTalk’s orchestration model to handle the business processes.

BizTalk 2004 is the first Microsoft.NET incarnation of the product (the previous two versions were not .NET applications). Built on the ASP.NET application stack and including WS-I v1.0 support (and a v2.0 adapter), BizTalk Server 2004 is integrated with Visual Studio.NET 2003 and the Office System 2003 with additional features including business activity monitoring, human workflow services and a business rules engine. BizTalk Server 2006 is due to follow the SQL Server 2005 and Visual Server 2005 launch in November 2005 and (according to Microsoft) will offer simplified setup, migration and deployment, comprehensive management and operations, and business user empowerment. An adapter for the Windows communication framework is also expected later in 2006. Future versions of BizTalk Server will be built natively on the Windows communication foundation and will offer support for the next version of Windows Server (codenamed Longhorn) as well as the dynamic systems initiative (DSI).

Ultimately, the Windows communication foundation will become a transport layer for connected systems, with BizTalk Server providing orchestration. With continued support for WS-* standards, truly connected systems may well become a reality for many organisations.

Links

WS-I overview presentation