Useful Links: April 2011

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A list of items I’ve come across recently that I found potentially useful, interesting, or just plain funny:

Embed your Xbox Live Profile in a website

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Yesterday I wrote a post about accessing your Xbox Live avatar for use elsewhere. That got me thinking – could I also access other parts of my Xbox Live profile? (Yes!)

The following URL can be used to access a user’s profile:

  • http://gamercard.xbox.com/gamertag.card

So, if I embed this code:

I get:

Want to download your Xbox Live avatar for use elsewhere? Here’s how

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’m a fairly recent convert to the Xbox 360 but I do think the avatars are pretty cool (as do my kids). And, if I can use the Avatar on Xbox Live, why not anywhere else?

It turns out that you can.

The clue was in a blog post about the “Next Generation” Xbox interface from 2008 and it seems that you can pick up your avatar in various formats using these URLs:

  • http://avatar.xboxlive.com/avatar/gamertag/avatar-body.png
  • http://avatar.xboxlive.com/avatar/gamertag/avatarpic-l.png
  • http://avatar.xboxlive.com/avatar/gamertag/avatarpic-s.png

By substituting my gamertag into this URL structure, I was able to access the following images:
My Xbox Live Avatar  My Xbox Live Avatar My Xbox Live Avatar

(There used to be a tool called free your avatar, but that seems to have gone AWOL)

Trouble with (mobile) telcos

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I wanted to start this post by asking “How can you tell if a mobile phone salesman is lying? Because his lips move!” but I’m slightly concerned that might be slanderous, or libellous, or something – so let me just say that I’m talking about a hypothetical salesman and it was intended as a joke, so don’t sue me. OK?

As for what caused me to say that – well, given the trouble I’m having with not one but three of the UK’s five mobile network operators, I think I’m entitled to a slightly jaundiced view of the “services” that they provide.

O2

When I bought my iPad last year, I signed up for a monthly rolling data contract with O2. One of the reasons for selecting O2 was that the tariff included access to BT Openzone Wi-Fi hotspots. Except it’s not that simple as the BT Openzone SSID is used for three products: BT Openzone; BT Business Hubs and BT Fon. At first, I couldn’t get my iPad to work on any BT Openzone hotspot but, even since that was fixed by a support call to O2, I’ve found that many “BT Openzone” hotspots still require payment. Without the benefit of bundled Wi-Fi, I considered switching networks but had built up around 5GB in unused data that was rolling forward each month.

Until today.

I received an email from O2 advising me that payment for my 30-day rolling contract had been declined. Realising that I had been issued a new debit card recently, I updated the payment details but I still couldn’t make a payment. I phoned my bank and they assured me there was no problem their end, so I called O2. As friendly as the call centre staff were, they were unable to help me because, by then, my account was frozen for 24 hours. In a few hours time it would be cancelled, before the 24 hour window expires and with it goes my 5GB of unused data.

So long O2 – if this is how you repay me, I’m off to another network.

Three

I don’t know what Orange/T-Mobile’s data coverage is like but voice coverage where I live is terrible. Meanwhile, my Twitter followers have been extolling the virtues of Three to me, so I thought I’d save myself £2.50 a month and give them a try. On the way home, I dropped into a Three store and bought a 3G micro-SIM for my iPad. The guy who served me in the London Baker Street store could barely string two words together as he grunted at me (presumably he doesn’t make any commission on SIM-only sales) but he was clear about two things:

  • The SIM was already activated – and would work within a few minutes.
  • I only needed to call if I wanted to set up a rolling contract – otherwise I could just start using it.

Wrong. And wrong again.

When it became apparent that the SIM was not active, I called Three, who took me through the credit check process and set up a monthly payment. At the end of the call I was told that it would take between 3 and 24 hours to activate my SIM.  Not quite the instant access I was promised when I bought it. No mobile communications for me on the way home tonight…

Vodafone

As an aside, my employer is currently moving their corporate mobile service from Vodafone to O2. We have a block of numbers allocated exclusively for our use and, for the last few years network coverage has been good, right until we started the migration. Nowadays, it’s not uncommon for me to have no signal in the office, where previously there was a good reception. At home my wife often has a full signal, while I have none. And calls frequently go straight to voicemail without the phone ringing. Could it be that Vodafone has prioritised traffic on it’s network so that +44786782/3xxxx numbers are no longer treated as a “valued business customer” and are instead bumped off the network when required? I can’t prove it, and I’m sure Vodafone will deny it, but I’m not the only one experiencing these sort of issues.

As bad as each other?

Time will tell how my new mobile data contract with Three works out, and my O2 signal at home is okay so hopefully it will be fine when my company phone is migrated too. In the meantime, I have to say that I’m underwhelmed with all the UK’s mobile telcos: there’s plenty of room for improvement and whoever can deliver excellent network coverage for a competitive price, backed up with excellent customer service, has the potential to clean up. I suspect that might be a while coming…

[Update 4 May 2011: O2 responded to my whinges on Twitter – but were too late to stop me from leaving as I’d already bought the SIM from Three by then. They have also promised a credit to my mobile phone account to compensate me for the lost data but could not match the deal I’m getting from Three for mobile broadband. It’s unfortunate that they couldn’t help when I first called but they have at least taken steps to retain me as a customer]

Does Microsoft Kinect herald the start of a sensor revolution?

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last week, Microsoft officially announced a software development kit for the Kinect sensor. Whilst there’s no commercial licensing model yet, that sounds like it’s the start of a journey to official support of gesture-based interaction with Windows PCs.

There’s little doubt that Kinect, Microsoft’s natural user interface for the Xbox game console, has been phenomenally successful. It’s even been recognised as the fastest-selling consumer device on record by Guinness World Records. I even bought one for my family (and I’m not really a gamer) – but before we go predicting the potential business uses for this technology, it’s probably worth stopping and taking stock. Isn’t this really just another technology fad?

Kinect is not the first new user interface for a computer – I’ve written so much about touch-screen interaction recently that even I’m bored of hearing about tablets! We can also interact with our computers using speech if we choose to – and the keyboard and mouse are still hanging on in there too (in a variety of forms). All of these technologies sound great, but they have to be applied at the right time: my iPad’s touch screen is great for flicking through photos, but an external keyboard is better for composing text; Kinect is a fantastic way to interact with games but, frankly, it’s pretty poor as a navigational tool.

What we’re really seeing here is a proliferation of sensors. Keyboard, mouse, trackpad, microphone, and camera(s), GPS, compass, heart monitor – the list goes on. Kinect is really just an advanced, and very consumable, sensor.

Interestingly sensors typically start out as separate peripherals and, over time, they become embedded into devices. The mouse and keyboard morphed into a trackpad and a (smaller) keyboard. Microphones and speakers were once external but are now built in to our personal computers. Our smartphones contain a wealth of sensors including GPS, cameras and more. Will we see Kinect built into PCs? Quite possibly – after all it’s really a couple of depth sensors and a webcam!

What’s really exciting is not Kinect per se but what it represents: a sensor revolution. Much has been written about the Internet of Things but imagine a dynamic network of sensors where the nodes can automatically handle re-routing of messages based on an awareness of the other nodes. Such networks could be quickly and easily deployed (perhaps even dropped from a plane) and would be highly resilient to accidental or deliberate damage because of their “self-healing” properties. Another example of sensor use could be in an agricultural scenario with sensors automatically monitoring the state of the soil, moisture, etc. and applying nutrients or water. We’re used to hearing about RFID tags in retail and logistics but those really are just the tip of the iceberg.

Exciting times…

[This post originally appeared on the Fujitsu UK and Ireland CTO Blog and was jointly authored with Ian Mitchell.]

Azure Connect – the missing link between on-premise and cloud

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Azure Connect offers a way to connect on-premise infrastructure with Windows Azure but it’s lacking functionality that may hinder adoption.

While Microsoft is one of the most dominant players in client-server computing, until recently, its position in the cloud seemed uncertain.  More recently, we’ve seen Microsoft lay out its stall with both Software as a Service (SaaS) products including Office 365 and Platform as a Service (PaaS) offerings such as Windows Azure joining their traditional portfolio of on-premise products for consumers, small businesses and enterprise customers alike.

Whereas Amazon’s Elastic Compute Cloud (EC2) and Simple Storage Service (S3) offer virtualised Infrastructure as a Service (IaaS) and Salesforce.com is about consumption of Software as a Service (SaaS), Windows Azure fits somewhere in between. Azure offers compute and storage services, so that an organisation can take an existing application, wrap a service model around it and specify how many instances to run, how to persist data, etc.

Microsoft also provides middleware to support claims based authentication and an application fabric that allows simplified connectivity between web services endpoints, negotiating firewalls using outbound connections and standard Internet protocols. In addition, there is a relational database component (SQL Azure), which exposes relational database services for cloud consumption, in addition to the standard Azure table storage.

It all sounds great – but so far everything I’ve discussed runs on a public cloud service and not all applications can be moved in their entirety to the cloud.

Sometimes makes it makes sense to move compute operations to the cloud and keep the data on-premise (more on that in a moment). Sometimes, it’s appropriate to build a data hub with multiple business partners connecting to a data source in cloud but with applications components in a variety of locations.

For European CIOs, information security, in particular data residency, is a real issue. I should highlight that I’m not a legal expert, but CIO Magazine recently reported how the Patriot Act potentially gives the United States authorities access to data hosted with US-based service providers – and selecting a European data centre won’t help.  That might make CIOs nervous about placing certain types of data in the cloud although they might consider a hybrid cloud solution.

Azure already provides federated security, application layer connectivity (via AppFabric) and some options for SQL Azure data synchronisation (currently limited to synchronisation between Microsoft data centres, expanding later this year to include synchronisation with on-premise SQL Server) but the missing component has been the ability to connect Windows Azure with on-premise infrastructure and applications. Windows Azure Connect provides this missing piece of the jigsaw.

Azure Connect is a new component for Windows Azure that provides secure network communications between compute instances in Azure and servers on premise (ie behind the corporate firewall). Using standard IP protocols (both TCP and UDP) it’s possible to take a web front end to the cloud and leave the SQL Server data on site, communicating over a virtual private network, secured with IPSec. In another scenario, a compute instance can be joined to an on-premise Active Directory  domain so a cloud-based application can take advantage of single sign-on functionality. IT departments can also use Azure Connect for remote administration and troubleshooting of cloud-based computing instances.

Currently in pre-release form, Microsoft is planning to make Azure Connect available during the first half of 2011. Whilst setup is relatively simple and requires no coding, Azure Connect is reliant on an agent running on the connected infrastructure (ie on each server that connects to Azure resources) in order to establish IPSec connectivity (a future version of Azure Connect will be able to take advantage of other VPN solutions). Once the agent is installed, the server automatically registers itself with the Azure Connect relay in the cloud and network policies are defined to manage connectivity. All that an administrator has to do is to enable Windows Azure roles for external connectivity via the service model; enable local computers to initiate an IPSec connection by installing the Azure Connect agent; define network policies and, in some circumstances, define appropriate outbound firewall rules on servers.

The emphasis on simplicity is definitely an advantage as many Azure operations seem to require developer knowledge and this is definitely targeted at Windows Administrators. Along with automatic IPSec provisioning (so no need for certificate servers) Azure Connect makes use of DNS so that there is no requirement to change application code (the same server names can be used when roles move between the on premise infrastructure and Azure).

For some organisations though, the presence of the Azure Connect agent may be seen as a security issue – after all, how many database servers are even Internet-connected? That’s not insurmountable but it’s not the only issue with Azure Connect.

For example, connected servers need to run Windows Vista, 7, Server 2008, or Server 2008 R2 [a previous version of this story erroneously suggested that only Windows Server 2008 R2 was supported] and many organisations will be running their applications on older operating system releases. This means that there may be server upgrade costs to consider when integrating with the cloud – and it certainly rules out any heterogeneous environments.

There’s an issue with storage. Windows Azure’s basic compute and storage services can make use of table-based storage. Whilst SQL Azure is available for applications that require a relational database, not all applications have this requirement – and SQL Azure presents additional licensing costs as well as imposing additional architectural complexity.  A significant number of cloud-based applications make use of table storage or combination of table storage and SQL Server – for them, the creation of a hybrid model for customers that rely on on-premise data storage may not be possible.

For many enterprises, Azure Connect will be a useful tool in moving applications (or parts of applications) to the cloud. If Microsoft can overcome the product’s limitations, it could represent a huge step forward for Microsoft’s cloud services in that it provides a real option for development of hybrid cloud solutions on the Microsoft stack, but there still some way to go.

[This post was originally written as an article for Cloud Pro.]

Crowdsourcing a digital future at The Fantastic Tavern (#TFTLondon)

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Earlier in the year, I blogged about my first visit to The Fantastic Tavern (TFT)  – a meeting of people to discuss ideas, over beer, with the common theme being that we all do something in “digital”.

Last night’s meeting was an opportunity to crowdsource ideas for the enablement of the digital future on London’s Greenwich Peninsular. When people think of Greenwich, they tend to think of a Royal Borough, Victorian architecture, Greenwich Park and the prime meridian but Greenwich Peninsular is an area of reclaimed industrial land on which stands the O2 Arena and the Ravensbourne Hub Digital Innovation Centre with plans for 3.5 million square feet of business space, 10000 new homes, schools, healthcare and other community facilities, the Thames cable-car link to the Siemens Pavilion at eXcel, a cruise liner terminal, beaches – and the opportunity to make Greenwich an exemplar for future living.

I won’t talk about the ideas that we brainstormed – that’s between the taverners and the Digital Greenwich Advisory Board – but the whole process seemed, to me, to be a model of how crowdsourcing can work to drive innovation.

Around 50 taverners were there, broken out into four groups in an open spaces brainstorm to look at education, governance, business/commerce, and community. People could move between conversations and the ideas were written/drawn on huge boards. At the end of the evening, each group presented their topic and, inevitably, that generated even more discussion.

Hopefully, what we came up with provides inspiration for Greenwich Council. At the very least it was an opportunity to experience a real-world crowdsourcing event – one which seemed to me to be very successful indeed.

If you’re interested in future TFT events, find out more at The Fantastic Tavern site (or @TFTLondonNYC).

Microsoft Hyper-V: A reminder of where we’re at

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Earlier this week I saw a tweet from the MIX 2011 conference that highlighted how Microsoft’s Office 365 software as a service platform runs entirely on their Hyper-V hypervisor.

There are those (generally those who have a big investment in VMware technologies) who say Microsoft’s hypervisor lacks the features to make it suitable for use in the enterprise. I don’t know how much bigger you have to get than Office 365, but the choice of hypervisor is becoming less and less relevant as we move away from infrastructure and concentrate more on the platform.

Even so, now that Hyper-V has reached the magical version 3 milestone (at which people generally start to accept Microsoft products) I thought it was worth a post to look back at where Hyper-V has come from, and where it’s at now:

Looking at some of the technical features:

  • Dynamic memory requires Windows 2003 SP2 or later (and is not yet supported for Linux guests). It’s important to understand the differences between over subscription and over commitment.
  • Performance is as close as no difference for differentiator between hypervisors.
  • Hyper-V uses Windows clustering for high availability – the same technology as is used for live migration.
  • In terms of storage scalability – it’s up to the customer to choose how to slice/dice storage – with partner support for multipathing, hardware snapshotting, etc. Hyper-V users can have 1 LUN for each VM, or for 1000 VMs (of course, no-one would actually do this).
  • Networking also uses the partner ecosystem – for example HP creates software to allow NIC teaming on its servers, and Hyper-V can use a virtual switch to point to this.
  • In terms of data protection, the volume shadow copy service on the host is used an there are a number of choices to make around agent placement. A single agent can be deployed to the host, with all guests protected (allowing whole machine recovery) or guests can have their own agents to allow backups at the application level (for Exchange, SQL Server, etc.).

I’m sure that competitor products may have a longer list of features but in terms of capability, Hyper-V is “good enough” for most scenarios I can think of – I’d be interested to hear what barriers to enterprise adoption people see for Hyper-V?

Justifying a Windows/Office update – those “little things” add up

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

It’s often hard to justify a Windows or Office upgrade, but I think I might just have found a way to identify some of the advantages – try going back to an older version.

A few weeks ago, my company-supplied notebook was rebuilt onto the corporate standard build. I realised that it’s been about 4 or 5 years since I was last in that situation, as I’ve always been in a position to be trialling new versions of Windows and Office but these days my role is largely non-technical (so I have no real justification to be different to anyone else) and my team actually sits within the IT department (so I guess I should be setting an example!). I do have local administration rights on the machine, and I did install some software that I need for my role, but which is officially unsupported (examples would be TweetDeck, Nokia PC Suite and the drivers for my company-supplied HP OfficeJet printer). I also tweaked some power settings and turned off the corporate screen saver (thereby keeping my green credentials intact by balancing the lack of automatic shutdown with the lack of increased processor/fan activity to run a screensaver) but I’ve been trying to stick to the company build where possible/practical. That means I’m back to Office 2007 (with Visio 2003) although I am at least on a Windows 7 (x64) build in order that I can use all 4GB of RAM in my notebook.

I have to say that it’s been driving me insane. I had a similar experience when I went back to XP for a couple of days after a hard drive failure a couple of years ago but I’ve really missed some of the newer functionality – particularly in Outlook 2010:

  • I’ve lost my Quick Steps (I use them for marking an e-mail as read and moving to my archive folder in one action, or for sending a team e-mail).
  • Conversation view is different (I can’t tell you how, but I’m missing some new e-mails as a result).
  • When I receive a meeting request, I don’t see my other appointments for that day in the request.
  • [Update 15 April 2011: Access to multiple Exchange accounts from one Outlook instance.]

These are just examples off the top of my head – I should have noted each feature I’ve missed in recent weeks but I didn’t (maybe I’ll come back and edit the post later) but, for a knowledge worker like me, they are significant: a few minutes extra in Outlook to triage email 7-8 times a day, represents half an hour of lost productivity – every day.

…none of this is likely to convince a company to invest in an upgrade, even if they have the software (software costs are generally quite insignificant in relation to resource costs), but it’s all part of the business case – employee productivity is never easy to measure, but the little things do add up.

I’m now running Internet Explorer 9 (I need to test certain websites on the latest browser version), although I’m ready to revert to 8 if there are issues with any of the business applications I need to use, and my PC is fully patched including the latest service pack. I am resisting the temptation to install my own (licensed) copy of Office 2010 though… at least for now.

Is the Windows Experience Index really of any value?

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Those who follow me on Twitter (@markwilsonit) may have seen a few comments about the Windows Vista laptop that I’m currently fixing for a family member, who decided not to “bother” me when they bought a new computer, yet still relies on me for help when it doesn’t work as intended…

The laptop was woefully underpowered, with just 1GB of RAM (but only 768MB available) and an Intel Celeron 540 CPU running at 1.87GHz.  Patching the operating system seemed to improve things slightly (it was running Windows Vista RTM, with no updates successfully applied for over 18 months) but what it really needed was more RAM. The Crucial System Scanner told me it had a single memory module, with room for one more, so I invested the princely sum of £13.67 in making the system usable.

Not surprisingly, the addition of the extra memory to the machine changed the Windows Experience Index values for memory operations per second but it also significantly increased the graphics score:

Component What is rated? Fujitsu-Siemens Esprimo V5535, Celeron 540, 1GB RAM Fujitsu-Siemens Esprimo V5535, Celeron 540, 2GB RAM
Processor Calculations per second 4.1 4.1
Memory (RAM) Memory operations per second 3.9 4.4
Graphics Desktop performance for Windows Aero 3.5 4.9
Gaming graphics 3D business and gaming graphics performance 3.2 3.2
Primary hard disk Disk data transfer rate 5.1 5.1

Unfortunately, Windows Vista Home Basic doesn’t include Aero (there are some workarounds on the ‘net but they didn’t seem to work for me), so I left the system running as normal.

What I found bizarre though was that even the crippled system with 1GB of RAM and only a few MB free (which was almost unusable, it was so slow) had similar Windows Experience Index scores to my everyday laptop – a much more powerful machine with an Intel Core 2 Due P8400 CPU at 2.26GHz, 4GB RAM and Windows 7 x64:

Component What is rated? Fujitsu-Siemens Lifebook S7220, Core2Duo P8400, 4GB RAM
Processor Calculations per second 3.1
Memory (RAM) Memory operations per second 4.3
Graphics Desktop performance for Windows Aero 4.1
Gaming graphics 3D business and gaming graphics performance 3.4
Primary hard disk Disk data transfer rate 4.5

Perhaps Microsoft updated the Windows Experience Index algorithm between Vista and 7, or between 32- and 64-bit systems, (I thought they just increased the maximum score from 5.9 to  7.9) but it seems to make a mockery of the “experience index” when a basic consumer system scores more highly than a mid-range business machine.