A list of items I’ve come across recently that I found potentially useful, interesting, or just plain funny:
Yesterday I wrote a post about accessing your Xbox Live avatar for use elsewhere. That got me thinking – could I also access other parts of my Xbox Live profile? (Yes!)
The following URLÂ can be used to access a user’s profile:
So, if I embed this code:
<iframe src="http://gamercard.xbox.com/markawilson.card" scrolling="no" frameBorder="0" height="140" width="204">markawilson</iframe>
I’m a fairly recent convert to the Xbox 360 but I do think the avatars are pretty cool (as do my kids). And, if I can use the Avatar on Xbox Live, why not anywhere else?
It turns out that you can.
The clue was in a blog post about the “Next Generation” Xbox interface from 2008Â and it seems that you can pick up your avatar in various formats using these URLs:
By substituting my gamertag into this URL structure, I was able to access the following images:
Â Â Â
(There used to be a tool called free your avatar, but that seems to have gone AWOL)
I wanted to start this post by asking “How can you tell if a mobile phone salesman is lying? Because his lips move!” but I’m slightly concerned that might be slanderous, or libellous, or something – so let me just say that I’m talking about a hypothetical salesman and it was intended as a joke, so don’t sue me. OK?
As for what caused me to say that – well, given the trouble I’m having with not one but three of the UK’s five mobile network operators, I think I’m entitled to a slightly jaundiced view of the “services” that they provide.
When I bought my iPad last year, I signed up for a monthly rolling data contract with O2. One of the reasons for selecting O2 was that the tariff included access to BT Openzone Wi-Fi hotspots. Except it’s not that simple as the BT Openzone SSID is used for three products: BT Openzone; BT Business Hubs and BT Fon. At first, I couldn’t get my iPad to work on any BT Openzone hotspot but, even since that was fixed by a support call to O2, I’ve found that many “BT Openzone” hotspots still require payment. Without the benefit of bundled Wi-Fi, I considered switching networks but had built up around 5GB in unused data that was rolling forward each month.
I received an email from O2 advising me that payment for my 30-day rolling contract had been declined. Realising that I had been issued a new debit card recently, I updated the payment details but I still couldn’t make a payment. I phoned my bank and they assured me there was no problem their end, so I called O2. As friendly as the call centre staff were, they were unable to help me because, by then,Â my account was frozen for 24 hours. In a few hours time it would be cancelled, before the 24 hour window expires and with it goes my 5GB of unused data.
So long O2 – if this is how you repay me, I’m off to another network.
I don’t know what Orange/T-Mobile’s data coverage is like but voice coverage where I live is terrible. Meanwhile, my Twitter followers have been extolling the virtues of Three to me, so I thought I’d save myself Â£2.50 a month and give them a try. On the way home, I dropped into a Three store and bought a 3G micro-SIM for my iPad. The guy who served me in the London Baker Street store could barely string two words together as he grunted at me (presumably he doesn’t make any commission on SIM-only sales) but he was clear about two things:
- The SIM was already activated – and would work within a few minutes.
- I only needed to call if I wanted to set up a rolling contract – otherwise I could just start using it.
Wrong. And wrong again.
When it became apparent that the SIM was not active, I called Three, who took me through the credit check process and set up a monthly payment. At the end of the call I was told that it would take between 3 and 24 hours to activate my SIM.Â Not quite the instant access I was promised when I bought it. No mobile communications for me on the way home tonight…
As an aside, my employer is currently movingÂ their corporate mobile service from Vodafone to O2. We have a block of numbers allocated exclusively for our use and, for the last few years network coverage has been good, right until we started the migration. Nowadays, it’s not uncommon for me to have no signal in the office, where previously there was a good reception. At home my wife often has a full signal, while I have none. And calls frequently go straight to voicemail without the phone ringing. Could it be that Vodafone has prioritised traffic on it’s network so that +44786782/3xxxx numbers are no longer treated as a “valued business customer” and are instead bumped off the network when required? I can’t prove it, and I’m sure Vodafone will deny it, but I’m not the only one experiencing these sort of issues.
As bad as each other?
Time will tell how my new mobile data contract with Three works out, and my O2 signal at home is okay so hopefully it will be fine when my company phone is migrated too. In the meantime, I have to say that I’m underwhelmed with all the UK’s mobile telcos: there’s plenty of room for improvement and whoever can deliver excellent network coverage for a competitive price, backed up with excellent customer service, has the potential to clean up. I suspect that might be a while coming…
[Update 4 May 2011: O2 responded to my whinges on Twitter – but were too late to stop me from leaving as I’d already bought the SIM from Three by then. They have also promised a credit to my mobile phone account to compensate me for the lost data but could not match the deal I’m getting from Three for mobile broadband. It’s unfortunate that they couldn’t help when I first called but they have at least taken steps to retain me as a customer]
Last week, Microsoft officially announced a software development kit for the Kinect sensor. Whilst there’s no commercial licensing model yet, that sounds like it’s the start of a journey to official support of gesture-based interaction with Windows PCs.
There’s little doubt that Kinect, Microsoft’s natural user interface for the Xbox game console, has been phenomenally successful. It’s even been recognised as the fastest-selling consumer device on record by Guinness World Records. I even bought one for my family (and I’m not really a gamer) – but before we go predicting the potential business uses for this technology, it’s probably worth stopping and taking stock. Isn’t this really just another technology fad?
Kinect is not the first new user interface for a computer – I’ve written so much about touch-screen interaction recently that even I’m bored of hearing about tablets! We can also interact with our computers using speech if we choose to – and the keyboard and mouse are still hanging on in there too (in a variety of forms). All of these technologies sound great, but they have to be applied at the right time: my iPad’s touch screen is great for flicking through photos, but an external keyboard is better for composing text; Kinect is a fantastic way to interact with games but, frankly, it’s pretty poor as a navigational tool.
What we’re really seeing here is a proliferation of sensors. Keyboard, mouse, trackpad, microphone, and camera(s), GPS, compass, heart monitor – the list goes on. Kinect is really just an advanced, and very consumable, sensor.
Interestingly sensors typically start out as separate peripherals and, over time, they become embedded into devices. The mouse and keyboard morphed into a trackpad and a (smaller) keyboard. Microphones and speakers were once external but are now built in to our personal computers. Our smartphones contain a wealth of sensors including GPS, cameras and more. Will we see Kinect built into PCs? Quite possibly – after all it’s really a couple of depth sensors and a webcam!
What’s really exciting is not Kinect per se but what it represents: a sensor revolution. Much has been written about the Internet of Things but imagine a dynamic network of sensors where the nodes can automatically handle re-routing of messages based on an awareness of the other nodes. Such networks could be quickly and easily deployed (perhaps even dropped from a plane) and would be highly resilient to accidental or deliberate damage because of their “self-healing” properties. Another example of sensor use could be in an agricultural scenario with sensors automatically monitoring the state of the soil, moisture, etc. and applying nutrients or water. We’re used to hearing about RFID tags in retail and logistics but those really are just the tip of the iceberg.
[This post originally appeared on the Fujitsu UK and Ireland CTO Blog and was jointly authored with Ian Mitchell.]
Earlier in the year, I blogged about my first visit to The Fantastic Tavern (TFT)Â – a meeting of people to discuss ideas, over beer, with the common theme being that we all do something in “digital”.
Last night’s meeting was an opportunity to crowdsource ideas for the enablement of the digital future on London’s Greenwich Peninsular. When people think of Greenwich, they tend to think of a Royal Borough, Victorian architecture, Greenwich Park and the prime meridian but Greenwich Peninsular is an area of reclaimed industrial land on which stands the O2 Arena and theÂ Ravensbourne HubÂ Digital Innovation Centre with plans for 3.5 million square feet of business space,Â 10000 new homes, schools, healthcare and other community facilities,Â the ThamesÂ cable-car link to the Siemens Pavilion at eXcel, a cruise liner terminal, beaches – and the opportunity to make Greenwich an exemplar for future living.
I won’t talk about the ideas that we brainstormed – that’s between the taverners and the Digital Greenwich Advisory Board – but the whole process seemed, to me, to be a model of how crowdsourcing can work to drive innovation.
Around 50 taverners were there, broken out intoÂ four groups in an open spaces brainstorm to look at education, governance, business/commerce, and community. People could move between conversations and the ideas were written/drawn on huge boards. At the end of the evening, each group presented their topic and, inevitably, that generated even more discussion.
Hopefully, what we came up with provides inspiration for Greenwich Council. At the very least it was an opportunity to experience a real-world crowdsourcing event – one which seemed to me to be very successful indeed.
Earlier this week I saw a tweet from the MIX 2011 conference that highlighted how Microsoft’s Office 365 software as a service platform runs entirely on their Hyper-V hypervisor.
There are those (generally those who have a big investment in VMware technologies) who say Microsoft’s hypervisor lacks the features to make it suitable for use in the enterprise. I don’t know how much bigger you have to get than Office 365, but the choice of hypervisor is becoming less and less relevant as we move away from infrastructure and concentrate more on the platform.
Even so, now that Hyper-V has reached the magical version 3 milestone (at which people generally start to accept Microsoft products) I thought it was worth a post to look back at where Hyper-V has come from, and where it’s at now:
- Windows Server 2008 shipped in February 2008 with a beta version of Hyper-V andÂ the final hypervisor was released as a Windows update in July 2008. (System Center Virtual Machine Manager followed in October 2008).
- Windows Server 2008 R2 shipped in July 2009, including a new version of Hyper-V with live migration capabilities (SCVMM 2008 R2 followed in September 2009).
- Windows Server 2008 R2 Service Pack 1 includes the third Hyper-V release, bringing new dynamic memory and RemoteFX capabilities. (At around the same time, SCVMM 2008 R2 was updated to support Hyper-V v3Â and the SCVMM 2012 beta was announced).
- Windows 7 and Windows Server 2008 R2 have been awarded Common Criteria Certification at EAL4+.
- There are two ways to get Hyper-V – the free Hyper-V Server product (which shipped with SP1 included earlier this week) and the Hyper-V role in Windows Server 2008 R2. Since Windows Server 2008 R2, both versions have been identical in capabilities – the difference is that Hyper-V Server is based on the server core role in Windows (i.e. limited GUI) and comes with no virtualisation usage rights.
- In terms of supported guest operating systems, support means that Microsoft will fix the hypervisor or the guest operating system. Microsoft supports:
- Windows 2003 SP2/XP SP3 or later, but some custom support agreements go back to 2000 SP4.
- SLES 10 SP3 – 11 with anÂ appropriate support agreement.
- RHEL 5.2-5.5 – with an appropriate support agreement.
- Virtual machine additions are provided for these operating systems. Other operating systems will run, but in emulation mode.
- Microsoft is also contributing code into the Linux driver repository.
Looking at some of the technical features:
- Dynamic memory requires Windows 2003 SP2 or later (and is not yet supported for Linux guests). It’s important to understand the differences between over subscription and over commitment.
- Performance is as close as no difference for differentiator between hypervisors.
- Hyper-V uses Windows clustering for high availability – the same technology as is usedÂ for live migration.
- In terms of storage scalabilityÂ – it’s up to the customer to choose how to slice/dice storageÂ – with partner support for multipathing, hardware snapshotting, etc.Â Hyper-V users can have 1 LUN for each VM, or for 1000 VMs (of course, no-one would actually do this).
- NetworkingÂ alsoÂ uses the partner ecosystem – for example HP creates software to allow NIC teaming on its servers, and Hyper-V can use a virtual switch to point to this.
- In terms of data protection, the volume shadow copy service on the host is used an there are a number of choices to make around agent placement.Â A single agent can be deployed to the host, with all guests protected (allowing whole machine recovery) or guests can have their own agents to allow backups at the application level (for Exchange, SQL Server, etc.).
I’m sure that competitor products may have a longer list of features but in terms of capability, Hyper-V is “good enough” for most scenarios I can think of – I’d be interested to hear what barriers to enterprise adoption people see for Hyper-V?
It’s often hard to justify a Windows or Office upgrade, but I think I might just have found a way to identify some of the advantages – try going back to an older version.
A few weeks ago, my company-supplied notebook was rebuilt onto the corporate standard build. I realised that it’s been about 4 or 5 years since I was last in that situation, as I’ve always been in a position to be trialling new versions of Windows and Office but these days my role is largely non-technical (so I have no real justification to be different to anyone else) and my team actually sits within the IT department (so I guess I should be setting an example!). I do have local administration rights on the machine, and I did install some software that I need for my role, but which is officially unsupported (examples would be TweetDeck, Nokia PC Suite and the drivers for my company-supplied HP OfficeJet printer). I also tweaked some power settings and turned off the corporate screen saver (thereby keeping my green credentials intact by balancing the lack of automatic shutdown with the lack of increased processor/fan activity to run a screensaver) but I’ve been trying to stick to the company build where possible/practical. That means I’m back to Office 2007 (with Visio 2003) although I am at least on a Windows 7 (x64) build in order that I can use all 4GB of RAM in my notebook.
I have to say that it’s been driving me insane. I had a similar experience when I went back to XP for a couple of days after a hard drive failure a couple of years ago but I’ve really missed some of the newer functionality – particularly in Outlook 2010:
- I’ve lost my Quick Steps (I use them for marking an e-mail as read and moving to my archive folder in one action, or for sending a team e-mail).
- Conversation view is different (I can’t tell you how, but I’m missing some new e-mails as a result).
- When I receive a meeting request, I don’t see my other appointments for that day in the request.
- [Update 15 April 2011: Access to multiple Exchange accounts from one Outlook instance.]
These are just examples off the top of my head – I should have noted each feature I’ve missed in recent weeks but I didn’t (maybe I’ll come back and edit the post later) but, for a knowledge worker like me, they are significant: a few minutes extra in Outlook to triage email 7-8 times a day, represents half an hour of lost productivity – every day.
…none of this is likely to convince a company to invest in an upgrade, even if they have the software (software costs are generally quite insignificant in relation to resource costs), but it’s all part of the business case – employee productivity is never easy to measure, but the little things do add up.
I’m now running Internet Explorer 9 (I need to test certain websites on the latest browser version), although I’m ready to revert to 8 if there are issues with any of the business applications I need to use, and my PC is fully patched including the latest service pack. I am resisting the temptation to install my own (licensed) copy of Office 2010 though… at least for now.
Those who follow me on Twitter (@markwilsonit) may have seen a few comments about the Windows Vista laptop that I’m currently fixing for a family member, who decided not to “bother” me when they bought a new computer, yet still relies on me for help when it doesn’t work as intended…
The laptop was woefully underpowered, with just 1GB of RAM (but only 768MB available) and an IntelÂ Celeron 540 CPU runningÂ at 1.87GHz.Â Patching the operating system seemed to improve things slightly (it was running Windows Vista RTM, with no updates successfully applied for over 18 months) but what it really needed was more RAM. The Crucial System Scanner told me it had a single memory module, with room for one more, so I invested the princely sum of Â£13.67 in making the system usable.
Not surprisingly, the addition of the extra memory to the machine changed the Windows Experience Index values for memory operations per second but it also significantly increased the graphics score:
|Component||What is rated?||Fujitsu-Siemens Esprimo V5535, Celeron 540, 1GB RAM||Fujitsu-Siemens Esprimo V5535, Celeron 540, 2GB RAM|
|Processor||Calculations per second||4.1||4.1|
|Memory (RAM)||Memory operations per second||3.9||4.4|
|Graphics||Desktop performance for Windows Aero||3.5||4.9|
|Gaming graphics||3D business and gaming graphics performance||3.2||3.2|
|Primary hard disk||Disk data transfer rate||5.1||5.1|
Unfortunately, Windows Vista Home Basic doesn’t include Aero (there are some workarounds on the ‘net but they didn’t seem to work for me), so I left the system running as normal.
What I found bizarre though was that even the crippled system with 1GB of RAM and only a few MB free (which was almost unusable, it was so slow) had similar Windows Experience Index scores to my everyday laptop – a much more powerful machine with an Intel Core 2 Due P8400 CPU at 2.26GHz, 4GB RAM and Windows 7 x64:
|Component||What is rated?||Fujitsu-Siemens Lifebook S7220, Core2Duo P8400, 4GB RAM|
|Processor||Calculations per second||3.1|
|Memory (RAM)||Memory operations per second||4.3|
|Graphics||Desktop performance for Windows Aero||4.1|
|Gaming graphics||3D business and gaming graphics performance||3.4|
|Primary hard disk||Disk data transfer rate||4.5|
Perhaps MicrosoftÂ updated the Windows Experience Index algorithm between Vista and 7, or between 32- and 64-bit systems,Â (I thought they just increased the maximum score from 5.9 to Â 7.9) but it seems to make a mockery of the “experience index” when a basic consumer system scores more highly than a mid-range business machine.
Thanks to everyone who attended the rescheduled Live Meeting last month on Connecting on-premise applications with the Windows Azure platform (with Allan Naim and Phil Winstanley).
Unfortunately the gremlins didn’t subside – after rescheduling the event I was unable to get a microphone working – which is a bit of an issue for a facilitator (thanks to Phil for stepping up to the mark)
and the Live Meeting recording has not worked completely either.
[A version of this post also appears on the Windows Server User Group blog]
[Updated 18 April 2011: Live Meeting recordings are now available]