Maintaining a common user profile across different Windows versions

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I wish I could take the credit for this, but I can’t: last week one of my colleagues (Brad Mallard) showed me a trick he has for creating a single user profile for multiple Microsoft operating systems. Michael Pietroforte wrote about the different user profile formats for Windows XP and Vista back in 2007 but Brad’s tip takes this a step further…

Using Group Policy Preferences, Brad suggests creating a system variable to record the operating system version for a given client computer (e.g. %osversion%) and assign it to the computer account. Then in Active Directory Users and Computers (ADUC/dsa.msc), set the user’s profile path to \\servername\sharename\%username%.%osversion%. ADUC will resolve the %username% portion but not the %osversion% part so what remains will be something like \\bigfileserver\userprofiles\mark.wilson.%osversion%.

Using this method, one user can hotdesk between several locations with different desktop operating systems (e.g. Windows XP and Windows 7). Each time they log on to a machine with a different operating system, a new profile will be created in a subfolder of their user name. Technically, that’s two profiles – but at least they are in one location for management purposes. Combine this with folder redirection for documents, IE favorites, etc. and it should be possible to present a consistent view between two operating system releases.

Mark Russinovich explains “the machine SID duplication myth”

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

One of my colleagues just flagged a blog post I’d been meaning to read when I have a little more time from Microsoft (ex-SysInternals) Technical Fellow Mark Russinovich in which he discusses “the machine SID duplication myth“. It seems that all of the effort we put into de-duplicating SIDs on Windows NT-based systems (NT, 2000, XP, 2003, Vista, 2008, 7 and 2008 R2) over the years was not really required…

To be honest, I don’t think anyone ever said it was required – just that having multiple machines with the same security identifier sounded like a problem waiting to happen and that generating unique SIDs was best practice.

The full post is worth a read but, in summary, the new best practice is:

“Microsoft’s official policy on SID duplication will also now change and look for Sysprep to be updated in the future to skip SID generation as an option. Note that Sysprep resets other machine-specific state that, if duplicated, can cause problems for certain applications like Windows Server Update Services (WSUS), so Microsoft’s support policy will still require cloned systems to be made unique with Sysprep.”

As you were then…

Windows native boot from VHD roundup

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

This is the first of several planned posts based on knowledge gained at Tech·Ed last week – but this one is necessarily brief. Mark Minasi, who presented the session that this content is based on, owns the copyright on the materials he presented (although Microsoft still distributed them to delegates). Consequently, I can’t write his session up as fully as I would like; however this post captures some of the key points (along with some narrative of my own) as I see nothing that’s not already in the public domain (and some of which has already been written about on this blog). The value in Mark’s presentation was that it pulled together various items of information into one place and explained it in a way that was simple to follow – consequently I’m not repeating the full details, just the high level overview, with some extra links where I feel they add value (Mark seems like a decent fellow – he’s only trying to protect his income and I suspect the real problem would be if I presented his materials as my own – I’m sure he would understand the fine line I’m attempting to walk here):

  • The session was titled “How Windows Storage is Changing: Everything is going VHD (CLI302)” and that’s pretty spot on – the virtual hard disk (.VHD) file format allows an entire disk partition (but not a whole drive with multiple partitions) to be packaged in a single file complete with folder structure and NTFS permissions: Microsoft’s Storage Server uses .VHD files for iSCSI targets; Windows Backup has been able to perform completed PC backups to .VHD files since Vista; and with Windows 7 we have the ability to natively boot Windows from a VHD file. Just to be clear – this is not client/server virtualisation (as in with a hypervisor) – this is storage virtualisation (presenting the VHD container as as a logical volume, stored on a physical disk).
  • To understand native .VHD booting, it’s useful to understand recent changes in the boot process: boot.ini is no more – instead we have a Boot Configuration Database (BCD) and a system reserved partition (incidentally, that’s the same one that is used for BitLocker, and is automatically created in Windows 7, with no drive letter assigned).
  • Running Windows Backup from the command line with wbadmin.exe requires the use of the -allcritical switch to ensure that the System Reserved partition is backed up.
  • As Mike Kolitz described back in May, access to .VHD file contents from Windows 7 and Server 2008 R2 is provided by a completely new mini-port driver in the storage stack for VHD files. This VHD driver enables requests to files in the VHD to be sent to the host NTFS file system on the physical partition where the VHD file is located. VHD operations can also be performed on a remote share.
  • The steps for creating a .VHD file, attaching (mounting) it, assigning a drive letter and formatting the volume can be found in my previous post on running Windows from a USB flash drive (as well as elsewhere on the ‘net).
  • The diskpart.exe command can be used to view the details of the VHD once mounted (detail disk) and it will be identified as a Msft Virtual Disk SCSI Disk Device.
  • The System Reserved Boot Partition may populated using the bcdboot.exe command. After this partition has been created, the remainder can be partitioned and formatted, then a pre-configured .VHD can be copied to the second (non-system) partition. After editing the BCD and rebooting, the physical drive will be something like D: or E: (depending on the presence of optical drives) and the .VHD will be C:.
  • There are various methods for creating a pre-configured .VHD, including booting a reference PC from Windows PE and using imagex.exe (from the Windows Automated Installation Kit) to capture the disk contents to a .WIM file, then mounting the target .VHD and deploying the .WIM image to it. Alternatively, there is a SysInternals tool called Disk2VHD.
  • The changes to the BCD are also documented in a previous post on this site but Mark also highlighted the existence of the [locate] parameter instead of specifying a drive manually (James O’Neill uses it in his post on booting from VHD and the joy of BCDEdit).
  • There are GUI tools for editing the BCD, but bcdedit.exe is worth getting to know:

    “The GUI BCDEdit commands are rather like having a 3 metre ladder for a 5 metre wall” … “Step into the light, come to the command line, in the command line there is freedom my friends.”

    [Mark Minasi at TechEd Europe 2009]

  • Booting from VHD is a great feature but it does have its limitations: for instance I can’t use it on my everyday notebook PC because the current release doesn’t support hibernation or BitLocker.
  • To finish up his presentation, Mark demonstrated an unsupported method for installing Windows directly to .VHD: Run Windows setup and press shift and F10 to break out into a command prompt; wipe and partition the hard drive, creating and attach a new .VHD; ignore Windows setup’s protests that it can’t be installed to the .VHD – click the Next button anyway and it should work (although it may be patched in a future release).

Finally, if the contents of this post are interesting, this blog recently featured two guest posts from my friend and colleague, Garry Martin that build on the concepts described above: in the first post, Garry described the process for booting Windows 7 from VHD on a Windows XP system; the second went deep into an unsupported, but nevertheless useful, method for booting Windows 7 or Server 2008 R2 from a VHD on removable media… perhaps a USB flash drive? There are also some useful links in Mike Ormond’s post on native VHD booting and Jon Galloway has a whole bunch of tips even if he is still searching for his virtual machine nirvana.

Tech·Ed Europe 2009

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Tech·Ed logoThose who follow me on Twitter may have noticed that I’ve spent the last week at Microsoft’s European technical education conference – Tech·Ed Europe – which was held in Berlin this year.

Mauerfall celebrationsIt was a great week to be in Berlin as it co-incided with Germany’s celebrations for the 20th anniversary of the fall of the Berlin Wall (die Mauerfall) and I was lucky enough to be at the Brandenburg Gate, standing in the rain, a couple of hundred metres away from political heavyweights past and present, watching a line of 1000 “dominos” tumbling to signify the fall of the wall. I don’t want to give the impression that Tech·Ed is just a jolly though – actually it’s far from it and I spent half my weekend travelling to get there, before attending sessions from 9am to around 7pm most days and then networking in the evenings. This was my first Tech·Ed since 2001, for various family and business reasons, and it was both tremendously rewarding and very hard work.

Tech·Ed BadgeFirstly, I should try and give some indication of the size of the event: more than 7200 people spread over several halls in a convention centre; more than 110 partners in the exhibition hall; hundreds of Microsoft staff and volunteers in the Technical Learning Center; around 600 sessions in something like 20 session rooms – only 21 sessions of which can fit into the agenda; a keynote with seating for all 7200 people; catering for everyone too (including the 460 staff); and a lot of walking to/from sessions and around the centre.

So, what sort of content is covered in the sessions? This year Tech·Ed had a mixture of IT Pro and Developer content but over the years it’s been held as separate developer and IT Pro events on consecutive weeks – and, if I go back far enough, there used to be a separate IT Pro conference (the Microsoft Exchange Conference, later renamed IT Forum). This year there didn’t seem to be as much for coders at Tech·Ed, but they have a Professional Developer Conference (PDC) in Los Angeles next week; web developers have their own conference too (MIX); and, if IT management is more your forte, the Microsoft Management Summit (MMS) is intended for you. Microsoft’s description of Tech·Ed is as follows:

Tech·Ed Europe
Provides developers and IT professionals the most comprehensive technical education across Microsoft’s current and soon-to-release suite of products, solutions and services. This event provides hands-on learning, deep product exploration and opportunities to connect with industry and Microsoft experts one-to-one. If you are developing, deploying, managing, securing and mobilising Microsoft solutions, Tech·Ed Europe is the conference that will help you solve today’s real-world IT challenges and prepare for tomorrow’s innovations.”

This week I attended a wide variety of sessions coving topics as diverse as using hacker techniques to aid in IT administration to troubleshooting Windows using SysInternals tools and from managing and monitoring UNIX and Linux systems using System Center Operations Manager to looking at why the various architectures for desktop delivery don’t matter so much as the way in which they are managed. Meanwhile, colleagues focused on a variety of messaging and collaboration topics, or on directory services. I’m pleased to say that I learned a lot this week. So much indeed that, by Friday lunchtime I was struggling to take any more in – thankfully one of the benefits of attending the event is a year’s subscription to TechNet Online, giving me access to recorded versions of the sessions.

When I first attended Tech·Ed, back in 1998, my career was only just getting going. These days, I have 15 years industry experience and I now know many of the event organisers, Microsoft staff, and speakers – and one of the reasons is the tremendous networking opportunity that events like this afford. I didn’t spend much time around the trade stands but I did make sure I introduced myself to key speakers whose subject material crosses my own expertise. I also met up with a whole load of people from the community and was able to associate many faces with names – people like Sander Berkouwer and Tamás Lepenye (who I knew from our online interactions but had not previously had the chance to meet in person) as well as Steven Bink (who I first met a couple of years ago, but it’s always good to see him around). But, by far the most fortuitous interaction for me was meeting Microsoft Technical Fellow Mark Russinovich on Friday morning. I was walking into the conference centre and realised that Active Directory expert John Craddock (whom I had shared a taxi with on the way from the airport earlier in the week) was next to me – and then I saw he was with Mark, who is probably the best known Windows operating system internals expert (with the possible exception of Dave Cutler) and I took the opportunity to introduce myself. Mark won’t have a clue who I am (apart from the hopeless groupie who asked him to pose for a picture later in the day) but, nevertheless, I was able to introduce myself. Mark and Mark Russinovich - yes, he really is that tall!Then, there was the Springboard Community Partei – a real opportunity to meet with international speakers and authors like Mark Minasi, as well as key Microsoft staff like Stephen Rose (Microsoft Springboard), Ward Ralston (Windows Server 2008 R2 Group Product Manager) and Mark Russinovich (although I didn’t actually see him at the party, this video shows he was there) – as well as MVPs like Sander Berkouwer, Aidan Finn and Thomas Lee. These are the events that lead to lasting relationships – and those relationships provide real value in the IT world. Name dropping in a blog post is one thing – but the IT world in which we live is a small place – Aidan is writing a book with Mark Minasi and you never know what opportunities may arise in future.

So, back to the point – Tech·Ed is one of my favourite IT events and I would love to attend it more frequently. At the stage my career has reached I no longer need week-long training courses on technical products, but 75 minute sessions to give an overview of a specific topic area are perfect – and, at around £2000 for a week of technical education and networking opportunity, Tech·Ed is something I’d like to persuade my employer to invest in more frequently…

…I’ll have to wait and see on that, but Tech·Ed 2010 will be held in Berlin again next November – fingers crossed I’ll be one of the attendees.

A quick look at Microsoft Surface

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A couple of weeks back I managed to get a close look at a Microsoft Surface table. Although Surface has been around for a while now, it was the first time I’d been “hands on” with one and, considering it’s really a bunch of cameras, and a PC running Windows Vista in a cabinet a bit like a 1980s Space Invaders game, it was actually pretty cool.

One thing I hadn’t appreciated previously is that Surface uses a totally different technology to a multitouch monitor: rather than relying on capacitance, the surface table is sensitive to anything that reflects or absorbs infra red light. It uses an infrared emitter and a series of cameras to detect light reflected by something on the surface, then processes the image and detects shapes. There’s also an API so that software can decide what to do with the resulting image and a DLP projector to project the user interface on the glass (with an infrared filter so as not to confuse the input system). At the moment, the Surface display is only 1024×768 pixels but that didn’t seem to be restrictive in any way – even with such a physically large display.

Although in some ways surface behaves like a touch device as it has multiple cameras so it can perform stereoscopic three dimensional gestures but, because it lacks direct touch capabilities, there is no concept of a hover/mouse-over. Indeed the surface team’s API was taken and extended in the Microsoft .NET Framework version 4 to work with Window Touch and, at some point in the future, the Surface and Windows Touch APIs will converge.

The surface technology is unable to accommodate pressure sensitivity directly but the underlying processor is just a PC and has USB ports so peripherals could be used to extend the available applications (e.g. a fingerprint reader, card reader, etc.)

Surface can also recognise the type of object on the glass (e.g. finger, blob, byte tag) and it returns an identifier along with X and Y co-ordinates and orientation. When I placed my hand on the device, it was recognised as five fingers and a blob. Similarly, objects can be given a tag (with a value), allowing for object interaction with the table. Surface is also Bluetooth and Wi-Fi enabled so it’s possible to place a device on the surface and communicate with it, for example copying photos from the surface to a phone, or exchanging assets between two phones via the software running on the table. Finally, because Surface understands the concepts of flick and inertia, it’s possible to write applications that make use of this (such as the demonstration application that allows a globe to be spun on the surface display, creating a rippled water effect that it feels like you are interacting with, simulating gravity, adding sprung connections between items on the display, or making them appear to be magnetic.

One technology that takes this interaction even further (sometimes mistakenly referred to as Surface v2) is Microsoft’s SecondLight, which uses another set of technologies to differentiate between the polarisation properties of light so images may be layered in three dimensions. That has the potential to extend the possibilities of a Surface-like device even further and offer very rich interaction between devices on the Surface.

At present, Surface is only available for commercial use, with a development SKU offering a 5-seat license for the SDK and the commercial unit priced at £8,500. I’m told that, if a developer can write Windows Presentation Foundation (WPF) they can write Surface applications and, because Surface runs WPF or XNA, just as an Xbox or a PC does, it does have the potential for games development.

With touch now a part of the operating system in Windows 7, we should begin to see increasing use of touch technologies although there is a key difference between surface and Windows Touch as the vertically mounted or table form factor affects the user interface and device interaction – for example, Surface also detects the direction from which it is being touched and shows the user interface in the correct orientation. In addition, Surface needs to be able to cope with interaction from multiple users with multiple focus points (imagine having multiple mice on a traditional PC!).

My hour with Surface was inspiring. The key takeaways were that this is a multi-touch, multi-user, multi-directional device with advanced object interaction capabilities. Where it has been used in a commercial context (e.g. AT&T stores) it has mostly been a novelty; however there can be business benefits too. In short, before deploying Surface, it’s important to look further than just the hardware costs and the software development costs, considering broader benefits such as brand awareness, increased footfall, etc. Furthermore, because Surface runs Windows, some of the existing assets from another application (e.g. a kiosk) should be fairly simple to port to a new user interface.

I get the feeling that touch is really starting to go somewhere and is about to break out of its niche, finding mainstream computing uses and opening up new possibilities for device interaction. Surface was a research project that caught Bill Gates’ attention; however there are other touch technologies that will build on this and take it forward. With Windows Touch built into the operating system and exciting new developments such as SecondLight, this could be an interesting space to watch over the next couple of years.

Using a Windows System Image backup to transfer a configuration between computers

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

One of my colleagues left our organisation a couple of weeks ago and his notebook PC was up for grabs (kind of like vultures looking for prey, my manager and I were trying to grab the best bits of his relinquished IT assets…). To be honest, the PC is only marginally better than the one I had already but it did have a slightly faster processor (Intel Core 2 Duo Mobile P8400 vs. T7500), a larger hard disk, and was in better physical condition (I’ll try not to drop this one!). I did need to transfer my configuration to the “new” machine quickly though (i.e. between the start and the end of our team meeting today!) so that my “old” machine could be reallocated to someone in need of a more modern PC.

I could have messed around with user state migration onto a fresh build; however I’m flying out to TechEd Europe at the weekend and I wanted to be sure that I had all my applications working so I tried a different approach. The two computers are similar, but not identical (both Fujitsu-Siemens Lifebooks – one is an S7210 and the other is an S7220) so I decided to try creating a Windows System Image and restoring it onto a different machine, then letting Plug and Play sort out the hardware. It’s a bit messy (with new network adapters etc.) but the theory was sound.

Plug and Play driver detection on Windows 7Not only was the theory sound, but it worked. After booting the “new” machine from the Windows 7 Repair Disc that I was prompted to create at the end of the backup, I restored my system, complete with all applications and data. Plug and Play did indeed identify all of my hardware, combined with Microsoft Update for a missing display driver (that would have worked too if I had been online at the time). Windows even managed to reactivate itself as the product key was still valid so my system is reporting itself as genuine (note that Windows licences remain with individual computers; however in this case both machines were licensed for Windows 7 using a volume license product key).

It’s important to note that this effectively cloned the machine (yes, I could have used any number of disk imaging products for this, but I was using the out-of-the-box tools) and so I was careful not to have both machines on the network at the same time. Indeed the last step (before passing the “old” machine on to my manager) was to securely erase my data partition, which I did using the cipher command, before booting into the Windows Recovery Environment one more time to run up diskpart and remove all of the disk partitions.

The only remaining hurdle is moving the (so far empty) BitLocker Drive Encryption Partition from its current location in the middle of my hard disk (which was the end of the smaller disk in my old machine) but that should be possible as I haven’t actually encrypted the drive on this PC.

Not bad for a few hours work, especially as there was no downtime involved (I was able to use the “old” machine to deliver my presentation whilst the “new” one was being prepared).

Useful Links: October 2009

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A list of items I’ve come across recently that I found potentially useful, interesting, or just plain funny:

Protecting my netbook with BitLocker

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

One of the reasons I run Windows 7 Ultimate edition on my netbook is to take advantage of features like BitLocker. For those who are not aware of this technology, BitLocker has offered whole drive encryption for fixed hard disks since Windows Vista but Windows 7 also includes encryption capabilities for removable devices (BitLocker To Go).

Even though I don’t keep much data on my netbook, it’s exactly the sort of device that’s likely to be lost or stolen and it seems like a perfect candidate for data encryption – although my main concern was that I might encrypt the device and then lock myself out (and I’m not the only one who’s had those concerns). Luckily there are options for key recovery – ranging from storing a copy of the key in a file or in Active Directory (not applicable for me as my netbook is not domain-joined) to the most basic printing the key on a piece of paper and keeping it in a safe place (i.e. not the carry case for my computer!).

So, armed with the knowledge that I had backed up all my critical data, just in case something went wrong, last weekend I “BitLockered” my netbook and I’m pleased to say it was really straightforward (especially as Windows 7 creates the necessary drive partition at install-time). It would have been even easier if my computer had a trusted platform module (TPM) chip but, even so, Windows can be configured to allow encryption without a TPM – I just need to supply a startup key when I turn the computer on – in this case I used a small capacity USB thumb drive to store the key, then remove it from the computer after the drive has been unlocked. In effect, I can only start (or resume) the computer with that USB “key” – or enter the recovery key to disable the encryption entirely.

There are two common ways to allow Windows to use BitLocker without a TPM: one involves editing the Local Security Policy and the other uses a few registry tweaks, which is the one I chose:

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\FVE]
“UseAdvancedStartup”=dword:00000001
“EnableBDEWithNoTPM”=dword:00000001
“UseTPM”=dword:00000002
“UseTPMPIN”=dword:00000002
“UseTPMKey”=dword:00000002
“UseTPMKeyPIN”=dword:00000002

To revert to the default settings, use a .reg file with the following contents:

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\FVE]
“UseAdvancedStartup”=-
“EnableBDEWithNoTPM”=-
“UseTPM”=-
“UseTPMPIN”=-
“UseTPMKey”=-
“UseTPMKeyPIN”=-

Then, using Windows Explorer, right click the drive you want to encrypt and select the option to turn on BitLocker, and follow the wizard. Make sure you store a copy of the recovery key, as this will be required to recover the data on a BitLocker protected drive.

It took a while for my drive to encrypt and, despite almost every reference for this that I’ve seen saying that a dialogue box will be presented showing encryption progress, I didn’t see this – all I saw was that the drive was almost full and lots of hard drive activity, then I got my free space back and the icon for the drive had a padlock and a key on it. Now, if I right click the drive there are options to manage BitLocker, including duplicating the startup key and saving/printing a new copy of the recovery key.

All in all, it was pretty painless and I haven’t noticed any performance degradation but if someone does take a fancy to my netbook, they won’t be able to access the data on it.

For more information, see the Windows BitLocker Drive Encryption Step-by-Step Guide on the Microsoft TechNet website and the BitLocker drive encryption team blog (although that hasn’t been updated in a while). Michael Pietroforte has also compared BitLocker with TrueCrypt, concluding that TruCrypt lets you choose your own recovery passphrase; however BitLocker can be managed with Group Policy and the keys can be stored in Active Directory Domain Services. There’s more information on storing BitLocker keys in Active Directory on the TechNet website (domain controllers must be running Windows Server 2003 SP1 or higher and schema extensions are required).

Apple’s new multitouch mouse misses the point

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last week Apple updated its product line, ahead of Microsoft’s Windows 7 launch, and one of the new announcements was a replacement for the “Mighty Mouse”, which was quietly killed off a few weeks back after years of doing anything but living up to its name (as Adam Pash notes in Lifehacker’s coverage of Apple’s new lineup).

I first heard about Apple’s new “Magic Mouse” on Twitter:

“RT @real_microsoft: RT @Mirweis Once again #Apple seems to have nosed ahead of #Microsoft with the multitouch mouse: ”

[@michaelsfo]

and Apple’s latest mouse is a multitouch device that uses gestures to control the screen. As should be expected, it looks great but, as TechRadar reported, it doesn’t support a key gesture – the pinch zoom that we first saw on the iPhone and that Apple has made synonymous with multitouch through its advertising.

Furthermore, there’s no touch screen on any of Apple’s refreshed line-up. In fact, the iMac changes are mostly evolutionary (and there’s a new unibody entry-level MacBook). Meanwhile, with the launch of Windows 7, Microsoft now has advanced touch capability available within the operating system. A multitouch mouse is cool – seriously cool – but the real advantages of touch come with touch screens and other displays that take concepts like the Microsoft Surface table into mainstream computing uses.

Some people might not think touch is really a big deal, or that it’s just a bit gimmicky right now – but step back and take a look at what’s happened with smartphones: in 2007, Apple launched the iPhone and all we’ve seen since then is an endless stream of competing devices – each with multitouch capabilities. Now that’s crossing over into the PC marketplace and, unlike tablet PCs, or early Windows Mobile devices, there’s no need for a stylus and that’s why I believe touch will become much more signifcant that it has been previously. Only yesterday, I watched my young sons (both of whom are under 5) using one of Ikea’s play kiosks and they instantly knew what to do to colour in a picture on screen. As soon as prices drop, I’ll be buying a multitouch monitor for them to use with a PC at home as I expect touch to replace the mouse as the interface that their generation uses to access computing devices.

Far from nosing ahead of Microsoft, I believe Apple has missed the point with its new mouse (please excuse the, entirely accidental, pun). Just as in the years when they insisted that mice only needed a single button (indeed, one of the problems that made the Mighty Mouse so unreliable was that it offered all the functionality of a multi-button mouse with several contact switches under a single button shell in order to maintain the appearance of a single-button mouse), now they are implementing touch on trackpads and mice, rather than on screen. Sure, fingerprints on glass don’t look good but that hasn’t held back the iPhone – and nor would it the iMac or MacBook if they implemented multitouch on screen. For now, at least, Apple is holding off on touchscreen displays, whilst mainstream PC manufacturers such as Dell are embracing the potential for multitouch applications that the latest version of Windows offers. As for the criticism that multitouch monitors are spendy and Apple’s mouse is not, the monitors will come down in price pretty quickly and, based on my experience with Apple’s previous mouse, I won’t be rushing out to spend £55 on the latest model.

As it happens, I bought a mouse to match my white MacBook a couple of weeks ago. Ironically, its from Microsoft – the Arc mouse – and it manages to look good, feel good, and fold up for transportation with its (tiny) transponder neatly connected (with a magnet) to the underside. It seems that Jonathan Ive is not the only person that can design functional and stylish computer hardware (most of the time).

Microsoft and Sky launch Sky Player on Windows 7

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

In my post earlier this evening about the Windows 7 launch, I mentioned new content providers in Windows Media Center and that was one of the other big announcements today – Sky and Microsoft announced the addition of Sky Player, Sky’s online TV service, to Windows Media Center in Windows 7, creating a new and exciting way to watch live and on-demand TV from Sky on a PC.

In a few days time (27 October), Sky Player will also be available on the Xbox 360 and, whilst it can also be accessed from other platforms, the integration into Microsoft’s media offerings is part of Microsoft’s multi-screen entertainment strategy which will bring a wide range of live and on-demand entertainment programmes to the Windows platform.

The service will give Windows 7 users access live and on-demand pay TV currently available via Sky Player, including movies, sports, entertainment, children’s programming, music, arts and documentaries.

Sky Player in Windows Media Center on Windows 7

For existing Sky TV customers, Sky Player in Windows 7 provides an alternative to their set top box in order to view their Sky TV. For new customers, Sky Player in Windows 7 will offer a wide range of live channels and on-demand content via a number of monthly subscription packages.

According to the press release:

“Windows 7 enables audiences with a broadband Internet connection to watch TV from Sky on a PC. In addition to offering digital music, photos and personal videos all in one place, Windows 7 makes it easier to discover great TV, sports and movies from Sky straight from the PC desktop, via a new desktop gadget. Users can also browse programme galleries or search for shows using keywords.”

Ashley Highfield, Managing Director and Vice President Consumer and Online at Microsoft UK (formerly of both the BBC and Project Kangaroo), commented:

“The way UK consumers interact with TV is changing. Audiences now want to consume great quality TV anywhere and at any time and are demanding a lot more from their TV experience. With the launch of Windows 7 and through partners such as Sky, we are making new things possible and delivering TV to British viewers the way they want it”

Sky recommends a 2Mbps broadband connection and when asked if the content was high definition, Sky’s Director of OnDemand, Griff Parry, said that the company is looking to improve the service over time but for now the focus is on great quality standard definition programming.

Maybe one day we’ll see the BBC’s iPlayer integrated in a similar manner – I certainly hope so!