Configuring RPC over HTTP for access to Exchange Server

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Microsoft Outlook Web Access (OWA) is great for occasional access to e-mail but if you’re using a non-Microsoft browser (as I often do) then it degrades to a rather sorry state. Consequently, for a couple of years now, I’ve been meaning to get RPC over HTTP (aka. Outlook Anywhere) working so that I can use a full Outlook client to access my Exchange Server mailbox when I’m on the road (iPhone access to Exchange Server via IMAP or Outlook Mobile Access from my Nokia 6021 are useful for checking for messages throughout the day but I need to run the full Outlook client to filter out the junk e-mail). After doing most of the preparation work some time ago, I didn’t get around to testing it fully – mostly because a lot of my access is from behind an authenticated proxy (and I’m told that Outlook doesn’t like anything getting in the way).

Tonight, I’m in a hotel, and the iBahn connection has no such restrictions, so I finally got around to testing the connection, using Outlook 2007 to communicate with an Exchange Server 2003 (SP2) server.

Full details may be found in Microsoft knowledge base article 833401 but, as ever, I found Daniel Petri’s articles on the subject more useful:

For me, the process was simplified as I already had OWA working over HTTPS but, as Daniel highlights, Harry Bates’ RPCNoFrontEnd utility can save a lot of time in checking that the registry keys are correctly set for the RPC proxy server ports and the Windows Server 2003 resource kit rpccfg /hd command is useful to confirm their operation:

Using rpccfg to confirm the RPC proxy settings

Secondly, running outlook /rpcdiag gave some useful diagnostic information for confirming that the connection was indeed using HTTPS:

Using the Outlook 2007's RPC diagnostics to check connection status

Ironically, I’ve finally got this working with Exchange Server 2003 just before I’m about to move my mail over to a new server running Exchange Server 2007!

Windows Vista and Office 2007 deployment brain dump

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

This week I’m working on a Desktop Deployment Planning Services (DDPS) engagement with a customer. It’s been a while since I last looked at deployment (basically I haven’t done anything since I passed the Windows Vista and Office 2007 deployment exam) so I’m revising my notes in preparation for a workshop tomorrow.

As a supplement to my previous post on a BDD 2007 overview and Office 2007 customisation and deployment using BDD 2007, this is a rollup of just about everything I could lay my hands on about Vista and Office deployment. It’s not particularly well structured – let’s just call it a “brain dump”. If anyone has anything extra to add, please leave a comment at the end of this post:

Windows imaging technologies

  • ImageX (imagex.exe) is a command line tool for manipulating Windows Imaging Format (.WIM) files. It is built using the Windows imaging API (WIMGAPI) together with a .WIM file system filter.
  • Windows Vista images are HAL-independent and make use of single instance storage. To minimise the amount of space used by Windows Vista installation images, use imagex.exe to apply images to separate folders on a computer and then append these images to the final image.
  • Windows System Image Manager (SIM) is used to create and maintain answer files.
  • To modify an image, use imagex.exe to mount it and then apply an unattended setup answer file (unattend.xml).
  • Package Manager (pkgmgr.exe) can be used to update both image files and computers that have already had an image applied:
    • When used to update computers that have already had an update applied, pkgmgr.exe can install, configure and update features in Windows Vista (e.g. installed components, device drivers, language packs, updates). It can also be used with an unattended installation answer file for new installations.
    • When adding additional drivers to an existing Windows Vista image, use pkgmgr.exe to add the drivers from a folder.

Windows Vista deployment

  • Windows Setup (setup.exe) for Vista is now GUI-only and there is no more winnt.exe and winnt32.exe.
  • Windows installation is structured around a number of configuration passes:
    • Windows PE.
    • Offline servicing.
    • Generalise.
    • Specialise.
    • Audit system.
    • Audit user.
    • OOBE system.
  • unattend.xml is a single unattended installation answer file, replacing multiple files in previous versions of Windows – including unattend.txt, cmdlines.txt, winbom.ini, oobeinfo.ini and sysprep.inf.
  • To avoid prompting users for input during the installation of Windows Vista, create an unattended setup installation file and copy this to a USB flash drive, then ensure that the flash drive is present during Windows Vista installation.
  • unattend.xml must be renamed to autounattend.xml when used on removable media during installation and replaces winnt.sif.
  • The Out-of-Box Experience (OOBE) is now known as Windows Welcome and is controlled with oobe.xml, which includes options for Windows Welcome, ISP sign-up and the Windows Vista Welcome Center.
  • Disk repartitioning can be configured in the first pass of the Windows PE section of unattend.xml.
  • When using multiple hardware configurations, create a distribution point that includes an Out-of-Box Drivers folder.
  • Windows Deployment Services (WDS) replaces Remote Installation Services (RIS).
  • When using WDS with computers that do not have PXE capabilities, create a WDS discovery image and use this to create a bootable CD for Windows Vista installation.
  • When using WDS on a server that provides DHCP services, enable DHCP Option 60 and configure WDS to listen on port 67.
  • If the WDS Image Capture Wizard is unable to capture a reference computer image, restart the reference computer and run sysprep /generalize /oobe.
  • The Windows Automated Installation Kit (WAIK) replaces deploy.cab and contains updated versions of tools previously provided to OEMs (e.g. Windows PE) for use in corporate deployments.
  • The OEM Preinstallation Toolkit (OPK) is for system builders, containing the WAIK and additional OEM-specific information (e.g. OEM licensing).
  • bootsect.exe is used to enable deployment alongside earlier versions of Windows with the Windows Vista boot manager (bootmgr.exe) – it replaces fixfat.exe and fixntfs.exe (both included with Windows Vista). Microsoft knowledge base article 919529 has more details.
  • boot.ini has been replaced by the >Boot Configuration Data.
  • The System Preparation Tool (sysprep.exe) is installed by default on Windows Vista systems in %windir%\system32\sysprep and there are several changes when compared with previous versions:
    • sysprep /reseal is replaced with sysprep /generalize /oobe.
    • sysprep /factory is replaced by sysprep /audit.
    • sysprep /mini is replaced by sysprep /oobe.
    • sysprep /nosidgen is replaced by sysprep /generalize.
    • sysprep /clean and sysprep /bmsd are deprecated.
    • sysprep /activated is replaced by sysprep /generalize (together with slmgr.vbs for managing the activation status of a computer)
    • OEMs are required to run sysprep /oobe before delivery of new computers.

Customising Office 2007 installations

  • Windows Installer Patch (.MSP) files can be used to produce customised Office installations (and then called using a script).
  • Multiple installation shares can be defined within a .MSP file.

Office 2007 deployment

  • To create an Office 2007 installation share (e.g. for scripted deployment), create a shared folder on a server and copy the installation files from the source media to the root of the shared folder.
  • To slipstream Microsoft Office 2007 updates into the deployment, create a folder called updates in the Microsoft Office 2007 distribution folder and copy all updates to this folder.

User data migration:

  • The User State Migration Toolkit (USMT) v3.0 can be used with both Windows XP and Windows Vista.
  • miguser.xml can be used to ensure that USMT captures files with a particular extension during migration.
  • The USMT scanstate.exe command can be used with the /p switch to ensure that sufficient free space exists in a target folder.
  • USMT can migrate user state using a network server during an upgrade that involves repartitioning of disks.
  • If the partition table is to be left intact during a migration, use a local partition with sufficient free space for temporary storage.
  • scanstate.exe can scan a source computer, collect files and create a store without modifying the source. The default action is to compress files and store them in image file (usmt3.mig).
  • loadstate.exe will migrate files and settings from and existing store to the destination computer.
  • The scanscate.exe and loadstate.exe commands have matching command line arguments.
  • Migration XML files include rules to define what should be migrated and are specified with the /i switch:
    • Custom XML files define components to exclude and are created using scanstate /genconfig:config.xml.
    • migsys.xml is used with the /targetxp switch to migrate operating system and browser settings.
    • migapp.xml is used to migrate application settings.
    • miguser.xml is used to migrate user files, folders and filetypes.
    • If the destination computer is running Windows XP, modify miguser.xml, migapp.xml and migsys.xml
    • If the destination computer is running Windows Vista, modify miguser.xml and migapp.xml but migsys.xml is not supported – use config.xml instead.
    • migxml.xsd can write and validate xml files.
  • scanstate /p can be used to create a space estimate file called usmtsize.txt (it will also be necessary to specify /nocompress).

Office 2003-2007 interoperability

Localisation

  • To add multiple language support to Office 2007 applications, install the appropriate language pack on the installation share and update config.xml.
  • To add a language pack to an existing computer, use pkgmgr.exe to apply a new unattended setup installation file that references the appropriate language pack.
  • If the Windows SIM is unable to access language pack settings in a customised Windows Vista image, generate a new catalog based on the custom image.

Further reading

My living room PC finally becomes a reality

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Some time back, I wrote about my plans for a living room PC but before this could happen there were several hurdles to overcome:

  1. Earn enough money to replace my Mac Mini so that it could move to the living room.
  2. Negotiate the wife approval factor for IT in a common area of the house (must look good – hence Mac Mini).
  3. Find a software setup that works for me, but is also consumer-friendly for the rest of the family (i.e. no hint of a Windows, OS X or Linux interface).

Fortunately, the first two items came together for me quite easily – after I decided to raid my savings and buy a MacBook a couple of months back, my wife asked me which PC it was replacing and I said “that one” (pointing at the Mac Mini), never expecting the response that she gave – “Oh. I like that one. It’s cute.”! This was a revelation to me – my wife has never before referred to any of my IT as “cute” – so I grabbed the moment to say something like “yeah, I thought it could go in the living room for when we watch films and stuff” (and the lack of any objection was interpreted as implicit approval).

The hardest part was the software setup. I still feel that Windows Vista’s Media Center capabilities are vastly superior to Apple’s Front Row – there’s TV support in there for starters. There are other options too: EyeTV would add TV support to the Mac (the problem is that it only has a 1 year TV guide subscription in Europe); Center Stage looks promising, but is still an alpha release product (and has been for a while now); Myth TV could work too (but my research suggested that USB TV tuner support could be a bit of a ‘mare). Then I thought about it a bit harder – we have lousy terrestrial TV support in my house (digital or analogue), so the clearest TV signal I have is on satellite (Freesat from Sky). Unless I can find a way to interface the Mac with my digibox, TV on the living room PC is not going to happen (this solution looks interesting but is for Windows/Linux only). Which meant that the criteria for a living room PC were:

  • Access to my iTunes library (which lives on my MacBook) to watch podcasts, listen to music, etc.
  • Ability to play content from CD/DVD.
  • Ability to play content ripped from DVD or obtained by other means (e.g. home movies or legally obtained digital downloads).
  • Simple (no technical skills required) user interface.

So, ruling out the need for TV integration meant that Apple Front Row was suddenly a contender as the OS X 10.5 (Leopard) version of Front Row can access shared libraries from other PCs (so I don’t have to copy/convert the media) and the remote control supplied with the Mac Mini does not rely on line of sight to control the PC. Vista Media Center could do this but I’d need to have that huge IR receiver and ugly remote control. Of course, I could just buy an Apple TV, but the Mac Mini gives me so much more (and it can output to my aging, but still rather good Sony Trinitron 32″ widescreen TV). I will stress though, that if I ever manage to get a decent TV signal from our aerial, Windows Vista Media Center would beat Front Row hands down.

It’s ironic that I’m writing this in a hotel room in Bolton (nowhere near my living room) but so far, the software stack on the Mini is:

I may have to add a few more codecs over time (but Perian seems to include most of what I need) and furthermore, this headless PC (sorry, Mac – for the purists out there) is suiting my requirements pretty well. I’ve watched films from the hard disk with no issues at all, and streamed video podcast (and audio) content from my MacBook across an 802.11g Wi-Fi network (with no apparent playback issues – despite the signal having to travel through several walls and to the furthest corner of my living room – although I wish iTunes would mark podcasts as played when viewed remotely). There is one caveat though (and that’s a hardware issue) – even though the Leopard version of Front Row supports DVD playback, I still watch DVD content on my home theatre setup as that gives me true 5:1 surround sound (it’s only stereo on the Mac Mini).

Other applications that will probably find their way onto the mini over time include:

Those are for the future though – at this point in time, I’m ripping my content (and performing iPod conversion as required) on other machines and transferring it to the Mini across the network (see below). I also tried running a Windows XP virtual machine for downloading from BBC iPlayer and Channel 4 On Demand but didn’t manage to set up the tools that are necessary to remove the Windows Media DRM in order to play the content on the Mac (I do at least have a PC running older software releases that I can use for that).

As for getting media onto the living room PC (the stuff that I don’t want to stream across the network), I can always plug in a USB drive and control it remotely using the screen sharing capabilities in OS X.

Screen sharing in OS X to access the living room TV remotely

OS X screen sharing is only VNC but it works well across my network (and scales the display accordingly – it even gave a decent representation of my 1680×1050 resolution display downscaled to fit on a 1280×800 MacBook display, although the picture here is with the Mac Mini hooked up to a standard definition TV). One point to note – it’s necessary to disconnect the shared screen session before trying to control the living room PC with the remote control or else it won’t work.

Further reading

Wikipedia article on Apple Front Row.
MacInTouch Intel Mini Home Theater.
ARS Technica review of various Mac Mini media solutions.

And finally…

This guy has some great details of how he set up his Mac Mini in his living room – he uses MythTV but gives some good details about mounting the hardware to only take up 97mm (3.8″) of room space.

At last, a new service pack for Windows XP

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

It’s been a long time coming – almost 4 years – but Microsoft has just announced that Windows XP service pack 3 has been released to manufacturing.

In the announcement, Release Manager, Chris Keroack, said:

“Today we are happy to announce that Windows XP Service Pack 3 (SP3) has released to manufacturing (RTM). Windows XP SP3 bits are now working their way through our manufacturing channels to be available to OEM and Enterprise customers.

We are also in the final stages of preparing for release to the web […] on April 29th, via Windows Update and the Microsoft Download Center. Online documentation for Windows XP SP3, such as Microsoft Knowledge Base articles and the Microsoft TechNet Windows XP TechCenter, will be updated then. For customers who use Windows XP at home, Windows XP SP3 Automatic Update distribution for users at home will begin in early summer.”

Microsoft released a white paper a few months back, detailing the improvements that Windows XP service pack 3 will bring – XP SP3 is basically a rollup of updates and some of the networking technologies from Windows Vista – for example NAP client support.

Accessing unsecured Wi-Fi – is it a crime?

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Whilst I was researching my earlier post about WiMax in Milton Keynes, I came across an article on The Register about a couple of guys who got themselves arrested for accessing someone’s open Wi-Fi connection.

The comments make interesting reading – I recommend a read but will warn you that there are 111 of them, so you’d better be good at skim reading!

There are lots of useful analogies there (and the general consensus seems to be that, if a Wi-Fi access point is open, then you are inviting people to come in – especially with most wireless cards configured to connect to the strongest available signal – and that, if it’s secured, then it is clearly a private computer system) but I found a few of them particularly interesting after reading Section 1 of the Computer Misuse Act, 1990 (I’m sure other laws can equally be applied):

Unauthorised access to computer material
(1) A person is guilty of an offence if—
(a) he causes a computer to perform any function with intent to secure access to any program or data held in any computer;
(b) the access he intends to secure is unauthorised; and
(c) he knows at the time when he causes the computer to perform the function that that is the case.
(2) The intent a person has to have to commit an offence under this section need not be directed at—
(a) any particular program or data;
(b) a program or data of any particular kind; or
(c) a program or data held in any particular computer.
(3) A person guilty of an offence under this section shall be liable on summary conviction to imprisonment for a term not exceeding six months or to a fine not exceeding level 5 on the standard scale or to both.

[Computer Misuse Act, 1990]

Based on this it could be argued that, if anaccess point is broadcasting SSIDs and is unencrypted, then a person cannot know that the access that they intend to secure is unauthorised. It could also be argued that, by broadcasting its presence, the access point accessed any computers with wireless cards in the area without their respective owners’ permissions. Or consider, as another commenter highlighted, what happens when pinging a computer’s IP address – is that not requiring the other computer to perform an action (even if that action is to reject ping responses, it still has to read the packet)? What about accessing a web server – did I explicitly give you permission to come here and read this article? No, but by publishing this website, I gave implicit permission, which is expanded further in my legal notice. Ergo, by leaving wireless access point open and broadcasting it’s SSID, I would be giving implicit permission to access it.

I know there’s at least one Copper who reads this blog and I’m sure he has an opinion. As of course, do I. And that’s why I locked down my Wi-Fi.

Usual caveats apply: I am not a lawyer; don’t interpret anything you read here to be legal advice; etc., etc..

WiMax in Milton Keynes – not at that price, thank you!

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last year, I wrote a post about free Wi-Fi provision in central Milton Keynes. I wasn’t very impressed (although I’d like to see the service prosper) but have to admit that I haven’t tried it since. In the same post, I also mentioned that there was a WiMax trial planned for Milton Keynes and a few weeks back, after hearing nothing for over a year, I received an e-mail to tell me that it is now available in my area.

This sounded good – I have “up to 8Mbps” ADSL at home and my router tells me that I get about 7.2Mbps downstream with about 448Kbps upstream, but if I could get good upstream bandwidth too then that would be an advantage. Then I noticed two things that put me off.

Firstly, the service is provided by Connect MK – who claim to be:

“A Council company created to provide better broadband services for Milton Keynes”

WTF! Milton Keynes Council appears to me to be incapable of managing anything of any substance (of course, that is purely a personal opinion, based on my experience as a Council Tax payer). In the small town where I live (under the control of the unitary authority that is Milton Keynes Council) we have: a secondary school that opened 8 months late and £3m over budget [source: political propaganda for the upcoming local elections], with design changes that mean it stands out like a blot on our (pleasant) landscape; a backlog of road repairs; short-sighted planning decisions with councillors supporting further expansion without any of the supporting infrastructure (including the grid road system that has worked so well for the last 30 years in urban Milton Keynes); etc., etc. (my list could go on and on, but let’s stop here – you get the idea). Now the same council wants to provide network infrastructure services. It’s not 1 April is it? Not according to my calendar anyway.

Secondly, the price: a 1Mbps downstream/512Kbps upstream package with a 10GB download limit is advertised for £20 a month; 2Mbps down and 512Kbps up with a 20GB allowance is £25; but 2Mbps down and 1Mbps up with a 40GB allowance is a staggering £50 a month! Are they joking?

I pay around £30 for my small business ADSL service and I have no issues with bandwidth allowances (my current ISP operates a system of peak and off-peak usage, and the off-peak usage really is unlimited, with peak usage rates depending upon the tariff). If I wanted a residential service I could pay a lot less than that. For that matter, I can get HSPA mobile broadband Internet for £15 a month on an £18 month contract.

As it happens, Connect MK is a reseller for the infrastructure provided by FREEDOM4 (formerly Pipex Communications). Interestingly, despite having supplied my home address and postcode details to Pipex and Connect MK having e-mailed me to say “Great news – You can now receive a WiMAX Broadband Service”, neither the current FREEDOM4 coverage map nor the coverage checker on their website indicates that I can receive the service – at this time it only seems to cover urban areas of Milton Keynes. It’s not a very good indictment of Connect MK’s ability to provide a reliable service when they haven’t even worked out that I live 10 miles outside their coverage area.

Regardless of the network coverage, I fail to see who would even consider the Connect MK WiMax service as an alternative to ADSL or cable. At the prices quoted, I can’t imagine much of Milton Keynes’ population getting connected with Connect MK.

Caching Microsoft updates with ISA Server

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I used to use WSUS to update the machines on my home network but after a botched server upgrade, it all went screwy and I didn’t really want to have to pull all the updates down over my ADSL connection again (would probably blow away my month’s worth of “fair usage”). In any case all I was doing was blindly approving updates for installation so I might as well use the Microsoft Update servers instead.

The only downside of using the Microsoft Update servers to update several computers is that there is a lot of duplication in the network traffic. That’s why ISA Server 2006 includes a cache rule that enables caching of Microsoft updates using the Background Intelligent Transfer Service (BITS). For those who aren’t aware, BITS allows the transfer of large volumes of data without degrading network performance as it transfers the data in small chunks to utilise unused bandwidth as it becomes available and reassembles the data at the destination. The BITS feature is not available for any other ISA cache rule but the Microsoft Update Cache Rule is installed by default and all I needed to do was enable caching.

After setting up the cache rules and updating one of my servers, I wanted to see that the cache was being used. I’ve previously mentioned the cachedir.exe tool that can be used to examine ISA Server caches and I downloaded the latest version from Microsoft’s ISA Server 2006 tools page. After extracting the tool, I ran it and was presented with an error:

CACHEDIR.exe – Unable to Locate Component
This application has failed to start because msfpc.DLL was not found. Re-installing the application may fix the problem.

Then I remembered that cachedir.exe needs to be copied to the ISA Server installation folder (on my system that is %programfiles%\Microsoft ISA Server) – after moving the file to the correct folder, it fired up as expected. Just remember that this utility can only display the cache contents that have been written to disk. To flush the memory cache to disk you will need to restart the Microsoft Firewall service and re-run cachedir.exe to view the contents.

Comparing internal and USB-attached hard disk performance in a notebook PC

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Recently, I was in a meeting with a potential business partner and their software was performing more slowly than they had expected in the virtual environment on my notebook PC. The application was using a SQL Server 2005 Express Edition database and SQL Server is not normally a good candidate for virtualisation but I was prepared to accept the performance hit as I do not want any traces of the product to remain on my PC once the evaluation is over.

Basic inspection using Task Manager showed that neither the virtual nor the physical system was stressed from a memory or CPU perspective but the disk access light was on continuously, suggesting that the application was IO-bound (as might be expected with a database-driven application). As I was also running low on physical disk space, I considered whether moving the VM to an external disk would improve performance.

On the face of it, spreading IO across disk spindles should improve performance but with SATA hard disk interfaces providing a theoretical data transfer rate of 1.5-3.0Gbps and USB 2.0 support at 480Mbps, my external (USB-attached) drive is, on paper at least, likely to result in reduced IO when compared with the internal disk. That’s not the whole story though – once you factor in the consideration that standard notebook hard drives are slow (4200 or 5400RPM), this becomes less of a concern as the theoretical throughput of the disk controller suddenly looks far less attainable (my primary hard drive maxes out at 600Mbps). Then consider that actual hard disk performance under Windows is determined not only by the speed of the drive but also by factors such as the motherboard chipset, UDMA/PIO mode, RAID configuration, CPU speed, RAM size and even the quality of the drivers and it’s far from straightforward.

I decided to take a deeper look into this. I should caveat this with a note that performance testing is not my forte but I armed myself with a couple of utilities that are free for non-commercial use – Disk Thruput Tester (DiskTT.exe) and HD Tune.

Both disks were attached to the same PC, a Fujitsu-Siemens S7210 with a 2.2GHz Intel Mobile Core 2 Duo (Merom) CPU, 4GB RAM and two 2.5″ SATA hard disks but the internal disk was a Western Digital Scorpio WD1200BEVS-22USTO whilst the external was a Fujitsu MHY2120BH in a Freecom ToughDrive enclosure.

My (admittedly basic) testing revealed that although the USB device was a little slower on sequential reads, and quite a bit slower on sequential writes, the random access figure was very similar:

Internal (SATA) disk External (USB) disk
Sequential writes 25.1MBps 22.1MBps
Sequential reads 607.7MBps 570.8MBps
Random access 729.3MBps 721.6MBps

Testing was performed using a 1024MB file, in 1024 chunks and the cache was flushed after writing. No work was performed on the PC during testing (background processes only). Subsequent re-runs produced similar test results.

Disk throughput test results for internal diskDisk throughput test results for external (USB-attached) disk

Something doesn’t quite stack up here though. My drive is supposed to max out at 600Mbps (not MBps) so I put the strange results down to running a 32-bit application on 64-bit Windows and ran a different test using HD Tune. This gave some interesting results too:

Internal (SATA) disk External (USB) disk
Minimum transfer rate 19.5MBps 18.1MBps
Maximum transfer rate 52.3MBps 30.6MBps
Average transfer rate 40.3MBps 27.6MBps
Access time 17.0ms 17.7ms
Burst rate 58.9MBps 24.5MBps
CPU utilisation 13.2% 14.3%

Based on these figures, the USB-attached disk is slower than the internal disk but what I found interesting was the graph that HD Tune produced – the USB-attached disk was producing more-or-less consistent results across the whole drive whereas the internal disk tailed off considerably through the test.

Disk performance test results for internal disk
Disk performance test results for external (USB-attached) disk

There’s a huge difference between benchmark testing and practical use though – I needed to know if the USB disk was still slower than the internal one when it ran with a real workload. I don’t have any sophisticated load testing tools (or experience) so I decided to use the reliability and performance (performance monitor) capabilities in Windows Server 2008 to measure the performance of two identical virtual machines, each running on a different disk.

Brent Ozar has written a good article on using perfmon for SQL performance testing and, whilst my application is running on SQL Server (so the article may help me find bottlenecks if I’m still having issues later), by now I was more interested in the effect of moving the virtual machine between disks. It did suggest some useful counters to use though:

  • Memory – Available MBytes
  • Paging File – % Usage
  • Physical Disk – % Disk Time
  • Physical Disk – Avg. Disk Queue Length
  • Physical Disk – Avg. Disk sec/Read
  • Physical Disk – Avg. Disk sec/Write
  • Physical Disk – Disk Reads/sec
  • Physical Disk – Disk Writes/sec
  • Processor – % Processor Time
  • System – Processor Queue Length

I set this up to monitor both my internal and external disks, and to log to a third external disk so as to minimise the impact of the logging on the test.

Starting from the same snapshot, I ran the VM on the external disk and monitored the performance as I started the VM, waited for the Windows Vista Welcome screen and then shut it down again. I then repeated the test with another copy of the same VM, from the same snapshot, but running on the internal disk.

Sadly, when I opened the performance monitor file that the data collector had created, the disk counters had not been recorded (which was disappointing) but I did notice that the test had run for 4 minutes and 44 seconds on the internal disk and only taken 3 minutes and 58 seconds on the external one, suggesting that the external disk was actually faster in practice.

I’ll admit that this testing is hardly scientific – I did say that performance testing is not my forte. Ideally I’d research this further and I’ve already spent more time on this than I intended to but, on the face of it, using the slower USB-attached hard disk still seems to improve VM performance because the disk is dedicated to that VM and not being shared with the operating system.

I’d be interested to hear other people’s comments and experience in this area.

Waiting for Windows 7: is Vista really that bad?

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I was at an event last week where Gareth Hall, UK Product Manager for Windows Server 2008, commented on the product’s fantastic press reviews, with even Jon Honeyball (who it seems is well known for his less-than-complimentary response to Microsoft’s output of late) commenting that:

“Server 2008 excels in just about every area [… and] is certainly ready for prime time. There’s no need to wait for Service Pack 1”

[Jon Honeyball, PC Pro, February 2008]

It seems that, wherever you look, Windows Server 2008 is almost universally acclaimed. And rightly so – I believe that it is a fantastic operating system release (let’s face it, Windows Server 2003 and R2 were very good too) and is packed full of features that have the potential to add significant value to solutions.

So, tell me, why are the same journalists who think Windows Server 2008 is great, still berating Windows Vista – the client version of the same operating system codebase? Sure, Vista is for a different market, Vista has different features, and it’s only fair to say that Vista took some time to bed down, but after more than a year of continuous updates and a major service pack is it really that bad?

This week, IT Week is running a leader on the “migration muddle” that organisations face. Should IT Manager’s skip Vista and go straight to Windows 7, with Bill Gates allegedly saying that “sometime in the next year we will have a new version [of Windows]”?

The short answer is “No!”. My advice is either to move to Vista now and save the pain of trying to jump two or three releases to Windows 7 later, or accept a more pragmatic approach of managed diversity.

The trouble is that Microsoft has muddied the water by dropping hints about what the future may hold. What was once arguably the world’s biggest and best marketing machine seems to have lost its way recently – either maintain the silence and keep us guessing what Windows 7 means, or open up and let us decide whether it’s worth the wait. With the current situation, IT Managers are confused: the press are, by and large, critical of Vista; consumers and early adopters have complained of poor device support (not Microsoft’s fault); and even Microsoft seems ready to forget about pushing their current client operating system and move on to the next big thing.

In all my roles – as a consultant, an infrastructure architect, a Microsoft partner and of course as a blogger, I’d love to know more about Windows 7 – and Microsoft does need to be more transparent if it expects customers to make a decision. Instead, they seem to be hoping that hints of something new that’s not Vista will help to sell Enterprise Agreements (complete with Software Assurance) to corporates.

Moving virtual machines between disks in Hyper-V

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

The trouble with running Microsoft Hyper-V on a notebook PC is that notebook PCs typically don’t have large hard disks. Add a few snapshots and a virtual machine (VM) can quickly run into tens or even hundreds of gigabytes and that meant that I needed to move my VMs onto an external hard disk.

In theory at last, there should also be a performance increase by moving the VMs off the system disk and onto a separate spindle; however that’s not straightforward on a notebook PC as second disks will (normally) be external (and therefore using a slower USB 2.0 interface, rather than the internal SATA controller) – anyway, in my case, disk space was a more important than any potential performance hit.

Moving VMs around under Hyper-V is not as straightforward as in Virtual Server; however there is an export function in Hyper-V Manager that allowed me to export a VM to my external hard disk, complete with snapshots (Ken Schaefer describes the equivalent manual process for moving a Hyper-V VM on his blog).

The exported VM is still not ready to run though – it needs to be imported again but the import operation is faster as it doesn’t involve copying the .VHD file (and any associated snapshots) to a new location. After checking that the newly imported VM (with disk and snapshot storage on the external drive) would fire up, I deleted the original version. Or, more accurately, I would have done if I hadn’t run out of disk space in the meantime (Windows Server 2008 doesn’t like it when you leave it with only a few MB of free space).

Deleting VMs is normally straightforward, but my machine got stuck half way through the “destroy” process (due to the lack of hard disk space upsetting my system’s stability) and I failed to recover from this, so I manually deleted the files and restarted. At this point, Hyper-V manager thought that the original VM was still present but any attempt to modify VM settings resulted in an error (not surprising as I’d deleted the virtual machine’s configuration file and the virtual hard disks). What I hadn’t removed though was the shortcut (symbolic link) from the to my external hard disk. Deleting this file from %systemdrive%\ProgramData\Microsoft\Windows\Hyper-V\Virtual Machines and refreshing Hyper-V Manager left me with a clean management console again.