Windows Server 2008 is a great workstation operating system too

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Windows Server 2008 logoIt took me months to convince my manager that I need a new laptop. Then it took me a few more to convince the IT department of the specification I needed (and to prise it out of their hands) but today I finally got my hands on it. It’s nothing special – I’d like a ThinkPad but, as my employer owns one half of Fujitsu-Siemens Computers, it is a Lifebook S7210 – and it’s not a bad machine either (especially as this one has 4GB of RAM in it). Why do I need that? Because I’m the technology lead for Windows Server 2008 and Hyper-V in our Microsoft Practice – and I want to “dogfood” the technology.

The thing is, that Windows Server 2008 is not really a client operating system. Except it can be… Windows Server 2008 has a lot in common with Windows Vista and with a few tweaks, I had got it working just as as I want it. A Windows desktop on steroids really:

  1. Step 1 is a notebook PC with hardware assisted virtualisation capabilities, No eXecute (NX)/eXecute Disable (XD) protection, and a 64-bit capable CPU. My Lifebook S7210 has all of those things, so on to step 2…
  2. Next, I needed an operating system – Windows Server 2008 Standard Edition would do the trick (after all I only have a single CPU and won’t be clustering laptops!), but the licensing model for Windows Server and virtualisation lends itself to using Windows Server 2008 Enterprise Edition (64-bit).
  3. Windows Server 2008 is not a supported operating system for this hardware but Windows Vista is. Installing x64 drivers for Windows Vista got my graphics and WiFi up and running but I still need to find drivers for some of the other components (like the built in card-reader).
  4. Next, installing the server roles that I want to use – Hyper-V for starters. Just make sure that the BIOS support for Intel-VT or AMD-V and NX/XD is enabled first.
  5. With the operating system installed, it’s time to get to work turning on some of the client features that are missing from a server operating system (thanks to Vijayshinva Karnure for his original post and subsequent follow-up, as well as this post from Stuart Maxwell):
    • Turn off the Internet Explorer enhanced security configuration (ESC) – it’s fine for servers that shouldn’t be browsing the Internet anyway, but for a workstation it just gets in the way (and encourages bad practice by putting lots of sites into the trusted zone).
    • Install the Desktop Experience feature – providing many of the Windows Vista capabilities that are not there by default in Windows Server 2008.
    • Set the Themes service to start automatically – and start it.
    • Ditto for the Windows Audio service.
    • Install the Windows Search service (part of the File Services role) – Outlook will use this for indexing e-mail.
    • Edit the local security policy to set Display Shutdown Event Tracker to Disabled
    • Enable Windows Aero in the appearance settings (may require a reboot, and possibly re-installation of video drivers).
    • In Control Panel, System, Advanced System Settings, Performance Options, set the required visual effects – I found that if I let Windows adjust for best appearance, it reverted to the Windows Vista Basic colour scheme but if I selected a custom configuration with all effects selected except Animate Controls and Elements inside Windows, I could keep Aero, complete with Flip 3D.
      Windows Aero Flip-3D
    • Also in the advanced system settings, set the processor scheduling to favour programs.
    • Enable Superfetch. Starting the Superfetch service will fail until some registry changes are made:

      Windows could not start the Superfetch service on computername.
      Error 197: The operating system is not presently configured to run this application.

      The solution is to create two new registry keys, after which the service should start successfully:

      Windows Registry Editor Version 5.00

      [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\Memory Management\PrefetchParameters]

    • Edit the power settings to allow hard disks to spin down after 20 minutes when running on mains power (and 5 when on battery power).
  6. Finally, install browser plug-ins (Flash, Silverlight, etc.) and application software (e.g. Microsoft Office).

Windows Server 2008 running as a workstation

I still need to configure our corporate anti-virus solution and VPN software (I may have some problems there as it has a dependency on a firewall product that does not work with Vista SP1 or, I imagine, Windows Server 2008). Why we insist on it with the firewall built into Windows I still don’t know but my VPN connection won’t work without it. I also need to work out if I can get hibernation to work on Windows Server 2008. Once that’s done, I should have a fully functional Windows Workstation 2008, with built-in hypervisor-based virtualisation. Sweet.

Internet Explorer search provider for

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Earlier today I had a go at creating a new search provider for Internet Explorer (IE) 7.0 so that I can search the website for information. It’s not of much practical use to anyone except to me but it is incredibly easy to achieve and works well. This is the resulting OpenSearch XML that IE generated for me:

  <?xml version="1.0" encoding="UTF-8" ?>
- <OpenSearchDescription xmlns="">
  <Description> provider</Description>
  <Url type="text/html" template="{searchTerms}" />

There’s more information on adding search providers to IE 7 using OpenSearch 1.1 at the IEBlog.

Fujitsu opinion on virtualisation

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I normally keep this blog separate from my work at Fujitsu; however I’ve recently been involved in the production of an article about virtualisation and now that it’s been published, I’d like to highlight its existence on the Fujitsu UK website:

Virtual Reality

Make sure that virtualisation pays – and keeps on paying
Once a niche technology for test and development environments, virtualisation has moved into the mainstream as organisations embrace the benefits in efficiency and flexibility that it offers. It’s our opinion that virtualisation in itself is neither a quick fix for complex IT environments nor a guaranteed source of bottom-line benefits. It all depends on how you manage the technology – and how you adapt the processes and culture of your organisation to new ways of working.

Of course, although I’m attributed as the author, there were a team of people involved in the production of this document and I’d be interested to hear other people’s views – either by leaving a comment here, or (preferably), by leaving a comment on the Fujitsu website.

No more heroes {please}

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

That’s it.  A single reference to [IT] heroes.  No more – because I didn’t count how many times that word was used at the 2008 Global Launch event today but I certainly didn’t have enough fingers and toes to keep a tally – and now I’m tired of hearing it.

Although those of us at the UK launch had already heard from a variety of Microsoft executives (including Microsoft UK Managing Director, Gordon Frazer, and Microsoft’s General Manager for the Server and Tools Division, Larry Orecklin) and customers, the highlight was the satellite link-up to the US launch event with Microsoft CEO, Steve Ballmer.Steve Ballmer at the Microsoft 2008 Global Launch  Unfortunately, before we got to hear the big man speak, we had to listen to the warm-up act – Tom Brokaw, who it would seem is a well-known television presenter in the States, but totally unknown over here.  He waffled on for a few minutes with the basic premise being that we are in a transformational age in the history of our world and that the definition of our time and generation comes from unsung heroes (damn, that’s the second time I’ve used the word) – not celebrities.

So.  Windows Server 2008, Visual Studio 2008, SQL Server 2008.  Three new products – one released last year, one earlier this month, and another due later in 2008 in Microsoft’s largest ever launch event with 275,000 people expected to attend events across the globe and another million online at the virtual launch experience website.  Ballmer described them as "The most significant [products] in Microsoft’s history" and "enablers to facilitate the maximum impact that our industry can have".  But what does that mean for you and I – the people that Microsoft likes to refer to with the H word who implement their technology in order to execute this change on an unsuspecting world?

I’ve written plenty here before about Windows Server 2008, but the 2008 global launch wave is about more than just Windows.  For years now, Microsoft has been telling us about dynamic IT and over the last few years we have seen many products that can help to deliver that vision.  The 2008 global launch wave is built around four areas:

  1. A secure and trusted foundation.
  2. Virtualisation.
  3. Web and developer productivity.
  4. Business intelligence (and user experience).

So, taking each of these one at a time, what do the 2008 products offer?

A secure and trusted foundation

Security and reliability are always touted as benefits for the latest version of any product, but in the case of Windows Server there are some real benefits.  The Server Core installation option results in a smaller codebase, meaning a reduced attack surface.  The modular design of IIS (and indeed the role-based architecture for Windows Server) means that only those components that are required are installed. Read-only domain controllers allow for secure deployment of directory servers in branch office situations that previously would have been a major security risk.

Availability is increased with enhancements to failover clustering (including new cluster validation tools), SQL data mirroring and the new resource governor functionality in SQL Server 2008 which allows resources to be allocated to specific workloads.

On the compliance and governance front, there is network access protection, federated rights management, and transparent SQL data encryption.

Microsoft is also keen to point out that their database platform has seen significantly fewer critical vulnerabilities in recent history than Oracle.

Finally, although not strictly security-related, Microsoft cites 40% of data centre costs relating to power and that Windows Server 2008 consumes 10% less power than previous versions of Windows Server, when running the same workload.


Microsoft’s view on virtualisation is broader than just server virtualisation, encompassing not just the new Hyper-V role that will ship within 180 days of Windows Server 2008 release but also profile virtualisation (document redirection and offline files), client virtualisation (Vista Enterprise Centralised Desktop), application virtualisation (formerly SoftGrid) and presentation virtualisation (Terminal Services RemoteApp), all managed in one integrated, unified manner with System Center.

As for VMware‘s dominance of the server virtualisation space – I asked Larry Orecklin how Microsoft would combat customer perceptions around Microsoft’s lack of maturity in this space. His response was that "the proof is in the pudding" and that many customers are running Hyper-V in beta with positive feedback on performance, scalability and ease of use.  Microsoft UK Server Director, Bruce Lynn added that Hyper-V is actually the tenth virtualisation product that Microsoft has brought to market.

In Steve Ballmer’s keynote, he commented that [customers] have told Microsoft that virtualisation is too hard and too expensive – so Microsoft wants to "democratise virtualisation" – to switch from the current situation where less than 10% of servers are virtualised to a world where 90% are.  Their vision is for a scalable and performant hypervisor-based virtualisation platform, with minimal footprint, interoperability with competitive platforms, and integrated management tools.

Web and developer productivity

At the core of Windows Server 2008 is IIS 7.0 but Visual Studio extends the vision for developer productivity when creating rich web applications including support for AJAX, JavaScript IntelliSense, XAML, LINQ, entity-level data access and multi-targeting.

From a platform perspective, there are improvements around shared configuration, administrative delegation and scalability.

Combined with Silverlight for a rich user experience and Expression Blend (for designers to interact with developers on the same code), Microsoft believes that their platform is enabling customers to provide better performance, improved usability and a better experience for web-based applications.  It all looks good to me, but I’m yet to be convinced by Silverlight, or for that matter Adobe AIR – this all seems to me like a return to the days when every site had a Shockwave/Flash intro page and I’m like to see a greater emphasis on web standards.  Still, at least IIS has new support for running PHP without impacting on performance now – and Visual Studio includes improved CSS styling support.

Business intelligence

Ballmer highlighted that business intelligence (BI) is about letting users engage with applications – providing not just presentation but insight – getting at the data to provide business value.  Excel is still the most popular business intelligence tool, but combined with other products (e.g. SharePoint and PerformancePoint), the Microsoft BI story is strengthened.

SQL Server 2008 is at the core of the BI platform providing highly performant and scalable support for data warehousing with intelligence for both structured and unstructured data.  SQL Server reporting services integrates with Office applications and the ability to store spatial data opens new possibilities for data-driven applications (e.g. the combination of non-relational data and BI data to provide location awareness).

Putting it all together

So, that’s the marketing message – but what does this mean in practice?  Microsoft used a fictitious coffee company to illustrate what could be done with their technology but I was interested to hear what some of their TAP customers had been up to.  Here in the UK there were a number of presentations from well-known organisations that have used 2008 launch wave products to solve specific business issues.

easyJet have carried out a proof of concept that they hope to develop into an improved travel portal for their customers.  As a low-fares airline, you might expect anything more than the most basic website to be an expensive extravagance but far from it – 98% of easyJet’s customers book via the web, and if the conversion rate could be increased by 1% then that translates into £17m of revenue each year.

The easyJet proof of concept uses a Silverlight and AJAX front end to access Microsoft .NET 3.5 web services and SQL Server 2008.  Taking a starting point of, for example, London Luton, a user can select a date and see the lowest prices to all available destinations on a map.  Clicking through to a destination reveals a Microsoft Virtual Earth map with points of interest within a particular radius.  Streaming video is added to the mix, along with the ability to view hotel details using TripAdvisor and book online.

The proof of concept went from design to completion in just 6 weeks.  Windows Server 2008 provided IIS 7.0 with its modular design and simplified configuration.  SQL Server 2008 allowed the use of geospatial data.  And Visual Studio 2008 enhanced developer productivity, team collaboration and the overall user experience.

Next up was McLaren Electronic Systems, using SQL Server 2008 to store telemetry data transmitted in real time from Formula 1 racing cars.  With microwave signals bouncing off objects and data arriving out of sequence, the filestream feature allows data to be streamed into a relational database for fast access.  Tests have shown that for files above 2MB this technology will out-perform a traditional file system.  Formula 1 may sound a little specialised to relate to everyday business but as McLaren explained, a Formula 1 team will typically generate 3TB of data in a season.  That’s a similar volume to a financial services company, or a warehousing and logistics operation – so the technology is equally applicable to many market sectors.

The John Lewis Partnership is using Windows Server 2008 for its branch office infrastructure.  Having rolled out Windows Server 2003, they would like to reduce the number of servers (and the carbon footprint of their IT operations) at the same time as doubling the number of stores.  Security is another major consideration, with the possibility of data corruption if power is removed from a server and a security breach if a directory server is compromised.

By switching branch servers to Windows Server 2008 read-only domain controllers (DCs), John Lewis can combine the DCs with other branch office functions (print, DHCP, System Center Configuration Manager and Operations Manager) to remove one server from every store.  The reduction in replication traffic (AD replication is all one-way from the centre to the RODCs) allows for a reduction in data centre DCs too.  Windows Server 2008 also facilitates improved failover between data centres in a disaster recover scenario.  Other Windows Server technologies of interest to John Lewis include Server Core, 64-bit scalability and clustering.

The University of Cambridge is making use of the ability to store spatial data in SQL Server 2008 to apply modern computing to the investigation of 200 year-old theories on evolution.  And Visual Studio 2008 allowed the construction of the associated application in just 5 days.  As Professor John Parker and his self-confessed "database geek" sidekick, Dr Mark Whitehorn explained, technologies such as this are "allowing the scientific community to wake up to business intelligence".

Finally, the Rural Payments Agency (the UK government agency responsible for paying agricultural subsidies) is using Microsoft Application Virtualization and Terminal Services to provide an ultra-thin client desktop to resolve application conflicts and allow users to work from any desk.


Microsoft never tells us a great deal about the roadmap (at least not past the next year or so) but the 2008 launch wave includes a few more products yet.  Visual Studio 2008 and Windows Server 2008 have already shipped.  SQL Server 2008 will be available in the third quarter of 2008 (with a community technology preview today) and the Hyper-V role for Windows Server will ship within 180 days of Windows Server (although I have heard rumours it may be a lot closer than that).  In the summer we will see a new release of Windows Small Business Server as well as a new product for SMEs – Windows Essential Business Server – and, at the other end of the computing spectrum, Windows High Performance Computing Server.  Finally, a new version of Silverlight will ship at some point this year.


I may not be a fan of the HEROES happen {here} theme but that’s just marketing – I’ve made no secret of the fact that I think Windows Server 2008 is a great product.  I don’t have the same depth of experience to comment on Visual Studio or SQL Server but the customer presentations that I heard today add credence to Microsoft’s own scenario for a dynamic, agile, IT infrastructure to reduce the demands for maintenance of the infrastructure and drive out innovation to support the demands of modern business. 

Mark Wilson {United Kingdom}

Windows Server 2008 {launches today}

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

2008 Global Launch WaveIn a few hours time, I will be at the UK press launch for Microsoft’s 2008 Global Launch Wave – the launch of Windows Server 2008, Visual Studio 2008 and SQL Server 2008.

This is the biggest launch I’ve been involved in for a while (the last one I took part in was Exchange 4.0 Server in 1996) – although I did attend the Windows Server 2000 launch in London and the Windows XP launch in Sydney.

Although today is the global launch, as far as UK events go, today is a press event – the customer launch is on 19 March at the ICC in Birmingham. I’ll be there, on the Fujitsu stand, so come along and say hello if you get the chance. There will also be a sequence of regional launch events throughout April and May as well as a virtual launch website from 19 March until June.

The various UK user groups are also in the process of putting together two Community Days, hosted by Microsoft at their UK headquarters in Reading (Thames Valley Park). Details are yet to be finalised, so watch this space (or better still, check out the UK User Groups website, where pre-registration is available) and I’m hoping to get the chance to present at least one of the sessions, so if I’ve ever been the awkward one who asked too many questions in one of your presentations, now is the chance to get your own back…

[Coincidentally, this post marks another momentous occasion – it’s post number 1000 on this blog (although the count is not strictly accurate as there are still a few that are in draft state and haven’t actually made it to being published)]

Regaining control of e-mail with Inbox Zero

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

In my day job (the one that pays the bills… not this website), my boss is a guy called Garry Martin. At the risk of sounding sycophantic, I can learn a lot from Garry – not only because he somehow manages to walk the fine line between technical knowledge and effective management, but because he seems to do it with effortless efficiency. Modestly, he tells me that its all a façade and that I should see his office at home but there is a saying that perception is reality – and my perception is that he is highly productive – so I’m trying to learn some of the life hacking techniques that he uses.

The first on his list of techniques is Getting Things Done (GTD). I haven’t read David Allen’s book on GTD and my attempt to listen to the audio version on a transatlantic flight last November resulted in my falling asleep – so that didn’t get much done. Even so, I do listen to a lot of podcasts featuring Merlin Mann, who is something of a GTD evangelist, and have been meaning for some time now to watch the Google Tech Talk video on the method of e-mail management that Merlin refers to as Inbox Zero. It’s no co-incidence that Garry uses this method, and after he helped me to convert to the system last week, I’m hooked.

You see, I have long since been a slave to my e-mail – and as for RSS feeds, I rarely find the time to look at them (ironic for a prolific blogger). I also find instant messaging and SMS very inefficient methods of communication. So, I decided to start 2008 with a new system for managing mail – less frequent e-mail checks, fewer hierarchical folders, and more efficiency. It didn’t work. I still found that every month I needed a day to sort out e-mail. That is not the idea of productivity that I had in mind. I had the right ideas but was failing to process to zero.

Enter Inbox Zero.

“One of the most important soft skills you can have is figuring out how to deal with a high volume of e-mail.”

[Merlin Mann]

Inbox Zero is based on GTD – also known as advanced common sense. That is to say that the principles are obvious, but that we don’t always do obvious things. In his Google Tech Talk, Mann outlined four principles for e-mail management:

  1. E-mail is just a medium.
  2. There is one place for anything (no need for hierarchy).
  3. Process to zero (every time that you check e-mail – just checking is not enough).
  4. Convert to actions (even if that action is just to delete the message).

“One of the most important soft skills you can have is figuring out how to deal with a high volume of e-mail”

Mann suggest that just 5 verbs are enough to process all e-mail (delete, delegate, respond, defer and do) – I’m using a slightly different set of folders but the principle is the same – sorting messages (both Inbox and Sent Items) into a limited number of categories:

  • Action – I need to do something with this.
  • Archive – one folder, no hierarchy, searchable.
  • Review – have a look at this later.
  • Someday – this might be interesting, but not now.
  • Waiting – waiting for a response from someone else.

So that’s the structure, but how does it actually help to Get Things Done? Firstly, stop leaving e-mail open all day – get into the habit of regular processing – Merlin Mann suggests these tips:

  1. Do e-mail less (go and work!). Do you really need to look at your messages more than hourly? I try to only look at my mail only three to five times a day but that is practically impossible with the mail client open and notifications appearing every few minutes. So close Outlook/ Entourage/Evolution/Thunderbird/Mail, or whatever you use. And only open it for as long as it takes to process the Inbox to zero, a few times a day. It may feel a bit like going cold turkey but believe me, it’s worth it.
  2. Cheat. Filter e-mail and check daily, weekly, or whatever. So, for example, I receive a bunch of corporate communications that rarely require action. I can filter them straight to my review folder and check them daily.
  3. No fiddling. Forget the “where did I put stuff” mentality that comes with hierarchical storage systems. Inbox Zero is about creating and managing actions – anything more than that is just playing with e-mail.

Finally, when you set up your Inbox Zero structure, what about the pile of e-mail that is already waiting to be processed? Mann suggests two possibilities for this:

  1. E-mail bankruptcy – BCC everyone in your address book and tell them you are starting afresh and to resend anything that is important.
  2. That may not go down too well in a work environment so try an e-mail DMZ – copy everything to a folder, set up the Inbox Zero folders and process the DMZ in batches, as and when time permits. It still needs to be processed to zero but starting off with a mountain of e-mail will not help to get you organised.

The Inbox Zero video embedded above (or the audio version) should be mandatory reading for everyone in my organisation. The first 30 minutes are Merlin Mann’s talk and the second half consists of audience questions. Having managed to regain some control over my work communications (and it’s early days but I have a good feeling about this), I’m going to attack my home e-mail – all 3612 Inbox items (2504 unread), 1634 sent items, and hundreds of folders worth of messages.

Why Windows Vista should not be viewed as a failure

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Many people expect Windows Vista SP1 to be a turning point for deployment of Microsoft’s latest desktop operating system. Critics have derided Vista, citing it as a disaster for Microsoft, suggesting that it has suffered from adoption rates. Others say that the desktop operating system model is a thing of the past – to be replaced by a “webtop” of cloud-based services. I don’t think that day is here yet – and anyway, we’ve been here before – weren’t thin clients supposed to have taken over by now?

But has Vista really done that badly?

Journalist Paul Thurrott used an interesting comparison in a recent podcast (Windows Weekly episode 48). In a market where 260 million PCs were sold worldwide, Microsoft sold 100 million copies of Vista. Some of those would have been downgraded to Windows XP but what about the 160 million PCs sold without Vista? What were these running? A few Linux machines, some Mac OS X, but mostly Windows XP – and this is what some people are highlighting as a failure to sell Vista.

Thurrott was unable to locate global PC shipment figures for the same period after the Windows XP launch, but he used another method to compare the adoption rates of the two operating systems:

  • At the time of the Windows XP launch, the global installed base of PCs was around 250 million and Microsoft sold 23 million copies of XP in the first year. That equates to an adoption rate of around 9.2%.
  • Looking at the figures for Vista, on an installed base of around a billion PCs, Microsoft shipped 100 million copies – an adoption rate of around 10%.

So, Vista has had a greater adoption rate than XP in its first year and, however you look at it, Microsoft sold 100 million copies. To my mind, that’s not stunning – but not bad either – and it seems that Windows Vista adoption is on the increase. CDW’s Windows Vista Tracking Poll (January 2008) suggests that:

  • The number of organizations evaluating and testing Windows Vista has increased to 48% in January 2008 (up from 29% in February 2007 and 21% in October 2006).
  • 30% of organizations are currently implementing or have implemented Windows Vista.
  • Windows Vista is delivering on expected benefits, with nearly 50% of evaluators/implementers reporting performance above expectation on key features.
  • And, although not part of Windows Vista, but of equal significance whilst examining adoption of Microsoft’s core technologies, 24% of organizations have implemented the Office 2007 System, up from 6% in February 2007.

All of this suggests that Vista (and Office 2007) are already pretty successful. So why the perception that Vista is not ready for the enterprise?

The first barrier is the “wait for the first service pack” mentality. Regardless of its validity, this view certainly exists and the release of SP1 may allow some organisations to start their preparations.

Other perceived barriers are the hardware and software requirements for the operating system but the reality is that any system purchased in the last few years should be capable of running Vista. And, when it comes to device drivers and application support, Microsoft is caught up in a vicious circle where vendors are reluctant to invest in updating their product to work with Windows Vista and customers will not deploy the new operating system unless their hardware and application software requirements can be met. This is the reason that, according to Paul Thurrott, Microsoft worked to ensure that SP1 will resolve issues for 150 enterprise applications that were blocking large-scale customer deployments.

The third issue that I see is that of cost. In the late-1990s, we saw organisations perform technical refreshes every three years or so, fuelled by a combination of technology advances and preparations for avoidance of the “millennium bug”. In recent years, however, the need to roll out the latest and greatest has been tempered somewhat. Rolling out separate hardware and operating system upgrades is often seen as double the trouble, and, unless there is a business benefit that exceeds the disruption and cost of a new desktop environment, organisations are slow to make changes.

Instead, many organisations are considering a system of managed diversity – running Windows Vista on new (and recently purchased) systems but sticking with XP on older machines that do not yet warrant replacement, or where applications do not yet support Vista. This was what Gartner recommended back at the time of the Vista launch and it’s for exactly this reason that I have been critical of Microsoft for taking so long to develop a third service pack for XP – by the time SP3 arrives it will have been almost four years since the last one.

Finally, there is the issue of new features. Windows, like Mac OS X, and any other mature operating system has reached a point where some think it has too many features and others say it needs more. Microsoft has a particularly difficult battle, whereby if it bundles software with the operating system it falls foul of competition laws. It seems to me that many of the Windows Vista improvements are incremental – and that makes a wholesale migration difficult to justify. Perhaps the strongest argument to date has been productivity improvements but these may be offset by people needing to learn new methods of working. With the release of Windows Server 2008 this will start to change – new technologies like network access protection and some of the networking enhancements require a new server infrastructure and that’s when we will start to see a stronger case for adoption of new technology.

Microsoft’s problem is persuading customers to make the move from its own legacy and even when Windows XP is withdrawn from sale in June 2008 (although system builders will still be able to provide XP pre-installed until January 2009), extended support will continue until 2014. Interestingly, Gartner, the same organisation that advised customers to wait before moving to Vista is reported in IT Week as warning firms to start the introduction of Vista no later than 2009 because software vendors are likely to start phasing out Windows XP support after this.

Service pack 1 for Windows Vista is now available for customer download (with some restrictions). It won’t be released on Windows Update for a few months, due to issues with certain hardware devices for which new device drivers will need to be released first, but for those 48% of organisations that are evaluating Vista, SP1 will play a major part. Further details of Vista SP1 and its release schedule may be found in Paul Thurrott’s Vista SP1 FAQ.

Windows Vista may not be perfect (no desktop operating system that I am aware of is) but it does offer improvements over its predecessor and is reaching the mainstream business market. SP1 will accelerate the adoption rate but the main change is that, for many organisations, the move to Vista may be a gradual one and strategies for managing co-existence with legacy operating systems will be crucial.

Using Active Directory to authenticate users on a Mac OS X computer

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

One of the projects that I’ve been meaning to complete for a while now has been getting my Mac OS X computers to participate in my Active Directory (AD) domain. I got Active Directory working with Linux – so surely it should be possible to repeat the process on a system with BSD Unix at the core? Yes, as it happens, it is.

Before I explain what was necessary, it’s probably worth mentioning that the process is not the same for every version of OS X. As explained in a Microsoft TechNet magazine article from 2005, early implementations of OS X required schema changes in Active Directory in order to make things work. Thankfully, with OS X 10.4/10.5 (and possibly with later versions of 10.3 – although I haven’t tried), schema changes are no longer necessary.

By far and away the best resource on this subject is Nate Osborne’s Mac OS/Linux/Windows single sign-on article at the Big Nerd Ranch weblog. This told me just about everything I needed to know (with screenshots) but, crucially, when I tried this over a year ago on my OS 10.4 system I could not get the Mac to bind with Active Directory. This was despite having disabled digital signing of communications (not required for OS X 10.5) and it turned out that the problem is the internal DNS domain name that I use which uses a .local suffix. As described in Microsoft knowledge base article 836413, OS X treats .local domains as being Rendezvous/Bonjour hosts and needs to be told what to do. There is an Apple article that describes how to look up .local hostnames using both Bonjour and DNS; however I’m not sure that’s what fixed it on my OS X 10.5.2 system. My TCP/IP, DNS and WINS settings were all being provided by DHCP and, even though I added local to the list of search domains, it was the second listed domain (after the DHCP-suppled entry) and successful binding seemed to occur after I had pinged both the domain name and the domain controller (by name and by IP address) and performed an nslookup on the domain name. Another thing that I considered (but did not actually need to do) was to create a reverse lookup (PTR) record in DNS for the domain name. Retrying the process and binding to a domain with a suffix presented no issues at all.

Nate’s article is for OS X 10.4 (Tiger), and having got this working in OS X 10.5.2 (Leopard), I thought I post a few more screenshots to illustrate the process:

  1. First of all, open the OS X Directory Utility and Show Advanced Settings. Switch to the Services view and ensure that Active Directory is selected, then click the button with the pencil icon to edit the settings:
    Mac OS X 10.5 Directory Utility - Services
  2. Enter the domain name (home.local) in my case and computer name. In the Advanced Options, I left the user experience items at their defaults (more on that later):
    Mac OS X 10.5 Directory Utility - Active Directory User Experience options
  3. Switching to the administrative options reveals some more settings that are required – I checked the box to enable administration by the Domain Admins and Enterprise Admins groups, but others group or user accounts can be added as potential computer administrators:
    Mac OS X 10.5 Directory Utility - Active Directory Administrative options
  4. Click the bind button and, when prompted, supply appropriate credentials to join the Macintosh computer to the domain (i.e. AD credentials). This is the point where the location of the computer account is defined.
    Mac OS X 10.5 Directory Utility - Active Directory authentication
  5. If you receive an error relating to an invalid domain and forest combination being supplied, this is likely to be a DNS issue. Check that DNS name resolution is working (using the OS X Terminal utility and the ping or nslookup commands) and note my earlier comments about support for .local domain name suffixes – you may need to follow Apple’s advice to add local to the list of search domains:
    Mac OS X 10.5 Directory Utility - Invalid Domain error message
    Mac OS X 10.5 Network Preferences - DNS settings
  6. Once successfully bound to Active Directory, the group names for administration of the local computer will be expressed in the format domainname\groupname. The system event log on the domain controller that processed the directory request will also show a number of account management events, as the computer account is created and enabled, then the password is set and the associated attributes changed (password last set and service principal names):
    Mac OS X 10.5 Directory Utility - Active Directory Administrative options
  7. In the OS X Directory Utility, Click OK, and move to the Directory Servers view – is all is well then the domain name will be listed along with a comment that the server is responding normally:
    Mac OS X 10.5 Directory Utility - Directory Servers
  8. Active Directory/All Domains should also have been added to the Authentication and Contacts views in the Search Policy:
    Mac OS X 10.5 Directory Utility - Search Policy Authentication
    Mac OS X 10.5 Directory Utility - Search Policy Contacts

Following this, it should be possible to view AD contacts in the Directory and also to log on using an AD account (in domainname\accountname format). Although this worked for me, I was having some issues (which I suspect were down to a problematic AirPort connection). Once I had switched to wired Ethernet, I was able to reliably authenticate using Active Directory, although I did not re-map my home drive to the network (Leopard’s SMB/CIFS support is reported to be problematic and I felt that can of worms could stay closed for a little longer until I was comfortable that AD authentication was working well). Instead, and because my computer is a MacBook, so will often be disconnected from my network, I changed the User Experience options for Active Directory to use a mobile account – effectively creating a local account on the MacBook that is mapped to my domain user:

Mac OS X 10.5 Directory Utility - Active Directory User Experience options

At the next logon, I was prompted to create a mobile account and once this was done, I could access the computer whilst disconnected from the LAN, using the using the AD credentials for the last-logged-on user.

One more point that’s worth noting – if you have existing local accounts with the same name as an AD account, the permissions around user account settings get messy, with the AD logon resulting in a message that there was a problem creating your mobile account record and the local logon reporting that there was a problem while creating or accessing “Users/username“.

That’s all I needed; however I did compile a list of links that might be useful to others who come across issues whilst trying to get this working (perhaps on another version of OS X):

Changes to the Microsoft Remote Desktop Connection command line switches

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Garry Martin alerted me to a useful piece of information this morning…

It seems that the /console switch introduced to the Remote Desktop Connection client in Windows Server 2003 has been deprecated in Windows Server 2008 (and Windows Vista SP1). It fails silently (so you still think that you are connected to the console session) but closer inspection reveals that a different session number is in use.

The replacement command is mstsc /v:servername /adminJohn Howard has more information on his blog.

When Windows Updates turn bad

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last night, as I got ready to shut down the notebook PC that I use for work, I noticed that it had some Windows updates to apply. I left Windows doing its thing and went to bed, stopping this morning only for long enough to put the PC into my bag as I headed off for the station. Only when I was on the train did I fire it up to find that the PC would not boot, greeting me instead with the following message:

Windows Boot Manager

Windows failed to start. A recent hardware or software change might be the cause. To fix the problem:

1. Insert your Windows installation disc and restart your computer.
2. Choose you language settings and then click “Next.”
3. Click “Repair your computer.”

If you do not have this disc, contact our system administrator or computer manufacturer for assistance.

File: \Windows\system32\winload.exe

Status: 0xc000000f

Info: The selected entry couple not be loaded because the application is missing or corrupt.

I spent the rest of the journey to London contacting colleagues to see if anyone could bring a Vista DVD in with them (with no success). After that failed, I asked the local IT support guys (no chance – they view anyone who doesn’t run the corporately-sanctioned Windows XP build as a renegade who can make their own support arrangements). A colleague used his MSDN subscription to start downloading a DVD image for me onto another colleague’s computer, but after almost 3 hours it was still only 60% downloaded (and he needed to leave the office). So I gave up and headed home.

Once home, the recovery process was straightforward. I booted from DVD, followed the directions for a startup repair and, after a reboot or two, I could log on as normal but it does leave me wondering whether, as I finally get stuck into today’s work at 4pm (after leaving home for the office at 6.30am), blindly applying updates is such a good idea?

I don’t think there is a single “correct” answer to this. On one hand, I run a risk that an update turns bad on me – and losing a day’s productivity is fairly minor in the scheme of things (next time it could be far worse). On the other hand, what is the risk of waiting to apply updates until after they have been tested (even critical ones)? After all, at home I’m on a NATted network segment, protected by a firewall, and at work the protection from the outside world is even stronger. But what about protection from the inside – from colleagues and internal servers? What about when I work on a public 3G or WiFi network? I guess, like any security decision, its a balance between risk of a security breach and the convenience of continued system stability.

In the meantime, I’ll carry on applying updates when Microsoft pushes them at me. It’s the first time an update has turned bad on me (and that system is operating with around 1.5% free disk space, which may be a factor in the issues that I experience with it). Hopefully next week I’ll finally get my new notebook and start the switch to using Windows Server 2008 as my daily computing platform for work.