Securely wiping hard disks using Windows

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

My blog posts might be a bit sporadic over the next couple of weeks – I’m trying to squeeze the proverbial quart into a pint pot (in terms of my available time) and am cramming like crazy to get ready for my MCSE to MCITP upgrade exams.

I’m combining this Windows Server 2008 exam cramming with a review of John Savill’s Complete Guide to Windows Server 2008 and I hope to publish my review of that book soon afterwards.

One of the tips I picked up from the book this morning as I tried to learn as much as I could about Bitlocker drive encryption in an hour, was John’s tip for securely wiping hard drives using a couple of Windows commands:

format driveletter: /fs:ntfs /x

will force a dismount if required and reformat the drive, using NTFS.

cipher /w:driveletter:

will remove all data from the unused disk space on the chosen drive.

I don’t know how this compares with third party products that might be used for this function but I certainly thought it was a useful thing to know. This is not new to Windows Server 2008 either – it’s certainly available as far back as Windows XP and possibly further.

For more tips like this, check out the NTFAQ or John’s site at Savilltech.com.

Windows Vista and Server 2008 SP2 is opened up to the public, target release date announced

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

After the storm of announcements from Microsoft at PDC, WinHEC and TechEd EMEA it’s been a quiet few weeks but, for those who haven’t seen, Microsoft announced that the Windows Vista and Server 2008 Service Pack 2 beta will be opened up to a wider audience, starting with TechNet and MSDN subscribers at 14:00 tomorrow (I guess that’s Redmond time, so 22:00 here in the UK) and then via a broader customer preview programme (CPP) on Thursday (4 December).

This release is intended for technology enthusiasts, developers, and administrators who would like to test SP2 in their environments and with their applications prior to final release and, for most customers, Microsoft’s advice is to wait until the final release prior to installing this update.

Full details of the changes in the SP2 beta may be found in Microsoft’s Windows Server TechCenter.

Microsoft also announced the date that they are aiming for (not a firm commitment) – SP2 should be expected in the first half of 2009.

More Xtremely Technical seminars scheduled for spring 2009

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A couple of weeks back, I was lucky enough to attend one of John Craddock and Sally Storey’s XTSeminars on Windows Server 2008 (those who were at the inaugural Active Directory User Group meeting would have got a taster). I’d blogged about the event beforehand and it really was an excellent use of my time – I can’t understate how good these seminars are (think 2 whole days of detailed presentations and demonstrations, diving pretty deep in places, with none of the marketing overhead you would have in a Microsoft presentation).

If the credit crunch hasn’t hit your training budget yet, then you might want to consider one of the workshops that are scheduled for the spring and the current dates are:

  • 25-26 February 2009, Microsoft Active Directory Internals.
  • 11-12 March 2009, Active Directory Disaster Prevention and Recovery.
  • 18-19 March 2009, Windows Server 2008.

If you do decide that you’re interested in one of these sessions and you book onto it – please mention my name (of even get in touch with me to let me know) – it won’t make any difference to your booking process but it will help me if they know you heard about the seminars on this blog!

Allowing Remote Desktop connections to a server core computer in a workgroup

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Over the weekend, I was trying to access a Windows Server 2008 server core installation using the Remote Desktop Connection client. I’d enabled remote desktop connections (and legacy connections) with:

cscript %windir%\system32\scregedit.wsf /ar 0
cscript %windir%\system32\scregedit.wsf /cs 0

and both times the system reported that the:

Registry has been updated.

Even so, I still couldn’t successfully connect. It seemed logical that this was a firewall issue. Reading Daniel Petri’s article on configuring the firewall on server core for remote management confirmed that installing roles does indeed open the associated ports and that for domain-joined machines the firewall profile allows remote management; however for workgroup machines it may be necessary to run:

netsh advfirewall firewall set rule group=“remote administration” new enable=yes

Even though this returned:

Updated 3 rule(s).
Ok.

It still didn’t let me connect but then I noticed that remote desktop has its own firewall group (i.e. it’s not included in remote administration) so I tried something more specific:

netsh advfirewall firewall set rule group=“remote desktop” new enable=yes

The rule was updated:

Updated 1 rule(s).
Ok.

and I was able to connect to the server. I later found that Julie Smith also suggests this approach over at The Back Room Tech but most posts on the subject seem to be focused on opening ports for Microsoft Management Console (MMC)-based remote administration.

Building a branch office in a box?

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

For many organisations, branch offices are critical to business and often, rather than being a remote backwater, they represent the point of delivery for business. Meanwhile, organisations want to spend less on IT – and, as IT hardware and software prices fall, providing local resources improves performance for end-users. That sounds great until considering that local IT provision escalates support and administration costs so it makes more financial sense to deliver centralised services (which have a consequential effect on performance and availability). These conflicting business drivers create a real problem for organisations with a large number of branch offices.

For the last few weeks, I’ve been looking at a branch office consolidation exercise at a global organisation who seem to be suffering from server proliferation. One of the potential solutions for consolidation is using Windows Server 2008 and Hyper-V to provide a virtualised infrastructure – a “branch office in a box”, as Gartner described it in a research note from a few years ago [Gartner RAS Core Research Note G00131307, Joe Skorupa, 14 December 2005]. Windows Server 2008 licensing arrangements for virtualisation allow a server to run up to 4 virtualised operating system environments (with enterprise edition) or a single virtual and a single physical instance (with standard edition). It’s also possible to separate domain-level administration (local domain controllers, etc.) from local applications and infrastructure services (file, print, etc.) but such a solution doesn’t completely resolve the issue of maintaining a branch infrastructure.

Any consolidation at the branch level is a good thing but there’s still the issue of wide area network connectivity which means that, for each branch office, not only are there one or more Windows servers (with a number of virtualised workloads) to consider but also potentially some WAN optimisation hardware (e.g. a Cisco WAAS or a Riverbed Steelhead product).

Whilst I was researching the feasibility of such as solution, I came across a couple of alternative products from Cisco and Citrix which include Microsoft’s technology – and this post attempts to provide a high level overview of each of them (bear in mind I’m a Windows guy and I’m coming at this from the Windows perspective rather than from a deep networking point of view).

Cisco and Microsoft Windows Server on WAAS

When I found the Windows Server on WAAS website I thought this sounded like the answer to my problem – Windows Server running on a WAN optimisation appliance – the best of both worlds from two of the industry’s largest names, who may compete in some areas but still have an alliance partnership. In a video produced as part of the joint Cisco and Microsoft announcement of the Windows on WAAS solution, Cisco’s Vice President Marketing for Enterprise Solutions, Paul McNab, claims that this solution allows key Windows services to be placed locally at a reduced cost whilst providing increased flexibility for IT service provision; whilst Microsoft’s Bill Hilf, General Manager for Windows Server marketing and platform strategy, outlines how the branch office market is growing as workforces become more distributed and that the Windows on WAAS solution combines Windows Server IT services with Cisco WAAS’ WAN optimisation, reducing costs relating to infrastructure management and power usage whilst improving the user experience as services are brought closer to the user.

It all sounds good – so how does this solution work?

  • Windows on WAAS is an appliance-based solution which uses virtualisation technologies for Cisco WAAS and Microsoft Windows Server 2008 to run on a shared platform, combined with the advantages of rapid device provisioning. Whilst virtualisation in the datacentre has allowed consolidation, at the branch level the benefit is potentially the ability to reconfigure hardware without a refresh or even a visit from a technician.
  • Windows Server 2008 is used in server core installation mode to provide a reduced Windows Server footprint, with increased security and fewer patches to apply, whilst taking advantage of other Windows Server 2008 enhancements, such as improved SMB performance, a new TCP/IP stack, and read-only domain controllers for increased directory security at the branch.
  • On the WAAS side, Cisco cite improved application performance for TCP-based applications – typically 3-10 times better (and sometimes considerably more) as well as WAN bandwidth usage reduction and the ability to prioritise traffic.
  • Meanwhile, running services such as logon and printing locally means that end user productivity is increased.

Unfortunately, as I began to dig a little deeper (including a really interesting call with one of Cisco’s datacentre product specialists), it seems that this solution is constrained in a number of ways and so might not allow the complete eradication of Windows Server at the branch office.

Firstly, this is not a full Windows Server 2008 server core solution – only four roles are supported: Active Directory Domain Services; DHCP server; DNS server and Print services. Other services are neither supported, nor recommended – and the hardware specifications for the appliances are more akin to PCs (single PSU, etc.) than to servers.

It’s also two distinct solutions – Windows runs in a (KVM) virtual machine to provide local services to the branch and WAAS handles the network acceleration side of things – greatly improved with the v4.1 software release.

On the face of it (and remember I’m a Windows guy) the network acceleration sounds good – with three main methods employed:

  1. Improve native TCP performance (which Microsoft claim Windows Server 2008 does already) by quickly moving to a larger TCP window size and then lessening the flow once it reaches the point of data loss.
  2. Generic caching and compression.
  3. Application-specific acceleration for HTTP, MAPI, CIFS and NFS (but no native packet shaping capability).

All of this comes without the need to make any modifications to the existing network – no tunnelling and no TCP header changes – so the existing quality of service (QoS) and network security policies in place are unaffected by the intervening network acceleration (as long as there’s not another network provider between the branch and the hub with conflicting priorities).

From a support perspective Windows on WAAS is included in the SVVP (so is supported by Microsoft) but KVM will be a new technology for many organisations and there’s also a potential management issue as it’s my understanding that Cisco’s virtual blade technology (e.g. Windows on WAAS) does not yet support centralised management or third party management solutions.

Windows on WAAS is not inexpensive either (around $6,500 list price for a basic WAAS solution, plus another $2,000 for Windows on WAAS, and a further $1,500 if you buy the Windows licenses from Cisco). Add in the cost of the hardware – and the Cisco support from year 2 onwards – and you could buy (and maintain) quite a few Windows Servers in the branch. Of course this is not about cheap access to Windows services – the potential benefits of this solution are much broader – but it’s worth noting that if the network is controlled by a third party then WAN optimisation may not be practical either (for the reasons I alluded to above – if their WAN optimisation/prioritisation conflicts with yours, the net result is unlikely to result in improved performance).

As for competitive solutions, Cisco don’t even regard Citrix (more on them in a moment) as a serious player – from the Cisco perspective the main competition is Riverbed. I didn’t examine Riverbed’s appliances in this study because I was looking for solutions which supported native Windows services (Riverbed’s main focus is wide area application services and their wide area file services are not developed, supported or licensed by Microsoft, so will make uncomfortable bedfellows for many Windows administrators).

When I pressed Cisco for comment on Citrix’s solution, they made the point that WAN optimisation is not yet a mature market and it currently has half a dozen or more vendors competing whilst history from in other markets (e.g. SAN fabrics) would suggest that there will be a lot of consolidation before these solutions reach maturity (i.e. expect some vendors to fall by the wayside).

Citrix Branch Repeater/WANScaler

The Citrix Branch Repeater looks at the branch office problem from a different perspective – and, not surprisingly, that perspective is server-based computing, pairing with Citrix WANScaler in the datacentre. Originally based around Linux, Citrix now offer Branch Repeaters based on Windows Server.

When I spoke to one of Citrix’s product specialists in the UK, he explained to me that the WANScaler technologies used by the Branch Repeater include:

  1. Transparency – the header is left in place so there are no third-party network changes and there is no need to change QoS policies, firewall rules, etc.
  2. Flow control – similar to the Cisco WAAS algorithm (although, somewhat predictably, Citrix claim that their solution is slightly better than Cisco’s).
  3. Application support for CIFS, MAPI, TCP and, uniquely, ICA.

Whereas Cisco advocate turning off the ICA compression in order to compress at the TCP level, ICA is Citrix’s own protocol and they are able to use channel optimisation techniques to provide QoS on particular channels (ICA supports 32 channels in its client-server communications – e.g. mouse, keyboard, screen refresh, etc.) so that, for example, printing can be allowed to take a few seconds to cross the network but mouse, keyboard and screen updates must be maintained in near-real time. In the future, Citrix intend to extend this with cross-session ICA compression in order to use the binary history to reduce the volume of data transferred.

The Linux and Windows-based WANScalers are interoperable and, at the branch end, Citrix offers client software that mimics an appliance (e.g. for home-based workers) or various sizes of Branch Repeater with differing throughput capabilities running a complete Windows Server 2003 installation (not 2008) with the option of a built-in Microsoft ISA Server 2006 firewall and web caching server.

When I asked Citrix who they see as competition, they highlighted that one two companies have licensed Windows for use in an appliance (Citrix and Cisco) – so it seems that Citrix see Cisco as the competition in the branch office server/WAN optimisation appliance market – even if Cisco are not bothered about Citrix!

Summary

There is no clear “one size fits all” solution here and the Cisco Windows on WAAS and Citrix WANScaler solutions each provide significant benefits, albeit with a cost attached. When choosing a solution, it’s also important to consider the network traffic profile – including the protocols in use. The two vendors each come from a slightly different direction: in the case of Cisco this is clearly a piece of networking hardware and software which happens to run a version of Windows; and, for Citrix, the ability to manipulate ICA traffic for server-based computing scenarios is their strength.

In some cases neither the Cisco nor the Citrix solution will be cost effective and, if a third party manages the network, they may not even be able to provide any WAN optimisation benefits. This is why, in my customer scenario, the recommendation was to investigate the use of virtualisation to consolidate various physical servers onto a single Windows Server 2008 “branch office in a box”.

Finally, if such a project is still a little way off, then it may be worth taking a look the branch cache technology which is expected to be included within Windows Server 2008 R2. I’ll follow up with more information on this technology later.

Trusting a self-signed certificate in Windows

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

All good SSL certificates should come from a well-known certification authority – right? Not necessarily (as Alun Jones explains in defence of the self-signed certificate).

I have a number of devices at home that I access over HTTPS and for which the certificates are not signed by Verisign, Thawte, or any of the other common providers. And, whilst I could get a free or inexpensive certificate for these devices, why bother when only I need to access them – and I do trust the self-signed cert!

A case in point is the administration page for my NetGear ReadyNAS – this post describes how I got around it with Internet Explorer (IE) but the principle is the same for any self-signed certificate.

First of all, I added the address to my trusted sites list. As the ReadyNAS FAQ describes, this is necessary on Windows Vista in order to present the option to install the certificate and the same applies on my Windows Server 2008 system. Adding the site to the trusted sites list won’t stop IE from blocking navigation though, telling me that:

There is a problem with this website’s security certificate.

The security certificate presented by this website was not issued by a trusted certificate authority.

Security certificates problems may indicate an attempt to fool you or intercept any data you send to the server.

We recommend that you close this webpage and do not continue to this website.

Fair enough – but I do trust this site, so I clicked the link to continue to the website regardless of Microsoft’s warning. So, IE gave me another security warning:

Security Warning

The current webpage is trying to open a site in your Trusted sites list. Do you want to allow this?

Current site: res://ieframe.dll
Trusted site: https://
mydeviceurl

Thank you IE… but yes, that’s why I clicked the link (I know, we have to protect users from themselves sometimes… but the chances are that they won’t understand this second warning and will just click the yes button anyway). After clicking yes to acknowledge the warning (which was a conscious choice!) I could authenticate and access the website.

Two warnings every time I access a site is an inconvenience, so I viewed the certificate details and clicked the button to install the certificate (if the button is not visible, check the status bar to see that IE has recognised the site as from the Trusted Sites security zone). This will launch the Certificate Import Wizard but it’s not sufficient to select the defaults – the certificate must be placed in the Trusted Root Certification Authorities store, which will present another warning:

Security Warning

You are about to install a certificate from a certification authority (CA) claiming to represent:

mydeviceurl

Windows cannot validate that the certificate is actually from “certificateissuer“. You should confirm its origin by contacting “certificateissuer“. The following number will assist you in this process:

Thumbprint (sha1): thumbprint

Warning:

If you install this root certificate, Windows will automatically trust any certificate issued by this CA. Installing a certificate with an unconfirmed thumbprint is a security risk. If you click “Yes” you acknowledge this risk.

Do you want to install this certificate?

Yes please! After successfully importing the certificate and restarting my browser, I could go straight to the page I wanted with no warnings – just the expected authentication prompt.

Incidentally, although I used Internet Explorer (version 8 beta) to work through this, once the certificate is in the store, then all browsers any browser that uses the certificate store in Windows should act in the same manner (the certificate store is not browser-specific some browsers, e.g. Firefox, implement their own certificate store). To test this, I fired up Google Chrome and it was able to access the site I had just trusted with no issue but if I went to another, untrusted, address with a self-signed certfiicate (e.g. my wireless access point), Chrome told me that:

The site’s security certificate is not trusted!

You attempted to reach mydeviceurl but the server presented a certificate issued by an entity that is not trusted by your computer’s operating system. This may mean that the server has generated its own security credentials, which Google Chrome cannot rely on for identity information, or an attacker may be trying to intercept your communications. You should not proceed, especially if you have never seen this warning before for this site.

Chrome also has some excellent text at a link labelled “help me understand” which clearly explains the problem. Unfortunately, although Chrome exposes Windows certificate management (in the options, on the under the hood page, under security), it doesn’t allow addition a site to the trusted sites zone (which is an IE concept) – and that means the option to install the cerficate is not available in Chrome. In imagine it’s similar in Firefox or Opera (or Safari – although I’m not sure who would actually want to run Safari on Windows).

Before signing off, I’ll mention that problems may also occur if the certificate is signed with invalid details – for example the certificate on my wireless access point applies to another URL (www.netgear.com) and, as that’s not the address I use to access the device, that certificate will still be invalid. The only way around a problem like this is to install another, valid, certificate (self-signed or otherwise).

Ready for an Xtremely Technical seminar on Windows Server 2008?

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve always been impressed with John Craddock and Sally Storey’s presentations on Active Directory and related topics so, a couple of weeks back, I was pleased to catch up with them as they presented at the inaugural meeting of the Active Directory User Group.

In that session, John and Sally gave a quick overview of the new features in Windows Server 2008 Active Directory as well as the new read only domain controller (RODC) functionality and, if that whet your appetite (or if you missed it and think you’d like to know more), it may be of interest to know that John and Sally are running one of their XTSeminars later this month, looking at Windows Server 2008 infrastructure design, configuration and deployment. Topics include:

  • Building virtual environments with Hyper-V.
  • Creating high-availability with application and virtual machine clustering.
  • Windows imaging and the Windows Deployment Services (WDS).
  • What’s new in the Active Directory.
  • The benefits and caveats of Read Only Domain Controllers (RODC).
  • Windows networking with IPv6 and Network Access Protection (NAP).
  • Managing Server Core.

This is a chargeable event but I’ve never been disappointed by one John and Sally’s presentations, which are dedicated to delivering good technical content in a highly consumable format. For more information, and to book a place, visit the XTSeminars website.

(For a limited time only, using the code CC349, you can attend this two day event for just £349 For other seminars, try TN1384 for a 35% discount.)

Windows Vista (and Server 2008) SP2 beta announced

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Next week’s Professional Developer’s Conference should see lots of news from Microsoft around Windows 7, Windows Server 2008 R2 and Microsoft’s cloud computing strategy but those who are looking for something that should be hear a a little sooner, the Windows Vista team’s announcement that a beta of service pack 2 is just around the corner will probabably be of interest.

As seems to be the norm these days, the service pack will include new functionality (including Windows Search 4.0, native Blu-Ray support and updated Bluetooth and Wi-Fi connectivity options) but, even though some of these features are client-focused, it intended that a single service pack will apply to both client and server versions of Windows (quite how that works, only time will tell – the Windows Server team is focusing on including the RTM version of Hyper-V and power improvements in SP2 – perhaps it will be a single service pack, but two different versions?).

No news yet as to an intended release date for the final service pack – Microsoft’s Mike Nash wrote:

“The final release date for Windows Vista SP2 will be based on quality. So we’ll track customer and partner feedback from the beta program before setting a final date for the release.”

Windows Vista SP2 beta will be available to a limited group of testers from 29 October.

Microsoft Virtualization: part 7 (wrap up and additional resources)

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Over the last few weeks (it was originally supposed to be a few days… but work got in the way), I’ve written several posts on Microsoft Virtualization:

  1. Introduction.
  2. Host virtualisation.
  3. Desktop virtualisation.
  4. Application virtualisation.
  5. Presentation virtualisation.
  6. Management.

I thought I’d wrap-up the series by mentioning the Microsoft Assessment and Planning Toolkit (MAP) solution accelerator – a free inventory, assessment and reporting tool which can help with planning the implementation of various Microsoft technologies – including Windows Server 2008 Hyper-V (v3.2 is in a public beta at the time of writing) – to find out more about MAP try and catch (in person or virtually) Baldwin Ng’s session at the November meeting of the Microsoft Virtualization User Group.

Also worth noting is the 7 hours of free e-learning courses that Microsoft has made available:

  • Clinic 5935: Introducing Hyper-V in Windows Server 2008
  • Clinic 6334: Exploring Microsoft System Center Virtual Machine Manager 2008
  • Clinic 6335: Exploring Microsoft Application Virtualization
  • Clinic 6336: Exploring Terminal Services in Windows Server 2008

Microsoft’s virtualisation portfolio is not complete (storage and network virtualisation are not included but these are not exactly Microsoft’s core competencies either); however it is strong, growing fast, and not to be dismissed.

Microsoft Virtualization: part 5 (presentation virtualisation)

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Continuing the series of posts on Microsoft Virtualization technologies, I’ll move onto what Microsoft refers to as presentation virtualisation (and everyone else calls terminal services, or server based computing).

Like host virtualisation, Terminal Services is not a new technology and Microsoft has provided basic Terminal Server capabilities within Windows Server for many years, with Citrix providing the enterprise functionality for those who need it. With Windows Server 2008, Microsoft has taken a step forward, introducing new Terminal Services functionality – with new features including:

  • Terminal Services Web Access – providing a web portal for access to RemoteApps – applications which run on the terminal server but have the look and feel of a local application (albeit subject to the limitations of the RDP connection – this is probably not the best way to deploy graphics-intensive applications). Whilst this is a great feature, it is somewhat let down by the fact that the Web Access portal is not customisable and that all users see all RemoteApps (although permissions are applied to control the execution of RemoteApps). For web access to RemoteApps, v6.1 of the Remote Desktop Connection (RDP) client is required but for v6.0 clients an MSI may be created using RemoteApp Manager (which may be deployed using Active Directory group policy).
  • Terminal Services Gateway – provides a seamless connection to Terminal Services (over HTTPS) without need for a VPN. It’s not intended to replace the need for a firewall (e.g. ISA Server) but it does mean that only one port needs to be opened (443) and may be an appropriate solution when a local copy of the data is not required or when bandwidth/application characteristics make the VPN experience poor.
  • Terminal Services Session Broker – a new role to provide load balancing and which enables a user to reconnect to an existing session in a load-balanced terminal server farm.

There are improvements on the client end too – for details of the client enhancements in Remote Desktop Connection (v6.1), provided with Windows XP SP3, Vista SP1 and Server 2008 see Microsoft knowledge base article 951616.

One of the more signicificant improvements in RDP 6.1 (but which requires Windows Server 2008 Terminal Services Printing) is Terminal Services EasyPrint. Whereas printing is traditionally problematic in a server-based computing environment (matching drivers, etc.) – Terminal Services EasyPrint presents a local print dialog and prints to the local printer – no print drivers are required on the server and there is complete transparency if a 32-bit client is used with a 64-bit server. If the application understands XPS (i.e. it uses the Windows Presentation Framework) then it prints XPS using the EasyPrint XPS Driver (which creates an XPS spool file). Otherwise there is a GDI to XPS conversion module (e.g. for Win32 applications). On the client side, the spool file is received over RDP using the Remote Desktop Connection with an EasyPrint plugin to spool the XPS through an XPS printer driver (converted by print processor if required). If the print device does not support XPS, the print job is converted to EMF by the Microsoft.NET Framework and printed using a GDI printer driver.

Terminal Services EasyPrint

Whilst Microsoft’s presentation virtualisation offerings may not be as fully-featured as those from other vendors, most notably Citrix, they are included within the Windows Server 2008 operating system and offer a lot of additional functionality when compared with previous Windows Server releases.

In the next post in this series, I’ll look at how the four strands of Microsoft Virtualization (host/server, desktop, application and presentation) are encapsulated within an overall management framework using System Center products.