Windows Server UK user group

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Scotty McLeod has been working hard to get the Windows Server User Group UK website off the existing SharePoint platform and onto something more appropriate (SharePoint is great for some things but it was not great for the user group website – hey, even SharePoint blogs uses Community Server! Come to think of it, so do most of the Microsoft blog sites!).

Anyway, head over to http://www.winserverteam.org.uk/. It’s still work in progress but, over the coming weeks and months, I’m hoping it will grow to become a lively discussion area (backed up with regular meetings) for UK-based IT Professionals who are interested in the development of the Windows Server platform.

Some more about Terminal Services Gateway Servers

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

In an earlier post, I mentioned Austin Osuide’s recent Windows Server User Group presentation on Terminal Services Gateway Server and what follows is some of the detail from that session.

Terminal Services Gateway Server is a server role in Windows Server 2008 – effectively a protocol translator that allows authorised users to remotely access resources on a corporate LAN using RDP over HTTPS.

Up until now, it’s been necessary to open TCP port 3389 to allow RDP traffic through the corporate firewall but by encapsulating the RDP traffic within an SSL-secured tunnel, control may be exercised over which computers (and hence which applications), users can connect to and from. Other advantages include the fact that there is no need for a VPN infrastructure, so connectivity can be gained from any PC, anywhere (home, hotel, business partner or client premises, mobile or wireless hotspot). Then there are other advantages for IT organisations looking to reduce costs…

Consider a large outsourcer, with many support teams, each supporting a single customer’s infrastructure. What if one team (with the appropriately scaled resources) could manage multiple networks? Maybe even in an offshore scenario? As an IT professional, I’m not keen on this and as a customer I would be concerned about the potential impact on security but what if my managers could convince the customer that they can maintain security in a global infrastructure such as this? Using technologies such as network access protection (NAP) the health of any connected devices can be ensured and terminal services gateway servers can be deployed to control who can connect to which computers – perhaps only to defined administrative servers with a controlled application set.

The process for connection is as follows:

  1. Client tunnel RDP connection through HTTPS.
  2. Terminal Services Gateway strips out the HTTPS encapsulation and forwards the request to the terminal server/remote desktop (if the request passes appropriate policy checks – connection authorisation policies control who can connect and resource authorisation policies control what they can connect to, using user-defined or built-in groups for servers).
  3. Remote machine beleives that the request has come directly from the client and responds appropriately.

It all sounds straightforward enough but, as Austin explained, there are some gotchas too:

  • As for when tunneling RPC through HTTPS with Outlook and Exchange, the certificate must be recognised as valid – there is no manual option to trust a site if there are certificate issues (as there would be when browsing the Internet). There are two possible options:
    1. Establish a corporate public key infrastructure and install the appropriate certificates on the client. The downside to this is where clients don’t allow certificates to be installed by users (e.g. in a kiosk scenario).
    2. Alternatively, purchase a certificate from a trusted certification authority.
  • At least initially, few organisations will have a PKI based on Windows Server 2008 and, due to the removal of the Xenroll ActiveX control from Windows Vista (see Microsoft knowledge base article 922706), Windows Vista and Windows Server 2008 computers cannot use the WIndows 2000/2003 CA web interface (or indeed the equivalent interfaces on the Thawte or Verisign websites). It should be possible to craft an appropriate web server certificate using the MMC Certificates snap-in, but the common name for the server needs to be fully qualified and the MMC tools insert an unqualified name in a computer certificate. Thankfully there is another method – using the certreq.exe command line tool and a .inf file with the certificate template information (the syntax is described in Microsoft knowledgebase article 321051 and certutil -csplist will list the trusted cryptographic service providers), for example:

    [Version]
    Signature="$Windows NT$

    [NewRequest]
    Subject = "CN=servername.domainname.tld"
    KeySpec = 1
    KeyLength = 2048
    Exportable = TRUE
    MachineKeySet = TRUE
    SMIME = False
    PrivateKeyArchive = FALSE
    UserProtected = FALSE
    UseExistingKeySet = FALSE
    ProviderName = "Microsoft RSA SChannel Cryptographic Provider"
    ProviderType = 12
    RequestType = PKCS10
    KeyUsage = 0xa0

    [EnhancedKeyUsageExtension]
    OID=1.3.6.1.5.5.7.3.1

In terms of best practice, Austin had some more advice to give:

  • Use a dedicated Terminal Services Gateway Server.
  • Consider placing the gateway server behind an ISA server.
  • Terminate the SSL connection in the DMZ and put the Terminal Services Gateway Server on the corporate network
  • VPNs may still have a place in the infrastructure – Terminal Services gateway servers are best used where no local copy of the data is required or where bandwidth issues mean that the user experience over a VPN experience is poor.

Further information

Microsoft Terminal Services.
Microsoft Terminal Services team blog.
Terminal Services in the Windows Server 2008 Technical Library.

James May’s 20th Century

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve just spent an hour in front of the gogglebox watching James May’s 20th Century. “What’s that?”, you may ask. It’s a new BBC/Open University television programme looking at technology and how it’s changed the way in which we live over the last hundred-or-so years.

Tonight featured two episodes: the first looking at how the world became smaller with the development of air and motorised road travel and how, ironically, it was not supersonic aircraft travel that became the accepted means to “shrink” our planet but computers, fibre optics and the Internet; and the second looking at how the space race grew from one man’s dreams to a desire for military supremecy and eventually to a means to communicate (bit of a theme running here…) – I never realised just how many satellites are in orbit around the world.

Anyway, for UK readers with even the remotest interest in technology (and if you don’t have that, I’m surprised that you’re reading this blog), it’s fascinating viewing. Even my wife was interested, although as our toddler son is asking more and more “who?, “what?”, “when?” and “why?” questions this could be the science lesson that she needs in order to be able to keep up!

Why the banks just don’t get IT

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Identity theft worries me. It doesn’t stop me sleeping at night but nevertheless it does worry me.

It seems that each time I log in to a banking website the security has been “enhanced” with yet another item that I fail to enter correctly and then have to call the helpdesk to get my account unlocked – and I’m an IT guy… what about the “normal” users (they probably write down the details somewhere)!

Mark James has written an interesting article about this issue – and how the answer is really quite simple – if only the banks would apply the same security approach to consumer banking as corporates do for remote access.

Security – Why the banks just don’t get IT

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few weeks back, I read a column in the IT trade press about my bank’s botched attempt to upgrade their website security and I realised that it’s not just me who thinks banks have got it all wrong…

You see, the banks are caught in a dilemma between providing convenient access for their customers and keeping it secure. That sounds reasonable enough until you consider that most casual Internet users are not too hot on security and so the banks have to dumb it down a bit.

Frankly, it amazes me that information like my mother’s maiden name, my date of birth, and the town where I was born are used for “security” – they are all publicly available details and if someone wanted to spoof my identity it would be pretty easy to get hold of them all!

But my bank is not alone in overdressing their (rather basic) security – one of their competitors recently “made some enhancements to [their] login process, ensuring [my] money is even safer”, resulting in what I can only describe as an unmitigated user experience nightmare.

First I have to remember a customer number (which can at least be stored in a cookie – not advisable on a shared-user PC) and, bizarrely, my last name (in case the customer number doesn’t uniquely identify me?). After supplying those details correctly, I’m presented with a screen similar to the one shown below:

Screenshot of ING Direct login screen

So what’s wrong with that? Well, for starters, I haven’t a clue what the last three digits of my oldest open account are so that anti-phishing question doesn’t work. Then, to avoid keystroke loggers, I have to click on the key pad buttons to enter the PIN and memorable date. That would be fair enough except that they are not in a logical order and they move around at every attempt to log in. This is more like an IQ test than a security screen (although the bank describes it as “simple”)!

I could continue with the anecdotal user experience disasters but I think I’ve probably got my point across by now. Paradoxically, the answer is quite simple and in daily use by many commercial organisations. Whilst banks are sticking with single factor (something you know) login credentials for their customers, companies often use multiple factor authentication for secure remote access by employees. I have a login ID and a token which generates a seemingly random (actually highly mathematical) 6 digit number that I combine with a PIN to access my company network. It’s easy and all it needs is knowledge of the website URL, my login ID and PIN (things that I know), together with physical access to my security token (something I have). For me, those things are easy to remember but for someone else to guess – practically impossible.

I suspect the reason that the banks have stuck with their security theatre is down to cost. So, would someone please remind me, how many billions did the UK high-street banks make in profit last year? And how much money is lost in identity theft every day? A few pounds for a token doesn’t seem too expensive to me. Failing that, why not make card readers a condition of access to online banking and use the Chip and PIN system with our bank cards?

[This post originally appeared on the Seriosoft blog, under the pseudonym Mark James.]

Windows Vista volume activation failure

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

When I upgraded my Vista installation from a (not-yet activated) copy of Windows Vista Business Edition to Windows Vista Enterprise Edition, the activation counter was reset to 30 days; however, since then it’s been bugging me with the following message

Volume activation has failed.

Your computer could not be activated.

Error:
0x8007232B
Description:
DNS name does not exist

Cryptic though the message is, it’s really quite simple – this is a volume licensed (Enterprise) copy of Windows Vista so it is looking for a key management server (KMS) to activate itself. I’m at home today, so it can’t find one but in any case, as I had not provided a product key during installation, Vista could not activate. Once I provided the appropriate multiple activation key (MAK), Vista was able to activate via the Microsoft servers.

It was interesting to see the changes in the system properties as activation took place. First the remaining time to activate dropped from 24 days (30 days minus the 6 since I upgraded the PC) to 5 days when the MAK was accepted. Then, once activation had completed successfully, Windows acknowledged that it was activated and genuine.

There’s more information about this error in Microsoft knowledge base article 938107 and Christian Mohn has blogged about a similar experience he had with Windows Vista Business Edition requiring the product key to be re-entered.

Open XML documents driving me insane on the Mac

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few weeks back, I wrote about how smart Office 2003 had been in detecting my need for an Office 2007 document converter and opening it for me. If only I could say the same for Office 2004 on the Mac. I’m all too familiar with Microsoft product groups working independently but the MacBU has excelled (excuse the pun) in its inability to ship a working document converter for the Open XML document formats more than seven months after the release of Office 2007 on Windows.

To make matters worse, Office 2008 for Mac (which uses the new file formats) is a closed beta so I can’t use that to convert/open the files.

Ironically, there are various reports of using an alternative office suite like OpenOffice or NeoOffice to open the files! Hmm… not such a smart business move for Microsoft then…

My Digital Life has information on the various options for working with Open XML in Office 2004 for Mac. Mac Mojo (the Mac Office team blog) has information about a beta converter for Word documents (only).

Microsoft Office: save as PDF or XPS

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Microsoft doesn’t provide portable document format (PDF) compatibility within Office 2007 but there is a free add-in to allow Office applications to save documents as a PDF or XPS (formerly codenamed Metro) document.

XKCD

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Make me a sandwich.  What?  Make it yourself.  sudo make me a sandwich.  OkayThanks to a Red Hat geek gift guide that I stumbled on whilst writing another post, I just found a very amusing (I’ll resist the urge to say cool) T-shirt for sale featuring this cartoon – there’s more like this at the XKCD store.

If you like it, check out the XKCD comic too. Very funny. If you are a geek. I believe that I am.

The Microsoft-Novell alliance – good, bad or ugly?

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few weeks back, I attended a Novell webcast about last year’s Novell-Microsoft collaboration agreement. Although that particular event was for partners, I’ve since found that the same presentation is available to a wider audience so I’m not breaching any NDAs by writing a bit more here about what this is all about.

We live in a heterogeneous world; most of the world’s data centres run a combination of mainframe operating systems, Unix, Windows and Linux. As commodity server hardware takes hold, many organisations previously running Unix-derived operating systems are starting to look at Linux (what Novell don’t say is that many won’t consider running Linux because of concerns about the supportability of open source software). Clearly a move from Unix to Linux is easier than a move to Windows, so (according to Novell), Microsoft has taken the pragmatic approach and partnered with Novell, who claim that SUSE Enterprise Linux is more prevalent in data centres than Rad Hat – the number one Linux distribution (I’m sure that Microsoft would argue that Windows Server 2003 and 2008 include better integration with and application support for Unix-like operating systems).

The Novell-Microsoft collaboration agreement focuses on three technology areas:

  • Virtualisation – virtualisation is a hot topic and the various competing technologies each take a different approach. Novell and Microsoft consider their solutions (with interoperability options for Xen and Windows Server Virtualisation) to give the best in performance, support, interoperability, cost and management (I’d say that’s not yet true, but may soon become closer to the truth with Windows Server Virtualization). Novell are quick to point out that Red Hat now include Xen (since Red Hat Enterprise Linux 5) but only support their own operating system in a virtual environment whereas Novell will support Red Hat, SUSE and Windows (NT/2000/2003) guests.
  • Heterogeneous systems management – today’s server management products are a minefield of standard-based and proprietary software. Under the Novell-Microsoft collaboration deal, the two companies will co-sponsor and contribute to a number of open source WS-Management products. They will also improve federation between Microsoft Active Directory and Novell eDirectory with WS-Federation and WS-Security.
  • Document format capability – Novell describes Microsoft as having a “heathy market share” (I’d call that an understatement – others might consider Microsoft’s dominance of the Office productivity application market to be unhealthy). Novell considers the open document format (ODF) to be growing in support (if not from Microsoft) and project that it will soon become the standard for governments. Under the agreement, Microsoft and Novell will co-operate to make it easier for customers use either or both Open XML and ODF formats.

Under the terms of the arrangement, Microsoft has purchased vouchers that may be exchanged for copies of SUSE Enterprise Linux and will issue them to customers who are looking at Linux in a cross-licensing arrangement that indemnifies SUSE Enterprise Linux users from patent infringement claims – as discussed in episode 93 of the Security Now podcast (transcript) – in return, Novell hopes to become the Enterprise Linux of choice and has issued a similar covenant to indemnify Microsoft customers against claims on their patents.

Remember that this information has come from Novell – not Microsoft – and there is a lot of fear uncertainty and doubt (FUD) circulating at present about Microsoft’s true motives for a Microsoft-Linux alliance (including rumours of open source software’s wide infringement on Microsoft’s software patents).

As an infrastructure architect working for systems integrator, my personal view is that anything that leads to interoperability improvements is a bonus. I’m not sure that’s what we have here – the Microsoft-Novell relationship seems (to me) to be more about marketing than anything substantive (although they have announced a joint technical roadmap) but we’ll see how this works out – it has certainly got the Linux movement up in arms as Microsoft has announced further partnerships with some less significant distributions (including Xandros and Linspire) and consumer electronics giants who use Linux in their products (notably Samsung and LG).

It will be interesting to see how Ubuntu reacts over time (Ubuntu founder, Mark Shuttleworth’s latest reaction is neither hostile nor approving although he did earlier incite OpenSUSE developers to defect to Ubuntu and can now be quoted as saying:

“We have declined to discuss any agreement with Microsoft under the threat of unspecified patent infringements.”

[Mark Shuttleworth, founder of the Ubuntu project]

I’m certainly not expecting a Microsoft deal from the number one Linux distribution:

“We believe…

It was inevitable. The best technology has been acknowledged.

The relentless march of open source is shaking up the industry by freeing customers from proprietary lock-in and lack of choice.

[…]

We will not compromise.”

[Red Hat statement on the Microsoft Novell announcement]

There’s more from Red Hat’s Mark Webbink and ars technica has a good review of why he is ever-so-slightly misguided in his assertion that:

“These guys made noise. Larry Ellison had the effect he wanted to have, and our stock price went down. But let’s see where we all are a year from now. We will still be standing. We still believe that we will be the dominant player in the Linux market because, by that time, there won’t be any other Linux players. We will have succeeded once again.”

[Enterprise Linux News – Red Hat: We will be here in one year, Novell will not.]

Whilst I’ve not spoken to anybody at Microsoft on this particular topic, it does strike me that Microsoft employees are, by and large, either extremely defensive, or a touch arrogant, when open source software is mentioned (to be fair, so are representatives of many companies if you ask them to talk about the competition). Maybe Microsoft can help make a better Linux (as the Linspire agreement suggests) but will they? Well, for one example, they rejected my feature request for Linux client support in Windows Home Server; and one Microsoft employee had a good point when we were discussing my desire to see (ideally not DRM at all, but more realistically) a single cross-platform standards-based DRM solution – “would [Linux users] accept a solution from Microsoft?” (to which I would append “, Apple or any other closed source vendor?”) – probably not.

Further information

Microsoft Interoperability.
Novell/Microsoft more interop.
Novell and Microsoft collaborate – customers win.

Is a picture worth a thousand words?

Novell's new business strategy (from ars technica)ars technica has a visual timeline of the Novell-Microsoft controversy, including this gem of an illustration for Novell’s apparent business strategy.