Windows 2000 mainstream support is due to end today and a couple of days back, as expected, Microsoft released update rollup 1 for Windows 2000 service pack 4. Full details of this update (including why it is not called Windows 2000 service pack 5) are included in Microsoft knowledge base article 891861 and the problems which it resolves are listed in Microsoft knowledge base article 900345.
I’ve been working with Windows 2000 since the late 90s, when it was NT 5.0 beta 2, and I guess I’ll still be using it for a while yet (as will many of my clients) but for a view on why 48% of corporates are still using Windows 2000, see my decision time for Windows 2000 users post from a few days back.
I spent some time yesterday chaining two ISA Server 2000 proxy arrays. As there doesn’t seem to be a lot of information about on the subject, I thought I’d provide some here (most of what can be found easily is about using proxy chaining for anonymity, and mostly reads as if it’s intended for nefarious purposes).
Both Microsoft Proxy Server and Microsoft Internet Security and Acceleration (ISA) Server can be configured in a chain to distribute the web cache, forming hierarchical configurations with other proxy servers. One example of where this might be useful is for networks with geographically separated segments, such as branch offices. The proxy chain allows servers to query an upstream serverâ€™s cache before accessing resources on the Internet. Using this type of configuration, the clients in the branch office benefit from a local cache as well as the cache at the main office.
Proxy chaining is also known as proxy cascading or hierarchical caching and it is primarily used to improve cache performance and balance the caching load by placing information closer to proxy clients (note that only client requests for the web proxy service can be routed upstream because only the web proxy service uses caching).
My client’s scenario was slightly different. They have two proxy arrays – one for the Americas and another for Europe, the Middle East and Africa (EMEA) but they also need to access resources in their parent company using that company’s proxy server (accessed across the private network). Using this configuration, any requests that are not cached in the local proxy will require Internet access, unless the domain name matches the internal domain name for the parent company in which case they need to be forwarded to the parent company’s proxy servers.
This is fairly straightforward to configure but it is important to check first that the proxy server(s) can perform a DNS lookup for the upstream proxy server(s) and access the appropriate network. When I originally configured the proxy array, I had secured the network interfaces and added a static route to the organisation’s internal network using the route -p add networkaddress mask subnetmask gatewayaddress command but I also had to add a route to the parent company’s network, otherwise all requests that didn’t match the internal network were routed via the external (Internet-facing) default gateway. Once I could both resolve the upstream proxy server names in DNS and ping them I was ready to configure the routes in ISA Server.
I already had a destination set for the internal domain, based on the IP address range for the internal network (10.0.0.0-10.255.255.255), so I added another one for the parent company’s internal DNS domain name (*.companyname.countrycode). Once this was complete, I could establish a new routing rule in the ISA Server network configuration. Leaving the default rule in place as the last to be applied (routing all destinations directly to the specified destination), I added another rule with a higher rank order which applied to my new destination set and routed them to a specified upstream server (i.e. the parent company’s proxy server), with no caching in this case. Verifying the configuration was as simple as accessing the sites from a browser and reviewing the access logs, where access requests for the parent company’s internal sites were shown with an s-object-source of “upstream”.
Last year I blogged about Microsoft’s acquisition of Giant Software and I’ve been using their AntiSpyware Beta since it was made available in January; but last week I was looking at the inordinate amount of spam my Dad receives and that got me thinking about the overall security on his PC (which has my e-mail addresses in the address book!). After installing Lavasoft Ad-Aware SE Personal, I found that the Microsoft AntiSpyware Beta product he had been using was doing a pretty good job, but there were a load of tracking cookies which it had not identified. Today, I ran the same tests on two of my PCs and found the same.
As the Microsoft product is based on Giant’s well-regarded software I decided to look a bit deeper…
It turns out that although the Giant version of the product scans for cookies, the Microsoft version does not as they are not regarded as a threat (despite Ad-Aware classifying them as critical objects). In their information for Giant AntiSpyware users who have active subscriptions, Microsoft says:
So are cookies a threat? The answer is both “Yes” and “No”. Quoting from an HP article on where spyware hides:
“Cookies can help users streamline online transactions, remember browsing preferences and user profiles, and personalize pages. Many users don’t realize that cookies can be used to compile data so companies can construct a profile about the websites they visit and the web banner advertisements they click through. This information is mined so companies can deliver targeted ads.
Some websites respectfully use temporary cookies (session cookies) that disappear when you close the browser. Many more websites use persistent cookies that remain on your hard drive indefinitely. Microsoft Internet Explorer and Netscape Navigator, the two most popular browsers, still send out existing cookies even if you’ve disabled cookies in your browser settings. This means you must delete cookie files manually to keep from being tracked by third-party ad networks and spyware providers.”
And from the privacy.net cookie demo:
“Some common uses for Internet cookies are:
- An anonymous code given to you so the web site operator can see how many users return at a later time. These cookies are configured to stay on your system for months or years and are called “persistent” cookies.
- A code identifying you. This usually occurs after a registration. The site could keep a detailed account of pages visited, items purchased, etc. and even combine the information with information from other sources once they know who you are.
- A list of items you purchased. This is often used in “shopping cart” web sites to keep track of your order. Often cookies of this type ‘expire’ as soon as you log out or after a short time. These are called “session” cookies.
- Personal preferences. This can be anonymous or linked to personal information provided during a registration.
Cookies are supposed to be only accessible from the site that placed them there. However, in some cases cookies from other sites show up in the log files so it is not a secure way to authenticate a user.”
So you can see that session cookies are fine. So are some persistent cookies (e.g. the one which tells the BBC website where I live so it can give me localised information); but most of the ones I found were tracking cookies for advertising sites. These are not good and I urge Microsoft to include cookie detection in the release version of Microsoft AntiSpyware (perhaps using the SpyNet AntiSpyware community to distinguish between good and bad cookies?).
Finally, for anyone worrying about what happens when their version of the Microsoft AntiSpyware Beta expires at the end of July, Microsoft has started to push updates and one of my PCs upgraded itself to version 1.0.614 today, which expires at the end of December. The others are still on 10.0.501 but I expect to see them do the same over the next few weeks.
Adware/Spyware thread (pcreview.co.uk)
Cookie demonstration (privacy.net)
Microsoft AntiSpyware: Torn Apart
I just read that Jack St.Clair Kilby died last week. The sad thing is that I’d never heard of Jack until I read his obituary even though his invention – the integrated circuit (IC) – undoubtedly paved the way for the computerised world in which we live today. His former employer, Texas Instruments, have a tribute site. What I find interesting is that had he not been a new employee (hence with no accrued annual leave), he wouldn’t have had the opportunity to carry out his early experiments whilst the rest of the company were on vacation!
Last week, I was fortunate enough to be quoted on the front page of IT Week by Martin Veitch, in his article “Decision Time for Win2000 Users“. Of course, as my wife is a Public Relations Consultant, I understand (and even expect) only a partial quote when a journalist asks for comment, so I’m using my blog to put this into context, as the soundbite which Martin used seems to have surprised some people, including one of Microsoft UK’s Enterprise Strategy Consultants.
Yes, Windows 2000 is still popular. The AssetMetrix Research Labs report, on which Martin’s article is based, notes that between the last quarter of 2003 and the first quarter of 2005 the popularity of Windows 2000 only fell by 4%. However, the real news here is not that clients are sticking with Windows 2000 but that people are finally junking Windows 9x/ME/NT and moving to XP. Windows 9x system usage fell over the same period from 28% to 5%, Windows NT was down slightly (down from 13.5% to 10%) whilst Windows XP usage increased from less than 7% to 38%.
My colleagues and I have worked with many organisations to migrate from Windows 9x and NT to XP; but the reason that Windows 2000 is still in use by 48% of corporate IT environments is that (when patched), it is a stable and reliable platform. Microsoft may have ended support for NT last year, and 2000 is about to go onto extended support but firms are willing to tolerate the risk of not moving, whilst they recoup the investment that they have made. The heady days of the late 1990s “millennium date bug” upgrades are gone and business users are demanding value for money from their IT assets. Many of my clients have moved away from a 3 year write down of desktop PCs to 5 years, or even 7 years in one case (mind you, I’m helping them to move their retail estate from NT 3.51 to XP!). That means that those who took a big leap to implement Active Directory and adopt Windows 2000 are contemplating skipping a release, before they move directly to the next version of Windows (codenamed Longhorn). What I do expect to see over the next year or so is a lack of Windows 2000 device drivers and the consequential hardware issues driving a move to Windows XP and Server 2003 where perhaps corporates are downgrading Windows XP licenses on new PCs to match their Windows 2000 standard operating environments (SOEs).
As for Linux or the new low-cost Mac, well, at the risk of being flamed (or even accused of being sponsored by Microsoft – which incidentally I’m not!), I don’t see any of my customers moving from Windows on the desktop (yet). The education sector may be being forced down an open source route to save money (false economy I say – we should be teaching our children using the software that they will later encounter in the workplace), and consumers/hobbyists will be looking at the best technology, but in the commercial world, the reality is that organisations are not usually interested in the best technology, or even the lowest total cost of ownership (TCO) – a much overused term which is always open to dispute – but are more concerned with deploying software that the majority of users can use with the least retraining whilst minimising licensing costs through volume licensing agreements – by and large that will mean using the Windows platform and the software vendor will be Microsoft!
So what is my real advice for Windows 2000 users?
- Build on your Windows 2000 investment and take advantage of new security features by moving to Windows XP (with SP2) and Windows Server 2003 (with SP1) now.
- Windows 9x/NT to 2000 was a step change but the move to XP/2003 is less so (and the licensing costs should be minimal for those organisations that already have software assurance).
- Don’t wait until Windows Longhorn. This will be another major release, which is not expected until 2006 (client, with the server version following in 2007), after which many organisations will still wait for the first service pack before deploying.
- Finally, new technologies (such as Internet Explorer 7.0), with new features and security enhancements will only be available for recent platforms (i.e. those that are still supported by Microsoft).
Alex e-mailed me earlier and told me that the RSS feed on my family blog was broken. Actually, I’d password protected the site, and forgotten to update the details in Feedburner (which translates Blogger’s Atom output to RSS for me). I couldn’t find any fields in the feed service settings to supply username and password credentials until an unusually helpful error message suggested that I should enter the URL as
I knew that particular syntax worked for FTP, but not for HTTP too! Of course, if I was really that bothered about security I should secure the site using HTTPS, but in this case, the username and password is only a deterrent and there’s not really anything there that needs SSL security.
A couple of weeks back, I was testing a DHCP configuration scenario using a number of virtual machines and needed them to obtain their IP addresses from a Windows 2000 DHCP server within my virtual network. That should work, but for some reason, my virtual clients were picking up strange private IP addresses in the range 10.237.0.16-10.237.255.254 (not even the familiar automatic private IP addresses in the range 169.254.0.1-169.254.255.254). After a while, I discovered that Virtual Server has the capability to provide its own DHCP service and that this was enabled. By editing the configuration for the internal network I was using, I could disable Virtual Server’s DHCP server, allowing my clients to locate the correct DHCP server.
Last year, I blogged about building a Windows cluster using VMware. Since then, new versions of VMware have made this more difficult/expensive (as it no longer works with VMware Workstation) and Rob Bastiaansen has removed the virtual SCSI disks from his website. I haven’t tried building a cluster on Microsoft Virtual Server, but it seems feasible, and a few days back, I found a Windows IT Pro magazine article on building a Windows Server 2003 cluster using Virtual Server 2005.
For anyone who says “why do this – the point about clustering is high availability and that needs the supporting hardware”, I would agree with you, but a virtual cluster is great for testing/proof of concept.
My Windows-based PC just crashed (hardly surprising given all the rubbish I have installed on it recently – despite all of the bad press that Windows attracts, I maintain that a well-patched and well-managed Windows NT, 2000, XP or Server 2003 system will generally be reliable).
In the past, if this has happened, I have ignored the message about error reporting (“yeah, yeah, yada, yada – I need to get back to work and did I manage to save that document I was working on before it crashed?”… etc.) but this time I let it report the problem – and I read the results. It was even useful.
You see what happens, is that if I experience a blue screen crash event, or stop error, while using Microsoft Windows XP (or later), I can upload the error report to the Microsoft Online Crash Analysis (MOCA) site for analysis (Microsoft say this is “to further improve the quality and reliability of Windows”). They then analyse the error report and prioritise it based on the number of customers affected by the error in the report. They then try to determine the cause of the error submitted, categorise it according to the type of issue encountered, and send relevant information when such information is identified.
What I like is that, using MOCA, I can check the status of the error report for 180 days and this time it told me that my system “crashed because the random access memory containing Windows program code was corrupted. Microsoft is unable to determine if this corruption was caused by a hardware or software issue. The nature of the corruption suggests that a hardware issue is more likely. To determine if this is the case, Microsoft developed a Windows memory diagnostic that tests your PC memory. We recommend you download and run this tool on your computer system”.
Sure, so the Windows memory diagnostic tool didn’t find any memory errors so I still don’t know why the PC crashed, but at least it feel like someone actually cares and is trying to fix things… much better than just getting a blue screen of death (or even a red screen of death!).
Keni Barwick commented recently that he was worried about MSN Messenger having gone AWOL from Windows Mobile 5.0. The answer it seems is Pocket MSN. Microsoft wants to charge a one-off fee of Â£10.99 to use MSN Messenger on a mobile device.
As I commented on Keni’s blog, I tend to agree that if MSN Messenger were to be removed from smartphones then that would be a pretty dumb move (from one of the smartest marketing companies in the world), and without it the whole presence element of Microsoft’s mobility strategy starts to fall apart. Microsoft are claiming that 20% of all enterprise users make use of instant messaging (IM) services (either for business, or because their company allows it) and that this is expected to rise to 80% by the end of 2008 – not surprisingly, they want a piece of this market.
I’m reliably informed that the reason for public IM connectivity in Live Communications Server (LCS) 2005 being chargeable is because AOL, MSN, and Yahoo! require Microsoft Corporation (remember, MSN is a separate company) to subsidise them for lost advertising revenues where companies use the Windows Messenger and Office Communicator (ad-free) clients with LCS. Of course, as there are no ads in the mobile version of MSN Messenger, perhaps that is the justification for charging for that too?
Of course, charging for IM could be about opening up the mobile device market to other IM clients in an attempt to avoid landing themselves in court for allegedly behaving in an an anti-competitive manner. After all, it seems that the European Union (EU) is taking Microsoft’s dominant market position more seriously than the US Department of Justice (DoJ).