Microsoft’s common engineering criteria and Windows Server product roadmap

This content is 20 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve often heard people at Microsoft talk about the common engineering criteria – usually when stating that one of the criteria is that a management pack for Microsoft Operations Manager (MOM) 2005 must be produced with each new server product (at release). A few minutes back, I stumbled across Microsoft’s pages which describe the full common engineering criteria and the associated report on common engineering criteria progress.

Also worth a read is the Windows Server product roadmap and the Windows service pack roadmap.

Finally, for an opportunity to provide feedback on Windows Server and to suggest new features, there is the Windows Server feedback page (although there’s no guarantee that a suggestion will be taken forward).

Windows Server 2003 Service Pack 1 overview

This content is 20 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve been meaning to write about the new functionality in Windows Server 2003 service pack 1 (SP1) for a while now, but various distractions led to this post sitting on ice for several months. I thought about dropping it altogether, but then I changed my mind because even though Windows Server 2003 release 2 (R2) is now generally available, SP1 information is still pertinent for two reasons:

  • R2 is installed on top of an SP1 baseline.
  • Many organisations will wait before implementing R2 – so SP1 is still highly relevant to a large chunk of the market (especially those still using Windows 2000, many of whom were waiting for the first Windows Server 2003 service pack before upgrading).

At last year’s Microsoft Technical Roadshow, John Howard presented a Windows Server 2003 SP1 technical overview session, at which he explained that, like Windows XP SP2, Windows Server 2003 SP1 is basically a security update. In producing SP1, Microsoft’s goal and vision was to respond to customer challenges around security, reliability and performance, making it simple both to cope with current threats and to secure a system ready for future threats. Robustness is addressed through some changes to increase performance (e.g. http.sys now runs in kernel mode for IIS servers) and reliability is about allowing systems operation with the minimum of downtime. Most importantly, tools like the security configuration wizard can be used to decrease the attack surface, exposing fewer ports and services so that organisations that have disabled a potentially vulnerable service can patch at their leisure, rather than having to schedule emergency downtime to cope with a major threat.

SP1 addresses security concerns with a number of new features, which I’ll describe in the rest of this post.

Data execution prevention (DEP) is implemented both in hardware – where no execute (NX) support is provided – and in software (functional on any process supporting Windows Server 2003). Controlled using a boot.ini /noexecute=policylevel switch, four policy levels can be selected:

  • OptIn – hardware DEP on, applications can select whether or not to use it.
  • OptOut – DEP is on, unless an application opts out.
  • AlwaysOn – DEP is on (for all applications).
  • AlwaysOff – DEP is off (for all applications).

As for many boot.ini file settings, this DEP can also be controlled through the GUI (system properties).

Post setup security updates (PSSU) is a feature designed to protect servers between the first boot and application of the most recent security updates, opening on the first administrative logon (if Windows Firewall was not explicitly enabled using an unattended installation or group policy) and blocking all inbound connections until the PSSU dialog box is completed (at which time all updates will have been applied).

PSSU offers links to install critical security updates (from Windows Update), as well as the opportunity to configure automatic updates and will re-open on the next login if not fully completed before the computer is restarted (or if forced to close using Alt and F4, which will leave the Firewall enabled). PSSU is invoked during a slipstreamed installation, but is not applied when existing servers are upgraded or when the Windows Firewall is enabled or disabled through group policy.

Unlike Windows XP SP2, the Windows Firewall is not enabled by default on Windows Server 2003 SP1 (unless PSSU is in effect). Microsoft say that this is because the primary purpose of a server is to accept inbound connections, although I would counter this by saying that the software should be secure by default and an administrator should have to take action to open ports and allow services. The boot-time security provided by the firewall is non-configurable, offering basic networking only (domain controller lookup, DHCP client, etc.) until the server is fully online. Like the XP SP2 firewall, multiple network profiles are supported (e.g. more aggressive control when away from the corporate network) and is the Windows Server 2003 SP1 firewall is integrated with the netsh command line utility.

Role-based configuration and lockdown is facilitated with the security configuration wizard (SCW). Although best practice, many administrators view reducing the attack surface on a server as difficult, time consuming, risky (services might be broken) and involving a whole load of documentation to review. Using the SCW, the process is simplified, using a role-based metaphor to disable unnecessary services and IIS web extensions, block unused ports, secure open ports using IPSec, reduce protocol exposure and configure audit settings.

SCW can be installed from the Add or Remove Programs Control Panel applet (appwiz.cpl), or by setting scw=on in unattend.txt. Command line support is included (scwcmd), as are rollback (scwcmd rollback), view (scwcmd view) and analysis (scwcmd analyze) capabilities. Although the security policy is not set through group policy, it can be applied to multiple servers as the configuration can be saved to an XML file for re-use (or converted to a group policy using scwcmd transform /p:filename.xml /g:policyname).

Incidentally, best practice would be to avoid saving the configuration file by server name as this would be useful information for a would-be hacker (and can be overwritten by later updates). The SCW viewer is also a good reference for port numbers, etc. used by various Windows services.

Other new security features include IIS 6 metabase auditing, VPN quarantine functionality and Internet Explorer security enhancements (as per Windows XP SP2 – described in the application compatibility testing and mitigation guide). RPC and DCOM are also enhanced (as for XP SP2) to reduce the attack surface with no more anonymous inbound RPC, restrictions on outbound RPC (both of these may be overridden with a registry key) and only administrators can invoke DCOM components remotely.

Another new Windows feature (which I believe NetWare administrators have had for years) is access based (directory) enumeration (ADE). ADE hides directories on a share based on a user’s access rights. The service pack version of ABE needs to be programmatically enabled (John Howard’s blog carries a link to an unsupported Microsoft utility which will enable this) but since SP1 was released, ADE has been made available for download from the Microsoft website with GUI and command line support (abecmd). It is fully described in the accompanying white paper and for those who would like to see a demonstration, John Howard has recorded an ADE blogcast.

I’m sure there are some other enhancements within SP1 that I’ve missed, but these are the major security improvements. Windows Server 2003 was already pretty good and with SP1 it got better. Add Windows Server 2003 R2 to the mix and there are also some great new features.

Scanning a multiple-page document into a single file using Microsoft Office

This content is 20 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last night, I discovered a Microsoft Office program that I’ve never used before – and it’s actually quite a useful feature to know about.

I’d received a contract in .PDF format which needed to be signed and returned by fax or e-mail. I no longer have a fax machine (my ISP provides me with a fax-to-email service for receiving faxes and I very rarely send them). So, my problem was that once I’d printed and signed the (multiple page) contract, how could I digitise it again (as a single document, rather than several individual pages)? The answer was Microsoft Office Document Imaging – provided as part of Office XP and 2003 (and possibly in other versions too – I haven’t checked). This let me scan multiple pages into a single .TIF file, also offering optical character recognition (OCR) and annotation functionality (pens, highlighting, text and picture insert, etc.).

I’ve been using Microsoft Office for many years, and I’ve never used this feature before – it strikes me that it might be a useful piece of information for someone else too.

The price of free speech (does anyone in the UK Government have a sense of humour?)

This content is 20 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I don’t normally engage in political comment on this site, but this is tech-related political comment…

Recently, there has been a lot of media coverage about how Google is allowing search results to be censored on it’s Chinese search portal and whether or not this co-operation with the Chinese authorities is the right thing to do – but did you know that the UK Government is engaging in a form of Internet censorship of its own (albeit on a much smaller scale)?

Late this afternoon, I needed a laugh, so I visited Ian Everleigh’s deeply satirical and very funny New Highway Code, only to find that he had been asked to take it down by the UK Government Cabinet Office (who were upset because it scored higher on search results than the real Highway Code). How pathetic, that the authorities feel so threatened by something that was obviously sarcastic (and extremely popular with 45,000 hits on the site in 12 months). In the author’s own words:

“Mimicking the familiar style of The Highway Code, its aim was to draw attention to the many appalling habits which cause inconvenience and even danger every day on the UK’s roads.”

I’d say that was a good thing. Clearly the people at the Cabinet Office don’t understand sarcasm. Thankfully, there is a copy of the site in the Internet wayback machine (sadly without the graphics), and Bruno Bozzetto’s yes and no dyseducational [sic] road movie is a very funny flash animation which examines driving habits in the same vein.

Back in 2004, Thomas Scott, the author of the HM Department of Vague Paranoia Preparing for Emergencies site (as well as lots of parodies that can be found via his site, some of which have even appeared on television), was asked to take the Preparing for Emergencies site down as people might confuse it with the real Cabinet Office Preparing for Emergencies site. Thankfully he refused and the BBC reported that the Government is unlikely to take further action (presumably because it could cost them a lot of money and make them look stupid in court) but I’m not sure that I would have the courage or conviction to stand up to them if they started pressuring me to remove a web site.

It only costs a few pounds to register similar domain names to official sites (and let’s face it, the government wastes enough of taxpayers’ money, a few quid on domain names won’t hurt much). It’s their own fault if the .co.uk version of a domain is available when the .gov.uk version is taken.

Thankfully, no one has yet taken down Ian Vince’s Department of Social Scrutiny site, although he does unfortunately have to provide a legal disclaimer to say that it’s a joke.

If you live in the UK (or even if you don’t and you check out the real UK Government sites), you’ll realise that these sites may be funny, but they are obviously uncomfortably close to the truth for the powers that be.

To the civil servants of the UK Government – especially the Cabinet Office, who claim to be “at the centre of Government, coordinating policy and strategy across government departments” – is your job really so dull that you’ve lost your sense of humour? It might be interesting to note that satire is defined in a glossary of literary terms as “a manner of writing that mixes a critical attitude with wit and humor [sic] in an effort to improve mankind…”.

RAID and units of storage

This content is 20 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A couple of weeks back, I commented that photography and IT are becoming ever-closer but last night I was amazed to open a copy of Digital Photographer magazine and find an article about redundant array of inexpensive disk (RAID) storage!

It was an interesting read and, because it assumed that the read wasn’t an IT professional, it gave a concise explanation of the various RAID levels (and storage capacities) which was actually a really good reference. It also referred to a number of websites with additional information – I’ve reproduced a couple of them here, along with an extra reference of my own:

From the same article, for those of us who have forgotten what the various (binary) units of storage are:

  • A single byte is the most basic unit of computer storage.
  • 1 kilobyte (KB) = 1024 bytes.
  • 1 megabyte (MB) = 1024KB (1,045,576 bytes).
  • 1 gigabyte (GB) = 1024MB (1,073,741,824 bytes).
  • 1 terabyte (TB) = 1024GB (1,099,511,627,776 bytes).
  • 1 petabyte (PB) = 1024TB (1,125,899,906,824,624 bytes).
  • 1 exabyte (EB) = 1024PB (1,152,921,504,606,846,976 bytes).
  • 1 zettabyte (ZB) = 1024EB (1,180,591,620,717,411,303,424 bytes).
  • 1 yottabyte (YB) = 1024 ZB (1,208,925,819,614,629,174,706,176 bytes).

IT Forum ’05 highlights: part 1

This content is 20 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Microsoft UK IT Forum Highlights
A few years back, I used to try and persuade my employer to send me to Microsoft TechEd Europe each year, on the basis that lots of 75 minute presentations on a variety of topics provided a better background for me than a few days of in depth product training (I can build experience later as I actually use the technology). The last time I attended TechEd was back in 2001, by which time it had become more developer-focused and the IT Forum was being positioned as the infrastructure conference (replacing the Microsoft Exchange Conference). For the last couple of years, I haven’t been able to attend the IT Forum due to family commitments (first it clashed with my the birth of my son and then subsequently its been in conflict with his birthday, as it is again this year) but luckily, Microsoft UK has been re-presenting the highlights from IT Forum as free-of-charge TechNet events (spread over two days) and I’ve managed to take some time out to attend them.

Yesterday’s event covered a variety of topics. Unfortunately there was no concept of different tracks from which I could attend the most relevant/interesting sessions, so some it went completely over my head. One of those topics was upgrading to SQL Server 2005, so apologies to the presenter – I was the guy nodding off on the front row.

In the next few paragraphs, I’ll highlight some of the key points from the day.

Upgrading to SQL Server 2005
Presented by Tony Rogerson, SQL Server MVP and UK SQL Server Community leader, this session gave useful information for those looking at upgrading from SQL Server 2000 (or earlier) to SQL Server 2005. I’ve blogged previously with a SQL Server 2005 overview, why SQL Server 2005 is such a significant new product and on the new management tools but the key points from Tony’s presentation were:

  • Upgrades (in-place upgrades) are supported, preserving user data and maintaining instance names in a largely automated fashion, as are side-by-side migrations (mostly manual, copying data from an old installation to a new and then decommissioning the old servers).
  • SQL Server versions prior to 7.0 cannot be migrated directly and SQL Server 7.0/2000 need to be updated to the latest service pack levels before they can be migrated. For SQL Server 2000 that is SP4, which might break some functionality for SP3A users, so the upgrade needs to be carefully planned.
  • The database engine (including subcomponents like the SQL Agent, tools, etc.), analysis services, reporting services and notification services can all be upgraded, and data transformation services can be migrated to integration services.
  • All product editions can be upgraded/migrated (32/64-bit, desktop, workgroup, personal, standard, developer or enterprise editions), as can all SQL Server 7.0/2000 released languages.
  • A smooth upgrade requires a good plan, breaking tasks into:
    • Pre-upgrade tasks.
    • Upgrade execution tasks.
    • Post-upgrade tasks (day 0, day 30, day 90).
    • Backout plan.
  • Microsoft provides the SQL Server 2005 Upgrade Advisor as a free download to analyse instances of SQL Server 7.0 and SQL Server 2000 in preparation for upgrading to SQL Server 2005. This can be used repeatedly until all likely issues have been resolved and the upgrade can go ahead.
  • Migration provides for more granular control over the process that an upgrade would and the presence of old and new installations side-by-side can aid with testing and verification; however it does require new hardware (although a major investment in a SQL Server upgrade would probably benefit from new hardware anyway) and applications will need to be directed to the new instance. Because the legacy installation remains online, there is complete flexibility to fail back should things not go to plan.
  • Upgrades will be easier and faster for small systems and require no new hardware or application reconfiguration; however the database instances will remain offline during the upgrade and it’s not best practice to upgrade all components (e.g. analysis services cubes).
  • Upgrade tips and best practices include:
    • Reduce downtime by pre-installing setup pre-requisites (Microsoft .NET Framework 2.0, SQL Native Client and setup support files) – some of these are needed for the Upgrade Advisor anyway.
    • If planning a migration using the copy database wizard, place the database in single-user mode (to stop users from modifying the data during the upgrade) and make sure that no applications or services are trying to access the database. Also, do not use read-only mode (this will result in an error) and note that the database cannot be renamed during the operation.
    • Be aware of the reduced surface attack area of SQL Server 2005 – some services and features are disabled for new installations (secure by default) – the surface area configuration tools can be used to enable or disable features and services.

Leveraging your Active Directory for perimeter defence
Presented by Richard Warren, an Internet and security training specialist, I was slightly disappointed with this session, which failed to live up to the promises that its title suggested. After spending way too much time labouring Microsoft’s usual points about a) how packet filtering alone is not enough and ISA Server adds application layer filtering and b) ISA Server 2004 is much better and much easier to use than ISA Server 2000, Richard finally got down to some detail about how to use existing investments in AD and ISA Server to improve security (but I would have liked to have seen more real-world examples of exactly how to implement best practice). Having been quite harsh about the content, I should add that there were some interesting points in his presentation:

  • According to CERT, 95% of [computer security] breaches [were] avoidable with an alternative configuration.
  • According to Gartner Group, approximately 70% of all web attacks occur at the application layer.
  • Very few organisations are likely to deploy ISA Server as a first line of defence. Even though ISA Server 2004 is an extremely secure firewall, it is more common to position a normal layer 3 (packer filtering) firewall at the network edge and then use ISA Server behind this to provide application layer filtering on the remaining traffic.
  • Users who are frightened of IT don’t cause many problems. Users who think they understand computers cause most of the problems. Users who do know what they are doing are few and far between. (Users are a necessary evil for administrators).
  • Not all attacks are malicious and internal users must not be assumed to be “safe”.
  • ISA Server can be configured to write it’s logs to SQL Server for analysis.
  • Active Directory was designed for distributed security (domain logon/authentication and granting access to resources/authorisation) but it can also store and protect identities and plays a key role in Windows managability (facilitating the management of network resources, the delegation of network security and enabling centralised policy control).
  • Using ISA Server to control access to sites (both internal and external), allows monitoring and logging of access by username. If you give users a choice of authenticated access or none at all, they’ll choose authenticated access. If transparent authentication is used with Active Directory credentials, users will never know that they needed a username and password to access a site (this requires the ISA Server to be a member of the domain or a trusted domain, such as a domain which only exists within the DMZ).
  • ISA Server’s firewall engine performs packet filtering and operates in kernel mode. The firewall service performs application layer filtering (extensible via published APIs) and operates in user mode.
  • SSL tunnelling provides a secure tunnel from a client to a server. SSL bridging involves installing the web server’s certificate on the ISA Server, terminating the client connection there and letting ISA server inspect the traffic and handle the ongoing request (e.g. with another SSL connection, or possibly using IPSec). Protocol bridging is similar, but involves ISA server accepting a connection using one protocol (e.g. HTTP) before connecting to the target server with another protocol (e.g. FTP).

Microsoft Windows Server 2003 Release 2 (R2) technical overview
Presented by Quality Training (Scotland)‘s Andy Malone, this session was another disappointment. Admittedly, a few months back, I was lucky to be present at an all day R2 event, again hosted by Microsoft, but presented by John Craddock and Sally Storey of Kimberry Associates, who went into this in far more detail. Whilst Andy only had around an hour (and was at pains to point out that there was lots more to tell than he had time for), the presentation looked like Microsoft’s standard R2 marketing deck, with some simple demonstrations, poorly executed, and it seemed to me that (like many of the Microsoft Certified Trainers that I’ve met) the presenter had only a passing knowledge of the subject – enough to present, but lacking real world experience.

Key points were:

  • Windows Server 2003 R2 is a release update – approximately half way between Windows Server 2003 and the next Windows Server product (codenamed Longhorn).
  • In common with other recent Windows Server System releases, R2 is optimised for 64-bit platforms.
  • R2 is available in standard, enterprise and datacenter editions (no web edition) consisting of two CDs – the first containing Windows Server 2003 slipstreamed with SP1 and the second holding the additional R2 components. These components are focused around improvements in branch office scenarios, identity management and storage.
  • The new DFSR functionality can provide up to 50% WAN traffic reduction through improved DFS replication (using bandwidth throttling remote differential compression, whereby only file changes are replicated), allowing centralised data copies to be maintained (avoiding the need for local backups, although one has to wonder how restoration might work over low-speed, high latency WAN links). Management is improved with a new MMC 3.0 DFS Management console.
  • There is a 5MB limit on the size of the DFS namespace file, which equates to approximately 5000 folders for a domain namespace and 50,000 folders for a standalone namespace. Further details can be found in Microsoft’s DFS FAQ.
  • Print management is also improved with a new MMC 3.0 Print Management console, which will auto-discover printers on a subnet and also allows deployment of printer connections using group policy (this requires use a utility called pushprinterconnections.exe within a login script, as well as a schema update).
  • Identity and access management is improved with Active Directory federation services (ADFS), Active Directory application mode (ADAM – previously a separate download), WS-Management and Linux/Unix identity management (incorporating Services for Unix, which was previously a separate download).
  • For many organisations, storage management is a major problem with typical storage requirements estimated to be increasing by between 60% and 100% each year. The cost of managing this storage can be 10 times the cost of the disk hardware and Microsoft has improved the storage management functionality within Windows to try and ease the burden.
  • The file server resource manager (FSRM) is a new component to integrate capacity management, policy management and quota management, with quotas now set at folder level (rather than volume) and file screening to avoid storage of certain file types on the server (although the error message if a user tries to do this just warns of a permissions issue and is more likely to confuse users and increase the burden on administrators trying to resolve any resulting issues).
  • Storage manager for SANs allows Windows administrators to manage disk resources on a SAN (although not with the granularity that the SAN administrator would expect to have – I’ve not seen this demonstrated but believe it’s only down to a logical disk level).
  • In conclusion, Windows Server 2003 R2 builds on Windows Server 2003 with new functionality, but with no major changes so as to ensure a non-disruptive upgrade with complete application compatibility, and requiring no new client access licenses (CALs).

Management pack melee: understanding MOM 2005 management packs
Finally, a fired up, knowledgeable presenter! Gordon McKenna, MOM MVP is clearly passionate about his subject and blasted through a whole load of detail on how Microsoft Operations Manager (MOM) uses management packs to monitor pretty much anything in a Windows environment (and even on other platforms, using third-party management packs). There was way too much information in his presentation to represent here, but Microsoft’s MOM 2005 for beginners website has loads of information including technical walkthoughs. Gordon did provide some additional information though which is unlikely to appear on a Microsoft website (as well as some that does):

  • MOM v3 is due for release towards the end of this year (I’ve blogged previously about some of the new functionality we might see in the next version of MOM). It will include a lightweight agent, making MOM more suitable for monitoring client computers as well as a Microsoft Office management pack. MOM v3 will also move from a server-centric paradigm to a service-centric health model in support of the dynamic systems initiative and will involve a complete re-write (if you’re going to buy MOM this year, make sure you also purchase software assurance).
  • There are a number of third-party management packs available for managing heterogeneous environments. The MOM management pack catalogue includes details.
  • The operations console notifier is a MOM 2005 resource kit utility which provides pop-up notification of new alerts (in a similar manner to Outlook 2003’s new mail notification).

A technical overview of Microsoft Virtual Server 2005
In the last session of the day, Microsoft UK’s James O’Neill presented a technical overview of Microsoft Virtual Server 2005. James is another knowledgeable presenter, but the presentation was a updated version of a session that John Howard ran a few months back. That didn’t stop it from being worthwhile – I’m glad I stayed to watch it as it included some useful new information:

  • Windows Server 2003 R2 Enterprise Edition changes the licensing model for virtual servers in two ways: firstly, by including 4 guest licenses with every server host licence (total 5 copies of R2); secondly by only requiring organisations to be licensed for the number of running virtual machines (currently even stored virtual machine images which are not in regular use each require a Windows licence); finally, in a move which is more of a clarification, server products which are normally licensed per-processor (e.g. SQL Server, BizTalk Server, ISA Server) are only required to be licensed per virtual processor (as Virtual Server does not yet support SMP within the virtual environment).
  • The Datacenter edition of the next Windows Server version (codenamed Longhorn) will allow unlimited virtual guests to be run as part of its licence – effectively mainframe Windows.
  • Microsoft is licensing (or plans to licence) the virtual hard disk format, potentially allowing third parties to develop tools that allow .VHD files to be mounted as drives within Windows. There is a utility to do this currently, but it’s a Microsoft-internal tool (I’m hoping that it will be released soon in a resource kit).
  • As I reported previously, Microsoft is still planning a service pack for Virtual Server 2005 R2 which will go into beta this quarter and to ship in the Autumn of 2006, offering support for Intel virtualization technology (formerly codenamed Vanderpool) and equivalent technology from AMD (codenamed Pacifica) as well as performance improvements for non-Windows guest operating systems.

Overall, I was a little disappointed with yesterday’s event, although part 2 (scheduled for next week) looks to be more relevant to me with sessions on Exchange 12, the Windows Server 2003 security configuration wizard, Monad, Exchange Server 2003 mobility and a Windows Vista overview. Microsoft’s TechNet UK events are normally pretty good – maybe they are just a bit stretched for presenters right now. Let’s just hope that part 2 is better than part 1.

Installing CA eTrust EZAntivirus on Windows Vista

This content is 20 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

CA eTrust EZAntivirus

My usual anti-virus software (Symantec AntiVirus 8 Corporate Edition) does not seem to install on Windows Vista – which is not really a problem as Vista is still in beta and so the PC will be rebuilt every few months anyway, leaving me free to use a trial version of something else. I found that CA is offering Microsoft customers a 1-year trial of the eTrust EZAntivirus product, free of charge, so I downloaded and installed that on Windows Vista (December CTP: build 5270). Installing this was not as easy as I expected – initial attempts to install failed part way through with the following message (even though I was logged in as Administrator):

Setup Error

Setup failed to copy necessary system files. Please make sure you have administrator permissions.

I eventually kicked the installation into life by running in compatibility mode for Windows XP Service Pack 2 (for reference, my EZAntivirus product version is 7.0.8.1 with engine 11.9.1 and virus signature 9633).

Previously I’ve had problems getting the Microsoft Windows AntiSpyware beta to load on Vista but I’m pleased to see that the December CTP includes Windows Defender so I’m already covered.

Now that I’ve got all the requisite IT prophylatics in place, it should be safe to go online…

Installing the Windows Vista December CTP (build 5270)

This content is 20 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

For a few days now, I’ve been struggling to get the December community technical preview (CTP) of Windows Vista installed on my notebook PC. I downloaded the DVD .ISO from Microsoft (twice, just to be sure my copy wasn’t in some way corrupt) and it booted fine, but setup.exe kept crashing – sometimes just before product key entry, sometimes just after (in any case it wouldn’t accept even a valid product key), with a variety of memory errors which reminded me of the old Windows 3.x unexpected application errors (UAEs).

I tried the raw disk workaround in the release notes (it’s great being able to access tools like diskpart during an installation) but it made no difference to my setup crashes so I submitted a bug report, but so far have heard nothing back. I couldn’t believe that I was alone with this problem and googling wasn’t doing much for me until I found HazardHawk’s reply to a post on Planet AMD 64:

“Build 5270 will not install if you download it and burn the ISO to disk no matter what program you use and the only way I have managed to get it running was to reinstall XP … from scratch, then install daemon tools and load the ISO to a virtual DVD”.

I’m using the 32-bit release (no 64-bit hardware here yet) and I didn’t use Daemon Tools, but I did use the Microsoft Virtual CD Control Panel to mount the ISO (it doesn’t matter that it’s a DVD ISO and the application is a virtual CD driver), after which I was able to run the installer from within Windows XP by just launching setup.exe.

The option to upgrade from Windows XP Professional was disabled, but that didn’t stop me from installing on the same disk partition – the Windows Vista installer moved my existing \Windows folder to \Windows.old. Like the previous builds I’ve tried (5219 and 5231), installation took a long time (just under 2 hours on my PC, which admittedly only has a 1.4GHz Pentium 4 M processor and 256MB of RAM), in this case even producing an interesting “installation is taking longer than expected, but should be finishing soon” message (after about an hour).

I have to agree with HazardHawk that not needing the DVD once the initial reboot has taken place is useful (this setup approach wouldn’t have worked otherwise) but Stretchboy’s following comment about using Nero to burn the ISO to DVD didn’t work for me (that’s what I’d been doing originally).

Now that I have the December CTP installed (which appears to be a huge improvement over earlier builds), I can go back to testing Windows Vista in earnest – I never felt comfortable with using earlier builds for anything other than transient data and it’s difficult to be an effective beta tester if you’re not using a system on a daily basis.

Nerd TV (how to play back MPEG-4 video without using Apple QuickTime Pro)

This content is 20 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

My wife is out tonight, so I’m home alone. I’ve been working pretty hard recently and am very tired so I’m under strict instructions to relax and go to bed early (especially as it’s my turn to get up with our son tomorrow morning… probably at about 5.30am).

The trouble is that I’m also a nerd (as indicated by blogging late at night!) with a geek rating of 40% (this has gone up since I started using Unix) and I have a load of episodes of Nerd TV that I’ve been meaning to watch since it launched last September.

Although the MPEG-4 Nerd TV download is only available at 320×240 resolution, I wanted to watch it scaled to full screen. This was a problem as Apple QuickTime 7 Player only lets me watch it at double size (unless I upgrade to the Pro version) and Microsoft Windows Media Player 10 can’t handle MP4s (Microsoft knowledge base article 316992 has more details).

I tried installing the 3ivX D4 4.51 CODECs to allow MP4 playback in Windows Media Player but playback was too fast (sounded like the Smurfs). The DivX 5.2 CODECs that I had lying around on my external hard disk didn’t work either (and I have a feeling that you have to pay for the latest ones) so I switched to MPlayer on my Solaris box (after first trying the Totem Movie Player, which also failed to play back files with a MIME type of Video/QuickTime).

MPlayer is a really good command line media player for Linux (there are also Solaris and Windows ports available) but I experienced some quality issues when running full screen. Using /opt/asf/bin/mplayer filename -vo x11 -zoom -fs informed me that “Your system is too slow to play this!”, although it did also help out by suggesting various switches to try in order to increase performance.

I didn’t have time to figure out the optimum MPlayer settings so I went back to Windows Media Player with the 3ivX CODECs, thinking I mist be able to do something to fix the playback speed. Purely by chance I found out that simply stopping (not pausing) the playback and starting again corrected the speed and gave a perfect playback.

Finally, I remembered that Apple iTunes is built on QuickTime… I wish I’d tried this an hour or so earlier as I found that my MP4s will play in full screen mode within iTunes. Having said that, Windows Media Player 10 with the 3ivX CODECs looks to provide a smoother image when scaled to full screen; however that could just be my eyesight (or my Microsoft-tinted glasses).

So there you go – three methods to play back MP4s at full screen without using QuickTime Pro: Windows Media Player with 3ivX CODECs, MPlayer, or iTunes.

Patching systems shouldn’t be this difficult

This content is 20 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

With tools like the automatic updates client and Microsoft Update, keeping a modern Windows system up-to-date is pretty straightforward.

For those who have a network of computers to manage there are additional tools, like the Microsoft baseline security analyzer (which helps to identify if any patches are missing) and Windows software update services (which keeps a local copy of Microsoft update on one or more servers on a network).

It’s just taken me over two hours to patch a single computer running Sun Solaris 10 x86. Like Microsoft, Sun provides tools that assist enormously in the process, but honestly – two hours! First I had to install the Sun update connection software, then once I’d launched Update Manager, there were 53 updates to download and install (and that was just security patches and driver updates – Sun restricts access to certain patches to organisations with a service plan). After a very long reboot (whilst some of these patches were applied), there were still 15 more updates (probably a subset of the original 53). Then a further reboot (shorter this time), and I was up and running again.

In fairness, Windows updates often require restarts and it can take several visits to Microsoft Update before a system is fully patched but this was ridiculous.

Next time someone tells me that patching Windows is too difficult, my response is unlikely to be empathetic.