Two stories of great customer service

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

It’s not often that I receive excellent customer service (a subject on which Guy Kawasaki has written a very interesting post) and when I do, I’ll shout about it. Today I got great service from not just one but two technology companies.

I’ve been thinking about buying an iPod with Video for a while now and a few months back, I had the opportunity to win one as an incentive for passing the Microsoft Certified Technology Specialist: Live Communications Server 2005 exam. Although I was offered an 30GB iPod, I’d really like to fit my entire iTunes library on the device, so I asked for the 80GB model instead (offering to pay the difference). For various practical reasons that wasn’t going to work out, so I waited until it was given to me and tried to exchange it at an Apple Store. They couldn’t exchange it for me, but they did check the serial number and told me that it was sold by Amazon. Meanwhile I bought a protective case from Apple and was very impressed that there was no queuing up to pay – the store assistants could complete the sales process on the shop floor and e-mail me a receipt.

Next, I contacted Amazon, to see what they could do to help. In addition to e-mail contact service, Amazon (UK) has a facility on their website whereby they will call you back and you can talk to a real person and their customer service staff (in Ireland – note that they have not outsourced customer service to companies on another continent where English is not a primary language) were really helpful. It seems that I can return a gift to Amazon (within 30 days) and they will pay the postage and issue a credit on my account. The theory is that I can return the 30GB iPod and buy an 80GB model using the credit and some more money of my own. All I need to do now is to get hold of the original order number and I can complete the Returns Support Centre wizard on the Amazon website.

Of course, now I’ve finally got my hands on an iPod with Video, Apple is bound to announce a 6G touchscreen iPod with a large flash-based hard disk… oh well, c’est la vie.

The Exchange Server Troubleshooting Assistant (ExTRA)

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Microsoft’s Exchange Best Practices Analyzer (ExBPA) has been around for a few years now and it’s an excellent preventative maintenance and troubleshooting resource. ExBPA was recently joined by the Exchange Server Troubleshooting Assistant (ExTRA) which, according to the Microsoft website:

“[…] Programmatically executes a set of troubleshooting steps to identify the root cause of performance, mail flow, and database mounting issues. The tool automatically determines what set of data is required to troubleshoot the identified symptoms and collects configuration data, performance counters, event logs and live tracing information from an Exchange server and other appropriate sources. The tool analyzes each subsystem to determine individual bottlenecks and component failures, then aggregates the information to provide root cause analysis.”

ExTRA v1.1 brings together a number of troubleshooting tools: the Exchange Server Disaster Recovery Analyzer (ExDRA); the Exchange Server Performance Troubleshooting Analyzer (ExPTA); and the Exchange Server Mail Flow Analyzer (ExMFA). Furthermore, ExTRA is integrated in the ESM Toolbox for Exchange Server 2007.

Beware of automatic updates and hosted virtual machines

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Whilst many organisations will have strict policies regarding patching, others will not and I’ve lost count of the number of times I’ve found myself troubleshooting strange errors in a virtual machine, only to find that the underlying host operating system has automatically updated itself and is waiting for a restart. Consequently, it’s worth mentioning that automatic updates and hosted virtualisation server products (e.g. Microsoft Virtual Server or VMware Server) do not mix well. Of course, those running a non-hosted virtualisation solution (like VMware ESX server) won’t have this issue; although even ESX needs patching from time to time.

Microsoft’s support policy for software running in a non-Microsoft VM

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’m troubleshooting some problems with my Exchange server at the moment and the ExBPA led me to a knowledge base article about running Exchange Server in a virtualised environment. Whilst reading that, I can across Microsoft knowledge base article 897615, which discusses the support policy for Microsoft software running in non-Microsoft hardware virtualisation software.

I’ll paraphrase it as “If you have Premier support and you use our virtualisation software, we’ll try and work out what the issue is (we use Virtual Server 2005 R2 to do that anyway). If you don’t have Premier support, then you should, and you need to proove that it’s nothing to do with virtualisation (i.e. can you replicate the issue on physical hardware). If you have a Premier agreement but you use another vendor’s virtualisation software then we’ll try our best, but you’ll probably have to proove the problem is not caused by the virtualisation software”. The crux of this is the statement that:

“Microsoft does not test or support Microsoft software running in conjunction with non-Microsoft hardware virtualization software.”

This might be worth considering whilst selecting which (if any) virtualisation platform is right for an organisation.

WSUS 3.0 delivers huge improvements for the deployment of Microsoft updates

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve been an advocate of Microsoft SUS/WSUS since the v1.0 release. Sure, there are better enterprise software deployment products out there (Microsoft even has one – Systems Management Server) but as a low cost (free) patch management solution for Windows, it’s hard to beat Windows Software Update Services (which, since version 2.0, will update more than just Windows – WSUS 2.0 can act as a local cache for all updates that are available through the Microsoft Update servers). Except that now it has been beaten – by Windows Server Update Services (note the subtle name change) 3.0.

WSUS 3.0 was launched a couple of months ago and I finally installed it this afternoon. Not only does it include some great new features (like e-mail notification, improved reporting and computer management) but it finally gets an MMC administration interface (a huge improvement on the previous web administration interface). There are database changes too – WSUS no longer supports SQL Server 2000/MSDE (after all, those products are shortly to be retired), although it will upgrade an existing database.

The only downside that I can see is that the product still relies on clients connecting to the server and pulling updates (there is no option to force updates on clients – at least not as far as I can see). That’s fine but it does introduce some latency into the process (i.e. if there is an urgent patch to deploy, then WSUS is probably not the right tool to use); however, for the basic operational task of keeping a Windows infrastructure patched (for Microsoft products) and reporting on the current state, WSUS is definitely worth considering.

Further Information

WSUS 3.0 distributed network improvements (white paper).
WSUS 3.0 Usability improvements (white paper).

Crowdsourcing for advice on PC security software

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

What would you do if you received a message that started like this?

Hi chaps,

In a somewhat strange experiment, you have found yourself BCC’d on this e-mail as the people whose technical and professional opinion I value the most. If that doesn’t feel right to you, perhaps Outlook auto-complete ended up selecting the wrong person from the GAL or my Personal Address Book! ;-)

If your spam filters hadn’t already picked it out you might stop reading right there, except that this was the start of a message from one of my colleagues, who was experimenting with an alternative method of gathering information – crowdsourcing. The theory is good – after all, why spend hours reading lots of highly subjective reviews of software, probably biased by the vendors public relations efforts, when you can ask some trusted colleagues to spend ten minutes telling you what they think (in this case, which anti-virus/anti-spam/personal firewall products they use and why they use them?). For those who are unconvinced by this method of research and say that those ten minutes are valuable and that you could be doing something worthwhile instead, think about this… we’re talking about people who trust one-another’s advice here – one day that favour will be returned.

In this case, my colleague returned the favour by sharing the information – and allowing me to post it here! What follows is the Garry Martin guide to selecting PC security software:

Anti-virus
Most of you swear by AVG Free and those that don’t, use “commercial” products instead (such as those from Symantec, McAfee or Microsoft etc.) that were either free, or that they have paid very little for under various special offer programmes. Only two of you appear to have paid retail prices for a product. Whilst there was some anecdotal evidence of issues with different programs, no one strongly warned me away from a particular product or manufacturer.

Anti-spyware
Again, most of you use the free Windows Defender (http://www.microsoft.com/athome/security/spyware/software/default.mspx) and those that don’t, use the anti-spyware capability of their “commercial” suite products (Symantec, McAfee etc.). Some of you supplement this real-time scanning with the occasional run of Ad-Aware 2007 Free or the freeware Spybot – Search and Destroy just to be sure. Many of you have found things that Windows Defender has let through using this method.

Firewall
Most of you are happy with the Windows Firewall built in to Windows XP and Windows Vista. Those of you that use something different do so generally because it is part of your “commercial” suite. Many of you mentioned that you were happy anyway as you were also behind the hardware firewall of your ADSL router.

Content Filtering
Only one of you uses web content filtering. This use is primarily to protect the prying eyes of little ones, and the product used is CyberPatrol.

Others
One notable mention from me is that I also use the freeware CCleaner to clear my tracking cookies on every boot and through a batch file when required. CCleaner allows you to tag cookies you want to keep, so is very effective in protecting your privacy. I’m sure it has hundreds of other features, but this is the only one I use it for and it works very well.

So in summary, my personal “crowdsourcing” experiment worked, and worked very well. I didn’t need to research this myself, and hopefully in the process have put together some useful information for all of you. Result. Oh, and hopefully my PC is now at least as secure as your PC is!

[I was one of the mugs who paid retail prices for a product… although in fairness it was for my wife’s business…]

Garry’s experiment doesn’t have to stop there though – if you have any views on either the crowdsourcing concept or on PC security software, please leave a comment on this post.

Totally protected

Don’t just take photographs – make them!

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

For a long time now, I’ve been intending to start a photography blog, but as made the move to a digital workflow, my photography is inevitably becoming more technology-focused and I’ve decided to post the occasional photographic item here (those who are interested in just the photographic items can point their browser/feed reader to the Digital Photography tag).

Mark Wilson and Charlie Waite in 2003Four years ago, I got to meet one of my photography heros – Charlie Waite – who gave a very interesting presentation at the Talking Pictures ’03 event in London. Last night, I found my notes from that talk and whilst they are far from clear now (so I’ve missed out whole chunks that I no longer understand completely), I thought it might be worthwhile posting them here.

Charlie Waite makes the distinction between taking photographs and making photographs – to make a photograph, it is necessary to “place oneself in the midst of the photographic experience”

When I used to take photos on film, I used to think myself lucky if I got 3-5 images that were good enough to keep from a roll of film. Of course, a professional’s idea of “good enough to keep” would be different to mine (my photos consist of family snapshots, holiday memories and the odd landscape – if I made photographic images for a living then my standards would need to be much higher). It is said that the renowned American photographer Ansel Adams used to reckon on 12 good photographs a year. Indeed, Charlie Waite compared a professional photographer to a top chef who thinks that nothing is ever perfect. The chef’s guests love the meal but he thinks the beans are not quite al dente!

So what makes a good image?

Firstly, being photogenic is nothing to do with good looks – it’s about “letting you in” to the the subject. Think about a travel photographer’s image of a wizened old man – he is rarely attractive in terms of beauty and yet there is something interesting about his face, his expression, or the situation. Similarly for landscapes, industrial scenes can make great images, although they would rarely be referred to as attractive.

Most people have the artistic view that is required to take good pictures. If something is less than ideal, think about compromises. What if it was composed differently? Perhaps change the point of view? (Charlie Waite recommends using a ladder to look over the foreground and reveal more focal planes) Or try cropping the image (preferably in-camera, not in Photoshop afterwards).

Lighting can be used to create an atmosphere – for example, using side lighting instead on one main light. Personally, I love the warm glow on the landscape from a low sunlight at the end of the day – particularly combined with dark clouds after a rainstorm!

To some extent, the camera used is not what makes a great photo (a good photographer will think about a number of compositional elements whether they use an expensive medium-format camera or a mobile phone) but it can make a huge difference. Charlie Waite described the process as “making a sacred image that you are proud of” and the choice of film/filter/camera can make a huge difference. As a photographer, the subjective and creative endeavour is all yours – you are the lighting director, the producer, responsible for props, etc. and it’s your role to make it all work.

Charlie Waite summed up his talk by commenting that “landscapes are about engaging with the natural world through photography” and his talk certainly opened my eyes to a new perspective on making photographs.

Apache HTTP server on Windows Server 2008 Server Core

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Microsoft’s James O’Neill wrote about how:

“Some bright spark tried running Apache on [Windows Server 2008 Server] Core and having no special Windows dependencies it works.”

I couldn’t find any references to this elsewhere on the ‘net so I had to give it a go – it’s actually really easy:

  1. Install Windows Server 2008 Server Core
  2. Map a network drive, insert a CD or some other media and copy over the Apache HTTP server installer MSI.
  3. Issue the command, msiexec /i apache_2.2.4-win32-x86-no_ssl.msi.

    Not surprisingly, the installer is unable to create application shortcuts:

    Apache HTTP Server 2.2 Installer Information

    Warning 1909. Could not create shortcut Apache Online Documentation.lnk. Verify that the destination folder exists and that you can access it.

    Apache HTTP Server 2.2 Installer Information

    Warning 1909. Could not create shortcut Help, I’m stuck!.lnk. Verify that the destination folder exists and that you can access it.

    Presumably, that’s what causes an error dialog with no message and an OK button at the end of the install.

  4. Open up the firewall with netsh firewall set portopening TCP 80 "Apache Web Server".
  5. Point a browser at the server’s IP address and the words “It works!” should be displayed.

OK, so Apache running on Windows is no big deal but if this one cross-platform application runs on Server Core with no modifications, think what else this stripped out version of Windows can be used for.

Fixing RIS after installing Windows Server 2003 SP2

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

This may be an isolated incident, as I’ve already written about how my Windows Server 2003 SP2 installation appeared to be broken (but was ultimately successful) but ever since SP2 was installed, I’ve been warned about service startup failures and have been unable to PXE boot to RIS.

I haven’t bothered too much – my RIS server is used for XP builds and I rarely need to build XP machines these days but as there are no fully-featured Windows Vista display drivers for my IBM ThinkPad T40, I wanted to rebuild it on XP today.

It turns out that the problem was trivial. RIS has been replaced in Windows Server 2003 SP2 by Windows Deployment Services (WDS). WDS includes something called Windows Deployment Services Legacy – which looks remarkably like RIS to me (it uses WDS binaries to provide RIS functionality). I fired up the Windows Deployment Services Legacy administrative tool and performed a diagnostic check, after which PXE boots resulted in a successful connection to the OSChooser.

Windows Server 2008 Server Core

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Scotty McLeod recently gave a presentation to the Windows Server UK User Group on Windows Server 2008 Server Core. I mentioned Server Core in a previous post but here’s some more on the subject, based on Scotty’s presentation (it’s also worth checking out Micheal Pietroforte’s post on Server Core essentials).

  • Contrary to popular belief, Server Core still has a GUI. There is no Start Menu, no Explorer, no Internet Explorer (it is entirely command-line driven), but the logon screen is graphical and some GUI applications can be used (the latest beta includes an old version of notepad.exe that has very few dependencies and rundll.exe can be used to launch some GUI hooks). It is rumoured that, because some of the product teams didn’t follow Microsoft’s own application development rules, it’s too difficult to remove the GUI from Windows without breaking it completely.
  • At present, the Server Core image is about 600MB in size – small enough to facilitate some interesting potential deployment scenarios – and, because of its small size, Server Core installs quickly.
  • The number of supported roles for Server Core is growing quickly – that could be seen as a potential weakness but even so, the basic principle of providing a reduced attack service for common server scenarios still holds true. Interestingly, one of the roles (and potentially the most problematic of them all) is as an (IIS) web server – only for ISAPI/ASP applications (i.e. no .NET Framework – yet) but rumour has it that Apache will also run on Server Core and a cut-down IIS allows the installation of PHP for a Windows alternative to a LAMP web server (this lends itself to an unfortunate acronym though – WIMP – Windows, IIS, MySQL, PHP).
  • Because there is no .NET Framework for Server Core at this time, there is no ability to run PowerShell scripts.
  • After installation, Server Core has a blank administrator password. This must be changed at logon but can be changed to another blank password; however keeping it blank will prevent remote access to the server.
  • Core server has huge potential, but still seems to be a little disjointed on the administration front (ironic, given what a huge improvement has been made in the full installation through the introduction of the Server Manager tool) – it seems that the recommended approach is to use a full Windows Server 2008 server as a management server for the various Core Server installations around the enterprise.