If only all warranty calls were like this…

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A couple of years ago, I had the misfortune to require warranty support from Dell (a frustrating experience). Then, problems with my IBM ThinkPad left me stuck between a 3-year hardware warranty and a 90-day software warranty. Well, thankfully my recent experiences with HP have been considerably better.

Last year I had some warranty repairs carried out on a couple of my notebook PCs – the warranty cover was for a back-to-base repair: a courier arrived from DHL and packaged the computers, then a few days later they were returned with the faulty components replaced.

Then, yesterday, one of my hard disks failed. I checked the warranty status on the Seagate website (one of the reasons that I use Seagate drives is the 5-year warranty) but it wasn’t valid as the component was originally supplied by HP. So, I called HP, who were happy to take my word that a few whirrs and clunks from the disk, then nothing (except a system that was stuck attempting to boot from drive C: ) meant that this device was broken and needed to be replaced (even if I did have to explain to an overseas call centre operator that I work for a company with 20,000 employees and I couldn’t check every address they had on their system for that company name, but that my home address certainly wouldn’t be there). Half an hour later, HP (or one of their agents) called me to check the part number and promised me a replacement within 24 hours.

By 9:00 this morning, I had a package containing a new drive in my hand (even if the courier didn’t know anything about collecting the faulty component) and a few minutes later I had installed it in my system. By lunchtime, everything was up and running again. Then, I found the instructions that told me to package the failed drive in the box used to ship the new replacement and peel off the label, underneath which was a pre-paid returns label. All that was needed then was a call to UPS to arrange collection and a few minutes ago, the same UPS driver returned to collect the package.

Overall, it was a positive experience (as positive as a wrecked hard drive can be) – less than a day of downtime on a standard parts-only warranty. Thank you HP.

Windows Home Server – first impressions, mass storage drivers and clients that won’t connect

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Windows Home Server logoIn my post about Microsoft’s Vista after hours event, I mentioned Windows Home Server (WHS). Over the weekend, I installed the April CTP of Windows Home Server (build 1371) on a PC at home and I’m pretty impressed.

WHS is based on Windows 2003 Small Business Server and consequently has a pretty solid codebase. In the April CTP, the product’s lineage is very visible, with a the title Windows Server 2003 for Small Business Server Setup during text-mode setup, a Windows Server 2003 splash screen and the desktop displaying the version information as:

Windows Server 2003 for Small Business Server Evaluation copy. Build 3790 (Service Pack 2)

I installed the product on an aging Compaq DeskPro D500SFF (Pentium 4 1.5GHz CPU) upgraded to 768MB of RAM (I’m sure 512MB would have been fine but I’d already upgraded it) with a Sony DWG120A DVD±RW dual layer recorder, white-box Serial ATA (SATA) controller (Silicon Image SiI3112A SATALink BIOS v4.2.83) and a Seagate ST3500641AS (500GB SATA) disk.

Rather than reviewing WHS (as other people are better at that than I am – Paul Thurrott has a review of the WHS April CTP and APC has a review of WHS beta 2), I’ll just highlight a couple of issues that it took me a while to resolve:

  • The WinPE 2.0-based installer didn’t recognise my SATA controller (but it did give me a straightforward interface for loading the correct drivers) – I know that SATA support in Windows is still patchy, but I would expect a new product to have been updated with current mass storage drivers for a common chipset (ironically, Windows Update pushed some updated drivers after installation)! I downloaded the latest SiI3x12 32-bit Windows base driver (v1.3.67.0, dated 30 March 2007) and, when prompted by the installer, I supplied them on a USB key; however this failed setup once it entered text-mode (it couldn’t see the USB key) so I tried again using a CD. Again, text-mode setup failed as it will only accept updated drivers (after pressing F6) from drive A: so I ran the whole process again, this time using a floppy disk (which felt like a return to the 1990s). Even though the GUI-mode and text-mode setups both require their own drivers to be loaded, it seems that they have to be from the same media.
  • I had a few issues with my media (file copy errors), despite downloading the ISO twice (on two different machines) and writing the DVD (using two different drives) at the slowest possible speed; however I decided to skip the files that couldn’t be read (mostly non-English language files but also one hotfix for Microsoft knowledge base article 929644, which is not available publicly). This may have been the cause of a later error – Windows Home Server setup error. Updating Windows Update Redirector failed: cannot complete this function. (error code 0x800703eb) but after setup consequently failed, I restarted the computer, after which it resumed installation, updated the Windows Update Redirector and ran the rest of the setup routine with no further issues.
  • When installing the client connector (on a Windows XP SP2 PC), I was unable to connect to my home server. As product intended for home users, WHS expects all devices to be on the same subnet; however my home network is split across multiple subnets (I also elected not to use the default server name). The WHS help text refers to this as an advanced network configuration and WHS requires that a manual connection is made. Unfortunately, connecting directly via IP (or name) also failed, informing me that A network error has occurred. Please verify that your network connection is active and that Windows Home Server is powered on. Then, I found a very useful troubleshooter for WHS client joins which let me ascertain that all was well with my server so I started looking at firewalls. After enabling firewall logging on the WHS network connection, I could see connections being dropped from one of my own subnets. I then edited the firewall exceptions list, changing the scope from my network (subnet) only to a custom list of subnets for the following services (any externally-accessible services were left at their defaults – i.e. HTTP on TCP 80, HTTPS on TCP 443 and Windows Home Server Remote Access on TCP 4125) and successfully joined the client to my WHS:
    • File and printer sharing (TCP 139 and 445, UDP 137-138).
    • HTTP (TCP 88).
    • HTTPS (TCP 444).
    • Remote Desktop (TCP 3389).
    • Windows Home Server Computer Backup (program exception).
    • Windows Home Server Diagnostics (TCP 5247).
    • Windows Home Server Transport Service (TCP 1138).
    • Windows Media Connect (TCP 10243, UDP 10280-10284).
    • UPnP Framework (TCP 2869 and UDP 1900).

Despite these problems, I want to stress that WHS is shaping up to be a great product. It is beta software and that means that problems are to be expected (I have filed a few bug reports already, as well as a couple of feature requests – namely that I would like to be able to join WHS servers to a domain and apply group policy and that I would like to be able to access WHS on my own domain name, rather than via a Microsoft-supplied address).

There’s more information about WHS at the Windows Home Server blog.

Corrupt Firefox profile preventing access to the WordPress visual editor

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

For the last few weeks (ever since one of the all-to-frequent Firefox crashes that I experience) I’ve been unable to use the WordPress visual editor to write my posts.  If I switched to another machine then everything was fine – the problem only existed in Firefox on one machine.  After seeking help on the WordPress support forums, someone tactfully suggested that I ask for help on the Mozilla forums… I wasn’t hopeful (as this problem seemed to be specific to WordPress); however the advice I was given was spot-on – it turns out that my issue was a corrupted Firefox profile.

After creating a new profile and copying key settings from my old profile (I copied bookmarks.html, certs8.db, cookies.txt, formhistory.dat, history.dat, hostperm.1, key3.db, mimeTypes.rdf and signons2.txt), I was able to relaunch Firefox and everything was back to the way it should be, complete with browser history, bookmarks, cookies, stored password, etc.  It should also be possible to copy items such as user preferences, search plugins and extensions but that’s not recommended if there were problems with the previous profile, so I reinstalled the couple of Firefox add-ons that I do use (the British English Dictionary and Web Developer extensions).

Mac vs. PC (vs. Linux)

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few months back, I wrote a post about the Mac vs. PC ads (which, funny as they are, as a user of Macintosh, Windows and Linux PCs, I find to be a little misleading sometimes and downright untruthful others) before following it up when I heard an amusing Mac vs. PC parody on BBC Radio 4’s The Now Show. It was interesting to hear that Mac Format magazine judged the ads as ineffective because the largest group of consumers to whom they appeal are already Mac users (although Apple’s continuation of the Get a Mac campaign would suggest that it is working for them) and, in the comments on my recent post about some of the consumer-targeted features in Windows Vista being just as good as the functionality offered by Mac OS X, I was criticised for saying:

“Apple’s Get a Mac campaign draws on far too many half truths that will only become apparent to users after they have made the decision to switch, splashed out on the (admittedly rather nice) Apple hardware and then found out that the grass is not all green on the other side.”

Regardless of the effectiveness (or honesty) of the original ads, late last night, whilst researching for my rebuttal of those comments, I came across some more Mac vs. PC ads:

I’ve said before that the whole “my operating system is better than your operating system” nonsense is quite ridiculous really but the TrueNuff guys have it all just about summed up:

“Why would you love a Mac? Computers are computers. Macs are great. So are PCs. So are toasters – what’s your point? It’s just a computer – get over it.”

I’m enjoying the spoof ads though!

Recovering data after destroying the Mac OS X partition table

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’m not a religious man but every once in a while I do something stupid and find myself hoping for some divine intervention. Yesterday, I excelled in my stupidity with what was probably the single most careless thing that I have ever done in my entire computing life, accidentally re-initialising my external hard disk (containing, amongst other things, my iTunes library and irreplaceable digital photos of my children) and the backup disk.

In a mild state of panic, I called my friend Alex who gave me two excellent pieces of advice:

  • Do nothing with the corrupted disks. Sit tight. Calm down. Wait and see what turns up from researching similar scenarios on the ‘net.
  • Submit a post to some of the Mac forums (Mac OS X Hints, Apple Discussions, Mac Geekery) and see if anyone can recommend a suitable course of action.

Thank you Alex.

And thank you Stanley Horwitz, debaser626, Tom Larkin and Joe VanZandt for coming back to me with recommendations almost straightaway. Almost everyone suggested running a tool from Prosoft Engineering called Data Rescue II.

In addition to its primary role of recovering lost data on hard disks, this $99 utility (a small price in comparision to professional data recovery fees) has two especially important features: it is non-destructive as all restoration has to be to another volume; and it can be run in demo mode first to check that data is recoverable before having to be registered.

A quick scan turned up no recoverable files but a thorough scan was more useful. After a few hours reading 6 billion disk blocks and another couple analysing the data, it found my files. Unfortunately the progress bar gives no indication as to how many files might be recoverable whilst the scan is taking place, presumably because files may be spread across many disk blocks but it found my files!

The recovered files were marked as orphans, CBR (whatever that is) and then a whole load of them actually had their original file names and other metadata. After successfully recovering a single file, I bought a license and set about recovering the entire contents of the disk to another volume. Unfortunately it hung after failing to read one of the files but I repeated the operation (this time saving my scan results so that I can exit and relaunch the application if necessary) and successfully restored my digital photos. The relief is immense and I’m presently running a full restoration of the entire disk contents (I imagine that a large part of tomorrow will be spent working out which files I need, and which were actually deliberately deleted files recovered along with the lost data).

Other potentially useful tools, which I didn’t try but which might be useful to others, include:

  • GRC Spinrite – for proactive hard disk maintenance and recovery of data from failing hard disks (disk recovery – not partition recovery)
  • Alsoft DiskWarrior – for damaged directory structures (file system recovery – not partion recovery).
  • SubRosaSoft File Salvage – another partition recovery tool.

Note that I haven’t tried any of these tools myself – I’m simply alerting any poor soul who stumbles across this page to their existence.

I was lucky. Very lucky.

The moral of this story – don’t rely on a single backup that is overwritten nightly and permanently connected to your computer. I really must take more frequent DVD backups of crucial files and store a disk backup offsite.

I should know better.

Get a Mac? Maybe, but Windows Vista offers a more complete package than you might think

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ll freely admit that I have been critical of Windows Vista at times and I’ll stand by my comments published in Computer Weekly last November – Windows XP will remain in mainstream use for quite some time. Having said that, I can’t see Mac OS X or Linux taking the corporate desktop by storm and the move to Vista is inevitable, just not really a priority for many organisations right now.

Taking off my corporate hat one evening last week, I made the trip to Microsoft’s UK headquarters in Reading for an event entitled “Vista after hours”. Hosted by James Senior and Matt McSpirit it was a demo-heavy and PowerPoint-light tour of some of the features in Windows Vista that we can make use of when we’re not working. Not being a gamer and having bought a Mac last year, I’ve never really paid attention to Microsoft’s digital home experience but I was, quite frankly, blown away by what I saw.

The first portion of the evening looked at some of the out-of-the-box functionality in Windows Vista, covering topics like search, drilling down by searching within results, using metadata to tag objects, live previews and saving search queries for later recall as well as network diagnosis and repair. Nothing mind-blowing there but well-executed all the same. Other topics covered included the use of:

  • Windows Photo Gallery (which includes support for the major, unprocessed, raw mode formats as well as more common, compressed, JPEG images) to perform simple photo edits and even to restore to the original image (cf. a photographic negative).
  • Windows Movie Maker to produce movies up to 1080p.
  • Windows DVD Maker to produce DVD menus with support for both NTSC and PAL as well as 4:3 and 16:9 aspect ratios.
  • Windows Media Player to organise media in many ways (stack/sort by genre, year, songs, album, artist, rating, recently added, etc.) and share that media.

Apple Macintosh users will think “yeah, I have iPhoto, iMovie, iDVD and iTunes to do all that” and they would be correct but Apple says (or at least implies in its advertising) that it’s hard to do these things on a PC – with Vista it’s just not… which moves me on to backup – not provided (at least in GUI form) by the current Mac OS X release (only with a .Mac subscription) and much improved in Windows Vista. “Ah yes, but Leopard will include Time Machine!”, say the Mac users – Windows has had included the volume shadow copy service (VSS/VSC) since Windows XP and Windows Backup includes support for multiple file versions right now as well as both standard disk-based backups and snapshots to virtual hard disk (.VHD) images, which can then be used as a restore point or mounted in Virtual PC/Virtual Server as a non-bootable disk. Now that does sound good to me and I’m sure there must be a way to make the .VHD bootable for physical to virtual (P2V) and virtual to physical (V2P) migrations… maybe that’s something to have a play with another day.

Regardless of all the new Vista functionality, for me, the most interesting part of the first session was Windows Home Server. I’m a registered beta user for this product but must confess I haven’t got around to installing it yet. Well, I will – in fact I’m downloading the April CTP as I write this. Based on Windows 2003 Small Business Server, it provides a centralised console for management of and access to information stored at home. Microsoft claim that it has low hardware requirements – just a large hard disk – I guess low hardware requirements is a subjective term (and I figure that my idea of low hardware requirements and Microsoft’s may differ somewhat), nevertheless it offers the opportunity to secure data (home computer backup and restore, including scheduling), provide centralised storage (a single storage pool, broken out as shared storage, PC backups, operating system and free space), monitor network health (i.e. identify unsafe machines on the network), provide remote access (via an HTTPS connection to a defined web address) and stream media, all controlled through a central console. Because the product is aimed at consumers, ease of use will be key to its success and it includes some nice touches like scheduled backups and automatic router configuration for remote access. Each client computer requires a connection pack in order to allow Home Server to manage it (including associating account information for secuirity purposes) and, in response to one of my questions, Microsoft confirmed that there will be support for non-Windows clents (e.g. Mac OS X 10.5 and even Linux). Unfortunately, product pricing has not yet been released and early indications are that this will be an OEM-only product; that will be a great shame for many users who would like to put an old PC to use as a home server.

Another area covered in the first session was parental controls – not really something that I worry about right now but maybe I will over the next few years as my children start to use computers. Windows Vista includes the ability for parents to monotor their child’s activities including websites, applications, e-mail, instant messages and media. Web filters can be used to prevent access to certain content with an HTTP 450 response, including a link for a parent to approve and unblock access to the content as well as time limits on access (providing a warning before forcing a logout). Similarly, certain games can be blocked for younger users of the family PC. The volume and diversity of the questions at the event would indicate that Vista’s parental controls are fairly simplistic and will not be suitable for all (for example, time limits are on computer access as a whole and not for a particular application, so it’s not possible to allow a child access to the computer to complete their homework but to limit games to a certain period in the evening and at weekends).

If session one had whetted my appetite for Vista, session two (Vista: Extended) blew my mind and by the time I went home, I was buzzing…

I first heard of Windows SideShow as a way to access certain content with a secondary display, e.g. to provide information about urgent e-mails and upcoming appointments on the lid of a laptop computer but it actually offers far more than this – in fact, the potential for SideShow devices is huge. Connectivity can be provided by USB, Wi-Fi, Bluetooth – Windows doesn’t care – and the home automation possibilities are endless. I can really see the day when my fridge includes capabilities for ordering groceries via a SideShow display in the door. There is at least one website devoted to SideShow devices but James Senior demonstrated a laptop bag with a built-in SideShow controller including a cache for media playback. Typically used to expose information from a Windows Sidebar gadget, SideShow devices will wake up a sleeping computer to synchrosise content then put it back to sleep and can be secured with a PIN or even erased when logged off. Access is controlled within the Windows Control Panel and there is an emulator available to simulate SideShow devices.

As elegant as Apple Front Row is, for once Microsoft outshines the competition with Windows Media Center

Next up was Windows Media Center. Unlike with the Windows XP Media Center and Tablet PC editions, Microsoft no longer provides a separate SKU for this functionality, although it is not enabled in all Vista product editions. Media Center is a full-screen application that offers a complete home media hub – sort of like Apple Front Row but with support for TV tuners to include personal video recorder (PVR) functionality. As elegant as Apple Front Row is, for once Microsoft outshines the competition with Windows Media Center – multiple TV tuners can be installed (e.g. to pause live TV, or to record two items at once, as well as the elctronic programme guide (EPG), controls, etc. being displayed as an overlay on the currently playing content. As with Windows Media Player, visualisations are provided and in theory it ought to be possible to remote control a Media Center PC via Windows Home Server and set up a recording remotely. Individual programs, or whole series, can be recorded and many TV tuners include DVB-T (digital terrestrial) support (i.e. Freeview), with other devices such as satellite and cable TV decoders needing a kludge with a remote infra-red controller (a limitation of Sky/Virgin Media network access rather than with Windows). Other functionality includes RSS support as well as integration with Windows Live Messenger and some basic parental controls (not as extensive as elsewhere in Windows Vista but nevertheless allowing a PIN to be set on certain recordings).

The event was also my first opportunity to look at a Zune. It may be a rather half-hearted attempt at producing a media player (no podcast support and, crucially, no support for Microsoft’s own PlaysForSure initiative) but in terms of form-factor it actually looks pretty good – and it includes functionality that’s missing from current iPods like a radio. If only Apple could produce an iPod with a similarly-sized widescreen display (not the iPhone) then I’d be more than happy. It also seems logical to me that as soon as iTunes is DRM-free then the iTunes/iPod monopoly will be broken as we should be able to use music purchased from the largest online music store (iTunes) on the world’s favourite portable media player (iPod) together with Windows Media Center… anyway, I digress…

I mentioned earlier that I’m not a gamer. Even so, the Xbox 360‘s ability to integrate with Windows PCs is an impressive component of the Microsoft’s digital home experience arsenal. With its dashboard interface based around a system of “blades”, the Xbox 360 is more than just a games machine:

As well as the Xbox 360 Core and Xbox 360 Pro (chrome) systems Microsoft has launched the Xbox 360 Elite in the United States – a black version with a 120GB hard disk and HDMI connectivity, although it’s not yet available here in the UK (and there are also some limited edition Yellow Xbox 360s to commemorate the Simpsons movie).

Finally, Microsoft demostrated Games for Windows Livebringing the XBox 360 Live experience to Windows Vista-based PC gaming. With an Xbox 360 wireless gaming receiver for Windows, Vista PC gamers can even use an Xbox 360 wireless controller (and not just for gaming – James Senior demonstrated using it to navigate Windows Live maps, including the 3D and bird’s eye views). Not all games that are available for both PCs and the Xbox will offer the cross-platform live experience; however the first one that will is called Shadowrun (and is due for release on 1 June 2007) bringing two of the largest gaming platforms together and providing a seamless user experience (marred only by the marketing decision to have two types of account – silver for PC-PC interaction and gold for PC-XBox).

Apple’s Get a Mac campaign draws on far too many half truths that will only become apparent to users after they have made the decision to switch… and then found out that the grass is not all green on the other side

So, after all this, would I choose a Mac or a Windows PC? (or a Linux PC?) Well, like so many comparisons, it’s just not that simple. I love my Mac, but Apple’s Get a Mac campaign draws on far too many half truths that will only become apparent to users after they have made the decision to switch, splashed out on the (admittedly rather nice) Apple hardware and then found out that the grass is not all green on the other side. In addition, Apple’s decision to delay the next release of OS X whilst they try to enter the mobile phone market makes me question how committed to the Macintosh platform they really are. Linux is good for techies and, if you can support yourself, it has the potential to be free of charge. If you do need support though, some Linux distros can be more expensive than Windows. So what about Windows, still dominant and almost universally despised by anyone who realises that there is a choice? Actually, Windows Vista is rather good. It may still have far too much legacy code for my liking (which is bound to affect security and stability) but it’s nowhere near as bad as the competition would have us thinking… in fact it hasn’t been bad since everything moved over to the NT codebase and, complicated though the product versions may be, Windows Vista includes alternatives to the iLife suite shipped with a new Macs as well as a superior media hub. Add the Xbox integration and Windows SideShow into the mix and the Microsoft digital home experience is excellent. Consumers really shouldn’t write off Windows Vista just yet.

Adding a meaningful description to web pages

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

One of the things that I noticed whilst reviewing the Google results for this site, was how the description for every page was shown using the first text available on the page – mostly the alternative text for the masthead photo (“Winter market scene from the small town of Porjus in northern Sweden – photograph by Andreas Viklund, edited by Alex Coles.”):

Screenshot showing duplicate descriptions

Clearly, that’s not very descriptive and so it won’t help much with people finding my site, linking to me, and ultimately improving the search engine placement for my pages, so I need to get a decent description listed for each page.

The WordPress documentation includes a page on meta tags in WordPress, including an explanation as to why they aren’t implemented by default (my template did include a meta description for each page which included the weblog title and tagline though). Even though meta tags are not a magic solution to search engine placement, I wanted to find a way to add a meaningful description for each page using <meta description="descriptionofcontent"> and also <meta keywords="pagecontext"> (although it should be noted that much of the available advice indicates that major search engines ignore this due to abuse). Fortunately there is a WordPress plugin which is designed to make those changes – George Notoras’ Add-Meta-Tags. There’s plenty of speculation as to whether or not Google actually uses the description meta tag but recent advice seems to indicate that it is one of many factors involved in the description shown in search results (although it will not actually affect positioning).

I already had meta tags in place for content-type, robots, and geolocation but I added some more that I was previously using HTML comments for:

<meta http-equiv="content-language" content="en-gb" />
<meta name="author" content="Mark Wilson" />
<meta name="generator" content="WordPress" />
<meta name="publisher" content="markwilson.it" />
<meta name="contact" content="webmaster@markwilson.co.uk" />
<meta name="copyright" content="This work is licenced under the Creative Commons Attribution-Non-Commercial-Share Alike 2.0 UK: England & Wales License. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-sa/2.0/uk/ or send a letter to Creative Commons, 559 Nathan Abbott Way, Stanford, California 94305, USA" />

Incidentally, a comprehensive list of meta tags and an associated FAQ is available at Andrew Daviel’s Vancouver webpages.

After checking back a couple of weeks later, the same search returns something far more useful:

Screenshot showing duplicate descriptions

Unfortunately my PageRank has dropped too, and it’s possible that the duplicate entries for http://www.markwilson.co.uk/ and https://www.markwilson.co.uk/blog/ are causing the site to be penalisedGoogle’s Webmaster guidelines say “don’t create multiple pages, subdomains, or domains with substantially duplicate content. The presence of those duplicate entries is actually a little odd as checking the server headers for http://www.markwilson.co.uk/ reveals an HTTP 301 response (moved permanently), redirecting to https://www.markwilson.co.uk/blog/.  Of course, it could be down to something entirely different, as PageRank is updated infrequently (there’s more information and links to some PageRank anaylsis tools at RSS Pieces but I use Page Rank Checker) and there have been a lot of changes to this site of late… only time (and building the volume of backlinks to https://www.markwilson.co.uk/blog/) will tell.

Defragmenting a Mac OS X hard disk

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Apple claims that OS X is the world’s most advanced operating system. If that’s the case, then why does it lack basic system utilities? That’s a rhetorical question, but I’ve written before about OS X’s lack of a decent backup utility and today (including most of tonight – hence the time of this post) I fell foul of its inability to defragment hard disks.

“ah… but you don’t need a defragmentation utility with OS X because it automatically defragments as it goes.”

[insert name of just about any Macintosh support forum here]

Wrong.

OS X defragments files, but not the disk itself (for an explaination as to what that really means and as to whether it’s really necessary, refer to Randy B Singer’s Mac OS X maintenance and troubleshooting guide). This inability to perform what should be a basic operating system function (even Windows has the capability) has cost me a lot of time today. In fairness, there is a third party utility availabilty (if I was prepared to pay for it), called iDefrag (Paul Stamatiou has a review of iDefrag on his site) but in the end, I used Mike Bombich’s Carbon Copy Cloner to clone my hard disk to my backup drive, make that bootable, repartition my system disk, and then clone the drive back again – a pretty long winded approach to defragmentation.

Still, every cloud has a silver lining… at least this process led me to discover the the Mac OS X maintenance and troubleshooting guide that I referred to earlier… well worth a read.

Passed Microsoft Certified Technology Specialist exam 70-262

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I missed the announcement, but at some stage in recent years, Microsoft revamped its IT Professional certification scheme. It seems as though I still qualify as a Microsoft Certified Systems Engineer (MCSE) for both the NT 4.0 and Windows 2000 tracks; although I never did get around to upgrading my MCSE to Windows XP and Server 2003… maybe I’ll follow the Vista and Longhorn Server track when it’s released.

Anyway, earlier today I passed the Microsoft Office Live Communications Server 2005 – Implementing, Managing, and Troubleshooting exam (70-262), making me a Microsoft Certified Technology Specialist (MCTS): Microsoft Office Live Communications Server 2005.

I guess that’s just like an MCP in the old days but it’s another logo to display on the IT Services page. Actually, the real reason I did it was that I was incentivised by the prospect of a free iPod from my employer if I was one of the first three people to take (and pass) the test by a particular date!

This was the first Microsoft exam that I’ve taken for a while and Microsoft’s non-disclosure agreement prevents me from saying too much about it but as I took Monday off work, spent all day Tuesday (and Thursday evening) at Microsoft events and had to do some real work too, it’s been a challenge to cram in all of my revision… hence the lack of blog posts this week. I plan to make up for that after the long weekend (when I finally get around to writing up my notes from the Microsoft Management Summit and Vista After Hours events)… watch this space.

Planning and deploying Microsoft Office SharePoint Server 2007

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

It’s been a few months since I attended a Microsoft event but last night I made the trip to Reading for a session on planning and deploying Microsoft Office SharePoint Server. Hosted by a vendor (rather than one of the IT Professional technical evangelist team), I was initially unsure of how useful the event would be but Steve Smith (an MVP who part-owns consultancy Combined Knowledge and is very active in the UK SharePoint User Group) was an extremely knowledgeable and engaging speaker. Furthermore, he took the time during the mid-session break to answer each of my (many) questions, for which I’m extremely grateful! What follows, is a summary of the content from last night’s event combined with links and additional information from my own research.

Firstly, it’s worth clarifying that SharePoint is a branding for a group of products and technologies and the two major product versions are:

It is important to note that WSS is a free of charge download for licensed Windows Server users whereas MOSS requires separate licenses (server and client access). It should also be noted that MOSS replaces the previous SharePoint Portal Server (SPS) 2003 product and the change of name reflects that SharePoint is more than a portal – it’s a collaboration platform.

WSS integrates with Windows Server (2003, at the time of writing there are still some issues with the beta for Windows Server codenamed Longhorn), Internet information services (IIS) and the Microsoft.Net Framework 3.0 (including the Windows Workflow Foundation) to provide a collaboration platform, using SQL Server 2000 or 2005 as its database. That’s a lot of dependencies and a lot of variables in the choice of server configuration; however it’s worth noting that a separate database server is recommended (more on that in a moment) and using SQL Server 2005 will provide performance gains over SQL Server 2000. WSS provides the ability to open, add, create and check in/out documents for collaboration at a departmental level; however it is not a document management solution. It provides some foundation services (storage, security, management, deployment, a site model and extensibility) for a collaborative solution.

MOSS builds on WSS (indeed, the MOSS installation includes WSS and will overwrite an existing WSS installation) to provide shared services for portal services, enterprise content management (formerly provided by Content Management Server 2002), enterprise search and indexing and business intelligence and forms (described as “a window on business systems”). What Microsoft’s marketing materials do not highlight, is that MOSS can also provide a front end to enterprise document and records management (EDRM) solutions such as those provided by Meridio, (EMC) Documentum and Open Text.

In designing MOSS, Microsoft attempted to address a number of customer pain points that existed in SPS:

  • Poor resource utilisation and isolation.
  • Inconsistent setup.
  • Network support.
  • Difficult central administration.
  • Topology restrictions.
  • Upgrades.

Many of these have been addressed (for example, unlike with SPS, it’s a simple to add another server to an existing infrastructure); however upgrades are still not as simple as they could be and were referred to anecdotally as being the most common reason for an incident to be logged with Microsoft Product Support Services (PSS) at the moment.

The WSS/MOSS administration design goals were:

  • Simplicity – easy setup using an existing SQL Server or installing a copy of SQL Server 2005 Express Edition.
  • Extensibility – a single object model so that moving from WSS to MOSS does not break SharePoint applications.
  • Consistency – no more “jumps” from WSS team sites to portal pages.
  • Resource optimisation – the ability to scale out by dedicating servers to specific tasks, e.g.indexing.
  • Delegation – the ability to delegate control over parts of the infrastructure to particular groups of users.

Steve Smith compared the changes between SPS 2003 and MOSS 2007 with the period when another Microsoft product – Exchange Server – reached maturity in the late 1990s; it was not until the release of Exchange Server 5 (which was actually the second product version) that it began to build market presence and by version 5.5 it was the arguably the de facto product for building a corporate messaging platform. Microsoft is hoping (and business interest is indicating) that MOSS 2007 could mark a similar turning point for SharePoint; however it seems likely that many organisations will experience some difficulties as a consequence of poor design decisions made when they originally created their SharePoint superstructure – it’s worth getting specialist advice from the outset.

Notice the term superstructure – that’s not one that was used at the event I attended but I was introduced to the term a few weeks back by my colleague Andy May and it seems appropriate for enterprise-wide applications that sit above the basic server and network infrastructure and provide services for true business applications – examples would be Exchange Server (messaging) and SharePoint (collaboration). Carlo Pescio described the semantic differences between infra- and super-structures in a recent blog post.

Many organisations will experience some difficulties as a consequence of poor design decisions… it’s worth getting specialist advice from the outset.

The need to plan ahead begins with the initial setup where there is a choice between a basic or an advanced installation. Most administrators who intend to try out SharePoint with a view to adapting the topology later as the organisation builds its knowledge and use of the product could be expected to elect a basic installation but unfortunately, a basic installation uses SQL Server 2005 Express Edition as a local database server and cannot be scaled out. The alternative is to select an advanced installation, where there is a choice of complete (which actually allows the selection of services as required), web front end (assumes that complete installations exist elsewhere in a web farm) or standalone (as for the basic installation). In most cases a complete install will be the most appropriate selection; however that does require an existing SQL Server to be in existence (either locally, or on another server). After determining the file location and electing whether or not to join the customer improvement programme), setup copies the binaries to the chosen location before launching a separate wizard to configure services.

Another design item is the concept of a server farm. A SharePoint server farm shares a single configuration database – that means that a fast network (<80ms latency: i.e. not geo-replicated) is required between the SharePoint servers and the database server(s). Microsoft also recommends that one domain controller should be provided for every three front-end SharePoint servers (and that doesn’t include any load on the DCs from other applications).

SharePoint configuration needs to know whether a new farm is to be created or if the server is to join an existing farm. Advanced settings include options as to whether or not the server should host the administration website. Errors at this stage of setup generally relate to permissions with the SQL Server service account, which needs to be a local Administrator. I have to ask if software developers will ever learn to provide a list of rights for delegation in place of saying “make it an administrator” but if Microsoft don’t even follow that approach on their own server operating system then what chance is there for third party application providers?

SharePoint administration is provided through a web interface (over a dedicated port), or from the command line on the server (using the stsadm command). In the case of web administration, there is a three-tier model employed with tasks delineated based on roles, allowing for controlled delegation and secure isolation:

  • Central administration – this is where the IT department is most likely to retain control, for farm-level resource management and status. Aiming to reduce administration time through provision of a single point of administration with a consistent (and extensible) user interface for all SharePoint products, the central administration console provides:
    • Administrative task list – informing operators of tasks for action, including links to the appropriate user interface.
    • Home page topology view – a quick view of the servers in a farm and what is running on each one.
    • Services on a server page – for management of components running on a single server.
    • Flat menu structure – operations and application management with only those options available to the current user displayed.
    • Remote administration – web based administration interface and scheduled system updates.
  • Shared services – this (MOSS-only) level may be managed by whoever is responsible for IT within a business unit; determining the services that team sites can consume. The shared service goal is to separate services from portals and remove scaling limitations around the number of portals. Shared services act as a group, providing a logical and secure partition of the server farm and are required for site and cross-site level Office Server features. Shared services components are the shared service administration website and associated databases, providing:
    • Search.
    • Directory import.
    • User profiles.
    • Audiences.
    • Targetting.
    • Business data cataloguing.
    • Excel caclulation services.
    • Usage reporting.
  • Site settings – management of a site or site collection within an hierarchy, e.g. a portal or a team site. Rights can be delegated on individual sites so a business user could have total (or partial) control over a tiny part of the overall SharePoint superstructure, without impacting on any other sites. It may sound counter-intuitive for an IT administrator to delegate control to business users but that’s often the best approach for administration at the site level.

One major change between SPS and WSS/MOSS is that there is no longer any requirement to create a site in IIS and then tell SharePoint to use the site. With the current SharePoint products, all management is performed though the SharePoint administration tools (with one exception – assigning certificates to SSL-secured sites, which is still done by IIS). SharePoint-aware IIS websites are no longer called virtual servers (server virtualisation has brought an entirely different meaning to that term) but are instead known as web applications.

Shared services are one of the key design elements for MOSS implementation. It is possible to define multiple shared service providers; however each is completely isolated from the other. This may be viewed as a limitation; however it is potentially useful (e.g. in an application service provider scenario, of for providing complete separation of one department’s collaborative web application from the rest of the enterprise for political or organisational reasons). Web applications can be re-associated with another shared service provider (e.g. to consume a new set of services) but they cannot consume services from more than one provider (with the exception of My Sites – through the concept of a trusted My Site location). Content that this “marooned” in another shared service provider needs to be recreated, or migrated using stsadm at the command line. The majority of SharePoint superstructures will use a single shared service provider.

Another key design element is the definition of the hierarchy for the site structure. It is not normally appropriate for an IT department to define a structure bu simply following an organisation chart and some business analysis is required to determine how the business actually functions (cross-group collaboration, etc.

Despite expecting SQL service accounts to be administrators (!), Microsoft also suggests some best practices from a security perspective:

  • Use unique accounts for centralised administration, managing servers in the farm and service accounts – i.e. do not use generic administration accounts!
  • Enable Kerberos – not only is it viewed as more secure but it is faster than NTLM.
  • Enable SSL on sites (set within SharePoint but certificates are assigned within IIS).
  • Consider the management of the SPAdmin service – it requires access to various items within SharePoint but is a service account; therefore consider password resets and the level of access required on individual WSS/MOSS servers (stsadm can be used to globally reset passwords across all application pools as detailed in Microsoft knowledge base article 934838).

In terms of physical architecture, there is a balance to be struck between availability and resilience – the main options (in order of increasing availability and performance) are:

  • Single server – potentially supporting many users but also a single point of failure. Serves content (sites), shared services, administration and all databases.
  • Small server farm (e.g. a single database server and one or two load-balanced SharePoint servers) – better resilience; however still reliant on a single database server.
  • Medium server farm (e.g. clustered SQL servers and SharePoint roles broken out onto multiple servers for front end web access and a middle tier for shared service provision, e.g. indexing). This solution potentially provides the best balance between performance, resilience and cost.
  • Large server farm – many dedicated servers for individual SharePoint roles providing a scalable solution for a global enterprise (but probably overengineering the solution for many organisations).

Due to the network requirements discussed previously, server farms need to be centralised (the user experience for remote users may be improved using hardware accelerators to cache content across the WAN). Other considerations for improving the user experience include not making the front page too “busy” to improve the time it takes to render and provision of additional front-end web servers to render pages quickly and increase throughtput to the back-end shared service and SQL servers. If SharePoint is to become the point of access for all information within a business then it will quickly be viewed as critical and some thought should be given to the location of various shared services. Load balancing across front end servers can be achieved using Windows Server network load balancing (NLB) or a hardware-based load-balancing solution – Steve Smith demonstrated using NLB at last night’s event; however it’s also worth checking out Joel Oleson’s NLB and SharePoint configuration and troubleshooting tips. It’s also worth noting that SharePoint automatially handles load balancing of application roles (where configured – clearly it won’t load balance a role if it only exists on a single server – something to think about when considering placement of the centralised administration role in a small or medium server farm) – a separate load balancing solution is only required for client access to front-end servers.

If it’s proving difficult to justify the cost of additional web servers, then some basic performance analysis can be undertaken using Microsoft’s web application stress tool (linked from Microsoft knowledge base article 231282) which can then be used to demonstrate the point at which user performance is likely to be impacted. Performance can also be improved by caching data (pages, graphics, etc.) on a per-site basis.

One potential method of scaling up rather than out, is to use 64-bit versions of Windows Server 2003, SQL Server 2005 and SharePoint; however it’s worth considering that IFilters (which are used to index non-native file formats) may only be available as 32-bit versions and that may limit the options for 64-bit deployments.

When thinking about other SharePoint roles, it’s worth considering that although individual roles can be started/stopped on SharePoint servers as required, certain roles have additional configuration items to be provided at startup and it’s better to plan the workload accordingly.

With regards to indexing, indexes can become large (10-15% of the size of the content that is being indexed); therefore the default location on the system drive is probably not ideal. Also, only one index server is allowed within the farm; however if a separate query server is created, this will hold a copy of the index (albeit not necessarily the latest version) avoiding the creation of a single point of failure.

To help with management of a SharePoint superstructure, Microsoft Operations Manager (MOM) 2005 management packs exist for both WSS and MOSS; however it’s also worth considering other systems management elements as SharePoint has its own security threats against which to mitigate:

  • A SharePoint-aware anti-virus product is required to interface with the SharePoint object model (e.g. Microsoft Forefront Security for SharePoint).
  • Some additional content filtering (e.g. using ISA Server) may be required to prevent content from circumventing SharePoint’s simple protection which is based upon file-extension and size limits.

ISA Server can potentially be used to bring other benefits to a SharePoint infrastructure, for example, whilst SharePoint does provide for extranet access, it may be appropriate to let ISA Server handle the security and caching elements of the connection and then pass simple (and fast) HTTP requests back to SharePoint. This is particularly convenient in a complex combine intranet and extranet scenario, where the need to access Active Directory for MySites and personalisation can cause issues around forms based authentication.

One point I’ve not mentioned yet is the change of name from Microsoft SharePoint Portal Server to Microsoft Office SharePoint Server 2007. Leaving aside the decision to drop portal from the name, the Microsoft Office part is significant because of the high level of integration between Microsoft Office and the SharePoint products and technologies; however it is worth noting that MOSS 2007 is not reliant on the 2007 Microsoft Office system although use of the latest products will allow the most complete user experience (Microsoft has published a fair, good, better, best white paper for Microsoft Office programs and SharePoint products and technologies).

The key message for me at last night’s presentation was that SharePoint needs to be planned in detail and that some outside help will probably be required. As yet, there is no prescriptive guidance from Microsoft (although this is rumoured to be in production – for details watch the Microsoft SharePoint products and technologies team blog which, somewhat curiously but in common with all the other Microsoft blogs is hosted using Community Server and not SharePoint!) so it’s worth consulting with those who have done it before – either via various Internet resources linked throughout this post or by engaging with one of Microsoft’s authorised solution provider partners (and yes, I do work for one of them so there is a potential conflict of interest there but the views, thoughts and opinions expressed in this blog are purely personal).

One final area of interest for me which was I have not seen covered anywhere is the SharePoint product roadmap. I can’t get anyone at Microsoft to comment on this (not even under NDA) but I understand that WSS3 will ship within Windows Server codenamed Longhorn and there are no new versions planned for the foreseeable future.

Further information