Recovering data after destroying the Mac OS X partition table

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’m not a religious man but every once in a while I do something stupid and find myself hoping for some divine intervention. Yesterday, I excelled in my stupidity with what was probably the single most careless thing that I have ever done in my entire computing life, accidentally re-initialising my external hard disk (containing, amongst other things, my iTunes library and irreplaceable digital photos of my children) and the backup disk.

In a mild state of panic, I called my friend Alex who gave me two excellent pieces of advice:

  • Do nothing with the corrupted disks. Sit tight. Calm down. Wait and see what turns up from researching similar scenarios on the ‘net.
  • Submit a post to some of the Mac forums (Mac OS X Hints, Apple Discussions, Mac Geekery) and see if anyone can recommend a suitable course of action.

Thank you Alex.

And thank you Stanley Horwitz, debaser626, Tom Larkin and Joe VanZandt for coming back to me with recommendations almost straightaway. Almost everyone suggested running a tool from Prosoft Engineering called Data Rescue II.

In addition to its primary role of recovering lost data on hard disks, this $99 utility (a small price in comparision to professional data recovery fees) has two especially important features: it is non-destructive as all restoration has to be to another volume; and it can be run in demo mode first to check that data is recoverable before having to be registered.

A quick scan turned up no recoverable files but a thorough scan was more useful. After a few hours reading 6 billion disk blocks and another couple analysing the data, it found my files. Unfortunately the progress bar gives no indication as to how many files might be recoverable whilst the scan is taking place, presumably because files may be spread across many disk blocks but it found my files!

The recovered files were marked as orphans, CBR (whatever that is) and then a whole load of them actually had their original file names and other metadata. After successfully recovering a single file, I bought a license and set about recovering the entire contents of the disk to another volume. Unfortunately it hung after failing to read one of the files but I repeated the operation (this time saving my scan results so that I can exit and relaunch the application if necessary) and successfully restored my digital photos. The relief is immense and I’m presently running a full restoration of the entire disk contents (I imagine that a large part of tomorrow will be spent working out which files I need, and which were actually deliberately deleted files recovered along with the lost data).

Other potentially useful tools, which I didn’t try but which might be useful to others, include:

  • GRC Spinrite – for proactive hard disk maintenance and recovery of data from failing hard disks (disk recovery – not partition recovery)
  • Alsoft DiskWarrior – for damaged directory structures (file system recovery – not partion recovery).
  • SubRosaSoft File Salvage – another partition recovery tool.

Note that I haven’t tried any of these tools myself – I’m simply alerting any poor soul who stumbles across this page to their existence.

I was lucky. Very lucky.

The moral of this story – don’t rely on a single backup that is overwritten nightly and permanently connected to your computer. I really must take more frequent DVD backups of crucial files and store a disk backup offsite.

I should know better.

Get a Mac? Maybe, but Windows Vista offers a more complete package than you might think

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ll freely admit that I have been critical of Windows Vista at times and I’ll stand by my comments published in Computer Weekly last November – Windows XP will remain in mainstream use for quite some time. Having said that, I can’t see Mac OS X or Linux taking the corporate desktop by storm and the move to Vista is inevitable, just not really a priority for many organisations right now.

Taking off my corporate hat one evening last week, I made the trip to Microsoft’s UK headquarters in Reading for an event entitled “Vista after hours”. Hosted by James Senior and Matt McSpirit it was a demo-heavy and PowerPoint-light tour of some of the features in Windows Vista that we can make use of when we’re not working. Not being a gamer and having bought a Mac last year, I’ve never really paid attention to Microsoft’s digital home experience but I was, quite frankly, blown away by what I saw.

The first portion of the evening looked at some of the out-of-the-box functionality in Windows Vista, covering topics like search, drilling down by searching within results, using metadata to tag objects, live previews and saving search queries for later recall as well as network diagnosis and repair. Nothing mind-blowing there but well-executed all the same. Other topics covered included the use of:

  • Windows Photo Gallery (which includes support for the major, unprocessed, raw mode formats as well as more common, compressed, JPEG images) to perform simple photo edits and even to restore to the original image (cf. a photographic negative).
  • Windows Movie Maker to produce movies up to 1080p.
  • Windows DVD Maker to produce DVD menus with support for both NTSC and PAL as well as 4:3 and 16:9 aspect ratios.
  • Windows Media Player to organise media in many ways (stack/sort by genre, year, songs, album, artist, rating, recently added, etc.) and share that media.

Apple Macintosh users will think “yeah, I have iPhoto, iMovie, iDVD and iTunes to do all that” and they would be correct but Apple says (or at least implies in its advertising) that it’s hard to do these things on a PC – with Vista it’s just not… which moves me on to backup – not provided (at least in GUI form) by the current Mac OS X release (only with a .Mac subscription) and much improved in Windows Vista. “Ah yes, but Leopard will include Time Machine!”, say the Mac users – Windows has had included the volume shadow copy service (VSS/VSC) since Windows XP and Windows Backup includes support for multiple file versions right now as well as both standard disk-based backups and snapshots to virtual hard disk (.VHD) images, which can then be used as a restore point or mounted in Virtual PC/Virtual Server as a non-bootable disk. Now that does sound good to me and I’m sure there must be a way to make the .VHD bootable for physical to virtual (P2V) and virtual to physical (V2P) migrations… maybe that’s something to have a play with another day.

Regardless of all the new Vista functionality, for me, the most interesting part of the first session was Windows Home Server. I’m a registered beta user for this product but must confess I haven’t got around to installing it yet. Well, I will – in fact I’m downloading the April CTP as I write this. Based on Windows 2003 Small Business Server, it provides a centralised console for management of and access to information stored at home. Microsoft claim that it has low hardware requirements – just a large hard disk – I guess low hardware requirements is a subjective term (and I figure that my idea of low hardware requirements and Microsoft’s may differ somewhat), nevertheless it offers the opportunity to secure data (home computer backup and restore, including scheduling), provide centralised storage (a single storage pool, broken out as shared storage, PC backups, operating system and free space), monitor network health (i.e. identify unsafe machines on the network), provide remote access (via an HTTPS connection to a defined web address) and stream media, all controlled through a central console. Because the product is aimed at consumers, ease of use will be key to its success and it includes some nice touches like scheduled backups and automatic router configuration for remote access. Each client computer requires a connection pack in order to allow Home Server to manage it (including associating account information for secuirity purposes) and, in response to one of my questions, Microsoft confirmed that there will be support for non-Windows clents (e.g. Mac OS X 10.5 and even Linux). Unfortunately, product pricing has not yet been released and early indications are that this will be an OEM-only product; that will be a great shame for many users who would like to put an old PC to use as a home server.

Another area covered in the first session was parental controls – not really something that I worry about right now but maybe I will over the next few years as my children start to use computers. Windows Vista includes the ability for parents to monotor their child’s activities including websites, applications, e-mail, instant messages and media. Web filters can be used to prevent access to certain content with an HTTP 450 response, including a link for a parent to approve and unblock access to the content as well as time limits on access (providing a warning before forcing a logout). Similarly, certain games can be blocked for younger users of the family PC. The volume and diversity of the questions at the event would indicate that Vista’s parental controls are fairly simplistic and will not be suitable for all (for example, time limits are on computer access as a whole and not for a particular application, so it’s not possible to allow a child access to the computer to complete their homework but to limit games to a certain period in the evening and at weekends).

If session one had whetted my appetite for Vista, session two (Vista: Extended) blew my mind and by the time I went home, I was buzzing…

I first heard of Windows SideShow as a way to access certain content with a secondary display, e.g. to provide information about urgent e-mails and upcoming appointments on the lid of a laptop computer but it actually offers far more than this – in fact, the potential for SideShow devices is huge. Connectivity can be provided by USB, Wi-Fi, Bluetooth – Windows doesn’t care – and the home automation possibilities are endless. I can really see the day when my fridge includes capabilities for ordering groceries via a SideShow display in the door. There is at least one website devoted to SideShow devices but James Senior demonstrated a laptop bag with a built-in SideShow controller including a cache for media playback. Typically used to expose information from a Windows Sidebar gadget, SideShow devices will wake up a sleeping computer to synchrosise content then put it back to sleep and can be secured with a PIN or even erased when logged off. Access is controlled within the Windows Control Panel and there is an emulator available to simulate SideShow devices.

As elegant as Apple Front Row is, for once Microsoft outshines the competition with Windows Media Center

Next up was Windows Media Center. Unlike with the Windows XP Media Center and Tablet PC editions, Microsoft no longer provides a separate SKU for this functionality, although it is not enabled in all Vista product editions. Media Center is a full-screen application that offers a complete home media hub – sort of like Apple Front Row but with support for TV tuners to include personal video recorder (PVR) functionality. As elegant as Apple Front Row is, for once Microsoft outshines the competition with Windows Media Center – multiple TV tuners can be installed (e.g. to pause live TV, or to record two items at once, as well as the elctronic programme guide (EPG), controls, etc. being displayed as an overlay on the currently playing content. As with Windows Media Player, visualisations are provided and in theory it ought to be possible to remote control a Media Center PC via Windows Home Server and set up a recording remotely. Individual programs, or whole series, can be recorded and many TV tuners include DVB-T (digital terrestrial) support (i.e. Freeview), with other devices such as satellite and cable TV decoders needing a kludge with a remote infra-red controller (a limitation of Sky/Virgin Media network access rather than with Windows). Other functionality includes RSS support as well as integration with Windows Live Messenger and some basic parental controls (not as extensive as elsewhere in Windows Vista but nevertheless allowing a PIN to be set on certain recordings).

The event was also my first opportunity to look at a Zune. It may be a rather half-hearted attempt at producing a media player (no podcast support and, crucially, no support for Microsoft’s own PlaysForSure initiative) but in terms of form-factor it actually looks pretty good – and it includes functionality that’s missing from current iPods like a radio. If only Apple could produce an iPod with a similarly-sized widescreen display (not the iPhone) then I’d be more than happy. It also seems logical to me that as soon as iTunes is DRM-free then the iTunes/iPod monopoly will be broken as we should be able to use music purchased from the largest online music store (iTunes) on the world’s favourite portable media player (iPod) together with Windows Media Center… anyway, I digress…

I mentioned earlier that I’m not a gamer. Even so, the Xbox 360‘s ability to integrate with Windows PCs is an impressive component of the Microsoft’s digital home experience arsenal. With its dashboard interface based around a system of “blades”, the Xbox 360 is more than just a games machine:

As well as the Xbox 360 Core and Xbox 360 Pro (chrome) systems Microsoft has launched the Xbox 360 Elite in the United States – a black version with a 120GB hard disk and HDMI connectivity, although it’s not yet available here in the UK (and there are also some limited edition Yellow Xbox 360s to commemorate the Simpsons movie).

Finally, Microsoft demostrated Games for Windows Livebringing the XBox 360 Live experience to Windows Vista-based PC gaming. With an Xbox 360 wireless gaming receiver for Windows, Vista PC gamers can even use an Xbox 360 wireless controller (and not just for gaming – James Senior demonstrated using it to navigate Windows Live maps, including the 3D and bird’s eye views). Not all games that are available for both PCs and the Xbox will offer the cross-platform live experience; however the first one that will is called Shadowrun (and is due for release on 1 June 2007) bringing two of the largest gaming platforms together and providing a seamless user experience (marred only by the marketing decision to have two types of account – silver for PC-PC interaction and gold for PC-XBox).

Apple’s Get a Mac campaign draws on far too many half truths that will only become apparent to users after they have made the decision to switch… and then found out that the grass is not all green on the other side

So, after all this, would I choose a Mac or a Windows PC? (or a Linux PC?) Well, like so many comparisons, it’s just not that simple. I love my Mac, but Apple’s Get a Mac campaign draws on far too many half truths that will only become apparent to users after they have made the decision to switch, splashed out on the (admittedly rather nice) Apple hardware and then found out that the grass is not all green on the other side. In addition, Apple’s decision to delay the next release of OS X whilst they try to enter the mobile phone market makes me question how committed to the Macintosh platform they really are. Linux is good for techies and, if you can support yourself, it has the potential to be free of charge. If you do need support though, some Linux distros can be more expensive than Windows. So what about Windows, still dominant and almost universally despised by anyone who realises that there is a choice? Actually, Windows Vista is rather good. It may still have far too much legacy code for my liking (which is bound to affect security and stability) but it’s nowhere near as bad as the competition would have us thinking… in fact it hasn’t been bad since everything moved over to the NT codebase and, complicated though the product versions may be, Windows Vista includes alternatives to the iLife suite shipped with a new Macs as well as a superior media hub. Add the Xbox integration and Windows SideShow into the mix and the Microsoft digital home experience is excellent. Consumers really shouldn’t write off Windows Vista just yet.

Adding a meaningful description to web pages

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

One of the things that I noticed whilst reviewing the Google results for this site, was how the description for every page was shown using the first text available on the page – mostly the alternative text for the masthead photo (“Winter market scene from the small town of Porjus in northern Sweden – photograph by Andreas Viklund, edited by Alex Coles.”):

Screenshot showing duplicate descriptions

Clearly, that’s not very descriptive and so it won’t help much with people finding my site, linking to me, and ultimately improving the search engine placement for my pages, so I need to get a decent description listed for each page.

The WordPress documentation includes a page on meta tags in WordPress, including an explanation as to why they aren’t implemented by default (my template did include a meta description for each page which included the weblog title and tagline though). Even though meta tags are not a magic solution to search engine placement, I wanted to find a way to add a meaningful description for each page using <meta description="descriptionofcontent"> and also <meta keywords="pagecontext"> (although it should be noted that much of the available advice indicates that major search engines ignore this due to abuse). Fortunately there is a WordPress plugin which is designed to make those changes – George Notoras’ Add-Meta-Tags. There’s plenty of speculation as to whether or not Google actually uses the description meta tag but recent advice seems to indicate that it is one of many factors involved in the description shown in search results (although it will not actually affect positioning).

I already had meta tags in place for content-type, robots, and geolocation but I added some more that I was previously using HTML comments for:

<meta http-equiv="content-language" content="en-gb" />
<meta name="author" content="Mark Wilson" />
<meta name="generator" content="WordPress" />
<meta name="publisher" content="markwilson.it" />
<meta name="contact" content="webmaster@markwilson.co.uk" />
<meta name="copyright" content="This work is licenced under the Creative Commons Attribution-Non-Commercial-Share Alike 2.0 UK: England & Wales License. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-sa/2.0/uk/ or send a letter to Creative Commons, 559 Nathan Abbott Way, Stanford, California 94305, USA" />

Incidentally, a comprehensive list of meta tags and an associated FAQ is available at Andrew Daviel’s Vancouver webpages.

After checking back a couple of weeks later, the same search returns something far more useful:

Screenshot showing duplicate descriptions

Unfortunately my PageRank has dropped too, and it’s possible that the duplicate entries for http://www.markwilson.co.uk/ and https://www.markwilson.co.uk/blog/ are causing the site to be penalisedGoogle’s Webmaster guidelines say “don’t create multiple pages, subdomains, or domains with substantially duplicate content. The presence of those duplicate entries is actually a little odd as checking the server headers for http://www.markwilson.co.uk/ reveals an HTTP 301 response (moved permanently), redirecting to https://www.markwilson.co.uk/blog/.  Of course, it could be down to something entirely different, as PageRank is updated infrequently (there’s more information and links to some PageRank anaylsis tools at RSS Pieces but I use Page Rank Checker) and there have been a lot of changes to this site of late… only time (and building the volume of backlinks to https://www.markwilson.co.uk/blog/) will tell.

Defragmenting a Mac OS X hard disk

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Apple claims that OS X is the world’s most advanced operating system. If that’s the case, then why does it lack basic system utilities? That’s a rhetorical question, but I’ve written before about OS X’s lack of a decent backup utility and today (including most of tonight – hence the time of this post) I fell foul of its inability to defragment hard disks.

“ah… but you don’t need a defragmentation utility with OS X because it automatically defragments as it goes.”

[insert name of just about any Macintosh support forum here]

Wrong.

OS X defragments files, but not the disk itself (for an explaination as to what that really means and as to whether it’s really necessary, refer to Randy B Singer’s Mac OS X maintenance and troubleshooting guide). This inability to perform what should be a basic operating system function (even Windows has the capability) has cost me a lot of time today. In fairness, there is a third party utility availabilty (if I was prepared to pay for it), called iDefrag (Paul Stamatiou has a review of iDefrag on his site) but in the end, I used Mike Bombich’s Carbon Copy Cloner to clone my hard disk to my backup drive, make that bootable, repartition my system disk, and then clone the drive back again – a pretty long winded approach to defragmentation.

Still, every cloud has a silver lining… at least this process led me to discover the the Mac OS X maintenance and troubleshooting guide that I referred to earlier… well worth a read.

Passed Microsoft Certified Technology Specialist exam 70-262

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I missed the announcement, but at some stage in recent years, Microsoft revamped its IT Professional certification scheme. It seems as though I still qualify as a Microsoft Certified Systems Engineer (MCSE) for both the NT 4.0 and Windows 2000 tracks; although I never did get around to upgrading my MCSE to Windows XP and Server 2003… maybe I’ll follow the Vista and Longhorn Server track when it’s released.

Anyway, earlier today I passed the Microsoft Office Live Communications Server 2005 – Implementing, Managing, and Troubleshooting exam (70-262), making me a Microsoft Certified Technology Specialist (MCTS): Microsoft Office Live Communications Server 2005.

I guess that’s just like an MCP in the old days but it’s another logo to display on the IT Services page. Actually, the real reason I did it was that I was incentivised by the prospect of a free iPod from my employer if I was one of the first three people to take (and pass) the test by a particular date!

This was the first Microsoft exam that I’ve taken for a while and Microsoft’s non-disclosure agreement prevents me from saying too much about it but as I took Monday off work, spent all day Tuesday (and Thursday evening) at Microsoft events and had to do some real work too, it’s been a challenge to cram in all of my revision… hence the lack of blog posts this week. I plan to make up for that after the long weekend (when I finally get around to writing up my notes from the Microsoft Management Summit and Vista After Hours events)… watch this space.

Planning and deploying Microsoft Office SharePoint Server 2007

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

It’s been a few months since I attended a Microsoft event but last night I made the trip to Reading for a session on planning and deploying Microsoft Office SharePoint Server. Hosted by a vendor (rather than one of the IT Professional technical evangelist team), I was initially unsure of how useful the event would be but Steve Smith (an MVP who part-owns consultancy Combined Knowledge and is very active in the UK SharePoint User Group) was an extremely knowledgeable and engaging speaker. Furthermore, he took the time during the mid-session break to answer each of my (many) questions, for which I’m extremely grateful! What follows, is a summary of the content from last night’s event combined with links and additional information from my own research.

Firstly, it’s worth clarifying that SharePoint is a branding for a group of products and technologies and the two major product versions are:

It is important to note that WSS is a free of charge download for licensed Windows Server users whereas MOSS requires separate licenses (server and client access). It should also be noted that MOSS replaces the previous SharePoint Portal Server (SPS) 2003 product and the change of name reflects that SharePoint is more than a portal – it’s a collaboration platform.

WSS integrates with Windows Server (2003, at the time of writing there are still some issues with the beta for Windows Server codenamed Longhorn), Internet information services (IIS) and the Microsoft.Net Framework 3.0 (including the Windows Workflow Foundation) to provide a collaboration platform, using SQL Server 2000 or 2005 as its database. That’s a lot of dependencies and a lot of variables in the choice of server configuration; however it’s worth noting that a separate database server is recommended (more on that in a moment) and using SQL Server 2005 will provide performance gains over SQL Server 2000. WSS provides the ability to open, add, create and check in/out documents for collaboration at a departmental level; however it is not a document management solution. It provides some foundation services (storage, security, management, deployment, a site model and extensibility) for a collaborative solution.

MOSS builds on WSS (indeed, the MOSS installation includes WSS and will overwrite an existing WSS installation) to provide shared services for portal services, enterprise content management (formerly provided by Content Management Server 2002), enterprise search and indexing and business intelligence and forms (described as “a window on business systems”). What Microsoft’s marketing materials do not highlight, is that MOSS can also provide a front end to enterprise document and records management (EDRM) solutions such as those provided by Meridio, (EMC) Documentum and Open Text.

In designing MOSS, Microsoft attempted to address a number of customer pain points that existed in SPS:

  • Poor resource utilisation and isolation.
  • Inconsistent setup.
  • Network support.
  • Difficult central administration.
  • Topology restrictions.
  • Upgrades.

Many of these have been addressed (for example, unlike with SPS, it’s a simple to add another server to an existing infrastructure); however upgrades are still not as simple as they could be and were referred to anecdotally as being the most common reason for an incident to be logged with Microsoft Product Support Services (PSS) at the moment.

The WSS/MOSS administration design goals were:

  • Simplicity – easy setup using an existing SQL Server or installing a copy of SQL Server 2005 Express Edition.
  • Extensibility – a single object model so that moving from WSS to MOSS does not break SharePoint applications.
  • Consistency – no more “jumps” from WSS team sites to portal pages.
  • Resource optimisation – the ability to scale out by dedicating servers to specific tasks, e.g.indexing.
  • Delegation – the ability to delegate control over parts of the infrastructure to particular groups of users.

Steve Smith compared the changes between SPS 2003 and MOSS 2007 with the period when another Microsoft product – Exchange Server – reached maturity in the late 1990s; it was not until the release of Exchange Server 5 (which was actually the second product version) that it began to build market presence and by version 5.5 it was the arguably the de facto product for building a corporate messaging platform. Microsoft is hoping (and business interest is indicating) that MOSS 2007 could mark a similar turning point for SharePoint; however it seems likely that many organisations will experience some difficulties as a consequence of poor design decisions made when they originally created their SharePoint superstructure – it’s worth getting specialist advice from the outset.

Notice the term superstructure – that’s not one that was used at the event I attended but I was introduced to the term a few weeks back by my colleague Andy May and it seems appropriate for enterprise-wide applications that sit above the basic server and network infrastructure and provide services for true business applications – examples would be Exchange Server (messaging) and SharePoint (collaboration). Carlo Pescio described the semantic differences between infra- and super-structures in a recent blog post.

Many organisations will experience some difficulties as a consequence of poor design decisions… it’s worth getting specialist advice from the outset.

The need to plan ahead begins with the initial setup where there is a choice between a basic or an advanced installation. Most administrators who intend to try out SharePoint with a view to adapting the topology later as the organisation builds its knowledge and use of the product could be expected to elect a basic installation but unfortunately, a basic installation uses SQL Server 2005 Express Edition as a local database server and cannot be scaled out. The alternative is to select an advanced installation, where there is a choice of complete (which actually allows the selection of services as required), web front end (assumes that complete installations exist elsewhere in a web farm) or standalone (as for the basic installation). In most cases a complete install will be the most appropriate selection; however that does require an existing SQL Server to be in existence (either locally, or on another server). After determining the file location and electing whether or not to join the customer improvement programme), setup copies the binaries to the chosen location before launching a separate wizard to configure services.

Another design item is the concept of a server farm. A SharePoint server farm shares a single configuration database – that means that a fast network (<80ms latency: i.e. not geo-replicated) is required between the SharePoint servers and the database server(s). Microsoft also recommends that one domain controller should be provided for every three front-end SharePoint servers (and that doesn’t include any load on the DCs from other applications).

SharePoint configuration needs to know whether a new farm is to be created or if the server is to join an existing farm. Advanced settings include options as to whether or not the server should host the administration website. Errors at this stage of setup generally relate to permissions with the SQL Server service account, which needs to be a local Administrator. I have to ask if software developers will ever learn to provide a list of rights for delegation in place of saying “make it an administrator” but if Microsoft don’t even follow that approach on their own server operating system then what chance is there for third party application providers?

SharePoint administration is provided through a web interface (over a dedicated port), or from the command line on the server (using the stsadm command). In the case of web administration, there is a three-tier model employed with tasks delineated based on roles, allowing for controlled delegation and secure isolation:

  • Central administration – this is where the IT department is most likely to retain control, for farm-level resource management and status. Aiming to reduce administration time through provision of a single point of administration with a consistent (and extensible) user interface for all SharePoint products, the central administration console provides:
    • Administrative task list – informing operators of tasks for action, including links to the appropriate user interface.
    • Home page topology view – a quick view of the servers in a farm and what is running on each one.
    • Services on a server page – for management of components running on a single server.
    • Flat menu structure – operations and application management with only those options available to the current user displayed.
    • Remote administration – web based administration interface and scheduled system updates.
  • Shared services – this (MOSS-only) level may be managed by whoever is responsible for IT within a business unit; determining the services that team sites can consume. The shared service goal is to separate services from portals and remove scaling limitations around the number of portals. Shared services act as a group, providing a logical and secure partition of the server farm and are required for site and cross-site level Office Server features. Shared services components are the shared service administration website and associated databases, providing:
    • Search.
    • Directory import.
    • User profiles.
    • Audiences.
    • Targetting.
    • Business data cataloguing.
    • Excel caclulation services.
    • Usage reporting.
  • Site settings – management of a site or site collection within an hierarchy, e.g. a portal or a team site. Rights can be delegated on individual sites so a business user could have total (or partial) control over a tiny part of the overall SharePoint superstructure, without impacting on any other sites. It may sound counter-intuitive for an IT administrator to delegate control to business users but that’s often the best approach for administration at the site level.

One major change between SPS and WSS/MOSS is that there is no longer any requirement to create a site in IIS and then tell SharePoint to use the site. With the current SharePoint products, all management is performed though the SharePoint administration tools (with one exception – assigning certificates to SSL-secured sites, which is still done by IIS). SharePoint-aware IIS websites are no longer called virtual servers (server virtualisation has brought an entirely different meaning to that term) but are instead known as web applications.

Shared services are one of the key design elements for MOSS implementation. It is possible to define multiple shared service providers; however each is completely isolated from the other. This may be viewed as a limitation; however it is potentially useful (e.g. in an application service provider scenario, of for providing complete separation of one department’s collaborative web application from the rest of the enterprise for political or organisational reasons). Web applications can be re-associated with another shared service provider (e.g. to consume a new set of services) but they cannot consume services from more than one provider (with the exception of My Sites – through the concept of a trusted My Site location). Content that this “marooned” in another shared service provider needs to be recreated, or migrated using stsadm at the command line. The majority of SharePoint superstructures will use a single shared service provider.

Another key design element is the definition of the hierarchy for the site structure. It is not normally appropriate for an IT department to define a structure bu simply following an organisation chart and some business analysis is required to determine how the business actually functions (cross-group collaboration, etc.

Despite expecting SQL service accounts to be administrators (!), Microsoft also suggests some best practices from a security perspective:

  • Use unique accounts for centralised administration, managing servers in the farm and service accounts – i.e. do not use generic administration accounts!
  • Enable Kerberos – not only is it viewed as more secure but it is faster than NTLM.
  • Enable SSL on sites (set within SharePoint but certificates are assigned within IIS).
  • Consider the management of the SPAdmin service – it requires access to various items within SharePoint but is a service account; therefore consider password resets and the level of access required on individual WSS/MOSS servers (stsadm can be used to globally reset passwords across all application pools as detailed in Microsoft knowledge base article 934838).

In terms of physical architecture, there is a balance to be struck between availability and resilience – the main options (in order of increasing availability and performance) are:

  • Single server – potentially supporting many users but also a single point of failure. Serves content (sites), shared services, administration and all databases.
  • Small server farm (e.g. a single database server and one or two load-balanced SharePoint servers) – better resilience; however still reliant on a single database server.
  • Medium server farm (e.g. clustered SQL servers and SharePoint roles broken out onto multiple servers for front end web access and a middle tier for shared service provision, e.g. indexing). This solution potentially provides the best balance between performance, resilience and cost.
  • Large server farm – many dedicated servers for individual SharePoint roles providing a scalable solution for a global enterprise (but probably overengineering the solution for many organisations).

Due to the network requirements discussed previously, server farms need to be centralised (the user experience for remote users may be improved using hardware accelerators to cache content across the WAN). Other considerations for improving the user experience include not making the front page too “busy” to improve the time it takes to render and provision of additional front-end web servers to render pages quickly and increase throughtput to the back-end shared service and SQL servers. If SharePoint is to become the point of access for all information within a business then it will quickly be viewed as critical and some thought should be given to the location of various shared services. Load balancing across front end servers can be achieved using Windows Server network load balancing (NLB) or a hardware-based load-balancing solution – Steve Smith demonstrated using NLB at last night’s event; however it’s also worth checking out Joel Oleson’s NLB and SharePoint configuration and troubleshooting tips. It’s also worth noting that SharePoint automatially handles load balancing of application roles (where configured – clearly it won’t load balance a role if it only exists on a single server – something to think about when considering placement of the centralised administration role in a small or medium server farm) – a separate load balancing solution is only required for client access to front-end servers.

If it’s proving difficult to justify the cost of additional web servers, then some basic performance analysis can be undertaken using Microsoft’s web application stress tool (linked from Microsoft knowledge base article 231282) which can then be used to demonstrate the point at which user performance is likely to be impacted. Performance can also be improved by caching data (pages, graphics, etc.) on a per-site basis.

One potential method of scaling up rather than out, is to use 64-bit versions of Windows Server 2003, SQL Server 2005 and SharePoint; however it’s worth considering that IFilters (which are used to index non-native file formats) may only be available as 32-bit versions and that may limit the options for 64-bit deployments.

When thinking about other SharePoint roles, it’s worth considering that although individual roles can be started/stopped on SharePoint servers as required, certain roles have additional configuration items to be provided at startup and it’s better to plan the workload accordingly.

With regards to indexing, indexes can become large (10-15% of the size of the content that is being indexed); therefore the default location on the system drive is probably not ideal. Also, only one index server is allowed within the farm; however if a separate query server is created, this will hold a copy of the index (albeit not necessarily the latest version) avoiding the creation of a single point of failure.

To help with management of a SharePoint superstructure, Microsoft Operations Manager (MOM) 2005 management packs exist for both WSS and MOSS; however it’s also worth considering other systems management elements as SharePoint has its own security threats against which to mitigate:

  • A SharePoint-aware anti-virus product is required to interface with the SharePoint object model (e.g. Microsoft Forefront Security for SharePoint).
  • Some additional content filtering (e.g. using ISA Server) may be required to prevent content from circumventing SharePoint’s simple protection which is based upon file-extension and size limits.

ISA Server can potentially be used to bring other benefits to a SharePoint infrastructure, for example, whilst SharePoint does provide for extranet access, it may be appropriate to let ISA Server handle the security and caching elements of the connection and then pass simple (and fast) HTTP requests back to SharePoint. This is particularly convenient in a complex combine intranet and extranet scenario, where the need to access Active Directory for MySites and personalisation can cause issues around forms based authentication.

One point I’ve not mentioned yet is the change of name from Microsoft SharePoint Portal Server to Microsoft Office SharePoint Server 2007. Leaving aside the decision to drop portal from the name, the Microsoft Office part is significant because of the high level of integration between Microsoft Office and the SharePoint products and technologies; however it is worth noting that MOSS 2007 is not reliant on the 2007 Microsoft Office system although use of the latest products will allow the most complete user experience (Microsoft has published a fair, good, better, best white paper for Microsoft Office programs and SharePoint products and technologies).

The key message for me at last night’s presentation was that SharePoint needs to be planned in detail and that some outside help will probably be required. As yet, there is no prescriptive guidance from Microsoft (although this is rumoured to be in production – for details watch the Microsoft SharePoint products and technologies team blog which, somewhat curiously but in common with all the other Microsoft blogs is hosted using Community Server and not SharePoint!) so it’s worth consulting with those who have done it before – either via various Internet resources linked throughout this post or by engaging with one of Microsoft’s authorised solution provider partners (and yes, I do work for one of them so there is a potential conflict of interest there but the views, thoughts and opinions expressed in this blog are purely personal).

One final area of interest for me which was I have not seen covered anywhere is the SharePoint product roadmap. I can’t get anyone at Microsoft to comment on this (not even under NDA) but I understand that WSS3 will ship within Windows Server codenamed Longhorn and there are no new versions planned for the foreseeable future.

Further information

The elements of meaningful XHTML

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’m really trying to use good, semantic, XHTML and CSS on this website but sometimes it’s hard work. Even so, the validation tools that I’ve used have helped me to increase my XHTML knowledge and most things can be tweaked – I’m really pleased that this page current validates as both valid XHTML 1.1 and CSS2.

Last night I came across an interesting presentation by Tantek Çelik (of box model hack fame) that dates back to the 2005 South by SouthWest (SxSW) interactive festival and discusses the elements of meaningful XHTML. Even though the slidedeck is no substitute for hearing the original presentation, I think it’s worth a look for a few reasons:

  • It taught me about some XHTML elements that I wasn’t familiar with (e.g. <address>) and others I’m just getting to grips with (e.g. <cite>).
  • It highlighted some techniques which abuse the intended meaning for XHTML elements and how the same result should be achieved using semantically correct XHTML.
  • It introduced me to extending XHTML with microformats for linked licenses, social relationships, people, events, outlines and even presentations (thanks to the links provided by Creative Commons and the XHTML Friends Network, I already use linked licenses and social relationships on this site but now I understand the code a little better).
  • It reinforced that I’m doing the right thing!

Modifying wp-mobile to create content that validates as XHTML-MP

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Yesterday, I wrote a post about using Alex King’s WordPress Mobile Edition plugin (wp-mobile) to generate WordPress content formatted for the mobile web. wp-mobile makes the code generation seamless; however I did have a few issues when I came to validating the output at the ready.mobi site. After a few hours (remember, I’m an infrastructure bod and my coding abilities are best described as weak) I managed to tweak the wp-mobile theme to produce code that validates perfectly.

Screen grab from the ready.mobi report for this website

The changes that I made to the wp-mobile index.php file can be seen at Paul Dixon’s PHP pastebin but are also detailed below:

  1. Add an XHTML Mobile Profile (XHTML-MP) document type descriptor: <!DOCTYPE html PUBLIC "-//WAPFORUM//DTD XHTML Mobile 1.0//EN" "http://www.wapforum.org/DTD/xhtml-mobile10.dtd">. Incidentally, I didn’t include an XML declaration (which looks like: <?xml version="1.0" charset="UTF-8" ?>) as it kept on generating unexpected T_STRING PHP errors and it seems that it is not strictly necessary if the UTF-8 character set is in use:

    “An XML declaration is not required in all XML documents; however XHTML document authors are strongly encouraged to use XML declarations in all their documents. Such a declaration is required when the character encoding of the document is other than the default UTF-8 or UTF-16 and no encoding was determined by a higher-level protocol.”

    W3C recommendation for XHTML 1.0

  2. Add some caching controls: <?php header ("Cache-Control: max-age=10 "); ?>. 10 seconds is a little on the low side but it can be changed later and it means that the caching is unlikely to affect testing of subsequent changes.
  3. Remove <meta name="HandheldFriendly" value="true" />: this code doesn’t appear to do anything and is not valid XHTML-MP – media="handheld" can be used instead when linking the stylesheet (see below).
  4. Change the stylesheet link method: although <style type="text/css">@import url("<?php print(get_stylesheet_uri()); ?>"); </style> should work, I found that the validator was only completely satisfied with the form <link href="<?php print(get_stylesheet_uri()); ?>" rel="stylesheet" type="text/css" media="handheld" />.
  5. Provide access keys using accesskey="key" inside the <a> tag for each of the main menu items.
  6. Surround <?php ak_recent_posts(10); ?> with <ul> and </ul> tags – this bug took the most time to track down and was the final change necessary to make the markup validate as XHTML-MP.

I also made some minor changes in order to fit my own page design (adding a legal notice, etc.) but in order to get the elusive 100% in the report for this site, there was one minor tweak required to style.css: removal of the height: 1px; rule for <hr>. I understand why it was there but the validator didn’t like it, suggesting that relative units should be used instead (I would argue that 1px is far more logical for a horizontal rule than the use of relative units but this change resulted in another pass on the report).

Right, enough of these mobile diversions – I’d better focus my development efforts on getting the rest of this site to be fully XHTML compliant…

Publishing WordPress content on the mobile web

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few nights back, I was reading a .net magazine article about developing websites enabled for mobile content.

As my blog is written primarily for technical people, it seems logical to assume that a reasonable proportion of its readers could make use of access from a mobile device, especially as the magazine article’s author, Brian Fling, believes that:

“[the mobile web] will revolutionize the way we gather and interact with information in the next three years”

Web 2.0 Expo: From Desktop to Device: Designing the Ubiquitous Mobile Experience

Basically, the catalyst for this comes down to a combination of increasing network speeds and mobile services, combined with a falling cost in the provision of data services.

It seems that there are basically two schools of thought when it comes to designing mobile content for the web: some (most notably the W3C) believe that content should be device agnostic; whilst that approach is perfectly laudable (a mobile browser is, after all, just another form of browser) others believe that the whole point of the mobile web is that device-specific functionality can be used to provide services that wouldn’t otherwise be available (e.g. location-based services).

Brian’s .net magazine article explains that there are for major methods of mobile web publishing:

  1. Small screen rendering
  2. Programatically reformatting content
  3. Handheld style-sheets
  4. Mobile-specific site.

As we work down the list, each of these methods is (potentially) more complex, but is also faster. Luckily, for WordPress users like myself, Alex King has written a WordPress Mobile Edition plugin, which applies a different stylesheet for mobile browsers, publishing a mobile friendly site. Using the Opera Mini live demo to simulate a mobile browser, this is what it did for my site:

This website, viewed in a simulated mobile phone browserThe mobile-optimised version of this website, viewed in a simulated mobile phone browser

The first image shows the content as it would be rendered using the default, small screen rendering – not bad but not exactly ideal on a small screen – but the second image is using the WordPress Mobile Edition plugin to display something more suitable for the mobile web. Not only is the display much simpler and easy to navigate on a handset, but the page size has dropped from 28KB to 1KB. Consequently, I was a bit alarmed when I used the ready.mobi site to generate a report for this site, as the site only scored 3 out of 5 and was labelled as “will possibly display poorly on a mobile phone”. Even so, the user experience on my relatively basic (by modern standards) Nokia 6021 was actually quite good (especially when considering that the device is not a smartphone and it failed the handheld media type test) whereas viewing the normal (non-mobile) version generated a “memory full” error.

So, it seems that preparing a WordPress site for the mobile web is actually pretty simple. I have a couple of tweaks to make in order to improve the ready.mobi test results (quick fixes ought to include support for access keys and working out why the page heading is being tagged as <h3> when the standard site uses an <h1> tag) but there is certainly no need for me to develop a separate site for mobile devices, which is just as well as it’s taking me ages to finish the redevelopment of the site (and I can save myself a few quid by not registering the markwilson.mobi domain)!

Links
The following links may be useful to anyone who is looking at developing content for the mobile web:

It may also be worth stopping by at Keni Barwick’s blog on all things mobile.

Coding horror

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I just stumbled upon Jeff Atwood’s Coding Horror blog and it’s very interesting reading (even for those of us who write very little code). The article that I found was commenting on Jakob Nielsen’s latest tome on web usability. Although Nielsen makes some valid points, the comments are worth a read as they highlight some of the real compromises that website designers and website developers have to make.

I’m sure I could lose many hours reading Jeff’s writing but they all seem well-informed, to the point and interesting… these were just a few of the posts that grabbed my attention this afternoon:

  • When in doubt, make it public looks at how Web 2.0 is really just creating websites out of old Unix commands and that the new business models are really about taking what was once private and making it public!
  • SEOs: the new pornographers of the web looks at how much of the real search engine optimisation is just good web development and that many of the organisations focusing on SEO are all about money and connections – whether or not the assertions that Jeff makes in his post are correct, it’s an interesting view and certainly seems to have a lot of SEOs fighting their corner.
  • Why does Vista use all my memory? looks at how Windows Vista’s approach to memory management (a feature called SuperFetch) and how grabbing all the available memory to use it as a big cache is not necessarily a bad thing.