When Apple’s connectors don’t connect

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A couple of weeks back I wrote about Apple’s lack of clarity over delivery times when ordering a new computer. Well, my MacBook finally arrived yesterday (and like it very much) but tonight, I got ready to hook it up to the TV using a combination of my Apple Mini-DV to DVI and DVI to Video adapters only to find that the “spade” on the male DVI connector on one adapter is is too large to fit the female DVI connector on the other! Arghhh! I also have the same problem if I try to connect it to a DVI to VGA connector.

These are all Apple products (i.e. it’s not as it I’m trying to use a combination of cheap components to cut corners) but it seems that I need to buy a third connector – a Mini-DV to Video connector – for the rare occasions when I want to watch digital video content on my aging 32″ TV.

Thank you Apple – for yet another example of the fabled Apple design taking precedence over practicality. As a friend pointed out to me, Apple probably doesn’t want me using two connectors together as it will spoil the aesthetic effect. Shouldn’t that be my choice?

Freeing digital downloads from the shackles of the BBC iPlayer

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve written before about my concerns with the BBC iPlayer but nevertheless, it is the only legal way to download BBC programming to my computer that I am aware of. Since I wrote that post, iPlayer has been improved to include streaming content for unsupported platforms but that doesn’t allow for offline viewing (catching up on TV episodes on the train, for example).

Well, there is a workaround and, as I figure that I am a BBC licence fee-payer and the content has been downloaded legally, converting it to watch it on another device is at least morally acceptable – even if the BBC may not agree. After all, it’s not as if I’m sharing the resulting files with other people. Based on my initial tests, it seems to work well – at least with the version of Windows Media Player that my iPlayer machine is using (v11.0.5721.5230).

All it involves is taking one copy of Windows XP, with a working BBC iPlayer installation, and running a couple of utilities to identify the keys to the Windows Media Player and remove the DRM from the .WMV files that make up the iPlayer content (by default, this is held at %allusersprofile%\Documents\My Deliveries\iplayer_live). The resulting file(s) should play in Windows Media Player without DRM restrictions – and, critically, will also play back on Windows Vista or MacOS X (using the Windows Media Components for QuickTime).

Performing an Active Directory Health Check

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few months ago, I was in a situation where I needed to perform a health check on a customer’s Active Directory (AD) infrastructure in preparation for guiding them through the process of migrating directory objects between forests. I’ve worked with AD for years – and am reasonably familiar with the various utilities – but didn’t really have a formalised method for reviewing its health and the political climate was such that I didn’t want to be the one who had missed an obvious diagnostic (no pressure there then!).

Then I found an eBook which turned out to be a fantastic investment – Andrew Abbate’s Digital Shortcut to Performing an Active Directory Health Check. Published by SAMS and supplied in Adobe PDF format (protected with digital rights management), this book gave me a refresher course on the tools and their use, then describes how to carry out the health check, interpret the data, and fix the problems. Sure, it won’t tell you everything you need to know – but it certainly gave me enough to apply the rest of my skills and knowledge to get to the bottom of the issues we were experiencing.

This eBook is available via the Safari online library; however googling also turned up copies available for purchase and download from a variety of online stores – I bought a copy for $9.99 at eBookMall.

Passed Microsoft Certified Systems Engineer exam 70-296

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last week I wrote about having scraped through the first of two exams needed to update my MCSE from 2000 to 2003 and this morning I passed the second by an equally narrow margin.  Whilst I’m pleased to have passed the Planning, Implementing and Maintaining a Microsoft Windows Server 2003 Environment for an MCSE Certified on Windows 2000 exam (exam 70-296), and am similarly glad that I found it challenging (i.e. worthwhile), I did sail a little close to the wind – and that wasn’t for lack of preparation either. So what happened?

I’ve worked with Windows NT since 1995, been an MCP since 1998 (and MCSE since 1999) worked with Active Directory since NT 5.0 beta 2 and generally have a fair amount of Microsoft Windows Server design and implementation experience in a variety of organisations.  Even though I’ve remained technical, it’s inevitable that as I progress in my career, I spend more time managing and less time doing – meaning that I do not have a huge amount of recent operational or administrative experience.  So, in order to upgrade my MCSE I needed to refresh my knowledge of the key concepts without re-learning everything from scratch.

With that in mind, and the impending withdrawal of the MCSE 2000-2003 upgrade exams, last summer, I bought a Microsoft Press Training Kit entitled Upgrading your Certification to Microsoft Windows Server 2003.  It’s a weighty tome and includes evaluation software, eBooks and a readiness review suite from MeasureUp.  It’s actually a really good purchase but, at 1100 pages and almost 2.5kg, I found it too large (physically) to keep lugging it around with me and, despite the title, it seems to be targetted at people who are setting out on the MCSE path for the first time.

Then, a few months back, I used an practice test from pass4sure to help prepare for MCTS exam 70-624.  I passed the exam, but the software was Java-based (and the installer failed to recognise that my system already had Java installed and tried to install it again), was full of bugs and, at $79.99 for just 53 questions, I felt that it was very poor value for money.  So, when uCertify asked me to review their PrepKits I was interested to compare them with my previous experiences.

uCertify kindly provided review copies of the PrepKits for exams 70-292 and 70-296 and, from the moment I installed them, I could see that the quality was way above my previous experience.  No buggy installer – these went straight onto my Vista system with no issues, and I was greeted with a professional interface.  Unlike the pass4sure practice tests, there were a few hundred questions (albeit with a fair amount of repetition – I calculated about 15% appeared in multiple practice tests) and tests were available as pre-defined practice tests, adaptive tests, custom tests (for example, just the questions that have previously be answered incorrectly), or an interactive quiz.  There was also a complete run-down of the exam objectives and other study aids including flash cards, study notes and articles.  Finally, the software allows the ability to view test history and to evaluate readiness using the built-in reporting tools.

uCertify PrepKit

I set to work on the practice tests, and found that there were two possible modes – test mode (with feedback at the end) and learn mode, whereby a fairly detailed explanation was available on request after answering each question.  For some of the questions, I did not (and still do not) agree with the answers provided but the tool also includes the ability to provide feedback to uCertify and on at least one question I could view the feedback that others had provided.  I also spotted quite a few grammatical and spelling errors – one was even in the interface itself so occurred on multiple questions.

Even though the general quality of the PrepKit software is high, there are some very obvious bugs.  On my Windows Vista system I found that if I paused a test and then cancelled the pause, the clock did not start counting again – but that was actually useful because in learn mode there is not a lot of time by default (58 questions in 60 minutes) to take in the information.  I also had a problem whereby the software lost my exam history – a minor annoyance, but it did effectively prevent me from retesting using just the questions I had answered incorrectly.

So, the software generally is not bad – it has a few issues but no show-stoppers.  But what about its effectiveness?  Taking exam 70-292 as an example, I saw my scores increase but I do wonder if, due to the repetition of the questions, I was actually learning the answers to the PrepKit tests rather than applying the knowledge gained in order to answer the question correctly (the difference may be subtle – but it is significant).  This was particularly evident when I moved on to the PrepKit for exam 70-296, where there was some repetition of questions from the PrepKit for exam 70-292 (unsurprising as the exam objectives also overlap) and I consistently scored above 80% (with most tests above 90%).

My theory about learning the answers rather than learning the key concepts that are required to answer the questions correctly appears to be born out in my results from the real exams.  The Microsoft NDA prevents me from discussing their content but I do have to wonder if, when I can consistently score above 90% in a practice test – even with the final test – which is intended to be more difficult than the vendor exam – how come I barely scraped a pass score in the real thing?

So, to summarise – do I think the uCertify PrepKits are worth the money?  Probably. Will they prepare you to pass the exam? Possibly.  Microsoft/Prometric are currently offering free exam insurance (Second Shot) and, in any case, uCertify offer their own money-back guarantee but, based on my experience, the PrepKits form just one part of an overall preparation strategy – and my usual method of re-reading course materials and writing my own notes seems to work better for me.

You can try the uCertify PrepKits for yourself – and I’d be interested to hear how people get on.  Demonstration versions can be downloaded for free and access to the full PrepKit is unlocked with a license key costing around $59.99 with discounts for multiple purchases.  It’s worth noting that the uCertify PrepKits are not just for Microsoft certifications either – there are PrepKits available for a variety of vendors with further details available at www.ucertify.com.

[Update 20 February 2008: You can get 10% off the uCertify PrepKit of your choice using the discount code MARWIL]

IDE/SATA to USB cable for temporary disk access

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

After last week’s near-catastophe when one of my external hard disks failed, I found that the disk itself was still serviceable and it was just the enclosure that had inexplicably stopped working. So, I put the disk into an identical enclosure that had been sitting on the shelf since my previous data storage nightmare, only to find that enclosure had also failed (whilst not even being used).

IDE/SATA to USB cableAlthough I have recovered the data to another drive, my new MacBook has yet to arrive and so I wanted to hook the disk back up to the original machine and sync my iPhone with iTunes (I was running out of podcasts to listen to in the car), so Maplintonight I bought a IDE/SATA to USB 2.0 cable from Maplin, allowing me to connect 2.5″ or 3.5″ IDE (PATA) or SATA disks to the USB port on any computer without a caddy.

It doesn’t look pretty and I wouldn’t recommend using it for too long as the drive gets very hot but it will certainly suffice as a temporary measure and the ability to support either PATA or SATA drives means that the cable should continue to be useful for a while yet.

Website development tips and tricks

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

About a year ago, I started the redevelopment of this site to use WordPress as my CMS, in the process aiming to make the site XHTML and CSS standards-compliant. It was a big job and, as this blog is really just a hobby that I put most of my spare time into, it took some time. For the last year I’ve had a draft post part-written about some of the things I found along the way and now, as I’m about to embark on another facelift (let’s call it markwilson.it v2.5), I thought it was about time I finished that post – or at least published the collection of notes I made during the development of markwilson.it v2.0. I hope some of the information here is useful to others setting off on the same route and, of course, if anyone knows better, feel free to leave a comment.

First of all, sizing my text. No pixel sizing here – I use font-size:medium; for the body and then percentages to resize text elsewhere (ems would achieve the same result as percentages but this method was recommended and avoids potential issues with Internet Explorer). Then, in order to keep the text size pretty much consistent across browsers, I set the font-size:76%; on the #wrap. Using a font-size of medium as a start point is probably not essential – that’s the default for most browsers anyway but it does at least give me a known start point. I could go further and implement the A List Apart hack for Internet Explorer (IE) 5 on Windows but, as IE5 users accounted for just 0.13% of my visitors in the last month, it really doesn’t justify the effort (I’m waiting for the day I can say the same about IE6 but that’s some time off yet).

I do most of my development using Mozilla Firefox (on a Mac), then test on Safari (MacOS), IE6/7 (Windows) and Mozilla (Linux). I’m afraid that I don’t bother about other browsers (except some compatibility testing for mobile browsers) but that covers the vast majority of visitors. I also test the site with the page style disabled (to check that document flow is correct and that site is still usable) and repeat (with style applied) to see the effect of the site without images, and without scripting. Basically, if it all degrades well, then I’m happy. The only downside (for me) of browsing without JavaScript support is the absense of Google AdSense or Analytics.

When trying to work out what is working, and what is not, the accessibility-checking favelets can be useful but I actually prefer Chris Pederick’s Web Developer extension for Firefox. When I first came across this handy extension, I thought Chris’ name was familiar and from his resume I see that we were both working at ICL at the same time in the late 1990s… maybe that’s it). Incidentally, the Web Accessibility Tools Consortium (WAT-C) has produced toolbars for IE and Opera.

Validating XHTML and CSS using the W3C tools is good for checking out the quality of your code (but you do need to keep checking back – I’ve just noticed that some of the third-party code I’ve added has broken my XHTML). For blogrolls, an OPML validator is available.

Having moved to a hosting provider where I have access to the server logs, the first thing I noticed was the volume of errors like this one:

[timestamp] [error] [client ipaddress] File does not exist: /usr/home/username/public_html/favicon.ico

I added a favourites icon file to my web server’s root folder and instantly saw a drop in the number of errors (there are plenty of online generators available but I used Favicon Maker – largely because the site looked good… I find it remarkable how many people offering web design tips don’t appear to have looked at their own site recently… although I do realise that there is a difference between design and code and I also realise I’m leaving myself open to criticism here too). I also added this line of code because not all browsers will look for the presence of the favicon.ico file:

<link rel="shortcut icon" href="http://example.com/favicon.ico" type="image/vnd.microsoft.icon">

Incidentally, Information Gift has a useful summary of how various browsers treat the favourites icon.

On a similar note, I recently added a 57×57 apple-touch-icon.png file to the site to support webclips on the iPhone. It may be a minority platform but it’s one I use!

A few more resources that I’ve found useful whilst developing the site include:

Last, but by no means least, I’d like to mention my buddy Alex, who provides the hosting service for the site and is also my first port of call for any WordPress/web development advice.

Passed Microsoft Certified Systems Administrator exam 70-292

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Phew! That was close. I passed the Managing and Maintaining a Windows Server 2003 Environment for an MCSA Certified on Windows 2000 exam (exam 70-292) yesterday afternoon, but only by the narrowest of margins. Microsoft’s NDA prevents me from commenting on the contents of the exam but after my last Microsoft certification (which was unbelievably simple) this one was much more difficult.

I’m also aiming to take exam 70-296 over the next couple of weeks – to complete the update of my MCSE from 2000 to 2003 before that is retired (and therefore make transitioning to the Windows Server 2008 certifications a little more straightforward).

I guess administration is not something I do a huge amount of (I’m a consultant and I know the technology but more from an implementation perspective) but I did invest a fair amount of time in the preparation and so I think it may say something about the quality of the revision materials that I used… I’ll reserve judgement on that until after I’ve taken the next exam but watch this space.

Delays when purchasing Apple hardware

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

After the disappointment that was Macworld 2008, last weekend I decided to bite the bullet and buy an Apple notebook. A MacBook Pro would be great, but it is also an old design and very expensive, so I decided to buy a MacBook.

My employer is a member of Apple’s employee purchase programme (EPP) so I bought the computer from the UK online Apple Store (EPP discounts are not valid in brick and mortar stores). Somewhere in the purchase process I’m sure that I was quoted 3-5 days for delivery (24 hour shipment for non-customised orders) – right up until the order was finalised, at which point it got a lot longer.

I also purchased some accessories (a mini-DVI to DVI adapter and iPhone headphone adapter) but that’s not really customisation. Is it? Apparantly is is, at least according to Apple. Checking the site now, it seems that adding accessories like this increases the ship time from 24 hours to 3 days.

My order was shipped within 48 hours but is currently estimated to take 11 days for delivery. I could walk from my house to Apple’s UK distribution centre in Leicestershire and back in that time (most couriers that come to my door have a nice big diesel van to make it faster for them…). I was confused, especially as the shipment status page has displayed “In transit to final destination – carrier details to be updated shortly” for a couple of days now and I’d expected it to be with UPS/DHL/insert-name-of-courier-here, allowing me to track its progress (and to make sure I’m here to sign for it).

So I called Apple and it turns out that my Macbook is not from stock in the UK. It’s being made in China. Shipping within 24 hours is all very well but when it’s shipped from the other side of the world it’s kind of irrelevant. Meanwhile, this was not clearly communicated to me at the time of order (I might have foregone my EPP discount for the sake of picking one up from a store) – making me a very dissatisfied customer – particularly as if I decide to cancel or return the order it will cost me another £60.

Apple has traditionally enjoyed a loyal fanbase and, more recently, has increased market share by encouraging many consumers to switch to their product range. I know that in my case the original decision to buy an Apple computer was based on style and the surity that if I didn’t get along with MacOS X, I could always install Windows on the Intel hardware. I’m now buying my second Mac and am extremely disappointed by the service I’ve received. It’s not as if the MacBook is inexpensive either – from a cursory glance around it seems that comparibly specified PC notebook from another vendor can be purchased for significantly less money and other major OEMs are happy to talk to me about product roadmaps so I know I’m not buying a white elephant (no chance of that with Apple).

It seems to me (and to friends who have experienced Apple’s customer service of late) that, as Apple grows its market share, attention to detail on the things that matter most to customers declines. Maybe that is just a reality of capitalism. Maybe it’s a reflection on corporate American business practices. Or maybe Apple have just taken their eye off the ball (again).

When, oh when, will I learn to take proper backups?

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Apparently (according to my wife), I’ve been a bit stressy today. Justifiably, I’d say: I have an exam tomorrow that I haven’t finished preparing for; one or both of my kids has woken me up at least once a night (or early in the morning) since I-can’t-remember-when; and this morning I walked into the office to see that the nice blue light on my external hard disk (the one with my digital photos and my iTunes library) was red.

The disk was still spinning and the Mac that it was attached to still appeared to have a volume called “External HD” but any application that attempted to access the volume locked up. So I shut down the computer, switched off the external hard disk, turned it all back on again and… saw the familiar blue light appear but without any sign of the disk spinning up. Arghhh!

Toshiba PX1223E-1G32 320GB External Hard DiskThis was the second time this had happened to me with this model of external hard disk. “Bloody Western Digital disks”, I thought… but I didn’t have time to investigate further – I had three practice exams to do today, a conference call about the Windows Server 2008 launch and the usually deluge of e-mail to process – so, I turned the disk enclosure off again and left it until, once the kids were in bed, I took the disk out of the enclosure and put it into another PC.

Imagine my relief when it spun up – and, after installing MacDrive on the PC (the disk is formatted with HFS+) and rebooting, I could see my data. Woohoo!

I’m currently in the process of copying all of the data to the only volume I have left with enough free space. Unfortunately the machine I’m using for recovery only has a 100Mbps NIC and seeing as Windows Server 2008 says I’m getting 10.5MB/sec (i.e. around 84Mbps) then I think the network’s doing pretty well but the process of copying almost 300GB of data will take most of the night. Then, once everything is safely recovered, I’ll run some diagnostics on the disk and work out whether it’s the disk or the enclosure that gone belly-up. In the meantime, I’m withdrawing my earlier recommendations for the Toshiba PX1223E-1G32 (320GB 7200 RPM external USB 2.0 hard drive with 8MB data buffer).

Does anybody know where I can get an iSCSI storage device with decent RAID capabilities for not too much cash?

Windows Server 2008 – pulling it all together

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

However you look at it, there’s little doubt that creating an operating system release the size of Windows Server 2008 is a huge undertaking.  A few months back, I was privileged to hear Alex Hinrichs, Group Program Manager for Windows Server, speak about the process of building Windows Server.

First of all, any major project needs strong leadership and the Windows Server Division management team includes a huge volume of collective experience from guys like Bob Muglia, Bill Laing and Iain McDonald.

Then, there’s the consistent vision – at the heart of the Windows Server 2008 product there have been a few major themes:

  • Roles (customers don’t think "my Windows server"; they think "my domain controllers", "my web servers", etc.).
  • Only install what the system needs.
  • Make it secure, reliable, manageable… and fast.
  • Quality should be determined by real world deployments (i.e. only ship when know the product is ready).

It’s also worth remembering that Windows Server 2003 has been an excellent operating system release.  So, as they began to plan for the next release, Microsoft took a look at what worked well when developing Windows Server 2003:

  • Deployments are key to quality (internal deployments – "dogfooding", early adoption customers).
  • Include a long customer feedback cycle.
  • Install fewer components and services by default.
  • Lock down the feature set early (good ideas are all very well, but it requires discipline to say "no, we’ll add that later") and focus on quality during the last year of development rather than adding new things.
  • Protect the daily build (the daily version of Windows Server needs to be solid).
  • Focus on the basics (reliability, security, performance).

Even though the feature set was locked down some time ago, Microsoft did respond to customer feedback on the early builds of Windows Server codenamed Longhorn – that’s how we got features like IIS as a role in server core, Windows PowerShell as a feature, more granular group policy objects and read-only domain controllers.

The important point is that in the final year, major changes were limited.  Customer requests were still the primary driver but there were no more changes for people who thought "it would be really cool if…" and changes were only implemented to unblock customer deployments.

For me (as I’m not a software engineer), the fascinating part of Alex’s presentation was the daily process.

Program Managers are responsible for delivering on a single component of the operating system, with each virtual build lab (VBL) being made up of a number of feature build labs (FBL) – for example:

  • Core (platform, kernel, setup, etc.).
  • Networking (TCPIP stack, DHCP, RRAS, etc.).
  • Server roles (IIS, AD, etc.).
  • Security (logon, licensing, etc.).
  • Client (shell, Internet Explorer, etc.).
  • File (including backup, storage server, etc.) .

The daily ship-room (engineering) meeting examines test results to see which product groups are ready to bring in code to the build based on:

  • Distributed responsibility:
    • FBLs need to get code ready in order to move up through the build process via the VBLs.
    • Around 10,000 people have contributed code to Windows Server at some point in process (about 1000 developers are working on it every day)
  • Daily tests (checks and balances) including:
    • Build verification tests (BVTs) – examining whether the operating system will complete set up, can it upgrade, can it share files, etc. for a few thousand simple tests on all versions (32/64-bit, Itanium, etc.).
    • Feature verification tests (FVTs) – tests at feature/role level, e.g. AD, IIS, etc.
    • Stress – what does it take to break the system, using over 1000 machines running stress tests every day until something breaks, then attaching to a debugger to see what broke it.
    • Reliability – how long can the system run – for every server role.
  • Bugs and bug bar (what bugs are there to fix… and when by).

Hinrichs explained that the componentisation effort used for Windows Server 2008 has certainly helped to make the process easier.  It has taken many years but dependencies have been broken as senior, strong, architects have run Windows code has through architectural layer tools to ensure that architectural policy is adhered too.

Using a component-based model helps to identify conflicts before they hit the main build.  For example, if the Shell and Internet Explorer teams are working on same binary the code is checked in at the client layer and the developers can work together to resolve a conflict before the code is incorporated into the main tree.

Before code is accepted, there are a number of quality gates to be negotiated, consisting of a battery of tests to run at check-in stages, for example:

  • Static code analysis (buffer overflows, other security problems, managed code problems, etc.).
  • Architectural layer (check for implied dependencies and relate back to roles – a developer may think that they are working in their own universe but this is not necessarily so).
  • Code coverage (automated tests all over the world constantly testing the system to hit as many code paths as possible, aiming for automation to cover as many as possible but realistically 70-75% automated with the rest tested manually).
  • Policy check tools (looking for globalisation or localisation issues, political issues, etc.).

On a normal day in the Windows Server Division:

  • FBL developers run checks and once the code is ready it is pushed to a VBL team for building.
  • The VBL team pulls down the main build and merges it with new code from the FBLs, looking for conflicts/problems.  Once everything is ready, code from all teams is brought together into the main build.
  • Because there are checks and balances at all levels (and reverse integration), things tend to be pretty clean at the main build level.
  • It’s not just about pushing code up – it is pulled down as well so that all teams pick up changes from each other.

I’m pleased to see Windows Server 2008 ship before its official release date.  For some time now Microsoft would only commit to "shipping during the first quarter of 2008" but were adamant that quality was the primary goal and that the product would only ship when it was ready.  Based on my experiences, it seems remarkably solid and I have no reservations about pushing my organisation as hard as I can to deploy Windows Server 2008 for our customers.  And I’ll wrap this post up with one final comment – Kevin Lane, the lead for the technology adoption program (TAP) customers has been on call 24×7 to ensure that major issues affecting customers are resolved quickly.  In the last 6 months he has only had one call that’s been important enough to disturb his sleep…