Did SpinRite actually save my data?

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

This morning, I shut down the notebook PC that I use for work and set off to meet a colleague. Upon returning, I tried to boot the system but nothing happened. Technically, something happened – but not what I expected – basically Windows would not boot and sometimes it reached the startup screen, sometimes it didn’t. Once or twice I’m sure I saw the once-familiar blue screen of death flash up for a fraction of a second before the PC reset itself. I tried a normal startup as well as last known good configuration, before finally I gave up and tried to recover the system using the Windows Server 2008 DVD but this couldn’t locate an installed copy of Windows to recover. What it would let me do though was get to a command prompt, where attempting to access drive C: returned:

The request could not be performed because of an I/O device error.

That didn’t sound good but I managed to run diskpart.exe and list disk told me that disk 0 was online. Moving on to try list partition told me that the two partitions I expected to see were there but it was list volume that really helped shine a light on the problem – the DVD drive and WinRE volumes showed as healthy but drive C: was reported as being a 110GB Healthy partition of type raw (i.e. not NTFS). At this point, I began to panic. Something had happened to the NTFS and that could mean lost data. I have a reasonably recent backup but the last couple of weeks at work have been mayhem and there was some stuff that I know I don’t have a second copy of.

I could call my company’s IT support number but it normally takes at least a day for a callback; I’d have to take the laptop to a “local” office (a 100 mile round trip); if a system won’t boot, the standard approach is to spend a very limited amount of time trying to fix it (probably none at all for people like me who run a non-standard operating system) before simply wiping the system and installing a new corporate build. That means going back to Windows XP and Office 2003 (which is painful when you are used to Windows Vista/Server 2008/7 and Office 2007), the loss of an activated copy of Windows Server 2008 Enterprise Edition (which is not exactly inexpensive) and also losing my data (the standard build has separate system and data partitions and my build does not… although now I’m starting to reconsider that choice).

I’m pretty sure that the root of this problem is a failing hard disk (after all, it is a “Western Dodgital“) but, without the tools to prove it, I’ve got a snowball’s chance in hell of getting a new one) and, to cut a long story short, when it comes to supporting my non-standard build, I’m on my own (at least unless I can prove that the hardware is faulty).

One of the podcasts that I listen to is “Security Now” and the hosts (Steve Gibson and Leo Laporte) spend far to much time plugging Steve’s SpinRite product. I’ve often wondered if it was any good but was not prepared to spend $89 for speculative purposes – this afternoon I decided that it was time to give it a try.

After paying up, downloading the software, extracting the ISO and creating a bootable CD, I ran SpinRite and performed what is referred to as a “level 2” scan. For the first 20 minutes, SpinRite ran through my disk finding nothing untoward but at the 50% mark it switched into “DynaStat” mode and started trying to identify lost data on one particular sector, slowly narrowing down the unrecoverable bits of the sector. Just this one sector took almost 5 hours and around 2000 samples but all of a sudden SpinRite took off again and finished up the rest of the drive in another 20 minutes. Even though the sector was marked as unrecoverable, a technical support conversation by e-mail confirms that this relates to the data, not the sector. With some trepidation, I restarted the computer, waited with baited breath and have never been so glad to see Windows start checking its disk(s). After a short while, chkdsk was complete and I was presented with a logon screen.

There’s nothing in the Windows event logs to indicate why my system failed to boot so many times this afternoon so it’s difficult to say what the problem was and whether it really was SpinRite that fixed it (although SpinRite did report the SMART data for the drive and there were a number of seek errors, backing up my theory that the hard disk is on its way out). What’s important though is that, as I write this post, Windows Server 2008 is 63% of its way through a backup and all seems to be well. I’m not quite ready to wholeheartedly endorse SpinRite – it does almost sound too good to be true – but, on the face of it, it seems to have recovered enough data on my disk to let Windows boot and for me to gain access to my system. That’s worth my $89 – although somehow I don’t see me getting that particular item through on my expenses…

Mounting ISO images in Windows 7

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

The Windows 7 beta includes the ability to burn CDs/DVDs from ISO images but it doesn’t seem to be able to mount them as volumes. As this beta is supposed to be feature complete, I don’t think it’s very likely that we’ll see this functionality added in future builds (even though rival operating systems can already do it…) but there are some third party alternatives available.

Last week, I (finally) got around to upgrading my netbook from Windows 7 milestone 3 (build 6801) to the beta (build 7000) and, as I didn’t have access to a DVD drive, I used Slysoft Virtual CloneDrive to mount the ISO as a drive, after which I could select the Windows 7 setup.exe from the autorun menu. It did exactly what I needed it to and that, rather lengthy, upgrade process didn’t seem to hiccup at all. From a quick trawl of the ‘net, there is at least one alternative out there (which I haven’t tried) – PowerISO – although this is a chargeable product and Virtual CloneDrive is freeware.

I got burned by [Google] FeedBurner

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

On Friday night I wrote a post which optimistically suggested that I’d successfully migrated this site’s RSS feeds from FeedBurner to the new Google FeedBurner platform. Unfortunately many people won’t have seen that post (at least not until after I spent a good chunk of my weekend enlisting the support of known subscribers to try and work out why the primary URL given out for this site’s RSS feed wasn’t working: Thanks go to Bill Minton, Alistair Doran, Garry Martin and Aaron Parker) – it all turned out to be because FeedBurner’s instructions for users of their MyBrand service have a vital step missing…

I’ve made the point before that free online services are not necessarily an ideal solution but nevertheless, many of us rely on them and hopefully my experiences will help out others who are going through the (soon to be forced) migration from the (old) FeedBurner platform to (new) Google FeedBurner.

If it ain’t broke, why fix it?

Some time ago, Google bought FeedBurner. That’s not all bad – I’m sure the guys who started FeedBurner were pretty stoked, and for us customers (who, after all, were largely using free services), things got better as premium services like MyBrand (more on that in a moment) were made free.

It was inevitable that at some point the service would be absorbed into Google’s infrastructure and if I hadn’t moved voluntarily in my own timescales (i.e. over a period when I was off work and potentially had some time to deal with any resulting issues), account migration would have been forced upon me at the end of this month.

What’s all the fuss about?

I may not have the largest blog in the world but I’ve worked hard for several years to get this blog to where it is now. With that in mind, I approached this migration with some trepedation but the Transferring FeedBurner Accounts to Google Accounts FAQ made it all sound pretty straightforward, including this text:

“Will I lose all my subscribers in this process?
You should not lose any readers of your feed during this transition process. All feeds.feedburner.com URLs will redirect your readers to feeds hosted by Google.

[…]

I use MyBrand, the service that allows me to map a domain I own to my feed. Do I need to change anything?
Yes. After transferring your account, you will be sent an email with instructions on how to change MyBrand. You can also get these instructions on the MyAccount page after the transfer.

You will be required to change your DNS CNAME to the Google hosted domain service, the same service that handles custom domains for Google applications like Blogger and Google Apps for Your Domain.

Please note that the CNAME will no longer be the same domain as the domain that serves feeds, but the service level and functionality will be identical.”

That all sounded straightfoward enough, so I followed the migration steps on FeedBurner’s website until I saw a message that indicated successful completion. The follow-up e-mail included this text:

“Important! If you use MyBrand, FeedBurner’s custom-domain service, you need to make a very important change to ensure your feeds remain available to subscribers using your custom domain(s).

To make this change in your Google account, follow the instructions listed in your account’s new MyBrand page: http://feedburner.google.com/fb/a/mybrand”

I followed the instructions (i.e. made the changes to my DNS zone) but, 48 hours later (and after confirming that the name was resolving correctly) I was still receiving HTTP 404 (Not Found) errors when I used the http://feeds.markwilson.co.uk/marksweblog/ URL (i.e. the one that uses the FeedBurner MyBrand service to redirect subscribers to the real location.

The missing step

Double-checking my account settings and the DNS record I had edited, I decided to deactivate the MyBrand service and reactivate it. Some time later (I’m not sure exactly how long afterwards, but not long) the 404s were gone and I was able to check with some of my subscribers that their feeds were updated with posts from after the migration. Whilst I waited for this confirmation, another FeedBurner user confirmed that this had worked for him too but it would be good if the instructions included this step…

Why is this Google’s problem?

To be fair to FeedBurner/Google, buried in a post on the FeedBurner Status Blog is this text:

“4-DEC 2008: Have you recently moved your feeds to a Google account? Seeing a ‘404’ error when trying to view your feeds, and you use our MyBrand service? Try the following workaround to fix the problem:

  1. Sign in to feedburner.google.com.
  2. Visit My Account > MyBrand.
  3. Click the remove link next to your current MyBrand domain(s), and then click Save.
  4. Re-enter the domain(s) you removed in the previous step and then click Save.
  5. Try to view the feed(s) that were showing 404 errors before. They should now display your content.”

It should be noted though that these instructions don’t work (there is no remove option in the new interface)… and there is nothing more recent in the blog about this.

Meanwhile, the MyBrand page in the FeedBurner account settings is typically Googlite in its tone, basically telling us that we’re welcome to use the service but if it breaks we’re on our own.

FeedBurner provides no technical support for MyBrand; you must configure DNS settings correctly with your domain/web hosting provider’s help. More technical detail and discussion of the requirements for using this service are available in this Help topic.

It can take up to 24 hours for a new CNAME entry to take effect. If your new feed address isn’t working, have a nice day and check back tomorrow.”

Thanks a bunch Google. My DNS is fine… but your migration process is broken and it seems that you don’t provide any method to report the problem.

Several other people have written about this problem (including one particularly useful post on the whole migration process) so it’s certainly not an isolated case but Google responses are nowhere to be seen on this two-week old post on the FeedBurner Help Group (providing a service without formal support is one thing but monitoring your own help forums is a basic courtesy).

Conclusion

Ironically, the FeedBurner MyBrand service (which let’s me host a FeedBurner Feed under my domain name) that caused this a problem but, because I use this service, many of the subscribers to this blog are using a URL that is under one of my domains (so, ultimately, under my control). This problem may have cost me a good chunk of my weekend but at least I got it fixed and if I hadn’t been able to work out what was happening then I would have reverted to serving the feed directly from WordPress (with the consequential hit on bandwidth and loss of analytics). Imagine if I had a popular blog with a large number of subscribers that wasn’t hosted on my own domain name and the service went belly-up…

Useful Links: January 2009

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A list of items I’ve come across recently that I found potentially useful, interesting, or just plain funny:

Why adjustment layers are preferable to directly editing an image in Photoshop

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve been trying to improve my Photoshop skills recently including signing up for Digital Photography evening classes at a local college (which, 4 weeks in, are very disappointing) but I’ve also been picking up some tips at my local camera club.

At last night’s club meeting, John Winchcomb gave a very technical talk on tone and colour correction which I’m still trying to get my head around but that talk included a very useful tip: instead of reaching for the various adjustment options on Photoshop’s Image menu (including common options like levels and curves), consider creating a new adjustment layer instead. That way it is possible to go back and edit the adjustments as they are applied as a non-destructive edit rather than being directly applied to the image. Normally the adjustment layer will apply to all layers below but it can be created as a clipping layer to only affect the layer immediately below.

Other advantages to adjustment layers include the ability to selectively edit using an image mask and also to copy and paste adjustment layers in order to apply the same changes to multiple images.

Whilst on the subject of layers, it’s probably worth highlighting another tip I picked up recently: before doing anything in Photoshop, create a new layer by copy (Ctrl+J on a PC or command+J on a Mac) and work on that. Using this method, the original image will remain on the background unaffected, should you ever want to revert, or to compare the manipulated image with the original.

RSS feeds migrated to a new host – hopefully everything is still working

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

For almost as long as this blog has been up and running I’ve been using FeedBurner to manage the RSS feeds. It’s been working well for years but Google bought FeedBurner a while back and tonight my feeds were migrated to the big G.

Those who are subscribed to http://feeds.markwilson.co.uk/marksweblog/ shouldn’t see any changes (at least not if I made the DNS changes correctly) but there may still be a few people subscribed using old feed addresses (e.g. http://feeds.feedburner.com/marksweblog/) and these might not always work (sadly this is outside my control). If you do find that my posts stop appearing in your RSS reader, please try resubscribing to the site feed.

Thanks for sticking with me.

[Update 31 January 2009: There seems to be a problem with the main feed as it’s returning HTTP 404 (Not Found) error pages (even where DNS propagation is complete)]

[Update 1 February 2009: The 404s are fixed; a full description of what went wrong has been posted; please let me know (assuming you can read this) if you find any other issues with the feed].

Running Windows 7 on a netbook

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Now that the Windows 7 beta is out and my NDA has lifted, I can finally write about my experiences of installing Windows 7 on a netbook. In a word:

Sweet.

You see, Windows XP works well on one of these little machines but who wants XP? It’s eight years old (an eternity in IT) and, for anyone who’s used to working with anything remotely modern, it’s a bit difficult to step back to (I was recently forced to revert to Windows XP and Office 2003 for a month whilst my main machine was being repaired and it was painful). I could install Vista but even Microsoft doesn’t think its the right OS for a netbook (that’s why they allowed vendors to continue shipping XP). Meanwhile, much has been said about how Windows 7 requires fewer resources and I wanted to find out how it would run on a typical netbook.

My Lenovo S10e arrived in early December and before installing anything I took an image of the hard disk so that I could return it to factory state if required (Lenovo provides a Windows PE-based recovery image but that’s not much good if you’ve accidentally wiped the hard disk using a pre-release operating system). I could have used Windows Deployment Services for this, but it was just as easy to fire up an old copy of Ghost and boot from a floppy drive and universal network boot disk.

With the disk backed up, I set about installing Windows 7 in a dual-boot scenario (I still needed to drop back to XP occasionally for BBC iPlayer downloads – although since then the BBC has made a version of iPlayer available that runs on other platforms). This was where I found James Bannan’s step-by-step guide on installing Vista to dual boot with XP so useful – the process that James describes is not exactly rocket science but it is good to know you’re following a process that has worked for someone else – and it also works for Windows 7.

Following James’ notes I used diskpart to shrink the existing disk partition, create a second partition and install Windows 7 (with no optical drive available, I used a USB hard disk as a boot volume and Windows 7 installed quickly and easily). There were a couple of unrecognised devices (the Broadcom wireless card and the Lenovo power management but I installed the XP drivers (I would have expected Vista drivers to work but not XP) and they seemed to do the trick. Everything else worked as intended.

Next, I downloaded and ran EasyBCD to edit the boot options. The Windows 7 installation wiped the boot loader that Lenovo had supplied, so I have no access to the recovery volume but it was simple enough to put Windows XP back in as a boot time option (actually, it should be simple enough to put in the recovery volume when I get around to it).

With Windows 7 installed, the next items to install were Vodafone Mobile Connect (which installed using the same options as for Vista) and some antivirus software (I used the free version of AVG, although I’ve been having problems whereby the resident shield won’t start automatically and I have to deactivate it, save the changes and then reactivate it.

After a few more apps (Microsoft Live Meeting, Windows Live Writer, Google Chrome – not to use as a browser but to set up application shortcuts for Google Mail and Calendar) and I had the machine configured as I needed for roaming around, checking e-mail, writing the odd blog post, etc.

So, how did it perform? Absolutely fine. This machine has a 160GB hard disk, a 1.6GHz Intel Atom CPU, integrated graphics and just 1GB of RAM. 3D graphics support was great (with really smooth transitions – e.g. Flip 3D) and the Windows System Performance Index showed 2.2, which may not sound high but makes sense when you look at the subscores:

Component What is rated Subscore
Processor Calculations per second 2.9
Memory Memory operations per second 4.4
Graphics Desktop performance for Windows Aero 2.2
Gaming graphics 3D business and gaming graphics performance 3.0
Primary hard disk Disk data transfer rate 5.3

So, fast disk, fast memory, let down by the CPU and the graphics. Not surprising given the class of machine that we’re looking at here.

Task Manager shows that Windows 7 is using 650MB of RAM, which doesn’t leave a huge amount for Office applications but it’s fine for a bit of browsing, e-mail, blogging, and even watching videos. Regardless of the fact that the machine seemed to run well with only a gig of RAM, I decided to see if adding more would make a difference.

First, I tried ReadyBoost to see if it would increase system responsiveness, using an old 1GB SD card, but I have to say that I’m not sure it really made any difference. Then I bought a 2GB SODIMM from , taking the total installed to 2.5GB (for some reason, XP only sees 1.99GB but Windows 7 recognises the whole amount) and measured the stats again. Surprisingly, the score dropped, but only by a fraction as the graphics subscore fell to 2.1 with memory IO slightly up to 4.5 and all other scores unchanged (as might be expected – after all, none of those components had been upgraded).

Windows 7 System Properties on Lenovo S10e after 2GB memory upgrade

So, eight weeks after installation, what’s my verdict? Well that is probably pretty obvious by now – Windows 7 runs nicely on a little netbook. How it will perform on older hardware is anyone’s guess but it also seemed fine on my Compaq Evo D510SFF with a 2.4GHz Pentium 4 CPU and 2GB of RAM (albeit with basic graphics). On that basis, it should be fine for most corporates (although even Vista should also be, with tactical RAM upgrades) and the only barriers to adoption will be cost (of a desktop refresh at a time of economic uncertainty) and application compatibility (as with Vista). It’s also remarkably stable – and I’m still running the pre-beta code (build 6801 with the Blue Badge “tweak”).

There’s plenty written elsewhere about Windows 7 features but those were not the purpose of this post. The one thing I cannot ignore is that Microsoft is yet to make a statement on netbook support for Windows 7 although TechRadar includes the major points in its article explaining Windows 7 netbook system specifications. Microsoft’s problem is that revenues are lower on netbooks (if the hardware is sub-£250, then it’s difficult to sell an operating system at full price without making the Linux alternatives look more attractive) but they also wants to stop shipping XP.

It seems to me that this is purely a marketing issue – from a technology standpoint, Windows 7 (plus Windows Live Essentials) seems to be an ideal netbook operating system.

A quick look at Windows ReadyBoost

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

My netbook it only came with 1GB of RAM, so I decided to see what effect the option to “Speed up my system using Windows ReadyBoost” would make (presented by Windows Vista and later when inserting removable media – more details can be found over on the Kodyaz Development Resources site).

First of all I tried a 1GB USB key that I’d been given with some presentation materials on it but Windows told me the device was not fast enough to use for ReadyBoost.

That was something of a surprise to me – I knew that not all devices were suitable for ReadyBoost but how could I tell why my device was failing? In his article, Is your flash drive is fast enough for ReadyBoost?, Ed Bott explains that:

“If you get a failure message when you first insert a flash device and try to use it as a ReadyBoost drive, you can click Test Again to get a second hearing. If the drive fails several tests, you can look up the specific performance results for yourself. Open Event Viewer (Eventvwr.msc) and click the Applications And Services Logs category in the console tree on the left. Under this heading, click Microsoft, Windows, and ReadyBoost. Under this latter heading, select Operational. The log entries in the center pane include performance test results for both successful and unsuccessful attempts.”

Sure enough, checking the logs on my Windows 7 system showed messages like:

Source: ReadyBoost
EventID: 1008
Description: The device (UT163 USB Flash Disk) will not be used for a ReadyBoost cache because it does not exhibit uniform performance across the device.  Size of fast region: 0 MB.

and:

Source: ReadyBoost
EventID: 1004
Description: The device (UT163 USB Flash Disk) will not be used for a ReadyBoost cache because it has insufficient write performance: 173 KB/sec.

173KB per second is about 10% of the required speed for ReadyBoost so I tried again, this time using a 1GB SD card.

First I saw an event to indicate that the card exhibited the necessary performance characteristics:

Source: ReadyBoost
EventID: 1000
Description: The device (Generic- Multi-Card) is suitable for a ReadyBoost cache.  The recommended cache size is 991232 KB.  The random read speed is 3311 KB/sec.  The random write speed is 3500 KB/sec.

and then a second event recording the creation of the cache:

Source: ReadyBoost
EventID: 1010
Description: A ReadyBoost cache was successfully created on the device (Generic- Multi-Card) of size 966 MB.

So, after creating the cache, did ReadyBoost actually make a difference?  It’s difficult to say – on a relatively low-powered PC (the one I used only has an Intel Atom 1.6GHz) performance is not blindingly fast and, as the USB ports (including internal ones used for devices like media card readers) rely on the main CPU for IO processing, it could be argued that use of USB attached memory would even compound the issue when the PC is running out of steam.  Those with faster PCs, or faster memory devices may see a difference.

Long Zheng has a good summary in his article which puts forward the notion that ReadyBoost works but that it’s not a miracle:

“I don’t agree with […] how ReadyBoost has been marketed and perceived by the public. ReadyBoost does not improve performance, it only improves responsiveness. It won’t make your system or [applications] run any faster, but it will make things faster to load and initialize to a working-state.

If you’re on a budget, then ReadyBoost is premium accessory that is definitely not value-for-money. You’re literally paying a price to slice milliseconds off loading times. But if you’re a professional or heavy business user, then ReadyBoost might be a cheaper, easier or the only alternative to upgrading memory.”

Long suggests that ReadyBoost is not value for money. I’d add that it may be if, like me, you have a lot of small USB keys that are doing nothing more than gathering dust on a shelf.  It’s probably not worth investing money in new hardware especially to use ReadyBoost though.  Indeed, one of Long’s readers (Tomer Chachamu) makes a distinction which is extremely important to consider:

“I am using [ReadyBoost] for several weeks now and I can confirm your experiences, that it helps a lot to improve the responsivness [sic.] of the system.

So it helps to make the whole system perform faster. So isn’t it the same?

High responsiveness: the system ‘feels fast’ and you don’t have to wait for something to load when you’re about to go to a command. (Example of high responsiveness: when you logon, you immediately want to go to the start menu and launch something. The time from logon to launch is a busy wait for you.) – this is affected by readyboost [sic].

High speed: the system performs computational (or I/O) tasks fast. (Example: you are ripping a massive library of CDs. It takes about 10 minutes. If it took less time, say by offloading floating point calculations to the GPU, then that would be high speed. It’s still longer than half a minute so the system is fast, but not responsive. When you’re encoding the CDs, you can do other useful activities, so it’s a non-busy wait.) – this is not affected by readyboost [sic].”

ReadyBoost is not about high speed – it’s about responsiveness (which explains why PC World were unimpressed when they tested some ReadyBoost-capable USB flash drives on Windows Vista).

In the end, I decided to buy some more RAM but, for those considering using ReadyBoost, it’s worth checking out Tom Archer’s ReadyBoost FAQ.

Get more memory at Crucial.com!

Sound only coming from one speaker on your iPhone?

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’m now on my third iPhone 3G (in just 5 months):

  • #1 developed a crack in the case (and was replaced under warranty by Apple).
  • #2 showed light at the edge of the screen where it should have been covered with the black bezel (visible in a dark room), so yesterday Apple changed that for me too (on yet another visit to the Apple Store to try and get my MacBook fixed…).
  • #3 so far so good, except…

…I was playing some music tonight and accidentally covered the speaker with my finger. Then I noticed that sound only came out of one speaker. Arghhh!!!

It turns out that is normal behaviour. The left grill next to the dock connector covers the speaker, the right grill covers the microphone.

Checking out some of the forum sites reveals this is a common concern (there’s even a website dedicated to the topic) but nevertheless it had me worried until I googled it…

Of course, I could just RTFM – but where’s the fun in that?

Windows 7 beta deployment tools

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Those who are checking out the Windows 7 beta with a view towards automated deployments may be interested to note that a beta of the Windows Automated Installation Kit has been released for Windows 7 along with an open beta of the Microsoft Deployment Toolkit 2010.