Monthly Archives: December 2008

Uncategorized

Useful Links: December 2008

A list of items I’ve come across recently that I found potentially useful, interesting, or just plain funny:

Uncategorized

A quick look at Lab Management in Visual Studio Team System 2010

A few weeks ago I referred to Microsoft’s announcement of Visual Studio 2010 Lab Management, asking if this was Microsoft’s answer to VMware Stage Manager and the answer is… sort of.

I don’t know a huge amount about Stage Manager but the basic premise is that it targets release management by placing virtual machine images into a configuration (a service) and then promoting or demoting configurations between environments based on the rights assigned to a user. Images can also be archived or cloned to create a copy for further testing.

Microsoft’s approach is subtly different – as should be expected with a product that’s part of Visual Studio it’s focused on aiding developers to avoid configuration drift and to perform repetitive system tests during the application development lifecycle, leaving the System Center management products to manage the movement of virtual machines between environments in the virtual infrastructure.

The VSTS approach attempts to address a number of fundamental issues:

  • Reproduction of bugs. It’s a common scenario – a tester files a bug but the developer is unable to reproduce it so, after a few rounds of bugfix ping-pong, the incident is closed with a norepo status, resulting in poor morale on both sides.  Lab Management allows the definition of test cases for manual testing (marking steps as passes/fails) and including an action log of all steps performed by the tester.  When an error occurs, the environment state can be checkpointed (including the memory, registry, operating system and software state), allowing for reproduction of the issue.  A system of collectors is used to gather diagnostic data, and, with various methods provided for recording tests (as a video, a checkpoint, an action log or an event log), it’s possible to automate the bug management and tracking process including details, system information, test cases and links to logs/checkpoints – all information is provided to the developer within the Visual Studio interface and the developer has access to the tester’s environment. In addition, because each environment is made up of a number of virtual machines – rather than running all application tiers on a single box – so-called “double-hop” issues are avoided whereby the application works on one box but issues appear when it’s scaled out. In short: Lab Management improves quality.
  • Environment setup. Setting up test environments is complex but, using Lab Management, it’s possible for a developer to use a self-service portal to rapidly create an new environment (from a template (not just a VM, but the many interacting roles which make up that environment – a group of virtual machines with an identity – for example an n-tier web application).  These environments may be copied, shared or checkpointed.  The Lab Environment Viewer allows the developer to view the various VM consoles in a single window (avoiding multiple Remote Desktop Connection instances) as well as providing access to checkpoints, allowing developers to switch between different versions of an environment and, because multiple environment checkpoints use the same IP address schema, supporting network fencing.  In short: Lab Management improves productivity.
  • Building often and releasing early.  Setting up daily builds is complex and Lab Management’s ability to provide clean environments is an important tool in the application development team’s arsenal.  Using VSTS, a developer can define builds including triggers (e.g. date and time, number of check-ins) and processes (input parameters, environment details, scripts, checkpoints, unit tests to run, etc.).  The traditional build cycle of develop/compile, deploy, run tests becomes develop/compile, restore environment, deploy, take checkpoint, run tests – significantly improving flexibility and reducing setup times. In short: Lab Management improves agility.

From an infrastructure perspective, Lab Management is implemented as a new role in Visual Studio Team System (VSTS), which itself is built on Team Foundation Server (TFS)Lab Management sits alongside Test Case Management (also new in Visual Studio 2010 – codenamed Camano), Build Management, Work Item Tracking and Source Control.

Vishal Mehotra, a Senior Lead Program Manager working on VSTS Lab Management in Microsoft’s India Development Center, explained to me that in addition to TFS, System Center Virtual Machine Manager (SCVMM) is required to provide the virtual machine management capabilities (effectively VSTS Lab Management provides an abstraction layer on the environment using SCVMM). Whilst it’s obviously Microsoft’s intention that the virtualisation platform will be Hyper-V, because SCVMM 2008 can manage VMware Virtual Center, it could also be VMware ESX.  The use of enterprise virtualisation technologies means that the Lab Management environments are scalable and the VMs may be moved between environments when defining templates (e.g. to take an existing VM and move it from UAT into production, etc.).  In addition, System Center Operations Managers adds further capabilities to the stack.

Whilst the final product is some way off and the marketing is not finalised, it seems likely that Lab Management will be a separate SKU (including the System Center prerequisites).  If you’re looking to get your hands on it right now though you may be out of luck – unfortunately Lab Management is not part of the current CTP build for Visual Studio 2010 and .NET Framework 4.0.

This post really just scrapes the surface (and, as I’m not a developer, that’s about as far as I can take it).  To find out more, read about Lab Management in VSTS 2010 over at Somasegar’s Weblog or check out the video of the PDC 2008 session on improving code quality with VSTS 2010 Lab Management, presented by Principal Program Manager, Ram Cherala.

Uncategorized

Microsoft E-Learning courses: the good, the bad and the ugly

The couple of weeks leading up to Christmas involved a lot of intense revision for me, as I prepared for the Microsoft exams to finish updating my MCSE on Windows Server 2003 to MCITP Enterprise Administrator.

When I set out to do this, I had originally intended to combine the tasks of reviewing John Savill’s Complete Guide to Windows Server 2008 with getting ready for my exams but it soon became apparent that I simply didn’t have enough time to work my way through the entire volume (excellent though it is!). Instead, I used the Microsoft-published exam preparation guides to identify the recommended Microsoft E-Learning courses.

If I’d written a review of the courses after the first couple of days it would have been a glowing recommendation – and in some respects perhaps I should be holding off on this review as I am somewhat battle-weary; however, having just taken two certification exams based on this study method it seems as good a time as any to assess the suitability of these courses.

The good

Starting out with the selection, there is a huge catalogue of courses available which mirror the Microsoft Official Curriculum instructor-led courses. The prices are not bad either, when compared with classroom training; however, in many ways, I prefer the interaction that a classroom environment provides.

The format of the courses is good – built up as a number of virtual classroom modules, with a mixture of demonstrations and animations (with transcripts), textual content, and puzzles/tests in each lesson. Each lesson ends with a self-test and there is a summary and a glossary at the end of each module. There’s also a full-text search capability.

It’s possible to synchronise the content with a local cache to provide offline viewing – indeed, I only used the courses online for one day (when I was in the office and the proxy server wouldn’t let me download some new courses for offline working – the offline player includes the ability to edit proxy settings in the options but is not exposed in Windows until after successfully downloading and launching a course – and online viewing required me to add microsoftelearning.com to Internet Explorer’s trusted sites list) but it’s important to note that the virtual labs must always be completed online (this functionality is not available in the offline viewer).

The bad

Somewhat annoyingly, the course overview (which is the same for each course) and the glossary are included in the progress count, so after completing all of the available lesson content, most of the courses I attended were marked as only partially completed (it is possible to mark a course as complete in the My Learning section of the Microsoft Learning website but this will not complete the course in your transcript).

I could almost forgive elements like this, but the next annoyance really affected my ability to learn. You see, I’m English, and I will admit that sometimes I find it difficult to listen to an American accent for a long period of time (that’s nothing personal – I’m sure the same happens in reverse). But the demonstrations and animations in these courses are recorded in an American monotone – and it doesn’t even seem to be human. After listening to a few of these, with misplaced paragraph breaks and identical pronunciation for recurring words, regardless of sentence structure and intonation (or lack of), they actually become very difficult to concentrate on. Towards the end of my revision I stopped working through entire courses and instead concentrated on the introductions, summaries, and making sure I could complete the puzzles and self tests at the end of each lesson – avoiding the computer-generated monotone entirely. By simply recording all of the demonstrations using a human voice (as most of the module introductions are) then a vast improvement could be made.

The ugly?

Then there are the animations – which are at best ugly and at worst confusing. Watching icons appear and disappear in a manner which at times appeared to be random whilst the computer was talking to me did not help at all. In the end, I nearly always resorted to reading the transcript.

Whilst the animations may be a design crime (as are many of the diagrams in Microsoft Official Curriculum courseware) even worst was the inaccuracy of some of the information presented – which shows it was produced by an outside agency (Element K) and sometime suffers from a lack of technical quality assurance.

Let me give some examples:

  • Course 6519: one of the self-tests at the end of a lesson claims that NT 4.0 supports Kerberos (for that I would need Windows 2000 or later); and in the context of Active Directory database mounting, the module claims that one should “use a line printer daemon (LPD) utility, such as Active Directory Users and Computers, to view the data” (clearly LPD should have been LDAP…).
  • Course 6521: one of the reviews claims that only Active Directory Lightweight Directory Services uses an extensible storage engine (ESE) for its database store – contradicting the text elsewhere in the module (as well as being incorrect); and a self test asked me to “identify the feature that AD LDS supports but AD LDS does not support” (!).
  • Course 6524: .PIX files referred to in the text, whilst the demonstrations clearly showed that the extension is .PFX.
  • Course 6536: claims that “Hyper-V is supported only by the Windows Server 2008 Standard 64-bit edition” (64-bit yes, standard edition only – certainly not).
  • Course 6169: claims that “The various wireless networking standards are 802.11, 802.11b, 802.11a, 802.11g, 802.1X, and 802.11n” (802.1x is used for implementing network security but is not specifically a wireless networking standard).

There are typos too (sever instead of server, yes and no the wrong way around in test answers, etc.) as well as references to product names that have not existed since beta versions of Windows Server 2008 (e.g. Windows Server Virtualization). Other beta information has not been refreshed either – course 6529 refers to a 30-day grace period before Windows enters reduced functionality mode when it is actually 60 days (and RFM is much less brutal today than it was in the original versions of Windows Vista and early Windows Server 2008 betas). In another place, virtual machine (VM) snapshots are mixed up with volume shadow service (VSS) snapshots as the course suggests that VM snapshots are a backup and recovery solution (they most certainly are not!).

I could go on, but you get the message – almost every module has at least one glaring error. Mistakes like this mean that I cannot be 100% certain that what I have learned is correct – for that matter, how do I know that the Microsoft examinations themselves are not similarly flawed?

Summary

In the end, I don’t think it was just these courses that helped me pass the exams. Boot camps (and that’s what intense online training is the equivalent of) are all very well to cram information but they are no substitute for knowledge and experience. The outcome of running through these courses was a combination of:

  • Refreshing long-forgotten skills and knowledge on some of the lesser-used functionality in Windows Server 2008.
  • Updating skills for new features and functionality in Windows Server 2008.

Without several years’ of experience using the products I doubt that I would have known all of the answers to the exam questions – indeed I didn’t know them all (but the knowledge gained from the online training helped me to evaluate and assess the most likely of the presented options).

So, is this training worth it? Probably! Is it a complete answer to exam study and preparation? Possibly – but not through cramming 100 hours of training into a couple of weeks and expecting to retain all the knowledge. What these Microsoft e-Learning courses represent are a low cost substitute to formal, instructor-led classes. There are some downsides (for example, the lack of interaction and the poor quality control – instructor-led courses benefit from the feedback that the instructors provide to allow improvements at each revision) but they are also self-paced and the ability to go at my own speed means that, given sufficient time, I could work through a few of these each week and allow time for the knowledge to settle, backed up with some real-world experience. On that basis, they’re certainly worthy of a look but don’t expect them to provide all of the answers.

If you want to try one of the Microsoft E-Learning courses there are plenty available discounted (or even free). Afterwards, I’d be interested to hear what you think.

Uncategorized

Christmas gadgets in the Wilson family

I’m writing this a couple of days after Christmas, as the great British public starts is looking for bargains in the sales.  If you’re reading this blog, the chances are that technology plays a significant part of your life if which case it’s likely that Father Christmas/Santa Claus/St. Nick/whoever distributes presents to your house brought you at least one gadget for Christmas.

The old saying is that giving is better than receiving and I gave away a couple of gadget gifts this Christmas with which the reciepients have been particularly pleased, so I wanted to tell you a little bit about them…

Pure Evoke-1S DAB digital radio

First up is the DAB digital radio that I bought for my wife.  Normally, she despairs of my penchant for expensive electronic items but a few weeks back she returned from a girly weekend in Sheffield and told me that some friends of ours had a digital radio and she’d really like one.  Naturally, I was pleased – I’d toyed with the idea for a while but I don’t listen to enough radio to justify it – I was even more pleased when it transpired that the model that had impressed her so was from Pure and it was the Evoke-1S.

Pure Evoke-1SPure is one of the better-known digital radio producers in the UK and the Evoke-1S is the current version of a model that has been around for a while.  Whilst some consider the wooden finish to be a little tired, I much prefer it to the cheap-plastic-sprayed-with-metallic-effect-paint that seems to be the current trend for consumer electronics.  It sounds pretty good too.  One speaker means mono but this is a radio (not the ultimate in high fidelity sound presentation) and anyway, this device lives in our kitchen (which does not have the best acoustics either).  For what we wanted – access to digital radio in an attractive unit at a reasonable price, this unit is perfect.

How perfect? Well, the setup process was easy enough for Mrs. W to get up and running in a few seconds with no intervention from me – that’s a good start.  It also has an alarm, and a kitchen timer.  There’s provision for a second speaker, as well as an auxiliary input (I’ll probably get an iPod dock and then I can use it to catch up on downloaded episodes of The Archers).  Then there is a USB port for software updates (e.g. to DAB+, should it ever reach these shores).

The Evoke-1S is just one of many models available from Pure (for a little more money I thought the Evoke Flow looked good, featuring Internet radio in it’s capabilities) but it was also a bargain.  Available for around £96 from most electronics retailers (maybe less if you go for the Cherry version – we preferred the Maple finish), I bought ours from John Lewis Online.  John Lewis Online’s packaging is terrible (it’s just a strong plastic bag) so the box was a little beaten up when it reached us but they were happy enough to offer me a discount by way of compensation.  The next day, I saw it on sale in PC World for £68.47 so I bought that (to guarantee stock) before getting John Lewis to price match (John Lewis Online won’t price match but the brick and mortar stores are never knowingly undersold and they refunded and resold the item to me, after which I returned the unit I’d bought in PC World).  Why the complicated refund and resale?  Because John Lewis offer a 2 year warranty on all new electrical purchases and because, if I had to guess which of the two retailers will still be trading in 2 years time, John Lewis looks a more certain bet to me than PC World’s parent company, DSG International (it should be noted that I have no information to back that up – it’s purely a personal opinion).

To find out more about the Evoke-1S, there’s an FAQ on Pure’s support pages.

VTech Kidizoom Multimedia Digital Camera

My children have grown up in the glare of my digital cameras and we thought that my 4-year-old would enjoy one of his own (for the last year or so he’s been playing with an old disposable camera body that I glued shut and he loves taking pretend photos).

VTech KidizoomAfter looking into the various options for rugged digital cameras (i.e. those with rubberised bodies designed to withstand the inevitable knocks and bangs which will be inflicted by children), we settled on a VTech Kidizoom (which we got 20% off by using a voucher at ELC). What we hadn’t realised was just how big a hit this would be.  By the end of Christmas day he had taken over 400 pictures and it was pushing 700 by the end of Boxing Day!  Furthermore, he takes his camera everywhere with him (just like his Dad… although I don’t take mine to bed with me!) and looking at his pictures has given me a great insight into the things that interest a 4-year-old (pictures of Mummy, Daddy, his brother, visiting grandparents, uncles, cousins, etc., his toys, the food on the table, his teddy bears, the Christmas tree, the television, the view from the car window, boats on the lake in the local country park, the produce in the supermarket, etc., etc.).  I simply cannot stress how much my son loves this present – I have never seen anything hold his attention for so long.

Ironically, we nearly didn’t buy it for him – most of the reviews concentrate on the poor quality of the pictures, the inadequate flash, the fact that the pictures in the internal memory are lost when the (non-rechargeable) batteries run out, etc. and, having experienced the device now, I wanted to set the record straight.  After all, I think I’m qualified to do so: as a parent; an amateur photographer; and as an IT bloke.

Firstly, picture quality.  Yes, this is a 640×480 (0.3MP) camera and so the pictures are not great – that’s putting it mildly – actually they are pretty awful, with sludgy colours and high compression.  But it’s also in the hands of a 4-year-old!  He may take the odd picture that’s OK but these are unlikely to be the family album shots and he’s more than happy looking at them on the computer screen or the TV (the camera is supplied with USB and composite video leads for connection to a PC or a TV).

Here’s an example of one of the pictures – at 33% and a section at 100%:

Sample image taken with a the VTech Kidizoom (33 percent)Section of a sample image taken with a the VTech Kidizoom (100 percent)

Next, the flash – it’s not very powerful (probably for safety reasons… as little people are bound to hold the camera the wrong way around and take a self portrait) and it bleaches out anything close-up, but it does the job – sort of.  I’ve come to the conclusion that this camera is designed for pictures to be taken at a distance of around 1.5 metres (which is the sort of distance my son stands from his subject anyway!) but pictures taken outside are definitely better.

Sample image taken with a the VTech Kidizoom using a flashSample image taken outdoors with a the VTech Kidizoom

The Kidizoom comes with 16MB of internal memory and the instructions do warn that it will be erased if the batteries (4xAA) run out but it also has an SD card slot and the pictures on this card are safe in the event of power loss.  I’ve set my son’s camera to use an old 1GB SD card by default and that’s around 33,000 of his pictures (it was just an old card that I was using for ReadyBoost on the PC and is too small to be useful in any of my cameras).  Basically, RTFM and then losing photos when the batteries run out won’t be a problem.

Sample image taken with a the VTech Kidizoom and border added in-cameraI haven’t used the supplied software – both Windows and Mac OS X detected the internal memory and the SD card as removable drives and were happy to copy over the pictures.  The camera also includes a video mode and some games but we haven’t used them yet – at this point still photos are a big enough attraction.  It’s a bit too easy for the kids to turn the flash off (although I had to read the manual to work out how to turn it back on) and quite a few pictures seem to have had a novelty border added by accident but these are minor issues given the market at which this camera is aimed. There’s also no EXIF data and the date and timestamp seems to be added when the image is copied from the camera – not when it’s taken.

In summary, this camera is far from perfect but I also have to remember that my 2-year-old will get his hands on it too and it’s more than good enough to last the next couple of years until they can both be trusted with a “grown-up” digital compact.  In terms of entertainment value it’s been a huge hit – most children emulate their parents and mine are certainly happy to be snapping away like their Daddy.

Nikon D60 DSLR camera

Nikon D60 and 18-55VR kitMy brother used to be a reasonable photographer but these days he hardly picks up his film camera (a Minolta X300, which, incidentally, last time I used, I really enjoyed for it’s raw simplicity).  In an attempt to set things straight this Christmas, his other half bought him a Nikon D60 kit, based on advice from me, including an 18-55VR lens (see DP reviews for a review of the D60).  The lens is a standard kit lens – cheap, built of plastic, slow glass – but is enough to get him started (and he can borrow one of mine if he needs to).  He brought it over to my place on Boxing Day and I was impressed – in fact I would say that, as a consumer DSLR, the D60 is fantastic.  It’s not the top of Nikon’s range (far from it) but it matches or exceed most of the features in my aging D70, packs in more pixels (10.2MP), includes image sensor cleaning, and is smaller, lighter, and more compact.  All in all, it’s a great DSLR – especially at around £300.

If you prefer Canon then all I have to go on is the fact that my Dad seems pretty pleased with his new 1000D, which appears to be broadly equivalent to the D60.  If you’re looking at any other brand for a DSLR I’d question why – Canon and Nikon are the market leaders which means there is a huge range of OEM and aftermarket support (accessories, etc.) and both offer plenty of scope to progress to a more advanced model, if required, at a later date.

Closing thoughts

Apologies to those who don’t find these items remotely interesting but this website comes up on enough Google searches that hopefully my comments will be of use to someone.  And if someone bought you something for Christmas that you think is fantastic and you’d like to share it with the world, please leave a comment on this post.

As for me – what gadgets did I receive this Christmas?  Nothing in particular (I bought myself a netbook a few weeks ago and most of what I asked for was books – like the excellent Landscape Photographer of the Year collections) but I do collect Pixar movies and I’m just about to sit down and enjoy the DVD of WALL-E that Father Christmas left in my stocking!

Uncategorized

Passed Microsoft Certified IT Professional exam 70-647

That’s it. Done it! I’ve just passed the last exam I needed to take (70-647) in order to update my MCSE on Windows Server 2003 to MCITP: Enterprise Administrator for Windows Server 2008, before the vouchers I had for free exams expired and just in time for Christmas!

For anyone else thinking of upgrading the Microsoft certifications for Windows Server 2008, then check out the post I wrote last year on Microsoft Learning and plans for Windows Server 2008 certification.

There’s also a PDF available which shows the various transition paths from earlier certifications.

Uncategorized

Giving or receiving a PC as a Christmas present? Take an image of the drive first

Some quick advice for those of you about to open up new PCs bought for Christmas (for that matter, the advice is equally applicable whatever the occasion)… avoid the temptation to dive straight in and have a play. Instead, take an image of the hard drive as it arrived from the factory – yes, you can always use the manufacturer’s instructions to return the machine to a restored state but this can take a long time (as I found a few weeks back when I succumbed to temptation and had a play with my Lenovo Idea Pad S10e before imaging it!).

As I write this, I’m setting up a new Dell notebook for someone (an Inspiron 1525) and the first key I pressed (after the power button) was F2 to go into the BIOS. Here I changed a couple of settings (boot order and numlock key state), then booted again with F12 for the boot menu to use my device of choice (floppy disk drive, USB drive, or PXE network boot) to boot to my image capture software of choice (Symantec Ghost, Windows Deployment Service, you name your poison) and take an image of the entire drive (not a partition).

Following this, I can can configure the device as intended (remove the crapware, install some AV software, install an Office suite, etc.) and then take another image when I’m done.

Of course, for corporate deployments it’s normal to blow away the manufacturer’s image and install something more appropriate to the organisation’s requirements but, for home and small business users, it’s perfectly acceptable to use the factory-supplied build and this might just save you some time if you ever need to return to square one.

Uncategorized

Submitting podcasts to iTunes

One of the few things I managed to get done last week was to submit the enhanced podcast (AAC) version of the Coalface Tech RSS feed to Apple for inclusion in the iTunes podcast directory.

Actually it’s remarkably straightforward but here’s a few pointers for anyone who is getting started with this podcasting lark.

First up – you need to understand that there are two things called iTunes:

  • Apple’s online store with audio and video content (depending on whereabouts you live in the world).
  • Apple’s media player for Windows and Macintosh PCs, used by millions of iPod and iPhone owners worldwide (as well as many people with other devices, I’m sure).

Next – you need to understand that podcasts are generally distributed using an RSS feed (just like blogs but with enclosures containing the media files). The RSS feed is structured using XML.

You can subscribe directly to the RSS feed (and even view it in a browser), or you can use a podcast directory (such as the iTunes Store).

James had created the original XML for our RSS feeds (one for the MP3 version and one for the AAC version of the podcast) using a feed generator (I’m not sure which one he used but there is a basic podcast RSS generator available in the TD Scripts webmaster utilities). Not all feed generators support the iTunes-specific metatags though and it’s useful to know what these do.

Armed with the two feeds (one with iTunes metatags for the AAC feed and one without for the MP3 which is not on iTunes), I tested them in my iTunes client application (selecting Subscribe to Podcast… from the Advanced menu to see what metadata is displayed in the feed and the downloaded episodes). This let me tune the tags until I saw something that approximated the desired information.

Once I knew the XML was correctly formatted (tested in iTunes and in various web browsers), the final versions were uploaded to the web (in testing you can use any accessible HTTP server but for live deployment you probably want to think about providing a reasonably permanent URL and the media files themselves need to be somewhere that bandwidth is not a problem – for Coalface Tech, that hosting is kindly provided by Australian Personal Computer, Internode and Sun Microsystems but I also considered using Liberated Syndication).

Next-up was time to submit the podcast to Apple for inclusion. There is a moderation process but within 24 hours we had received confirmation that we were live in the iTunes directory!

Coalface Tech in iTunes

And that was about it really – remarkably straightforward, especially when armed with Apple’s detailed instructions for making a podcast.

All I need to do each time we create a new episode is update the XML for the RSS feeds (create a new <item> section for each episode) and notify iTunes that we have posted a new episode (although it should automatically check every day).

Uncategorized

Managing stored credentials from the Windows command prompt using cmdkey

I’ve been meaning to blog about a command which is a reasonably recent addition to Windows for a few weeks now – cmdkey.exe (thanks to John Craddock for highlighting this at a recent XTSeminars event).

Basically Microsoft’s cmdkey, introduced with Windows Server 2003 (and which should not be confused with Jason Hood’s companion for cmd.exe), is used to create, list and delete stored security credentials.

For example, I back up the notebook PC that I use for work to my Netgear ReadyNAS using SyncToy. My ReadyNAS does not support Active Directory but it does provide SMB/CIFS access. This means that I can authenticate directly against a share but the username and password do not match the cached domain credentials on the notebook PC.

Supplying credentials each time I need to connect (or forgetting to before attempting a sync) is inconvenient, so I used cmdkey to store the username and password that I use to connect to the share:

cmdkey /add:computername /user:username /pass:password

In this case cmdkey responded as follows:

CMDKEY: Credential added successfully.

Typing:

cmdkey /list

returns:

Currently stored credentials:

Target: computername
Type: Domain Password
User:
username

and I can connect to a share without supplying any credentials:

net use h: \\<em>computername</em>\<em>sharename</em>

The command completed successfully.

Furthermore this drive mapping (and stored credentials) persists on reboot – when the computer is restarted, H: is visible as a disconnected drive in Windows Explorer but as soon as I double-click it I connect without a prompt to supply credentials.

Uncategorized

Camera raw support in Windows Vista and later

Most of my digital photography workflow takes place on a Mac, where I use Adobe Camera Raw and Bridge/Photoshop CS3 to handle camera raw images.  With my recent purchase of a netbook (which is small enough and light enough to take out with me on a shoot – and less expensive than a dedicated storage device like an Epson P-7000), it would be useful to view the images in Windows but the Microsoft Raw Image Thumbnailer and Viewer for Windows XP has not been updated since 2005 and is not compatible with Windows Vista or later.

I did wonder if the technology had been absorbed into Windows Explorer and it seems it has… I found a forum post that suggests using Windows Photo Gallery and then installing some codecs (this post has more information on raw support in Windows Vista) but it turns out that the camera raw codecs are also available for direct download (i.e. with no need for Windows Photo Gallery) and after installation the raw file contents are available in thumbnails, previews and applications.

Unfortunately the major manufacturers (Canon and Nikon) do not produce codecs for 64-bit Windows (i.e. for people running high-end workstations with lots of memory for editing large images…) but the 32-bit codecs are fine for my little netbook with 2.5GB of RAM and there is 64-bit support for Adobe digital negatives (.DNG).

During installation, the Canon codecs complained that the screen resolution was not high enough on the netbook (1024×576) and refused to install but that was easily overcome by connecting to an external monitor with a higher resolution (no such issue with the Nikon codecs).

Incidentally, whilst I was researching this blog post I found that Microsoft also has an interesting program called Pro Photo Tools, which includes the ability to geotag photos, edit metadata, convert between raw formats, TIFF, JPEG and HD Photo; and work with Sidecar (.XMP) files (for interoperability with Adobe products – i.e. Bridge).  It too relies on the installation of the relevant raw codecs but should fit in quite nicely for some basic metadata tagging on the netbook whilst still in the field before transferring the images to the MacBook for any final tweaks when I get home.
Nikon raw image viewed in Microsoft Pro Photo Tools

Uncategorized

Tracking down and removing unwanted software bundled with a new PC

I’ve heard many comments over the years about the volume of crapware installed on new PCs but had not experienced it first hand until yesterday, when I was setting up a new Dell Inspiron 1525.

Even Macs come with some bundled software (e.g. trial versions of Microsoft Office and Apple iWork applications) but I was amazed by the volume of unrequired software that I needed to remove from this machine.

In fact, I think that now I understand why people think Windows Vista is so bad. It’s not Vista at all – its all the various add-ons that the OEMs bundle that confuse the user experience with bizarre interfaces and which generally gunk up the operating system by loading additional applications into memory (Dell even has an application to present a poor imitation of the MacOS X dock to Windows Vista users).  I don’t mind Microsoft Works (it’s useful if you don’t have a copy of Microsoft Office – although it should also be noted that alternative office suites are available); I don’t even mind a 30-day trial of a security suite (even though I think that McAfee, Symantec, et al are preying on the insecurities of vulnerable consumers); but, by way of an example, this is a list of all the items that I removed from the computer I was setting up:

  • Dell Dock
  • PCMService
  • Adobe Reader Speed Launcher
  • Google Toolbar Notifier
  • Tiscali Internet
  • Sonic Update Manager
  • Roxio Express Labeler
  • Remove Empty Program Folders
  • Google Desktop
  • McAfee Security Center
  • Google Toolbar for Internet Explorer
  • NetWaiting
  • QuickSet
  • Browser Address Error Redirector
  • Live! Cam Avatar Creator
  • Internet from BT
  • OutlookAddinSetup
  • MediaDirect
  • Dell-eBay
  • Dell Best of Web
  • Digital Line Detect
  • Roxio Creator DE
  • Microsoft Works
  • Microsoft Office Compatibility Pack
  • Microsoft Office PowerPoint 2007 Viewer
  • GoToAssist

So, how does one go about working out what’s safe to remove and what should stay?  A bit of Googling helps – to try and find out what some of the more esoteric items on the list really are – but there were three tools I found useful whilst cleansing this PC:

  1. First of all – the comically named PC Decrapifier is an excellent piece of software for identifying items that you may wish to remove.
  2. Next up, Autoruns is a Sysinternals tool which may be used to identify any programs configured during Windows startup/logon and can help to identify any remnants of the previous uninstallations.  As all that happens is the delection/selection of a checkbox, the changes made in Autoruns are non-destructive.
  3. Whilst the first two tools are freestanding applications and do not require installation, the third one does – and somewhat ironically the default installation options include the Yahoo! Toolbar (another unnecessary addition).  Even so, CCleaner is useful for clearing away any unused files and registry items (and is easily uninstalled afterwards).

The final piece of the puzzle was removing the PowerPoint 2007 Viewer.  The Control Panel applet to uninstall or change a program didn’t display an uninstall button (just repair) and the source location was missing so even a repair didn’t work.  I downloaded and reinstalled a fresh copy but that still wouldn’t uninstall, so I dropped back to the command line:

powerpointviewer.exe /extract:./powerpointviewer

cd powerpointviewer

msiexec /x ppviewer.msi

Executing these commands extracts the contents of the PowerPoint Viewer installation package to a folder, changes directory into the folder and cleanly removes the application.  After having done that, I installed a full copy of Office on the computer (so I no longer required the viewer).

There is an argument that the payment to the OEMs to bundle this software helps us to pay less for our computers but it’s a lot of work just to strip a pre-installed OS back to the bare operating system, plus any OEM-specific support utilities (the reason for not just wiping the hard disk and starting over).  Thankfully, buying a new PC is not something I do too often – and the removal of the unnecessary items should help me when I’m faced with the inevitable task of supporting this computer following its presentation to a family member in a few days’ time…

%d bloggers like this: