Microsoftâ€™s approach is subtly different â€“ as should be expected with a product thatâ€™s part of Visual Studio itâ€™s focused on aiding developers to avoid configuration drift and to perform repetitive system tests during the application development lifecycle, leaving the System Center management products to manage the movement of virtual machines between environments in the virtual infrastructure.
The VSTS approach attempts to address a number of fundamental issues:
Reproduction of bugs. Itâ€™s a common scenario â€“ a tester files a bug but the developer is unable to reproduce it so, after a few rounds of bugfix ping-pong, the incident is closed with a norepo status, resulting in poor morale on both sides. Lab Management allows the definition of test cases for manual testing (marking steps as passes/fails) and including an action log of all steps performed by the tester. When an error occurs, the environment state can be checkpointed (including the memory, registry, operating system and software state), allowing for reproduction of the issue. A system of collectors is used to gather diagnostic data, and, with various methods provided for recording tests (as a video, a checkpoint, an action log or an event log), itâ€™s possible to automate the bug management and tracking process including details, system information, test cases and links to logs/checkpoints â€“ all information is provided to the developer within the Visual Studio interface and the developer has access to the testerâ€™s environment. In addition, because each environment is made up of a number of virtual machines â€“ rather than running all application tiers on a single box â€“ so-called â€œdouble-hopâ€ issues are avoided whereby the application works on one box but issues appear when itâ€™s scaled out. In short: Lab Management improves quality.
Environment setup. Setting up test environments is complex but, using Lab Management, itâ€™s possible for a developer to use a self-service portal to rapidly create an new environment (from a template (not just a VM, but the many interacting roles which make up that environment â€“ a group of virtual machines with an identity – for example an n-tier web application). These environments may be copied, shared or checkpointed. The Lab Environment Viewer allows the developer to view the various VM consoles in a single window (avoiding multiple Remote Desktop Connection instances) as well as providing access to checkpoints, allowing developers to switch between different versions of an environment and, because multiple environment checkpoints use the same IP address schema, supporting network fencing. In short: Lab Management improves productivity.
Building often and releasing early. Setting up daily builds is complex and Lab Managementâ€™s ability to provide clean environments is an important tool in the application development teamâ€™s arsenal. Using VSTS, a developer can define builds including triggers (e.g. date and time, number of check-ins) and processes (input parameters, environment details, scripts, checkpoints, unit tests to run, etc.). The traditional build cycle of develop/compile, deploy, run tests becomes develop/compile, restore environment, deploy, take checkpoint, run tests â€“ significantly improving flexibility and reducing setup times. In short: Lab Management improves agility.
From an infrastructure perspective, Lab Management is implemented as a new role in Visual Studio Team System (VSTS), which itself is built on Team Foundation Server (TFS)Lab Management sits alongside Test Case Management (also new in Visual Studio 2010 – codenamed Camano), Build Management, Work Item Tracking and Source Control.
Vishal Mehotra, a Senior Lead Program Manager working on VSTS Lab Management in Microsoftâ€™s India Development Center, explained to me that in addition to TFS, System Center Virtual Machine Manager (SCVMM) is required to provide the virtual machine management capabilities (effectively VSTS Lab Management provides an abstraction layer on the environment using SCVMM). Whilst itâ€™s obviously Microsoftâ€™s intention that the virtualisation platform will be Hyper-V, because SCVMM 2008 can manage VMware Virtual Center, it could also be VMware ESX. The use of enterprise virtualisation technologies means that the Lab Management environments are scalable and the VMs may be moved between environments when defining templates (e.g. to take an existing VM and move it from UAT into production, etc.). In addition, System Center Operations Managers adds further capabilities to the stack.
Whilst the final product is some way off and the marketing is not finalised, it seems likely that Lab Management will be a separate SKU (including the System Center prerequisites). If youâ€™re looking to get your hands on it right now though you may be out of luck – unfortunately Lab Management is not part of the current CTP build for Visual Studio 2010 and .NET Framework 4.0.
The couple of weeks leading up to Christmas involved a lot of intense revision for me, as I prepared for the Microsoft exams to finish updating my MCSE on Windows Server 2003 to MCITP Enterprise Administrator.
When I set out to do this, I had originally intended to combine the tasks of reviewing John Savill’s Complete Guide to Windows Server 2008 with getting ready for my exams but it soon became apparent that I simply didn’t have enough time to work my way through the entire volume (excellent though it is!). Instead, I used the Microsoft-published exam preparation guides to identify the recommended Microsoft E-Learning courses.
If I’d written a review of the courses after the first couple of days it would have been a glowing recommendation – and in some respects perhaps I should be holding off on this review as I am somewhat battle-weary; however, having just taken two certification exams based on this study method it seems as good a time as any to assess the suitability of these courses.
Starting out with the selection, there is a huge catalogue of courses available which mirror the Microsoft Official Curriculum instructor-led courses. The prices are not bad either, when compared with classroom training; however, in many ways, I prefer the interaction that a classroom environment provides.
The format of the courses is good – built up as a number of virtual classroom modules, with a mixture of demonstrations and animations (with transcripts), textual content, and puzzles/tests in each lesson. Each lesson ends with a self-test and there is a summary and a glossary at the end of each module. There’s also a full-text search capability.
It’s possible to synchronise the content with a local cache to provide offline viewing – indeed, I only used the courses online for one day (when I was in the office and the proxy server wouldn’t let me download some new courses for offline working – the offline player includes the ability to edit proxy settings in the options but is not exposed in Windows until after successfully downloading and launching a course – and online viewing required me to add microsoftelearning.com to Internet Explorer’s trusted sites list) but it’s important to note that the virtual labs must always be completed online (this functionality is not available in the offline viewer).
Somewhat annoyingly, the course overview (which is the same for each course) and the glossary are included in the progress count, so after completing all of the available lesson content, most of the courses I attended were marked as only partially completed (it is possible to mark a course as complete in the My Learning section of the Microsoft Learning website but this will not complete the course in your transcript).
I could almost forgive elements like this, but the next annoyance really affected my ability to learn. You see, I’m English, and I will admit that sometimes I find it difficult to listen to an American accent for a long period of time (that’s nothing personal – I’m sure the same happens in reverse). But the demonstrations and animations in these courses are recorded in an American monotone – and it doesn’t even seem to be human. After listening to a few of these, with misplaced paragraph breaks and identical pronunciation for recurring words, regardless of sentence structure and intonation (or lack of), they actually become very difficult to concentrate on. Towards the end of my revision I stopped working through entire courses and instead concentrated on the introductions, summaries, and making sure I could complete the puzzles and self tests at the end of each lesson – avoiding the computer-generated monotone entirely. By simply recording all of the demonstrations using a human voice (as most of the module introductions are) then a vast improvement could be made.
Then there are the animations – which are at best ugly and at worst confusing. Watching icons appear and disappear in a manner which at times appeared to be random whilst the computer was talking to me did not help at all. In the end, I nearly always resorted to reading the transcript.
Whilst the animations may be a design crime (as are many of the diagrams in Microsoft Official Curriculum courseware) even worst was the inaccuracy of some of the information presented – which shows it was produced by an outside agency (Element K) and sometime suffers from a lack of technical quality assurance.
Let me give some examples:
Course 6519: one of the self-tests at the end of a lesson claims that NT 4.0 supports Kerberos (for that I would need Windows 2000 or later); and in the context of Active Directory database mounting, the module claims that one should “use a line printer daemon (LPD) utility, such as Active Directory Users and Computers, to view the data” (clearly LPD should have been LDAP…).
Course 6521: one of the reviews claims that only Active Directory Lightweight Directory Services uses an extensible storage engine (ESE) for its database store – contradicting the text elsewhere in the module (as well as being incorrect); and a self test asked me to “identify the feature that AD LDS supports but AD LDS does not support” (!).
Course 6524: .PIX files referred to in the text, whilst the demonstrations clearly showed that the extension is .PFX.
Course 6536: claims that “Hyper-V is supported only by the Windows Server 2008 Standard 64-bit edition” (64-bit yes, standard edition only – certainly not).
Course 6169: claims that “The various wireless networking standards are 802.11, 802.11b, 802.11a, 802.11g, 802.1X, and 802.11n” (802.1x is used for implementing network security but is not specifically a wireless networking standard).
There are typos too (sever instead of server, yes and no the wrong way around in test answers, etc.) as well as references to product names that have not existed since beta versions of Windows Server 2008 (e.g. Windows Server Virtualization). Other beta information has not been refreshed either – course 6529 refers to a 30-day grace period before Windows enters reduced functionality mode when it is actually 60 days (and RFM is much less brutal today than it was in the original versions of Windows Vista and early Windows Server 2008 betas). In another place, virtual machine (VM) snapshots are mixed up with volume shadow service (VSS) snapshots as the course suggests that VM snapshots are a backup and recovery solution (they most certainly are not!).
I could go on, but you get the message – almost every module has at least one glaring error. Mistakes like this mean that I cannot be 100% certain that what I have learned is correct – for that matter, how do I know that the Microsoft examinations themselves are not similarly flawed?
In the end, I don’t think it was just these courses that helped me pass the exams. Boot camps (and that’s what intense online training is the equivalent of) are all very well to cram information but they are no substitute for knowledge and experience. The outcome of running through these courses was a combination of:
Refreshing long-forgotten skills and knowledge on some of the lesser-used functionality in Windows Server 2008.
Updating skills for new features and functionality in Windows Server 2008.
Without several years’ of experience using the products I doubt that I would have known all of the answers to the exam questions – indeed I didn’t know them all (but the knowledge gained from the online training helped me to evaluate and assess the most likely of the presented options).
So, is this training worth it? Probably! Is it a complete answer to exam study and preparation? Possibly – but not through cramming 100 hours of training into a couple of weeks and expecting to retain all the knowledge. What these Microsoft e-Learning courses represent are a low cost substitute to formal, instructor-led classes. There are some downsides (for example, the lack of interaction and the poor quality control – instructor-led courses benefit from the feedback that the instructors provide to allow improvements at each revision) but they are also self-paced and the ability to go at my own speed means that, given sufficient time, I could work through a few of these each week and allow time for the knowledge to settle, backed up with some real-world experience. On that basis, they’re certainly worthy of a look but don’t expect them to provide all of the answers.
If you want to try one of the Microsoft E-Learning courses there are plenty available discounted (or even free). Afterwards, I’d be interested to hear what you think.
Iâ€™m writing this a couple of days after Christmas, as the great British public starts is looking for bargains in the sales. If youâ€™re reading this blog, the chances are that technology plays a significant part of your life if which case itâ€™s likely that Father Christmas/Santa Claus/St. Nick/whoever distributes presents to your house brought you at least one gadget for Christmas.
The old saying is that giving is better than receiving and I gave away a couple of gadget gifts this Christmas with which the reciepients have been particularly pleased, so I wanted to tell you a little bit about themâ€¦
Pure Evoke-1S DAB digital radio
First up is the DAB digital radio that I bought for my wife. Normally, she despairs of my penchant for expensive electronic items but a few weeks back she returned from a girly weekend in Sheffield and told me that some friends of ours had a digital radio and sheâ€™d really like one. Naturally, I was pleased â€“ I’d toyed with the idea for a while but I donâ€™t listen to enough radio to justify it â€“ I was even more pleased when it transpired that the model that had impressed her so was from Pure and it was the Evoke-1S.
Pure is one of the better-known digital radio producers in the UK and the Evoke-1S is the current version of a model that has been around for a while. Whilst some consider the wooden finish to be a little tired, I much prefer it to the cheap-plastic-sprayed-with-metallic-effect-paint that seems to be the current trend for consumer electronics. It sounds pretty good too. One speaker means mono but this is a radio (not the ultimate in high fidelity sound presentation) and anyway, this device lives in our kitchen (which does not have the best acoustics either). For what we wanted â€“ access to digital radio in an attractive unit at a reasonable price, this unit is perfect.
How perfect? Well, the setup process was easy enough for Mrs. W to get up and running in a few seconds with no intervention from me â€“ thatâ€™s a good start. It also has an alarm, and a kitchen timer. Thereâ€™s provision for a second speaker, as well as an auxiliary input (Iâ€™ll probably get an iPod dock and then I can use it to catch up on downloaded episodes of The Archers). Then there is a USB port for software updates (e.g. to DAB+, should it ever reach these shores).
The Evoke-1S is just one of many models available from Pure (for a little more money I thought the Evoke Flow looked good, featuring Internet radio in itâ€™s capabilities) but it was also a bargain. Available for around Â£96 from most electronics retailers (maybe less if you go for the Cherry version â€“ we preferred the Maple finish), I bought ours from John Lewis Online. John Lewis Onlineâ€™s packaging is terrible (itâ€™s just a strong plastic bag) so the box was a little beaten up when it reached us but they were happy enough to offer me a discount by way of compensation. The next day, I saw it on sale in PC World for Â£68.47 so I bought that (to guarantee stock) before getting John Lewis to price match (John Lewis Online wonâ€™t price match but the brick and mortar stores are never knowingly undersold and they refunded and resold the item to me, after which I returned the unit Iâ€™d bought in PC World). Why the complicated refund and resale? Because John Lewis offer a 2 year warranty on all new electrical purchases and because, if I had to guess which of the two retailers will still be trading in 2 years time, John Lewis looks a more certain bet to me than PC Worldâ€™s parent company, DSG International (it should be noted that I have no information to back that up â€“ itâ€™s purely a personal opinion).
My children have grown up in the glare of my digital cameras and we thought that my 4-year-old would enjoy one of his own (for the last year or so heâ€™s been playing with an old disposable camera body that I glued shut and he loves taking pretend photos).
After looking into the various options for rugged digital cameras (i.e. those with rubberised bodies designed to withstand the inevitable knocks and bangs which will be inflicted by children), we settled on a VTech Kidizoom (which we got 20% off by using a voucher at ELC). What we hadnâ€™t realised was just how big a hit this would be. By the end of Christmas day he had taken over 400 pictures and it was pushing 700 by the end of Boxing Day! Furthermore, he takes his camera everywhere with him (just like his Dadâ€¦ although I donâ€™t take mine to bed with me!) and looking at his pictures has given me a great insight into the things that interest a 4-year-old (pictures of Mummy, Daddy, his brother, visiting grandparents, uncles, cousins, etc., his toys, the food on the table, his teddy bears, the Christmas tree, the television, the view from the car window, boats on the lake in the local country park, the produce in the supermarket, etc., etc.). I simply cannot stress how much my son loves this present â€“ I have never seen anything hold his attention for so long.
Ironically, we nearly didnâ€™t buy it for him â€“ most of the reviews concentrate on the poor quality of the pictures, the inadequate flash, the fact that the pictures in the internal memory are lost when the (non-rechargeable) batteries run out, etc. and, having experienced the device now, I wanted to set the record straight. After all, I think Iâ€™m qualified to do so: as a parent; an amateur photographer; and as an IT bloke.
Firstly, picture quality. Yes, this is a 640×480 (0.3MP) camera and so the pictures are not great â€“ thatâ€™s putting it mildly – actually they are pretty awful, with sludgy colours and high compression. But itâ€™s also in the hands of a 4-year-old! He may take the odd picture thatâ€™s OK but these are unlikely to be the family album shots and heâ€™s more than happy looking at them on the computer screen or the TV (the camera is supplied with USB and composite video leads for connection to a PC or a TV).
Hereâ€™s an example of one of the pictures â€“ at 33% and a section at 100%:
Next, the flash â€“ itâ€™s not very powerful (probably for safety reasonsâ€¦ as little people are bound to hold the camera the wrong way around and take a self portrait) and it bleaches out anything close-up, but it does the job â€“ sort of. Iâ€™ve come to the conclusion that this camera is designed for pictures to be taken at a distance of around 1.5 metres (which is the sort of distance my son stands from his subject anyway!) but pictures taken outside are definitely better.
The Kidizoom comes with 16MB of internal memory and the instructions do warn that it will be erased if the batteries (4xAA) run out but it also has an SD card slot and the pictures on this card are safe in the event of power loss. Iâ€™ve set my sonâ€™s camera to use an old 1GB SD card by default and thatâ€™s around 33,000 of his pictures (it was just an old card that I was using for ReadyBoost on the PC and is too small to be useful in any of my cameras). Basically, RTFM and then losing photos when the batteries run out wonâ€™t be a problem.
I havenâ€™t used the supplied software â€“ both Windows and Mac OS X detected the internal memory and the SD card as removable drives and were happy to copy over the pictures. The camera also includes a video mode and some games but we havenâ€™t used them yet â€“ at this point still photos are a big enough attraction. Itâ€™s a bit too easy for the kids to turn the flash off (although I had to read the manual to work out how to turn it back on) and quite a few pictures seem to have had a novelty border added by accident but these are minor issues given the market at which this camera is aimed. There’s also no EXIF data and the date and timestamp seems to be added when the image is copied from the camera – not when it’s taken.
In summary, this camera is far from perfect but I also have to remember that my 2-year-old will get his hands on it too and itâ€™s more than good enough to last the next couple of years until they can both be trusted with a â€œgrown-upâ€ digital compact. In terms of entertainment value itâ€™s been a huge hit â€“ most children emulate their parents and mine are certainly happy to be snapping away like their Daddy.
Nikon D60 DSLR camera
My brother used to be a reasonable photographer but these days he hardly picks up his film camera (a Minolta X300, which, incidentally, last time I used, I really enjoyed for itâ€™s raw simplicity). In an attempt to set things straight this Christmas, his other half bought him a Nikon D60 kit, based on advice from me, including an 18-55VR lens (see DP reviews for a review of the D60). The lens is a standard kit lens â€“ cheap, built of plastic, slow glass â€“ but is enough to get him started (and he can borrow one of mine if he needs to). He brought it over to my place on Boxing Day and I was impressed â€“ in fact I would say that, as a consumer DSLR, the D60 is fantastic. Itâ€™s not the top of Nikonâ€™s range (far from it) but it matches or exceed most of the features in my aging D70, packs in more pixels (10.2MP), includes image sensor cleaning, and is smaller, lighter, and more compact. All in all, itâ€™s a great DSLR â€“ especially at around Â£300.
If you prefer Canon then all I have to go on is the fact that my Dad seems pretty pleased with his new 1000D, which appears to be broadly equivalent to the D60. If youâ€™re looking at any other brand for a DSLR Iâ€™d question why â€“ Canon and Nikon are the market leaders which means there is a huge range of OEM and aftermarket support (accessories, etc.) and both offer plenty of scope to progress to a more advanced model, if required, at a later date.
Apologies to those who donâ€™t find these items remotely interesting but this website comes up on enough Google searches that hopefully my comments will be of use to someone. And if someone bought you something for Christmas that you think is fantastic and youâ€™d like to share it with the world, please leave a comment on this post.
That’s it. Done it! I’ve just passed the last exam I needed to take (70-647) in order to update my MCSE on Windows Server 2003 to MCITP: Enterprise Administrator for Windows Server 2008, before the vouchers I had for free exams expired and just in time for Christmas!
Some quick advice for those of you about to open up new PCs bought for Christmas (for that matter, the advice is equally applicable whatever the occasion)… avoid the temptation to dive straight in and have a play. Instead, take an image of the hard drive as it arrived from the factory – yes, you can always use the manufacturer’s instructions to return the machine to a restored state but this can take a long time (as I found a few weeks back when I succumbed to temptation and had a play with my Lenovo Idea Pad S10e before imaging it!).
As I write this, I’m setting up a new Dell notebook for someone (an Inspiron 1525) and the first key I pressed (after the power button) was F2 to go into the BIOS. Here I changed a couple of settings (boot order and numlock key state), then booted again with F12 for the boot menu to use my device of choice (floppy disk drive, USB drive, or PXE network boot) to boot to my image capture software of choice (Symantec Ghost, Windows Deployment Service, you name your poison) and take an image of the entire drive (not a partition).
Following this, I can can configure the device as intended (remove the crapware, install some AV software, install an Office suite, etc.) and then take another image when I’m done.
Of course, for corporate deployments it’s normal to blow away the manufacturer’s image and install something more appropriate to the organisation’s requirements but, for home and small business users, it’s perfectly acceptable to use the factory-supplied build and this might just save you some time if you ever need to return to square one.
Armed with the two feeds (one with iTunes metatags for the AAC feed and one without for the MP3 which is not on iTunes), I tested them in my iTunes client application (selecting Subscribe to Podcast… from the Advanced menu to see what metadata is displayed in the feed and the downloaded episodes). This let me tune the tags until I saw something that approximated the desired information.
Once I knew the XML was correctly formatted (tested in iTunes and in various web browsers), the final versions were uploaded to the web (in testing you can use any accessible HTTP server but for live deployment you probably want to think about providing a reasonably permanent URL and the media files themselves need to be somewhere that bandwidth is not a problem – for Coalface Tech, that hosting is kindly provided by Australian Personal Computer, Internode and Sun Microsystems but I also considered using Liberated Syndication).
Next-up was time to submit the podcast to Apple for inclusion. There is a moderation process but within 24 hours we had received confirmation that we were live in the iTunes directory!
All I need to do each time we create a new episode is update the XML for the RSS feeds (create a new section for each episode) and notify iTunes that we have posted a new episode (although it should automatically check every day).
I’ve been meaning to blog about a command which is a reasonably recent addition to Windows for a few weeks now – cmdkey.exe (thanks to John Craddock for highlighting this at a recent XTSeminars event).
and I can connect to a share without supplying any credentials:
net use h: \\computername\sharename
The command completed successfully.
Furthermore this drive mapping (and stored credentials) persists on reboot – when the computer is restarted, H: is visible as a disconnected drive in Windows Explorer but as soon as I double-click it I connect without a prompt to supply credentials.
Unfortunately the major manufacturers (Canon and Nikon) do not produce codecs for 64-bit Windows (i.e. for people running high-end workstations with lots of memory for editing large imagesâ€¦) but the 32-bit codecs are fine for my little netbook with 2.5GB of RAM and there is 64-bit support for Adobe digital negatives (.DNG).
During installation, the Canon codecs complained that the screen resolution was not high enough on the netbook (1024×576) and refused to install but that was easily overcome by connecting to an external monitor with a higher resolution (no such issue with the Nikon codecs).
Incidentally, whilst I was researching this blog post I found that Microsoft also has an interesting program called Pro Photo Tools, which includes the ability to geotag photos, edit metadata, convert between raw formats, TIFF, JPEG and HD Photo; and work with Sidecar (.XMP) files (for interoperability with Adobe products â€“ i.e. Bridge). It too relies on the installation of the relevant raw codecs but should fit in quite nicely for some basic metadata tagging on the netbook whilst still in the field before transferring the images to the MacBook for any final tweaks when I get home.
Even Macs come with some bundled software (e.g. trial versions of Microsoft Office and Apple iWork applications) but I was amazed by the volume of unrequired software that I needed to remove from this machine.
In fact, I think that now I understand why people think Windows Vista is so bad. Itâ€™s not Vista at all – its all the various add-ons that the OEMs bundle that confuse the user experience with bizarre interfaces and which generally gunk up the operating system by loading additional applications into memory (Dell even has an application to present a poor imitation of the MacOS X dock to Windows Vista users). I donâ€™t mind Microsoft Works (itâ€™s useful if you donâ€™t have a copy of Microsoft Office â€“ although it should also be noted that alternative office suites are available); I donâ€™t even mind a 30-day trial of a security suite (even though I think that McAfee, Symantec, et al are preying on the insecurities of vulnerable consumers); but, by way of an example, this is a list of all the items that I removed from the computer I was setting up:
Adobe Reader Speed Launcher
Google Toolbar Notifier
Sonic Update Manager
Roxio Express Labeler
Remove Empty Program Folders
McAfee Security Center
Google Toolbar for Internet Explorer
Browser Address Error Redirector
Live! Cam Avatar Creator
Internet from BT
Dell Best of Web
Digital Line Detect
Roxio Creator DE
Microsoft Office Compatibility Pack
Microsoft Office PowerPoint 2007 Viewer
So, how does one go about working out whatâ€™s safe to remove and what should stay? A bit of Googling helps â€“ to try and find out what some of the more esoteric items on the list really are â€“ but there were three tools I found useful whilst cleansing this PC:
First of all â€“ the comically named PC Decrapifier is an excellent piece of software for identifying items that you may wish to remove.
Next up, Autoruns is a Sysinternals tool which may be used to identify any programs configured during Windows startup/logon and can help to identify any remnants of the previous uninstallations. As all that happens is the delection/selection of a checkbox, the changes made in Autoruns are non-destructive.
Whilst the first two tools are freestanding applications and do not require installation, the third one does â€“ and somewhat ironically the default installation options include the Yahoo! Toolbar (another unnecessary addition). Even so, CCleaner is useful for clearing away any unused files and registry items (and is easily uninstalled afterwards).
The final piece of the puzzle was removing the PowerPoint 2007 Viewer. The Control Panel applet to uninstall or change a program didnâ€™t display an uninstall button (just repair) and the source location was missing so even a repair didnâ€™t work. I downloaded and reinstalled a fresh copy but that still wouldnâ€™t uninstall, so I dropped back to the command line:
msiexec /x ppviewer.msi
Executing these commands extracts the contents of the PowerPoint Viewer installation package to a folder, changes directory into the folder and cleanly removes the application. After having done that, I installed a full copy of Office on the computer (so I no longer required the viewer).
There is an argument that the payment to the OEMs to bundle this software helps us to pay less for our computers but itâ€™s a lot of work just to strip a pre-installed OS back to the bare operating system, plus any OEM-specific support utilities (the reason for not just wiping the hard disk and starting over). Thankfully, buying a new PC is not something I do too often â€“ and the removal of the unnecessary items should help me when Iâ€™m faced with the inevitable task of supporting this computer following its presentation to a family member in a few daysâ€™ timeâ€¦