Category: Technology

  • Microsoft Management Summit 2010 highlights

    This week sees the annual Microsoft Management Summit (MMS) taking place in Las Vegas, with over 3500 attendees from around the world, even though there are many people stranded by the current flight restrictions in Europe.  According to Microsoft, that’s 50% up on last year – and those delegates have access to 120 break out sessions to learn about Microsoft’s vision and technology for IT management – across client devices, the datacentre and the cloud.

    The keynote presentations are being streamed live but, for those who missed yesterday’s keynote (as I did) and who are waiting to hear today’s news, here are the main highlights from the event, as described by Paul Ross, a Group Product Marketing Manager for System Center and virtualisation at Microsoft.

    Cloud computing is a major trend in the IT industry and many customers are trying to balance new models for elastic computing with trying to get the best TCO and ROI from their existing investments.  There are those who suggest Microsoft doesn’t have a cloud strategy but it’s now 5 years since Ray Ozzie’s Internet Service Disruption memo in which he set out Microsoft’s software plus services approach and Steve Ballmer reinforced Microsoft’s Cloud Services vision earlier this year.

    For many years, Microsoft has talked about the Dynamic Systems Initiative (DSI), later known as Dynamic IT and the transition to cloud services is in line with this – model driven, service focused, unifying servers and management, thinking about services instead of servers, and automated management in place of manual approaches. Meanwhile, new deployment paradigms (e.g. virtualisation in the data centre) see customers shifting towards private and public cloud environments.  But customers are experiencing a gap in the consistency of security models and application development between on premise and cloud services – and Microsoft believes it is the key to allowing customers to bridge that gap and provide consistency of infrastructure across the various delivery models.

    Some of the new products announced at this year’s MMS include the next version of System Center Virtual Machine Manager (SCVMM), slated for release in the second half of next year, and which will take a service centric approach to management – including new approaches to deploying applications. Alongside SCVMM, System Center Operation Manager (SCOM) will also be updated in the second half of 2011 – itself making the transition to a service-centric model.

    Before then, June 2010 will see the release to web of the Dynamic Infrastructure Toolkit for System Center which provides enterprise customers with the foundations for creating a private cloud with concepts such as on demand/self-service provisioning, etc.

    Today’s keynote will focus on the shift from device-centric computing to a user-centric approach.  Many organisations today operate separate infrastructures for different client access models – and there is a need for unification to manage IT according to end user requirements.  Central to this vision is the need to unify the products used for security and management of the infrastructure, reducing costs and focusing on user-centric client delivery for the cloud.

    Earlier this week, we heard about the beta for Windows Intune – offering security, management, Windows Update and MDOP benefits within a single subscription for small to medium sized businesses.  Today’s headlines are enterprise-focused and will include the announcement of the beta for System Center Configuration Manager (SCCM) 2007 R3 – focused on power management and unified licensing for mobile devices alongside traditional desktop clients.  SCCM vNext (again, scheduled for the second half of 2011) will be focused on user-centric management – offering a seamless work experience regardless of whether applications are delivered via App-V, VDI, or using a traditional application delivery approach.  In addition, SCCM vNext will incorporate mobile device management (currently in a separate product – System Center Mobile Device Manager), allowing a single infrastructure to be provided (so, to summarise: that’s licensing changes in SCCM R3, followed by the technology the next release).

    In other news, we heard yesterday about the release of System Center Service Manager (SCSM) 2010 and System Center Data Protection Manager (SCDPM) 2010 – both generally available from June 2010.  SCSM is Microsoft’s long-awaited service desk product – with 57 customers in production already and around 3000 on the beta – which Microsoft hopes will disrupt service desk market that they describe as being “relatively stale”.  Built as a platform for extension by partners SCSM includes the concept of process packs (analogous to the management packs in SCOM) and Microsoft themselves are looking to release beta compliance and risk process packs from June, helping to grow out the product capabilities to cover a variety of ITIL disciplines.  As for SCDPM, the product gains new enterprise capabilities including client protection (the ability to back up and recover connected client systems) – and both SCSD and SCDPM are included within the Enterprise CAL and Server Management Suite Enterprise licensing arrangements.

    For some years now, Microsoft has been showing a growing strength in its IT management portfolio – and now that they are starting to embrace heterogeneous environments (e.g. Unix and Linux support in SCOM, ESX management from SCVMM), I believe that they will start to chip away at some of the territory currently occupied by “real” enterprise management products.  As for that image of a company that’s purely focused on Windows and Office running on a thick client desktop, whilst that’s still where the majority of its revenue comes from, Microsoft knows it needs to embrace cloud computing – and it’s not as far behind the curve as some may believe.  The cloud isn’t right for everyone – and very few enterprises will embrace it for 100% of their IT service provision - but, for those looking at a mixture of on-premise and cloud infrastructure, or at a blend of private and public cloud, Microsoft is in a strong position with a foot in either camp.

  • Introducing Windows Intune

    This is the week of the Microsoft Management Summit in Las Vegas and, as well as the whole load of System Center-related announcements that we can expect this week, Microsoft has formally announced the beta of a new cloud-based PC management service called Windows Intune.

    Designed for customers who have 25-500 PCs, Windows Intune is intended to provide a cloud-based desktop management service in the way that BPOS does for business productivity applications.  Aimed squarely at the mid-market, Windows Intune (formerly known as System Center Online Desktop Manager) allows smaller organisations to gain some insight over what’s happening in their PC estate, avoiding the high infrastructure costs associated with enterprise products (and even System Center Essentials needs a server on site).

    All that’s required on the PC is an Internet connection (and an agent, which Microsoft described as “lightweight”) but also included in the service is a license for Windows 7 Enterprise Edition and the MDOP technologies – that’s a single license purchase for a lot of functionality!  Microsoft is making the beta available today but interested customers will have to move quickly – it’s limited to 1000 users in the US, Canada, Mexico and Puerto Rico only – Europe and Asia will follow within a year.

    For those organisations that are not quite ready for Windows 7, the license with Intune can be downgraded to Windows XP Professional or Windows Vista Business.

    Administrators simply need an Internet connection and a Silverlight-capable browser to access a console which provides a system overview showing a rolled-up status including malware protection, updates, agent health (offline clients) and reports on operating system alerts (e.g. disk fragmentation) along with a number of workspaces – currently:

    • Computers – which may be organised into groups and subgroups (e.g. to assign policies and reports). Any groups are completely inside Intune and are nothing to do with Active Directory (a computers can exist within multiple groups). It’s also possible to drill down and expose details for each computer (updates, alerts, malware status. etc.).
    • Updates – a roll-up of all updates together with the ability to drill down on update type (i.e. security, critical, definition, service packs, update rollups, mandatory updates) and to filters to see which updates are waiting to be approved.
    • Malware protection – showing which clients have been infected and any resulting action – including integration with the endpoint protection encyclopedia (with the Microsoft Malware Protection Center)
    • Alerts – for malware protection, monitoring, notices, policy, remote assistance, system or updates.
    • Software – an automatic inventory reports details about the machine itself and installed software, which may be printed or exported as a CSV file.
    • Licenses – the ability to to track licenses within Software Assurance (SA) agreements by entering the agreement numbers correlating installed software with purchased software (for Microsoft products only).  Microsoft were keen to highlight that privacy will be taken seriously with third party audit ensuring that the information is private to customers and not used by Microsoft to enforce its licensing.  In addition, the entering of SA agreement details is optional and the service will function without this information.
    • Policy – controlling how Intune and clients function including agent settings (template driven, but not using
    • Group Policy – indeed Group Policy will override in any conflict), tools settings, and firewall settings (Intune communicates over HTTP, and the agent installation will also open remote management functionality).
    • Reports – providing a snapshot of status.
    • Administration – each computer is identified by a download/installation and multiple administrators may be defined for the service, with notifications on particular alerts (i.e. by e-mail).

    From a client experience perspective, the Windows Intune Tools can be used for an end user to request help from Easy Assist (by sending an urgent alert to the Intune service – this has to be user-initiated and the administrator cannot arbitrarily take control of a client) and the end user can also check the update status with regards to Windows Update and malware protection.

    Those who have worked with Microsoft Security Essentials may be interested to note that:

    • Windows Intune will work on servers, but is not supported.
    • Malware protection is provided by the common malware protection engine (from Forefront) with the user interface from Microsoft Security Essentials (“at the moment”).  The use of the Forefront  scanning engine allows for reporting and policy control that is not present in Microsoft Security Essentials.

    In summary, Windows Intune is intended as an easy-to-use cloud-based solution for small-medium businesses that requires little or no infrastructure and remains up-to-date.  It is not an enterprise solution (it’s certainly not a replacement for System Center Configuration Manager) but it is a useful way to license Windows 7 and prepare for Windows 8.

    For more information as the beta progresses, check out the Windows Intune Team Blog.

  • After hours at UK TechDays

    Over the last few years, I’ve attended (and blogged in detail about) a couple of “after hours” events at Microsoft – looking at some of the consumer-related items that we might do with out computers outside of work (first in May 2007 and then in November 2008).

    Tonight I was at another one – an evening event to complement the UK TechDays events taking place this week in West London cinemas – and, unlike previous after hours sessions, this one did not even try and push Microsoft products at us (previous events felt a bit like Windows, Xbox and Live promotions at time) – it just demonstrated a whole load of cool stuff that people might want to take a look at.

    I have to admit I nearly didn’t attend – the daytime UK TechDays events have been a little patchy in terms of content quality and I’m feeling slightly burned out after what has been a busy week with two Windows Server User Group evening events on top of UK TechDays and the normal work e-mail triage activities.  I’m glad I made it though and the following list is just a few of the things we saw Marc Holmes, Paul Foster and Jamie Burgess present tonight:

    • A discussion of some of the home network functionality that the guys are using for media, home automation etc. – predictably a huge amount of Microsoft media items (Media Center PCs, Windows Home Server, Xbox 360, etc.) but also the use of  X10, Z-Wave or RFXcom for pushing USB or RF signals around for home automation purposes, as well as Ethernet over power line for streaming from Media Center PCs.  Other technologies discussed included: Logitech’s DiNovo Edge keyboard and Harmony One universal remote control; SiliconDust HD HomeRun for sharing DVB-T TV signals across Ethernet to PCs; using xPL to control home automation equipment.
    • Lego Mindstorms NXT for building block robotics, including the First Lego League –  to inspire young people to get involved with science and technology in a positive way.
    • Kodu Game Lab – a visual programming language made specifically for creating games that is designed to be accessible for children and enjoyable for anyone.
    • Developing XNA games with XNA Game Studio and Visual Studio, then deploying them to Xbox or even running them in the Windows Phone emulator!  Other related topics included the use of the Freescale Flexis JM Badge board to integrate an accelerometer with an XNA game and GoblinXNA for augmented reality/3D games development.  There’s also a UK XNA user group.
    • A look at how research projects (from Microsoft Research) move into Labs and eventually become products after developers have optimised and integrated them.  Microsoft spent $9.5bn on research and development in 2009 and some of the research activities that have now made it to life include Photosynth (which became a Windows client application and is now included within Silverlight), the Seadragon technologies which also became a part of Silverlight (Deep Zoom) and are featured in the Hard Rock Cafe Memorabilia site.  A stunning example is Blaise Aguera y Arcas’ TED 2010 talk on the work that Microsoft is doing to integrate augmented reality maps in Bing – drawing on the Seadragon technologies to provide fluidity whilst navigating maps in 3D but that environment can be used as a canvas for other things – like streetside photos (far more detailed than Google Streetview).  In his talk (which is worth watching and embedded below), Blaise navigates off the street and actually inside Seattle’s Pike Place market before showing how the Microsoft imagery can be integrated with Flickr images (possibly historical images for “time travel”) and even broadcasting live video.  In addition to the telepresence (looking from the outside in), poins of interest can be used to look out when on the ground and get details of what’s around and even looking up to the sky and seeing integration with the Microsoft Research WorldWide Telescope.
    • Finally, Paul spoke about his creation of a multitouch (Surface) table for less than £100 (using CCTV infrared cameras, a webcam with the IR filter removed and NUI software – it’s now possible to do the same with Windows 7) and a borrowed projector before discussing his own attempts at virtual reality in his paddock at home.

    Whilst I’m unlikely to get stuck into all of these projects, there is plenty of geek scope here – I may have a play with home automation and it’s good to know some of the possibilities for getting my kids involved with creating their own games, robots, etc. As for Blaise Aguera y Arcas’ TED 2010 talk it was fantastic to see how Microsoft still innovates and (I only wish that all of the Bing features were available globally… here in the UK we don’t have all of the functionality that’s available stateside).

  • Useful Links February 2010

    A list of items I’ve come across recently that I found potentially useful, interesting, or just plain funny:

  • Backing up and restoring Adobe Lightroom 2.x on a Mac

    Over the last few days, I’ve been rebuilding the MacBook that I use for all my digital photography (which is a pretty risky thing to do immediately before heading off on a photography workshop) and one of the things I was pretty concerned about was backing up and restoring my Adobe Lightroom settings as these are at the heart of my workflow.

    I store my images in two places (Lightroom backs them up to one of my Netgear ReadyNAS devices on import) and, on this occasion I’d also made two extra backups (I really should organise one in the cloud too, although syncing 130GB of images could take some time…).

    I also backup the Lightroom catalog each time Lightroom runs (unfortunately the only option is to do this at startup, not shutdown), so that handles all of my keywords, develop settings, etc. What I needed to know was how to backup my preferences and presets – and how to restore everything.

    It’s actually quite straightforward – this is how it worked for me – of course, I take no responsibility for anyone else’s backups and, as they say, your mileage may vary.  Also, PC users will find the process similar, but the file locations change:

    I also made sure that the backups and restores were done at the same release (v2.3) but, once I was sure everything was working, I updated to the latest version (v2.6).

  • Reading EXIF data to find out the number of shutter activations on a Nikon DSLR

    A few years ago, I wrote about some digital photography utilities that I use on my Mac.  These days most of my post-processing is handled by Adobe Lightroom (which includes Adobe Camera Raw), with a bit of Photoshop CS4 (using plugins like Noise Ninja) for the high-end stuff but these tools still come in useful from time to time.  Unfortunately, Simple EXIF Viewer doesn’t work with Nikon raw images (.NEF files) and so it’s less useful to me than it once was.

    Recently, I bought my wife a DSLR and, as I’m a Nikon user (I have a D700), it made sense that her body should fit my lenses so I picked up a Nikon refurbished D40 kit from London Camera Exchange.  Whilst the body looked new, I wanted to know how many times the shutter had been activated (DSLR shutter mechanisms have a limited life – about 50,000 for the D40) and the D40’s firmware won’t display this information – although it is captured in the EXIF data for each image.

    After some googling, I found a link to Phil Harvey’s ExifTool, a platform independent library with a command line interface for accessing EXIF data in a variety of image formats. A few seconds later and I had run the exiftool -nikon dsc_0001.nef command (exiftool --? gives help) on a test image and it told me a perfectly respectable shutter count of 67.  For reference, I tried a similar command on some images from my late Father’s Canon EOS 1000D but shutter count was not one of the available metrics – even so the ExifTool provides a wealth of information from a variety of image formats.

  • Safer Internet Day: Educating parents on Internet safety for their children

    A few weeks ago, I mentioned that today is European Safer Internet Day and, here in the UK a number of organisations are working with the Child Exploitation and Online Protection centre (CEOP) to educate parents and children in safe use of the Internet.  I don’t work for Microsoft but, as an MVP, I was invited to join in and tonight I’ll be delivering a session to parents at my son’s school, using Microsoft’s presentation deck (although it has to be said that this is not a marketing deck – it’s full of real-world examples and practical advice about protecting children and young people from the specific dangers the Internet can pose, whilst allowing them to make full use of the ‘net’s many benefits: turning it off is not the answer).

    The BBC’s Rory Cellan-Jones has reported some of the activities for Safer Internet Day; although the Open Rights Group’s suggestion that this is all about scoring a publicity hit for a little cost are a little cynical – Microsoft has a social responsibility role to play and by working with CEOP to produce an IE 8 browser add-in the UK subsidiary’s activities are laudable.  If other browser-makers want to follow suit – then they can also work with CEOP (ditto for the social networking sites that have yet to incorporate the Report Abuse button).  Indeed, quoting from James O’Neill’s post this morning:

    “We are part of the UK Council for Child Internet Safety (UKCCIS) and Gordon [Frazer – Microsoft UK MD and VP Microsoft International]’s mail also said ‘This year as part of the ‘Click Clever Click Safe’ campaign UKCCIS will be launching a new digital safety code for children – ‘Zip It, Block It, Flag It’. Over 100 Microsoft volunteers will be out in schools in the UK teaching young people and parents alike about child online safety and helping build public awareness for simple safety tips.

    Our volunteering activities today mark our strong commitment to child online safety. Online safety is not only core to our business, as exemplified by particular features in Internet Explorer 8 (IE8) and our work in developing the Microsoft Child Exploitation Tracking System (CETS) which helps law enforcement officials collaborate and share information with other police services to manage child protection cases, but it is also an issue that our employees, many parents themselves, take very seriously. As a company we put a great deal of faith in our technology, however, we are also aware that the tools we provide have to be used responsibly.”

    Anyway, I digress – part of the presentation I’ll be giving this evening will include a fact sheet, produced by Microsoft, that I’ll leave with parents and I’d like to repeat some of the advice it contains here (with a few edits of my own…).

    Safety Considerations

    The Internet is a fantastic resource for young people but we must remember that the same as in the real world, there can be potential dangers to consider:

    • Control – Personal information can be easily accessed if it is posted online. Consider what information about your child someone could access online.
    • Contact – Paedophiles use the Internet to meet young people and build up a relationship.  This is often done in a public environment such as a chat room or online game before trust is built up to become an online friend for 1-1 conversations.
    • Cyberbulling – Other people may make use of technology to bully a young person 24/7.  By using online technology a bully can gain an instant and wide audience for their bullying. Cyberbullying can be threats and intimidation as well as harassment and peer rejection.
    • Content – The Internet can contain inappropriate images of violence and pornography that you might be unhappy for your child to have access to.

    Top Tips for Parents

    These simple rules can help to keep children safe:

    • Keep your PC in an open space where possible to encourage communication.
    • Discuss the programs your children use.
    • Keep communication open with regards to who they are chatting to online.
    • Discuss their list of contacts and check they know all those they have accepted as friends.
    • Consider using the same technology so you can understand how it works.
    • Talk to your children about keeping their information and photos private using privacy settings on sites such as Bebo and Facebook.
    • Teach your children what personal information is and that they shouldn’t share it online with people they don’t know.
    • Make use of Parental Controls where available. These can allow you to control the amount of time your children are online, the sites they can access and the people they can talk to.   Controls are available for many products including Windows (Vista and 7), Mac OS X, Xbox and Windows Live (Family Safety), or more technical users might consider using an alternative DNS provider such as OpenDNS.

    Some useful links include:

    How to Get Help

    For Young People:

    For Adults:

    • Adults can speak to The Samaritans. The Samaritans provide confidential emotional support for people who are in emotional distress. If you are worried, feel upset or confused and just want to talk you can email the Samaritans or phone 08457 90 90 90.

    I forgot that presenting at a school where I have an association means that some of the people in the audience are my friends (blurring my personal/professional boundary…) but hey, there are some important messages at stake here.  If all goes well tonight, I’ll be contacting other schools in the area to do something similar.

    [Updated 24 November 2014: CBBC Stay Safe link updated; Metropolitan Police link added]

  • Raising parents’ Internet awareness

    UK-based readers of this blog who also subscribe to Microsoft’s UK TechNet Newsletter may have noticed a reference to the upcoming European Safer Internet Day. Quoting from the newsletter:

    “To support the day and the launch of the new digital code for children, Microsoft is offering all UK schools the opportunity to host their own parent’s awareness session. These virtual sessions offer the opportunity to host a parents evening with a web cast presentation led by a Microsoft volunteer to inform and educate parents on the technology their children are using and how they can keep them safe when online. To find out more or to book a presentation for your school please call Karina Gibson […]”

    [Microsoft TechNet Newsletter, 21 January 2010]

    I was able to see Karina present this session a few months ago, and I have to admit that I found it a moving and worthwhile experience – my children are still very young but it certainly taught me some of the issues that children and young people face in our online society and what parents can be doing to support safe Internet usage (turning it off is not the answer!). Consequently, I’m now liaising with my local schools and hope to be delivering at least one session soon. If you want to know more – contact Karina Gibson at Microsoft UK (I’ve left her contact details out of this blog post to avoid spam, but the Microsoft UK switchboard number is 0870 60 10 100).

  • Writing a macro to e-mail a worksheet from an Excel workbook

    I spent most of today trying to catch up with my expenses (in order to plug the rather large hole in my bank balance before Christmas).  I work for a large IT company, and to say that our systems are antiquated would be an understatement: I understand that my expense claims are entered onto a computer system prior to payment but, before that, I have to complete an Excel spreadsheet and mail a hard copy, with receipts attached, to a team that processes them (which is corporate-speak for wasting my time, their time, and my manager’s time to quibble over minor infringements).  Recently a new level of bureaucracy was added to the process and, before snail-mailing the hard copy to be processed, I also have to e-mail a soft copy to my manager for approval, using a pre-defined format for the mail subject header.

    You can probably tell by the tone of this post that I’m no fan of this process.  I understand from contacts at Microsoft, for example, that their system is entirely electronic, although paper receipts do also need to be submitted for audit purposes and I can’t see why we couldn’t do something similar.  Still, it could be worse: when I worked for a major fashion design, marketing and retail organisation a few years back, they insisted that I staple each receipt to a sheet of A4 paper first…

    Anyway, after messing up the process a couple of times today and recalling messages with incorrectly formatted subjects, I decided that it’s something that should be automated.  I’ve never written any Visual Basic for Applications (VBA) before, but, armed with a few code snippets from the web, I managed to write a macro (the whole thing took me about 30 minutes to pull together and debug):

    Sub SendToManager()
    '
    ' SendToManager Macro
    ' Macro to send completed expense worksheet to one's Manager
    '
    ' Keyboard Shortcut: Ctrl+Shift+M
    '

    'Create a new Workbook Containing 1 Sheet (right most) and sends as attachment.

    ThisWorkbook.Sheets(ThisWorkbook.Sheets.Count).Copy

    With ActiveWorkbook

    .SendMail Recipients:="mail@domain.com", Subject:="Expenses for approval: " " & Range("C8").Value & ", " & Range("O8").Value & ", " & Format(Range("O9").Value, "Long Date") & ", " & Format(Range("Q48").Value, "Currency")

    .Close SaveChanges:=False

    End With

    End Sub

    The code is here for anyone that might find something similar useful… I’m sure that it will need modification to suit someone else’s requirements but the basic idea is here.  Basically, we create a copy of the right-most worksheet in our Excel workbook (I create a new sheet for each claim, and work left to right…), then we take that and send it to the specified recipient (one change might be to prompt for a user name) and format the subject with data from the sheet that eventually reads “Expenses for approval: name, employee number, claim date, claim value” before sending the mail.  Simple really.

    Here are a few links that helped me out in doing this:

  • Building a low-power server for 24×7 infrastructure at home: Part 2 (assembly and initial configuration)

    Yesterday I wrote about how I’d been looking to create a server that didn’t consume too much power to run my home infrastructure and finally settled on a mini-ITX solution.  This post continues the theme, looking at the assembly of the unit, installation of Windows Server, and finally, whether I achieved my goal of building a low-power server.

    As I commented previously, it’s been at least 10 years since I built a PC from scratch and it’s still a minefield of connectors and components.  I took the Travla C158 case and Intel D945GCLF2 board that I had purchased and added 512MB of DDR2 RAM and a 250GB Seagate Barracuda (ST3250620NS) that were not being used in any of my other machines.  I didn’t fit an optical drive, electing to use a USB-attached one for setup (more on that in a moment) and the case also has a slot for a card reader, which I really should consider filling (or blanking off).

    With all the components ready, this is the process I followed:

    1. Open the top cover of the case.
    2. Remove the media drive and hard drive holders.
    3. Fix the hard disk to its holder and refit.
    4. Fit the gasket that surrounds the various ports (supplied with the motherboard) to the case
    5. Fit the motherboard and PCI riser.
    6. Fit a blanking plate for the (unused) PCI card slot.
    7. Install some DDR2 memory in the motherboard’s single memory slot.  Unfortunately the module that I used does not have a low-enough profile to allow the media drive holder to be refitted, so I’ll be looking for some more (512MB isn’t much for a modern operating system anyway).
    8. Connect the case fan to the jumper on the motherboard.
    9. Connect the side panel audio ports to the motherboard (the labelling on the connectors did not match Intel’s instructions for the motherboard but I followed Gabrielle Torres’ Hardware Secrets article on installing frontal audio plugs – sound is not really a concern for me on a server).
    10. Connect the front panel connectors to the motherboard, using the pattern shown in the instructions (noting that the case I selected doesn’t have a reset button, so pins 5 and 7 are not connected)
    11. Connect the side panel USB ports to the motherboard (single jumper).
    12. Connect both the power jumpers (2×2 and 2×10) to the motherboard.
    13. Connect the SATA hard drive power and data cables (the data cable was supplied with the motherboard, along with an IDE cable that I did not use)
    14. Install the mounting kit, ready to fix the PC to the wall of my office (I also considered placing it in the void between the downstairs ceiling and upstairs floor… but decided it wasn’t really necessary to bury the machine inside the fabric of the house!).
    15. Check that the BIOS configuration jumper block is set to pins 1 and 2 (normal) and refit the top of the case, then boot the PC.
    16. Press F2 to enter the BIOS configuration utility and change the following values:
      • Set the date and time (on the Main screen).
      • Under Boot Configuration on the Advanced screen, enable the System Fan Control.
      • On the Power screen, set the action After Power Failure to Power On.
      • On the Boot screen, ensure that Boot USB Devices First is enabled.
    17. Connect a DVD drive and boot from a Windows setup DVD.

    I did manage to boot from my DVD drive once; however I had left the wrong DVD in the drive and so I rebooted.  After rebooting I was unable to get the PC to boot from the external DVD drive (a Philips SPD3900T).  I tried different USB ports, I changed BIOS options, I even reset the BIOS jumper to pins 2 and 3 (which provides access to some extra settings in the BIOS) but nothing worked, so I configured a USB thumb drive to install Windows Server 2008 R2 and that booted flawlessly.  I later found that Windows didn’t recognise the DVD drive until I had reset its power (which may also have resolved my issues in a pre-boot environment); however it’s all a bit odd (I hadn’t previously experienced any issues with this external DVD drive), and I do wonder if my motherboard has a problem booting from USB-attached optical media.

    The Windows Server setup process was smooth, and all of my devices were recognised (although I did need to set the screen resolution to something sensible, leaving just the configuration of the operating system and services (adding roles, etc.).

    With Windows Server 2008 R2 running, I decided to take a look at the power usage on the server and it seems to tick over at around 35W.  That’s not as low as I would like (thanks to the Intel 945GC chipset – the CPU itself only needs about 8W) but it’s a lot better than running my Dell PowerEdge 840 all day.  There are some other steps I can take too – I could potentially reduce hard disk power consumption by replacing my traditional hard drive with an SSD as the the Barracuda pulls about 9W idle and 12W when seeking (thanks to Aaron Parker for that suggestion).  It may also be that I can do some work with Windows Server to reduce it’s power usage – although putting a server to sleep is probably not too clever!  A brief look at the energy report from powercfg.exe -energy indicates that the USB mass storage device may be preventing processor power management from taking place – and sleep is disabled because I’m using a standard VGA driver (vgapnp.sys).  Microsoft has written a white paper on improving energy efficiency and managing power consumption in Windows Server 2008 R2 and this blog post from the Windows Server Performance team looks at adjusting processor P-states.  It may be some time before I reach my nirvana of a truly low-power infrastructure server, but I’ll write about it if and when I do – and 35W is still a lot better than 100W.