Useful Links: May 2009

A list of items I’ve come across recently that I found potentially useful, interesting, or just plain funny:

How a single sheet of MDF has changed my life

In my blog post last year about Microsoft Photosynth, I used my desk as an example and, even though I had cleared it up before taking the images for the photosynth, it was still looking congested – as the various items of IT that I use for my work and hobbies vie for placement on a desk that is a best described as small but functional.

So, last Friday, I decided that something had to change. As well planning a trip to Ikea for some new shelves to organise the assortment of items that are spreading across the office floor (most of which is waiting to be advertised on eBay, Freecycled or otherwise disposed of), I purchased a sheet of MDF. Nothing remarkable – just a single sheet of Medium-density fibreboard, sized 1900x605x18mm.

Laying this across my desk and filing cabinet gave me a large expanse of flat work surface, upon which to place: the MacBook and external monitor that I use for digital photography; the notebook PC that I use for work; my Netgear ReadyNAS; the PC that I use for video production; my IP phone; and the mouse and keyboard that control the server under the desk (connected to the same monitors as the PC, using a Linksys KVM solution). Now I can reach them all, I can leave my graphics tablet and film scanner on the desk all the time, and I still have room for papers when I’m working.

My desk, in my home office

It might only overhang the edge of the desk by a few inches either side but I really cannot overstate how big a change to my workspace this simple change has made!

A lovely new tablet PC?

I bumped into Aaron Parker at tonight’s Windows Server/Active Directory/Vista joint user group meeting and he showed me his latest toy: a Dell Latitude XT2 tablet PC. The great thing about this tablet is that it works with the stylus and with fingers, so you can use the multitouch features in Windows 7 but also use the stylus to write notes.

I’ve raved about tablets before but this one is sweet, in fact Dell seem to have a lot of nice computers right now, although I tend to favour ThinkPads and the Lenovo ThinkPad X200 looks good too (besides which: my faith in Dell has never recovered from the D600 I used when I worked at Conchango; I have to use Fujitsu kit at work; and justifying a £1500 tablet PC would be a bit difficult in today’s economic climate)… anyway, Aaron’s written up his experiences of Windows 7 on the XT2 over at his Stealth Puppy blog.

Keeping an eye on FTP upload progress with hash printing

I spent most of yesterday working on some more How-Do-I? videos for Microsoft. The delivery mechanism for these is via FTP, and the command line FTP client in Windows is pretty basic so it doesn’t show progress on uploads by default. My problem was that, when I’m uploading 50MB .zip files to a server, it’s nice to know that the transfer is still working.

Then I remembered something that Garry Martin had mentioned a few months back – the hash command.

Before starting an upload, I turned on hash printing, so the FTP client prints # characters to the console during the upload, demonstrating progress. I still prefer the BSD version on my Mac, which shows progress updates as a percentage by default, but at least I could see progress as Windows XP pushed the file to the server.

Windows FTP client with hash printing enabled

Handy Linux distro that can be built on the fly

A couple of days back, my friend Stuart and I were trying to configure a device via a serial port. You’re probably thinking that’s not so hard, just hook up a console cable, fire up a terminal emulator, make sure you have the right settings and you’re good to go, hey?

Except that neither of us had a serial port on our laptops… and the only PC we had available with a serial port wasn’t configured with an operating system (at least not one we had the password for).

Thanks to a great Linux distribution called Slax, we were able to build a boot CD that included minicom in just a few seconds and after downloading and burning we could boot the PC from CD. All it took then was to configure Minicom to use /dev/ttyS0 (if we had used a USB to serial connector it would have been /dev/ttyUSB0), with /var/lock for the lockfile, 9600 8N1 Baud/parity, hardware flow control on, software flow control off and we were connecting to the console output (David Sudjiman described something similar to configure his Cisco router from Ubuntu).

I’m sure I could have used an alternative, like Knoppix but the real beauty of Slax was the ability to create a custom build on the fly with just the modules that are required. I could even put it on a USB stick…

Reading around on the ‘net afterwards I came across Van Emery’s Linux Serial Console HowTo which turns things around the other way (using a serial port to get into a Linux machine). I though it might be fun (in a retro sort of way) to hook up some dumb terminals to a Linux PC but I’m not sure what I’d do with them though… web browsing via Lynx? A bit of command line e-mail? Definitely a geek project.

SharePoint designer speeds up the process of checking in multiple files

Another day of playing with SharePoint which varied between quite interesting and extremely tedious, I thought I’d share a little tip that my colleague, Alan Dodd, gave me tonight.

After copying a load of files to SharePoint in Explorer view, it’s a lot quicker to check them in using SharePoint Designer which, incidentally, is now a free download (although registration is required).

System Center Operations Manager 2007 R2 is released to manufacturing

In my recent blog post about the R2 wave of Microsoft Virtualisation products, I mentioned the management products too – including System Center Operations Manager (SCOM) 2007 R2 (which, at the time was at release candidate stage). That changes today, as Microsoft will announce that SCOM 2007 R2 has been released to manufacturing.

New features in SCOM 2007 R2 include:

  • “Enhanced application performance and availability across platforms in the datacenter through cross platform monitoring, delivering an integrated experience for discovery and management of systems and their workloads, whether Windows, UNIX or Linux.
  • Enhanced performance management of applications in the datacenter with service level monitoring, delivering the ability to granularly define service level objectives that can be targeted against the different components that comprise an IT service.
  • Increased speed of access to monitoring information and functionality with user interface improvements and simplified management pack authoring. Examples include an enhanced console performance and dramatically improved monitoring scalability (e.g., over 1000 URLs can be validated per agent, allowing scaling to the largest of web-based workloads).”
  • Further details are available on the Microsoft website.

    Server Fault

    Jeff Atwood has a great blog called Coding Horror (I’ve written about it here before) and last year he (and some friends) started a new site for developers called Stack Overflow.

    Stack Overflow logoStack Overflow is a Q&A site – but (and I thought this before I read it on Jeff’s blog post announcing Stack Overflow), it’s not like Experts Exchange – the site that charges money for access to community-generated content (thank goodness for Google’s cache) because it’s free – as community-generated content should be. Of course, not everyone will agree with my opinion there – I’m sure the people at Experts Exchange think they have a fine business model but I think it stinks to charge money for something that has been generated by your users. Anyway, back to the point, Stack Overflow is a sort of forum-meets-wiki-meets-blog-meets-digg site for software engineers to ask and answer questions, earning reputation points based on the value of their input.

    So Stack Overflow is a great site for developers but I’m an IT admin-type… is there something similar for us? Well, no – not really. There hasn’t been, but Server Fault logonow the Stack Overflow guys are launching Server Fault. You can hear more about it in Jeff Atwood’s recent interview on RunAs Radio but it’s aimed at IT professionals and system administrators, running on the same concepts as Stack Overflow.

    I wish Jeff and his cohorts all the best with Server Fault and I plan to spend some time over there myself. Right now the site is in semi-public beta – if you have an OpenID and you know the password then you can get in – and the full public beta is expected to commence next week.

    Fast(er) entry of person/group names in SharePoint

    Together with my colleague, Alan Dodd, I’ve spent a considerable amount of time this week cleansing data in a SharePoint list. Quite why I can’t import e-mail addresses from an Excel spreadsheet to a column of type “Person or Group” I don’t know, but we couldn’t seem to make it work…

    Checking names or browsing the directory was tediously slow; then I found that if I pasted the e-mail address, rather than the person’s name, SharePoint immediately resolved that address to a name and let me move on to the next record… might be a timesaver for someone?

    An introduction to business intelligence for IT Managers

    A few weeks ago, I caught one of the IT Manager series of webcasts that Microsoft is running, where Andrew Fryer was introducing Business Intelligence for IT Managers (I’ll steer clear of the obvious joke about IT managers and intelligence there… I might want a job as an IT Manager one day…). This was an interesting presentation for a couple of reasons: it’s not a topic that I know well; and Andrew presented 290 slides in less than an hour (which sounds a lot, but it wasn’t – he used slides with just a few words or a picture, in rapid succession – and I like that style).

    I can’t find the recorded version of the presentation online but this blog post attempts to capture the main points.

    According to Wikipedia, Business Intelligence (BI) can be defined as follows:

    “Business intelligence (BI) refers to skills, technologies, applications and practices used to help a business acquire a better understanding of its commercial context. Business intelligence may also refer to the collected information itself.”

    That’s a bit of a mouthful, but basically it makes BI seem hard. So, let’s think about intelligence without the business context – is it: knowledge and understanding (we used to think the world was flat!); about meaning and context (some information can seem obvious without context); about foresight (to predict future events); the ability to solve complex problems; or the ability to make decisions?

    We make decisions all the time – and some of those decisions are poor ones – so if intelligence is about making decisions (and decisions require people), what makes a good decision? The answer is information. The information provides the meaning and context together with the background information (knowledge and understanding, likelihood of a given event occurring, etc.) to allow someone to make the right decision.

    Information has been used throughout history to share stories, to learn and to discover things. Over time, information has helped to provide answers and to unlock secrets, which allowed innovation. Information has provided answers – and answers have allowed people to make decisions.

    In a business context, the information is derived from data (about people, products, places, etc.). Where there are questions (which products are best? how are the sales figures looking? how are my people?), some insight is required to provide meaning and to convert raw data to information.

    That data needs to be stored – originally it was stored in paper files and later on computer disks and tapes – but it also needs to be managed. The advent of databases provided a means for data to be stored and managed but business applications were needed to abstract the management of the database from business end users. These business applications provided a better way to collect data, easing the process of data entry and managing access, to ensure that those who needed access were able to find answers to business questions. But is wasn’t easy – the data was sourced from many locations. Reports were one approach, but they were one-dimensional and siloed, fragmented and lacking insight.

    The advent of data warehouses allowed data from multiple locations to be organised, managed and accessed – or consumed. Now the business applications could analyse as well as report and the term “Business Intelligence” was born. Promising more access, from more locations, BI vendors created demand for more data. This led to businesses wanting faster access to data (improved performance). But as the volume of data exploded, so did the use of personal computers, and most data ended up in desktop productivity applications like Microsoft Excel, and Access. There was no single version of the truth upon which to base decisions, the data was hard to maintain and the BI tools cost a lot of money, so vendors had to find a way to reduce costs and offer increased functionality. The result was a period of vendor consolidation in the BI tools space and the formation of a few BI platforms, from companies like Oracle, SAP, IBM and Microsoft, offering more tools and more functionality, for both online and offline access.

    But, for all the promises, BI tools were still not working. Business users were still confused and the business couldn’t get the answers that were needed. It wasn’t about people – it was still about disparate systems, with access to data controlled by the IT department. An overstretched IT department. So business users started to circumvent IT departments, but the BI tools were not intuitive and the users didn’t have the time to be IT administrators. Suddenly BI was about usability, and turning data into the right format to be easily consumed by people, with that data managed by IT.

    There’s not just the data in the databases to consider either – there is unstructured data too. That unstructured data comes from places like blogs, wikis, e-mail messages, documents, presentations, and videos – at one point analysts considered that 80% of business was conducted based on unstructured data.

    So BI is about the right person, accessing the right data, at the right time – and it needs to be people-centric because it’s generally people that make decisions, not computers. Businesses need to do more to collaborate, search, and communicate about questions and answers in order to drive innovation. Even in today’s times of economic uncertainly, BI is still a priority at CxO level in order for businesses to do more with less, to provide better insight for better decision-making, for more people.

    Reporting and scorecards are important components of the BI toolset, along with dashboards to display key performance indicators, for analysis. On the desktop we still use applications like Excel but the data lives in the warehouse. Other BI features include data mining (e.g. the shopping basket analysis that supermarket chains carry out using our loyalty cards). For unstructured data, we have portals for collaboration.

    In today’s BI implementations the critical success factors are sponsorship (at a senior level in the company), a compelling need, a culture of analysis (rather than looking for divine inspiration) and, most importantly, partnership between the IT department and business users.

    I don’t pretend to know anything about any of the specialist BI tools but, on the Microsoft infrastructure side, we already have some useful tools. Office gives us desktop applications like Excel, there are collaboration services in the form of SharePoint products and technologies, and we have a scalable database engine in SQL Server – there’s more information on Microsoft’s BI blog and learn more about the products on Microsoft’s BI website. There’s also advice on planning for BI in the SharePoint Server TechCenter, webcasts, videos, virtual labs and podcasts and more advice for IT Managers and their teams on the TechNet website. Finally, if you just want the highlights and a bit of technical analysis, Andrew Fryer’s “Insufficient Data” blog is worth a read.