This post was originally published in 2005 and its contents may now be out-of-date.
Other Social Networks
- If you would like to reproduce a blog post from this site, or commission an article from Mark for your own publication or web site, please get in touch.
- My Tweets
- markwilson.it » Administering Office 365 using PowerShell: updated information on the required components on Changing the primary email address for Office 365 users
- Anonymous on The effects of sunscreen on Volkswagen/Audi paintwork
- Don on Migrating from a Novell NetWare environment to the Windows Server System
- markwilson.it » A Sunday in Hell? Not quite, but rule 5 definitely applied (#RideLondon #Ride100) on Finally gave in and bought a Garmin
- dermotos2013mkt on The effects of sunscreen on Volkswagen/Audi paintwork
Monthly Archives: September 2005
Wednesday 28 September 2005 – 18:30
Yesterday, Microsoft released Office 2003 Service Pack 2 (SP2).
SP2 is basically a rollup of fixes for Office (full technical details can be found in Microsoft knowledge base article 887616) but it also includes enhancements to Outlook 2003′s Junk E-mail Filter to provide protection against phishing attacks, automatically evaluating any incoming message to see if it might be suspicious, potentially fraudulent, and part of a phishing scheme. For added security, messages that are moved to the Junk E-mail folder will now have their links disabled and message format converted to Plain Text format (any message later moved out of the Junk E-mail folder will have its links enabled and the original message format restored, unless the message is considered by the Junk E-mail Filter to contain suspicious links in which case the links will remain disabled).
Further details can be found at Microsoft Office Update.
Tuesday 27 September 2005 – 9:51
I just stumbled across a comment in one of Paul Thurrott’s Windows IT Pro magazine network WinInfo Daily Updates from a few weeks back that the next version of Microsoft Office (Office 12) might be called “Microsoft Office Vista”. If that does happen, I think it would be a really bad idea…
To put it simply, consumers (and business end-users) get confused about what software they use. That makes life harder for people like me. Whenever I start working with a new organisation I often am amazed to find how many names a single critical line-of-business application might be known by and I’ve lost count of the number of times people have tried to tell me that they use Windows 97 or Office 98 (I know that there was an Office 98 – but that was for the Macintosh).
The last time Microsoft released a version of Office which included part of the operating system name in its product name (Office XP), consumers (and senior IT management – the non-technical ones) got confused and thought that Office XP was somehow linked to Windows XP (as far as I’m aware there are no such constraints). Similarly, I understand that Office Vista will be supported on both Windows XP and Windows Vista so any Office product name including the Vista moniker could be confusing.
Personally I liked the old system of using numbers to describe products (the one that works well for the competition too) and think it should be the “Microsoft Office System, version 12″ (well, version 11 really because I seem to recall that the version numbers jumped from 4.3 to 6 a few years back as part of a game of “version number catchup”, but that’s too long ago to bother about now…). What about calling Windows Vista “Microsoft Windows, version 6″?
Monday 26 September 2005 – 18:54
Last week I needed to view a colleague’s project plan (in Microsoft Project) to
make sure I hadn’t been stitched up ensure that all the activities had been captured in the correct sequence and within a realistic time frame. Because I didn’t have Microsoft Project installed on my PC, I had to go through the correct processes to get a licence allocated and the software installed.
I could have just got the CD out of my drawer and installed an illegal copy, but I was “being good” and my honesty cost my Manager’s budget £223.35 – and that’s with a heavy volume licence discount.
Literally two days after the software was installed, I attended an event where I was given a copy of Seavus Project Viewer. For anyone who’s not aware of this product’s existence (I wasn’t), it is an application which allows Microsoft Project (.MPP) files to be viewed by users who don’t have a copy of Microsoft Project installed. At only $39, this would have been substantially more cost-effective than licensing Microsoft Project so I thought I’d blog about it and save someone else from spending the money if they only need read-only access to project plans.
Friday 23 September 2005 – 17:05
Yesterday, I was at a very interesting presentation from Fujitsu-Siemens Computers. It doesn’t really matter who the OEM was – it was the concept that grabbed me, and I’m sure IBM and HP will also be looking at this and that Dell will jump on board once it hits the mass market. That concept was processor area networking.
We’ve all got used to storage area networks (SANs) in recent years – the concept being to separate storage from servers so that a pool of storage can be provided as and when required.
Consider an e-mail server with 1500 users and 100Mb mailbox limits. When designing such a system, it is necessary to separate the operating system, database, database transaction logs, and message transfer queues for recoverability and performance. The database might also be split for fast recovery of VIP’s mailboxes but my basic need is to provide up to 150Gb of storage for the database (1500 users x 100Mb). Then another 110% storage capacity is required for database maintenance and all of a sudden the required disk space for the database jumps to 315Gb – and that doesn’t include the operating system, database transaction logs or message transfer queues!
Single instance storage might reduce this number, as would the fact that most users won’t have a full mailbox, but most designers will provide the maximum theoretical capacity “just in case” because to provision it later would involve: gaining management support for the upgrade; procuring the additional hardware; and scheduling downtime to provide the additional storage (assuming the hardware is able to physically accommodate the extra disks).
Multiply this out across an organisation and that is a lot of storage sitting around “just in case”, increasing hardware purchase and storage management costs in the process. Then consider the fact that storage hardware prices are continually dropping and it becomes apparent that the additional storage could probably have been purchased at a lower price when it was actually needed.
Using a SAN, coupled with an effective management strategy, storage can be dynamically provisioned (or even deprovisioned) on a “just in time” basis, rather than specifying every server with extra storage to cope with anticipated future requirements. No longer is 110% extra storage capacity required on the e-mail server in case the administrator needs to perform offline defragmentation – they simply ask the SAN administrator to provision that storage as required from the pool of free space (which is still required, but is smaller than the sum of all the free space on a all of the separate servers across the enterprise).
Other advantages include the co-location of all mission critical data (instead of being spread around a number of diverse server systems) and the ability to manage that data effectively for disaster recovery and business continuity service provision. Experienced SAN administrators are required to manage the storage, but there are associated manpower savings elsewhere (e.g. managing the backup of a diverse set of servers, each with their own mission critical data).
A SAN is only part of what Fujitsu-Siemens Computers are calling the dynamic data centre, moving away from the traditional silos of resource capability.
Processor area networking (PAN) extends takes the SAN storage concept and applies it to the processing capacity provided for data centre systems.
So, taking the e-mail server example further, it is unlikely that all of an organisation’s e-mail would be placed on a single server and as the company grows (organically or by acquisition), additional capacity will be required. Traditionally, each server would be specified with spare capacity (within the finite constraints of the number of concurrent connections that can be supported) and over time, new servers would be added to handle the growth. In an ideal world, mailboxes would be spread across a farm of inexpensive servers, rapidly bringing new capacity online and moving mailboxes between servers to marry demand with supply.
Many administrators will acknowledge that servers typically only average 20% utilisation and by removing all input/output (I/O) capabilities from the server, diskless processing units can be provided (effectively blade servers). These servers are connected to control blades which manage the processing area network, diverting I/O to the SAN or the network as appropriate.
Using such an infrastructure in a data centre, along with middleware (to provide virtualisation, automation and integration technologies) it is possible to move away from silos of resource and be completely flexible about how services are allocated to servers, responding to peaks in demand (acknowledging that there will always be requirements for separation by business criticality or security).
Egenera‘s BladeFrame technology is one implementation of processor area networking and last week, Fujitsu-Siemens Computers and Egenera announced an EMEA-wide deal to integrate Egenera Bladeframe technology with Fujitsu-Siemens servers.
I get the feeling that processor area networking will be an interesting technology area to watch. With virtualisation rapidly becoming accepted as an approach for flexible server provision (and not just for test and development environments), the PAN approach is a logical extension to this and it’s only a matter of time before PANs become as common as SANs are in today’s data centres.
Friday 23 September 2005 – 11:25
Having just said that Microsoft needs to be better at innovating if it is to survive another 30 years in business, it seems that the next version of Microsoft Office has surprised everyone with a new, simplified user interface that removes much of the toolbar clutter. The press release includes screenshots for Office 12 core applications as well as a description of why Microsoft made the changes (there’s also a review of the interface on Office Watch).
I laughed when I read the comments about the new interface on Mark Harrison’s blog (which I was alerted to by Rory) – one which says “I hope [Microsoft] makes it easy for me to put things back the way I want” and another (from Mark) saying “[let's] have a dinosaur button to revert… to [the] old UI”! Could this be another case where Microsoft are forced to provide a “classic” interface to please those who don’t want to move with the times? The press release indicates that there are no plans to do so at present that doesn’t mean things won’t change before the product is released and whilst I appreciate that from a user familiarity perspective, many organisations will be reluctant to change (as there will be an associated training cost), the current UI has evolved over many years and is far too complex.
Interestingly, Microsoft say that this only applies to their authoring applications – Word, Excel, PowerPoint, Access and parts of Outlook. Traditional menus and toolbars will still be used in many areas of the Office suite.
I can’t wait to see what they do to Outlook.
Friday 23 September 2005 – 8:58
Microsoft turns 30 today. We tend to associate Information Technology (IT) with a rapidly expanding market of young start-up companies but whilst it is nothing compared to the global giants IBM, Hewlett-Packard (HP) and Fujitsu, 30 years is significant.
Microsoft has become ubiquitous – largely through its Windows operating system and Office productivity suite, but recently (and somewhat worryingly for someone who makes a living architecting solutions based on Microsoft technology), Microsoft has been drifting and MSFT stock prices (which were once rising at astronomical levels, splitting nine times between the company’s IPO in 1986 and 2003) have been virtually static in recent years leading to a number of reports suggesting that the company has lost its way. Maybe it was because Bill Gates stepped down as CEO, maybe it was just the sheer size of the giant, which employs almost 60,000 staff in 100 countries and had annual revenues of $39.75bn in 2004/5 (up 8% on 2003/4), generating profits of $12.25bn (up 50%).
On the surface, these figures look great – 8% growth and 50% increase in profits. But a look at the figures for the last 10 years shows that growth has slowed from 49% in 1995/6.
The trouble is that Microsoft has been losing ground to young upstarts like Google (mission: “to organize the world’s information and make it universally accessible and useful”). Let’s face it, it was Microsoft that was the young upstart when Bill Gates and Paul Allen persuaded IBM to make MS-DOS the operating system for the first PC in 1981 (ousting CP/M). After being slow to embrace the Internet and a series of legal wrangles (some justified, others not), Microsoft was also late to embrace search technologies, whereas the current industry darling dominates with 36.5% of the web search market.
It didn’t help that for a period between 1995 and 2001, the flagship product (Windows) was split between the (unreliable and insecure) Windows 95, 98 and ME product line and the expensive business version, Windows NT (later Windows 2000). Since Microsoft finally converged the two product lines with the launch of Windows XP (which is still based on the Windows NT kernel) there has been a push towards delivery of a trustworthy computing platform, and despite its critics, I think Microsoft generally does pretty well there. If you have the largest market share you will get attacked my malware writers – that means Microsoft for PC operating systems and Nokia for mobile handsets!
The trouble is that since Windows 2000 and XP sorted out the security issues, operating system upgrades have been a little dull, with limited innovation. It doesn’t help that any bundling of middleware seems to result in a lengthy courtroom battle but without innovation, there is no reason for consumers to upgrade, and in the business market, where IT is a business tool (not the business itself), IT Managers are under pressure to reduce costs through standardisation. That often means standing still for as long as possible.
I really hope that Windows Vista/Longhorn and Office 12 are not the death of Microsoft. Microsoft’s mission is “enabling people and businesses to realize their full potential” and this week, in an attempt to realise its own potential, a massive re-organisation was announced, with the aim of making the giant more dynamic (and hence able to respond to the industry – let’s face it, Microsoft has never been the innovator but it is very good at marketing other people’s ideas and making them work – even MS-DOS was licensed from Seattle Computer Products). Maybe the new organisation will help the timely delivery of products but it’s amazing how the rising fortunes of the Mozilla Foundation’s Firefox browser has focused Microsoft on delivering a new version of Internet Explorer after years of poor standards compliance) with very few new features and how the desktop search functionality provided by Google (and others) has focused Microsoft’s attention in this space (even if the current MSN Search strategy appears to be failing). Maybe increased competition in the operating system market (come on Apple, give us OS X for the PC – not just Intel-based Macs, which are really just Apple PCs and could also run Windows…) in the shape of the major Linux distributions (Red Hat and Novell SuSE) or free UNIX distributions like the x86 version of Sun Solaris will focus the giant on delivering great new features for Windows.
Microsoft was built on a dream of “a computer on every desk and in every home”. Despite all of the negative publicity that Microsoft tends to attract, it seems to me that (at least in the “developed” world) this dream has largely been realised. Let’s see what the next 30 years brings.
Wednesday 21 September 2005 – 22:03
Readers of this blog will be aware that I am a great fan of my iPod Mini (even if I do think Apple is a touch monopolistic in the digital media market). I also like Volkswagen cars. Last year my wife and I bought a Polo; I’ve had a few Golfs (one Mk II and a couple of Mk IVs); until recently I drove a 2004 Passat Estate 1.9TDI 130PS Highline (which I really liked); and I would love to own a 1960s Microbus (or even the 21st Century Microbus if it ever makes it to market).
My new employer’s car scheme doesn’t include Volkswagen so I have a Saab 9-3 SportWagon on order and as I mentioned in my recent post about the iPod Nano, it has a 3.5mm jack for connecting an MP3 player to the audio system which should come in very handy.
Now Volkswagen have gone one better and soon all of their new cars will offer a stereo system with iPod connectivity. Paul Thurrott reports that this will let “users manage the music on an iPod or other portable audio player through the stereo’s controls and display. The devices will plug into the car through a standard USB [connection], which virtually all MP3 players and portable storage devices use these days. Apple’s iPod is specifically supported with a special menu, but any USB-based device will work”.
Could this signal the death of the in-car CD-player?
Monday 19 September 2005 – 17:33
I often link to Wikipedia as I think it’s one of the best “for further reading” information sources on the Internet; however today I stumbled across Wiktionary – a sister project which provides a online dictionary and thesaurus (although I must confess that at the time of writing the thesaurus category is currently slightly limited and I do tend to use Microsoft Word for that feature).
I think that the wiki paradigm has a tremendous potential for information sharing (one which many companies seem totally blind to at present) and it turns out that the Wikimedia Foundation has a whole load of similar open content projects.
I’ll end this post with something I picked up from Wikiquote:
“I’m doing a (free) operating system (just a hobby, won’t be big and professional like gnu) for 386(486) AT clones”.
[Linus Torvalds (at the 1991 launch of his Linux operating system)]
Monday 19 September 2005 – 17:03
I few minutes back I published some rules for blogging and one of the guidance notes was to post regularly. That particular guidance could be interpreted a number of ways (once a week; daily; fast and frequent; or just whenever there is time) and this blog tends to fall into the latter category but even so, regular readers might have noticed the output level drop recently. This is down to a number of things including a recent holiday; returning to work to find a greatly increased workload; and a 10-month old son with a cold (the result of which is a lack of sleep for his parents, directly impacting upon my desire to spend my evenings writing blog posts, even if it does seem to be affecting my Google PageRank).
Even if the quantity of posts has dropped slightly, I hope the quality is still there, so if you keep reading, I’ll keep blogging (but it might sometimes be a few days between posts).
Before I sign off, thanks to everyone who has left a comment against a post. As I highlighted recently when I added the rules for comments, I don’t have time to respond to every request for help; however, I do read all of the comments and it’s always good to hear when something I’ve written has been useful for someone else out there in cyberspace.