More on high ISO levels

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Chatting with a freind earlier this evening, I realised that I may have confused things slightly when talking about high ISO levels in my post explaining why not all megapixels are equal.

Higher ISO film has traditionally been used to take photographs at a higher speed or in poor lighting conditions (it’s not always possible, or desirable, to use a flash) although high ISO films typically introduce grain into an image. Similarly, using a higher ISO setting on a digital camera can help in the same situations – albeit at the expense of introducing digital noise. That situation is changing as modern DSLRs such as the Nikon D3 are reported to take acceptable images at very high ISO levels (e.g. ISO 6400 – that’s six stops faster than standard daylight film used by most of us a few years back and four stops faster than the film that many consumers would have used for “action” shots).

For those of us who can’t afford a D3, it’s worth noting that squeezing more and more megapixels onto a tiny sensor will increase digital noise. For the reasons I described in my original post (the type of sensor, the technical differences between pixels and photosites, the firmware and software supporting the imaging chip and even the size of the pixels) the only real answer is a larger sensor, which is why a full frame DSLR will produce appreciably better low-light images than a digital compact camera or a cameraphone and why my Canon Ixus 70 produces terrible night-time shots on its high ISO setting.

Microsoft Licensing: Part 1 (client and server)

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few weeks back, I found myself spending the evening in a conference room at Microsoft’s UK headquarters, listening to a presentation about software licensing. For those who say I should get a life – you’re probably right and I’m sure there are better things that I could have been doing on one of the UK’s rare sunny evenings, but I’ve missed this session before and, whilst I have a pretty good grip on the technology, it’s often handy to understand a bit about the minefield that is Microsoft’s software licensing policies.

I learnt too much that evening to repeat here in one blog post, so I’m planning on writing a series on this subject. This post is part one, in which I’ll attempt to explain the basic licensing concepts around clients and servers.

All Microsoft software products (even those offered free of charge) are subject to a license to use the software – an end user licensing agreement, or EULA. For many products, there are client and server components – and it’s important to license the operating system as well as the application.

Common mistakes are that Windows client (e.g. XP or Vista) licenses include connections to Windows servers – in fact, a client access license (CAL) is required to use Windows Server functionality. Similarly, Microsoft Outlook is included within the Microsoft Office system but not the connection to an Exchange Server system to access e-mail and other collaborative technologies.

A CAL gives a client the right to access the services of the server. It is not software and is not “installed” on a server (although it may be recorded in certain circumstances). In addition, only one CAL is needed for a given device or user to access a server, regardless of which servers it is accessing.

When considering client access licenses, for many products, there are two models:

  • Per-seat licensing – with a CAL required for each device that connects to the server.
  • Per-user licensing – whereby a user CAL is covers the total number of devices owned by a user who accesses or utilises the server service, regardless of the number of devices that they use.

Whilst user and device CALs cost the same as one another, for many organisations, a mix of per-seat and per-user licensing is appropriate – for example a sales team with a mixture of notebook PCs and mobile devices could use per-user licensing to cover all of their many devices whereas a warehouse with many users sharing a PC, or an office with shift workers would be better served with a per-seat model.

Per-seat licensing is available for Windows Server, Exchange Server, Office Communications Server (OCS), Office SharePoint Server (MOSS), Project Server, SQL Server and Small Business Server (SBS).

The important thing to remember is that CALs are associated with a particular product version and that it’s the server that defines the CAL version that is required – i.e. when a Windows Server 2003 machine is upgraded to Windows Server 2008, the CALs must be upgraded too; however, in a mixed environment, CALs can be used to connect to servers running downlevel operating systems.

For volume license customers (only), a core CAL suite is available covering Windows Server, Exchange Server, Office SharePoint Server and System Center Configuration Manager. Always sold with software assurance, the core CAL is less expensive than buying all of the individual CALs (approximately 2-3 times the price of an individual CAL).

Microsoft confused many customers with many of the 2007 products (e.g. Exchange Server 2007) by introducing a new CAL model with a standard CAL for basic functionality and an enterprise CAL for more advanced functionality (e.g. Exchange Server 2007 Managed Folders). The important points to remember are that:

  • The standard and enterprise CALs (a poor choice of nomenclature, in my opinion) have nothing to do with whether the server application is a standard or enterprise edition product – i.e. an enterprise edition product is not required in order to use an enterprise CAL and enterprise or standard CALs can be used for either enterprise or standard edition products (if this is confusing, it may help to think of standard and enterprise CALs as “basic” and “advanced” respectively).
  • Enterprise CALs are additive – i.e. a standard CAL is required as well as the enterprise CAL (an enterprise CAL “adds to” the functionality associated with a standard CAL).

It’s also worth noting that if a user connects to a server product there is no enforcement of standard or enterprise features. As with all licensing, the responsibility is with the customer to correctly license their software although, from a technical perspective, some advanced features need to be enabled manually and this would present an opportunity to record the use of enterprise functionality.

Select and Enterprise customers can buy an Enterprise CAL (ECAL) suite for twice the price of the core CAL. This includes:

  • Core CAL (with each component counting as a standard CAL).
  • Forefront Security Suite.
  • System Center Operations Management license (a CAL to allow a client to be managed using System Center Operations Manager).
  • Windows Rights Management Services CAL.
  • Office Communications Server standard and enterprise CALs.
  • Office SharePoint Server enterprise CAL.
  • Exchange Server enterprise CAL.

The ECAL suite is always sold with software assurance and customers without a Select or Enterprise agreement can buy enterprise CALs for MOSS and Exchange Server to top-up their Core CALs.

In the next part of this series, I’ll look at products that are licensed without CALs (e.g. per-processor licensing and special cases external connectivity and hosted environments).

Keeping up with developments in photography

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I love to take photographs – and friends and family tell me I’m good at it – but it’s been a much maligned hobby in recent years, which is part of the reason the planned photo gallery has never made it onto this website (it will one day). I do dream though of making a living one day from creating fantastic images – making photographs rather than taking them (combining the art of creating a pleasing image that tells a story with the science of a technically perfect exposure) – and so I like to take in other people’s work for inspiration.

For many years, I have read photographic magazines like Practical Photography but, over time, I grew tired of the features (except for the odd two-page pages of commentary accompanied with a stunning image from professionals like David Noton) and found it all a little bit repetitive.

More recently I found alternative titles that catered to my needs like Digital SLR Photography but I just don’t have enough time to read photography magazines, computer magazines, IT trade publications, RSS feeds, and still fit in the odd interesting book, so I’ve started to listen to a new photography podcast when I’m in the car – This Week In Photography (TWIP).

I’ve not always been a fan of Scott Bourne’s work but Alex Lindsay really knows his stuff and, so far, TWIP has managed to avoid some of the pitfalls that have resulted in other podcasts (notably This Week In Tech – TWIT) from being removed from my iTunes subscription list, by keeping the show times down to around an hour, largely staying on topic (sticking with the content, rather than indulging in the “personalities”) and having interesting and varied content – covering the news. It’s also great that they use the chapter markings and enhanced functionality available in an AAC audio file (it’s really helpful to have an audio feature about photography that can actually show some images) as well as mixing video content into the feed to demonstrate some of the concepts.

The TWIP podcast also has a great blog – but, whilst there are other excellent resources on the ‘net (like ShutterBug and DP Review), it’s the podcast format that works for me – an hour of audio whilst I’m in the car or out walking, interjected with the odd short video. In the past I’ve tried other podcasts – like The Digital Story (audio) and PixelPerfect (video) but I’m surprised to find that the mixture of audio and video in the same feed has really worked for me.

In the last few weeks I’ve learnt a whole load of new stuff – like creating high dynamic range images (remembering to shoot using a tripod to keep the camera steady and aperture priority to control the focus); that not all megapixels are equal; that the rules of composition are different for panoramic images; how to stitch photos together in Adobe Photoshop (and that it may be necessary to adjust a stitched image as the exposure may vary slightly between the edge and the centre); how to create a Photoshop Action to emulate the saturation of Fuji Velvia film; and that Lexar cards are optimised for Nikon cameras (that’s lucky as that’s what I use, although I’ve not been able to find any evidence to back up that claim).

Definitely recommended.

Not all megapixels are equal

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I suppose it was enevitable but photographic film products are slowly slipping away. I’d still like a Hasselblad Xpan II, but they are rare (and expensive) – and even if I did get one, I’m not sure that I’d use it (when I bought a DSLR, I kept my film body but it’s been sitting in the kit bag ever since). Over the last few years, we’ve seen film companies struggle, whilst the likes of Nikon and Canon announce record profits. I hope that film doesn’t die completely (there is something magical about developing photographic film with chemicals and producing prints) but film products will inevitably become expensive and niche.

Digital photography has many advantages but one of my main frustrations has been the lack of dynamic range that the current crop of cameras can capture. Whilst negative film has around 12 stops of latitude, and slide film has around 4.5 stops, digital cameras can be even more restrictive at times and, in order to capture shadow detail, burnt out highlights become apparant although, just as when shooting slides, using neutral density filters can help (a lot).

Then there is the issue of noise. With film we could push film a couple of stops beyond it’s intended levels and correct it at the processing stage – there was a corresponding increase in grain and some colour shift too but it helped to grab images in low-light situations or when there was fast-moving action. Try that on most digital cameras and you’ll see a lot of noise (the digital equivalent of grain but far less attractive) although this is starting to change and the latest digital cameras are reaching new levels with perfectly usable photos at high ISO levels and some reports of being able to shoot handheld at twilight and still capture a good image.

Meanwhile, the digital camera manufacturers have induced a state of megapixel madness. Consumers now know to look at the number of megapixels that a camera has but it seems that not all megapixels are equal – the images on my 7Mpel compact camera are fine for snapshots, but no-where near as good as the ones that my 6Mpel DSLR produces. It’s all down to the technology in use like the type of sensor (CCD or CMOS), the technical differences between picture elements (pixels) and photosites, the firmware and software supporting the imaging chip and even the size of the pixels. Even after all of this, the quality of the lens through which the light must travel to reach the sensor is still a major factor (ditto for filters).

A new DSLR is not an option for me (I have a 4-year old Nikon D70 that will last me for a while longer – at least until Nikon release an FX-format prosumer SLR) so, for the time being at least, I’ll be continuing to use a tripod and long exposures in low light and hopefully this summer I’ll have a go at creating high dynamic range (HDR) images (one image from multiple exposures) to increase the dynamic range.

A few iPhone bits and bobs

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Yesterday, I found some notes I made when I was preparing my one month with the iPhone post last year – including a bunch of iPhone tips (parts 1, 2, 3 and 4).

If you’re based in the UK, and you’re looking for free Wi-Fi courtesy of Apple’s agreement with The Cloud – they have a hotspot location tool on their website (I’m not sure if you can change the browser agent and enter a phone number associated with an iPhone for access from any device as AT&T users could at Starbucks outlets in the States until the service was removed).

Lego man unpacking iPhoneFinally, I stumbled across what has to qualify as the best set of unboxing photos I’ve ever seen. Lego men unpacking consumer electronics is certainly geeky but somehow it’s very cool at the same time.

Microsoft Offline Virtual Machine Servicing Tool

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

In my recent article about the realities of managing a virtualised infrastructure, I mentioned the need to patch offline virtual machine images. Whilst many offline images will be templates, they may still require operating system, security or application updates to ensure that they are not vulnerable when started (or when a cloned VM is created from a template).

Now Microsoft has a beta for a tool that will allow this – imaginatively named the Offline Virtual Machine Servicing Tool. Built on the Windows Workflow Foundation and PowerShell, it works with System Center Virtual Machine Manager and either System Center Configuration Manager or Windows Server Update Services to automate the process of applying operating system updates through the definition of servicing jobs. Each job will:

  1. “Wake” the VM (deploy and start it).
  2. Trigger the appropriate update cycle.
  3. Shut down the VM and return it to the library.

Although I haven’t tried this yet, it does strike me that there is one potential pitfall to be aware of – sysprepped images for VM deployment templates will start into the Windows mini-setup wizard. I guess the workaround in such a scenario is to use tools from the Windows Automated Installation Kit (WAIK) to inject updates into the associated .WIM file and deploy VMs from image, rather than by cloning sysprepped VMs.

Further details of the Offline Virtual Machine Servicing Tool beta may be found on the Microsoft Connect site.

Heterogeneous datacentre management from Microsoft System Center

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Back in 2005, I quoted a Microsoft executive on his view on Microsoft’s support for heterogeneous environments through its management products:

“[it’s] not part of our DNA and I don’t think this is something that we should be doing.”

Well, maybe things are changing in post-Gates Microsoft. I knew that System Center Virtual Machine Manager 2008 (named at last week’s Microsoft Management Summit) included support for managing VMware ESX Server and a future version should also be able to manage XenSource hosts, but what I had missed in the MMS press release was System Center Operations Manager (SCOM) 2007 Cross Platform Extensions. These allow SCOM to manage HP-UX, Red Hat Enterprise Linux (RHEL), Sun Solaris and SUSE Linux Enterprise Server through management packs with Novell, Quest and Xandros adding support for common applications like Apache, MySQL and (a real surprise) Oracle. Then, for those with existing investments in major enterprise management suites, there are SCOM connectors to allow interoperability between System Center and third-party products like HP OpenView and IBM Tivoli.

I really think this is a brave step for Microsoft – but also the right thing to do. There are very few Microsoft-only datacentres and, whilst I am no enterprise management expert, it seems to me that corporates don’t want one solution for each platform and the big enterprise management suites are costly to implement. With System Center, people know what they are getting – a reasonably priced suite of products, with a familiar interface and a good level of functionality – maybe not everything that’s in Tivoli, UniCenter or OpenView, but enough to do the job. If the same solution that manages the WIntel systems can also manage the enterprise apps on Solaris (or another common Unix platform), then everyone’s a winner.

Xobni

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Xobni logoEven though Inbox Zero has helped me gain some control over my e-mail, I still need all the help I can get. Last week, Simon Coles sent me an invitation for Xobni – a plugin for Microsoft Outlook that offers fast search, conversation threading, a social networking platform, and many other features designed to make email better – or as Xobni (inbox spelt backwards) like to put it:

“Xobni is the Outlook plug-in that helps you organize your flooded inbox.”

It’s already becoming very useful – earlier today I couldn’t find a document that I was sure I’d been sent (and the Outlook 2007 search functionality didn’t seem to find it either). I used Xobni to highlight another e-mail from the same correspondent and there was the missing document – one of the listed files that we had exchanged – from where I could open the original e-mail, or the attachment that I was after. Xobni will pull contact information out of e-mail messages (even if I don’t have an address book entry for a particular contact) and tells me who my contacts correspond with that I do too. There’s also an analytics feature that lets me track the volume of e-mail I receive (and how long it takes me to process), ranking my correspondents and telling me what time of day they tend to send me e-mail. It can also read my calendar and automatically highlight the times that I am available over the next few days, placing the details in an message, all ready to send. There’s VOIP integration too – although clicking on the Skype logo launched Office Communicator on my system (I don’t have Skype installed but I do have OCS). Finally, Xobni has its own built in search capabilities, which I’ve used a few times this evening to track down long lost e-mails based on the snippets of information that I could recall from the recesses of my mind. In fact, the only niggle I found was in my work e-mail, where it struggles to differentiate between first and last names (our display names are formatted with the lastname in front – e.g. “Wilson Mark” – and, even though the e-mail address is something like mark.wilson@country.companyname.com, Xobni thinks my name is “Wilson” but has no such problem for contacts with sensible display names – like “Mark Wilson” – or with punctuation in the display name – such as “Wilson, Mark”).

Xobni’s invitation-only period is over (although they are still banding around the beta tag in true web 2.0 style) and the product is available for all to download. I’ve only been using it for a few days but I’m very impressed with the information that it gives me – even so, I’ll leave the product review to those who know it best – check out the video below:

What I can say is that I reckon Xobni is pretty cool. It seems I’m not alone as Bill Gates demoed the product in his keynote at the 2008 Office Developers Conference and Xobni was selected for Microsoft’s Startup Accelerator Program but the founders are reported to have walked away from an outright takeover. If you use Microsoft Outlook for your mail, then Xobni is worth checking out and could save you a lot of time.

Microsoft 2.0

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

As someone who works closely with Microsoft, I’m very interested to see what happens to the company as Bill Gates steps aside. Changes are already afoot – anyone who has attended a recent marketing event (like the 2008 launch wave) will have heard about the idea of software plus services – and some of Microsoft’s partners need to start thinking about their own business models as hosted services become more and more attractive to corporates.

Two years ago, I wrote that I didn’t think the “webtop” would replace the desktop. I still think that is true – enterprises are not yet ready to store their data in “the cloud” – but things are starting to change and there seems little doubt that web services are the direction that were all heading in. Windows and Office will be here for a while yet but Microsoft desparately needs to get a piece of the action if it is to stay relevant – hence their failed attempt to buy Yahoo!. Meanwhile, rather than follow the Google model of storing everything in cyberspace, Microsoft Live Mesh looks at how to make data accessible by connecting people, processes and technology – wherever they are.

Last week’s Windows Weekly podcast (episode 57) featured an interview with Mary Jo Foley, author of Microsoft 2.0: How Microsoft Plans to Stay Relevant in the Post-gates Era. I’ve not read the book yet (it’s on my Amazon wishlist) but it may be interesting to track the accompanying blog – Microsoft 2.0.

Personally, I’m glad that Microsoft didn’t launch a hostile bid for Yahoo! and instead withdrew their offer. It seems pretty clear that the Yahoo! Inc. management team would rather have hit the self-destruct button than become part of Microsoft Corporation and, to me, that implies a degree of immaturity. Meanwhile, Microsoft can keep their cash and move forward with their software plus services model. When one of the world’s largest companies has to borrow money for a takeover, that’s not a good sign – and that’s an awful lot of money that they could do something useful with.