Photosynth

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Photosynth logoA few months back, I heard something about Photosynth – a new method of modelling scenes using photographic images to build up a 3D representation and yesterday I got the chance to have a look at it myself. At first, I just didn’t get it, but after having seen a few synths, I now think that this is really cool technology with a lot of potential for real world applications.

It’s difficult to describe Photosynth but it’s essentially a collage of two dimensional images used together to create a larger canvas through which it’s possible to navigate in three dimensions (four actually). It began life as a research project on “photo tourism” at the University of Washington, after which Microsoft Research and Windows Live Labs took it on to produce Photosynth, using technology gained with Microsoft’s acquisition of Seadragon Software and the first live version of Photosynth was launched yesterday.

Clearly this is not a straightforward application – it’s taken two years of development with an average team size of 10 people just to bring the original research project to the stage it’s at today – so I’ll quote the Photosynth website for a description of how it works:

“Photosynth is a potent mixture of two independent breakthroughs: the ability to reconstruct the scene or object from a bunch of flat photographs, and the technology to bring that experience to virtually anyone over the Internet.

Using techniques from the field of computer vision, Photosynth examines images for similarities to each other and uses that information to estimate the shape of the subject and the vantage point the photos were taken from. With this information, we recreate the space and use it as a canvas to display and navigate through the photos.

Providing that experience requires viewing a LOT of data though—much more than you generally get at any one time by surfing someone’s photo album on the web. That’s where our Seadragonâ„¢ technology comes in: delivering just the pixels you need, exactly when you need them. It allows you to browse through dozens of 5, 10, or 100(!) mega-pixel photos effortlessly, without fiddling with a bunch of thumbnails and waiting around for everything to load.”

I decided to try Photosynth out for myself and the first thing I found was that I needed to install some software. On my Windows computer it installed a local application to create synths and an ActiveX control to view them. Creating a synth from 18 sample images of my home office desk took just a few minutes (each of the images I supplied was a 6.1 mega-pixel JPG taken on my Nikon D70) and I was also able to provide copyright/Creative Commons licensing information for the images in the synth:

Once it had uploaded to the Photosynth site, I could add a description, view other people’s comments, get the links to e-mail/embed the synth, and provide location information. I have to say that I am truly amazed how well it worked. Navigate around to the webcam above my laptop and see how you can go around it and see the magnet on the board behind!

It’s worth pointing out that I have not read the Photosynth Photography Guide yet – this was just a set of test photos looking at different things on and around the desk. If you view the image in grid view you can see that there are three images it didn’t know what to do with – I suspect that if I had supplied more images around those areas then they could have worked just fine.

You may also notice a lack of the usual office artifacts (family photos) etc. – they were removed before I created the synth, for privacy reasons, at the request of one of my family members.

My desk might is not the best example of this technology, so here’s another synth that is pretty cool:

In this synth, called Climbing Aegialis (by J.P.Peter) you can see a climber making his way up the rock face – not just in three dimensions – but in four. Using the . and , keys it’s possible to navigate through the images according to the order in which they were taken.

Potting Shed is another good example – taken by Rick Szeliski, a member of the team that put this product together:

Hover over the image to see a doughnut-shaped ring called a halo and click this to navigate around the image in 3D. If you use the normal navigation controls (including in/out with the mouse scrollwheel) it is possible to go through the door and enter the potting shed for a look inside!

There are also some tiny pixel-sized pin-pricks visible as you navigate around the image. These are the points that were identified whilst the 3D matching algorithm was running. They can be toggled on an off with the p key and in this example they are so dense in places that the image can actually be made out from just the pixel cloud.

Now that the first release of Photosynth is up and running, the development team will transition from Windows Live Labs into Microsoft’s MSN business unit where they will work on using the technology for real and integrating it with other services – like Virtual Earth, where synths could be displayed to illustrate a particular point on a map. Aside from photo tourism, other potential applications for the technology include real estate, art and science – anywhere where visualising an item in three or four dimensions could be of use.

The current version of Photosynth is available without charge to anyone with a Windows LiveID and the service includes 20GB of space for images. The synths themselves can take up quite a bit of space and, at least in this first version of the software, all synths are uploaded (a broadband Internet connection will be required). It’s also worth noting that all synths are public so photos will be visible to everyone on the Internet.

If you couldn’t see the synths I embedded in this post, then you need to install an ActiveX control (Internet Explorer) or plugin (Firefox). Direct3D support is also required so Photosynth is only available for Windows (XP or later) at the moment but I’m told that a Mac version is on the way – even Microsoft appreciates that many of the people who will be interested in this technology use a Mac. On the hardware side an integrated graphics card is fine but the number of images in a synth will be limited by the amount of available RAM.

Finally, I wanted to write this post yesterday but, following the launch, the Photosynth website went into meltdown – or as Microsoft described it “The Photosynth site is a little overwhelmed just now” – clearly there is a lot of interest in this technology. For more news on the development of Photosynth, check out the Photosynth blog.

At last… my wait for a white 3G iPhone upgrade is nearly over

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Apple iPhone 3G in whiteAfter some initial scepticism, I bought an iPhone on the day it launched in the UK. It was great but Wi-Fi coverage is limited (so is 3G for that matter) and I wanted faster browsing (GPS will also be nice) so I decided to upgrade. In order to fund this, I unlocked my first iPhone and sold it on eBay, 2 days before the new one was launched. As I paid £269 for the phone and sold it for just over £200 after PayPal and eBay charges (not bad for an 8-month-old handset), I was pretty pleased – and that’s more than the upgrade will cost (£159) so you could say I made a small profit (except I also threw some almost-new accessories into the deal, so I guess I’m about even), but for the last couple of months I’ve been paying an iPhone tariff and using an old Nokia handset with only limited data capabilities…

The problem is that I would like to have the white model – and, if you live in the UK, that’s only been available from Apple. Of the three authorised retailers (Apple, O2 and Carphone Warehouse), only O2 and Carphone Warehouse can handle upgrades – so no white iPhones for loyal customers then… only for new business customers via Apple. Until today!!!

This morning I started my bi-weekly Monday and Friday call to my local O2 store to see if they have white iPhones in stock or any idea when they might have (“no”, in both cases) and checking the web for news. Apple customer forum moderators seem to be deleting posts about this issue (as with most things that are not pro-Apple) so Neil Holmes’ White iPhone Blog is a good place to start (there’s a good thread at MacRumors on the subject too) and I found that, from today, Carphone Warehouse (CPW) have white iPhones available for pre-order. Yay!!!

Apple iPhone 3G available in white at Carphone Warehouse

(still no news from O2 though…)

Unfortunately the CPW web site doesn’t mention anything about upgrades, so I called CPW’s 0870 (national rate rip-off) sales number and spoke to someone who said I couldn’t upgrade until they had stock (expected later today). After reading Neil’s experiences, I called back and spoke to another CPW representative, who took my details, contacted O2 for authorisation and called me back to take payment. Frankly I was amazed when he called back – CPW have a very bad reputation for customer service (which I why I checked that my billing will still be through O2) but I have an order reference number and my shipment will be confirmed later today or over the weekend, for delivery on Tuesday.

To CPW’s credit, the guy who dealt with my order (Soi) was helpful, called me back as promised, asked if I wanted accessories (but wasn’t pushy when I declined) and didn’t try to force me to buy insurance (it was offered but there was no pressure when I said no) – so that means he was actually better than the O2 representative that sold me my last iPhone in an O2 store last November.

Once I have that shipment notification I’ll be a very happy boy. Until then I wait with more than just a little trepidation.

[Update: 22 August 2008 @18:17: Just received an e-mail to say that the order has been processed and will be despatched shortly… could it be co-incidence that I phoned CPW within the last half hour to see what was happening with shipment?]

[Update: 26 August 2008 @09:42: It’s here! It even comes in a white box!]

Microsoft infrastructure architecture considerations: part 3 (controlling network access)

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Continuing the series of posts on the architectural considerations for designing a predominantly-Microsoft IT infrastructure, based on the MCS Talks: Enterprise Infrastructure series, in this post, I’ll look at some of the considerations for controlling access to the network.

Although network access control (NAC) has been around for a few years now, Microsoft’s network access protection (NAP) is new in Windows Server 2008 (previous quarantine controls were limited to VPN connections).

It’s important to understand that NAC/NAP are not security solutions but are concerned with network health – assessing an endpoint and comparing its state with a defined policy, then removing access for non-compliant devices until they have been remediated (i.e. until the policy has been enforced).

The real question as to whether to implement NAC/NAP is whether or not non-compliance represents a business problem.

Assuming that NAP is to be implemented, then there may be different policies required for different groups of users – for example internal staff, contractors and visitors – and each of these might require a different level of enforcement; however, if the the policy is to be applied, enforcement options are:

  • DHCP – easy to implement but also easy to avoid by using a static IP address. It’s also necessary to consider the healthcheck frequency as it relates to the DHCP lease renewal time.
  • VPN – more secure but relies on the Windows Server 2008 RRAS VPN so may require a third party VPN solution to be replaced. In any case, full-VPN access is counter to industry trends as alternative solutions are increasing used.
  • 802.1x – requires a complex design to support all types of network user and not all switches support dynamic VLANs.
  • IPSec – the recommended solution – built into Windows, works with any switch, router or access point, provides strong authentication and (optionally) encryption. In addition, unhealthy clients are truly isolated (i.e. not just placed in a VLAN with other clients to potentially affect or be affected by other machines). The downside is that NAP enforcement with IPSec requires computers to be domain joined (so will not help with visitors or contractors PCs) and is fairly complex from an operational perspective, requiring implementation of the health registration authority (HRA) role and a PKI solution.

In the next post in these series, I’ll take a look at some of the architectural considerations for using virtualisation technologies within the infrastructure.

Microsoft improves support for virtualisation – unless you’re using a VMware product

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Software licensing always seems to be one step behind the technology. In the past, I’ve heard Microsoft comment that to virtualise one of their desktop operating systems (e.g using VMware Virtual Desktop Infrastructure) was in breach of the associated licensing agreements – then they introduced a number of licensing changes – including the Vista Enterprise Centralised Desktop (VECD) – to provide a way forward (at least for those customers with an Enterprise Agreement). Similarly I’ve heard Microsoft employees state that using Thinstall (now owned by VMware and rebranded as ThinApp) to run multiple copies of Internet Explorer is in breach of the EULA (the cynic in me says that I’m sure they would have fewer concerns if the technology involved was App-V). A few years back, even offline virtual machine images needed to be licensed – then Microsoft updated their Windows Server licensing to include virtualisation rights but it was never so clear-cut for applications with complex rules around the reassignment of licenses (e.g. in a disaster recovery failover scenario). Yesterday, Microsoft made another step to bring licencing in line with customer requirements when they waived the previous 90-day reassignment rule for a number of server applications, allowing customers to reassign licenses from one server to another within a server farm as frequently as required (it’s difficult to run a dynamic data centre if the licenses are not portable!).

It’s important to note that Microsoft’s licensing policies are totally agnostic of the virtualisation product in use – but support is an entirely different matter.

Microsoft also updated their support policy for Microsoft software running on a non-Microsoft virtualisation platform (see Microsoft knowledge base article 897615) with an increased number of Microsoft applications supported on Windows Server 2008 Hyper-V, Microsoft Hyper-V Server (not yet a released product) or any third-party validated virtualisation platform – based on the Windows Server Virtualization Validation Programme (SVVP). Other vendors taking part in the SVVP include Cisco, Citrix, Novell, Sun Microsystems and Virtual Iron… but there’s a rather large virtualisation vendor who seems to be missing from the party…

[Update: VMware joined the party… they were just a little late]

Outlook cached mode is not available on a server with Terminal Services enabled

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I was putting together a demo environment earlier today and needed to publish a Terminal Services RemoteApp, so I installed Terminal Services (and IIS) on my Windows Server 2008 notebook. Later on, I noticed that Outlook was not working in cached mode and I found that offline store (.OST) files and features that rely on them are disabled when running Outlook on a computer with Terminal Services enabled.

I can see why cached mode on a terminal server would be a little odd (it’s fair enough caching data on a remote client but it’s also resonable to expect that the terminal server would be in the data centre – i.e. close to the Exchange Server) – even so, why totally disable it – surely administrators can be given the choice to enable it if circumstances dictate it to be an appropriate course of action?

Oh well… I’ve since removed the Terminal Services role and Outlook is working well again.

Microsoft infrastructure architecture considerations: part 2 (remote offices)

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Continuing from my earlier post which sets the scene for a series of posts on the architectural considerations for designing a predominantly-Microsoft IT infrastructure, in this post, I’ll look at some of the considerations for remote offices.

Geographically dispersed organisations face a number of challenges in order to support remote offices including: WAN performance/reliability; provisioning new services/applications/servers; management; remote user support; user experience; data security; space; and cost.

One approach that can help with some (not all) of these concerns is placing a domain controller (DC) in each remote location; but this has been problematic until recently because it increases the overall number of servers (it’s not advisable to co-locate other services on a domain controller because administration can’t be delegated to a local administrator on a domain controller and the number of Domain Admins should be kept to a minimum) and it’s a security risk (physical access to the domain controller computer makes a potential hacker’s job so much simpler). For that reason, Microsoft introduced read only domain controllers (RODCs) in Windows Server 2008.

There are still some considerations as to whether this is the appropriate solution though. Benefits include:

  • Administrative role separation.
  • Faster logon times (improved access to data).
  • Isolated corruption area.
  • Improved security.

whilst other considerations and potential impacts include:

  • The need for a schema update.
  • Careful RODC placement.
  • Impact on directory-enabled applications.
  • Possibility of site topology design changes.

Regardless of whether a remote office DC (either using the RODC capabilities or as a full DC) is deployed, then server sprawl (through the introduction of branch office servers for a variety of purposes) can be combatted with the concept of a branch “appliance” – not in the true sense of a piece of dedicated hardware runnings an operating system and application that is heavily customised to meet the needs of a specific service – but by applying appliance principles to server design and running multiple workloads in a manner that allows for self-management and healing.

The first step is to virtualise the workloads. Hyper-V is built into Windows Server 2008 and the licensing model supports virtualisation at no additional cost. Using the server core installation option, the appliance (physical host) management burden is reduced with a smaller attack surface and reduced patching. Multiple workloads may be consolidated onto a single physical host (increasing utilisation and removing end-of-life hardware) but there are some downsides too:

  • There’s an additional server to manage (the parent/host partition) and child/guest partitions will still require management but tools like System Center Virtual Machine Manager (SCVMM) can assist (particularly when combined with other System Center products).
  • A good business continuity plan is required – the branch office “appliance” becomes a single point of failure and it’s important to minimise the impact of this.
  • IT staff skills need to be updated to manage server core and virtualisation technologies.

So, what about the workloads on the branch office “appliance”? First up is the domain controller role (RODC or full DC) and this can be run as a virtual machine or as an additional role on the host. Which is “best” is entirely down to preference – running the DC alongside Hyper-V on the physical hardware means there is one less virtual machine to manage and operate (multiplied by the number of remote sites) but running it in a VM allows the DC to be “sandboxed”. One important consideration is licensing – if Windows Server 2008 standard edition is in use (which includes one virtual operating system environment, rather than enterprise edition’s four, or datacenter edition’s unlimited virtualisation rights) then running the DC on the host saves a license – and there is still some administrative role separation as the DC and virtualisation host will probably be managed centrally, with a local administrator taking some responsibility for the other workloads (such as file services).

That leads on to a common workload – file services. A local file server offers a good user experience but is often difficult to back up and manage. One solution is to implement DFS-R in a hub and spoke arrangement and to keep the backup responsibility data centre. If the remote file server fails, then replication can be used to restore from a central server. Of course, DFS-R is not always idea for replicating large volumes of data; however the DFS arrangement allows users to view local and remote data as though it were physically stored a single location and there have been a number of improvements in Windows Server 2008 DFS-R (cf. Windows Server 2003 R2). In addition, SMB 2.0 is less “chatty” than previous implementations, allowing for performance benefits when using a Windows Vista client with a Windows Server 2008 server.

Using these methods, it should be possible to avoid remote file server backups and remote DCs should not need to be backed up either (Active Directory is a multi-master replicated database so it has an inherent disaster recovery capability). All that’s required is some method of rebuilding a failed physical server – and the options there will depend on the available bandwidth. My personal preference is to use BITS to ensure that the remote server always holds a copy of the latest build image on a separate disk drive and then to use this to rebuild a failed server with the minimum of administrator intervention or WAN traffic.

In the next post in these series, I’ll take a look at some of the considerations for using network access protection to manage devices that are not compliant with the organisation’s security policies.

Microsoft infrastructure architecture considerations: part 1 (introduction)

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last week, I highlighted the MCS Talks: Enterprise Architecture series of webcasts that Microsoft is running to share the field experience of Microsoft Consulting Services (MCS) in designing and architecting Microsoft-based infrastructure solutions – and yesterday’s post picked up on a key message about software as a service/software plus services from the infrastructure futures section of session 1: infrastructure architecture.

Over the coming days and weeks, I’ll highlight some of the key messages from the rest of the first session, looking at some of the architectural considerations around:

  • Remote offices.
  • Controlling network access.
  • Virtualisation.
  • Security.
  • High availability.
  • Data centre consolidation.

Whilst much of the information will be from the MCS Talks, I’ll also include some additional information where relevant, but, before diving into the details, it’s worth noting that products rarely solve problems. Sure enough, buying a software tool may fix one problem, but it generally adds to the complexity of the infrastructure and in that way does not get to the root issue. Infrastrcture optimisation (even a self assessment) can help to move IT conversations to a business level as well as allowing the individual tasks that are required to reach meet the overall objectives to be prioritised.

Even though the overall strategy needs to be based on business considerations, there are still architectural considerations to take into account when designing the technical solution and, even though this series of blog posts refers to Microsoft products, there is no reason (architecturally) why alternatives should not be considered.

Core Configurator – download it whist you can…

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few months ago, I wrote a post on customising Windows Server 2008 Server Core and Michael Armstrong tipped me off about a cool utility, written by former MVP Guy Teverovsky, called Core Configurator. I say former MVP, because Guy has given up that award to join Microsoft in Israel – and I’m not surprised, after his employer claimed it was their intellectual property (even though he developed it in his spare time) and asked him to remove it from the web.

Anyway, Core Configurator is intended to provide a GUI (strange as it may seem on server core) to aid in the initial setup tasks for a server core machine including:

  • Product activation.
  • Display configuration.
  • Date and time configuration.
  • Remote Desktop configuration.
  • Local user account management.
  • Firewall configuration
  • WinRM configuration
  • Networking.
  • Computer name and domain/workgroup membership.
  • Installation of server core features/roles.
  • Shutdown.
  • Reboot.

Because the tool has been removed from the web, it’s now pretty hard to get hold of, so download it while you can (there is another download location but this version has a slightly different filename and I cannot vouch for the file contents – i.e. I have not tested it). Once it’s gone, it’s gone – so don’t ask me where to get it if these links stop working.

Reviewing documents? Forget about review sheets and use the features in Word instead!

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few weeks back, I was taking part in a document review process where the prescribed format of the review involved recording all the document comments on a separate sheet and then sending them back for consideration. Describing where the change/comment applied (e.g. section 1.1, paragraph 4, it states “blah blah blah” but really it should be “something entirely different”; section 2, last paragraph, extraneous apostrophe in PC’s; etc.) is a very labour intensive process for all the reviewers involved – it’s far easier to work through a work document and add comments/tracked changes as required.

Today, I was on the receiving end of some comments on one of my designs and I had the opposite problem – several documents with comments embedded to wade through (and one on a review sheet for good measure… grrr).

The obvious issue with receiving several documents with embedded comments/changes is how to merge all of the separate review comments into one place – and it turns out that’s easily done using Word 2007’s built-in tools for combining and comparing documents (Word 2003 has similar functionality on the Tools menu – Compare and Merge Documents…).

Compare and combine tools in Microsoft Word 2007

Once I had all the review comments merged into a single document (which only took a few seconds), I could track changes, make my edits (the review pane is useful here to jump between comments) and send it back for final sign-off. A few minutes later I had confirmation that the changes were approved, following which I accepted the changes in the document, removed hidden metadata (using the document inspector) and published the document.

It’s all quite straightforward really – the trouble is that most of us still use our office applications in the same way that we did 15 years ago… and, dare I say it, aside from knowledge workers using word processing software on a PC instead of relying on secretarial staff, the basic process probably hasn’t changed much since the days of the typing pool…

Software as a Service – or Software plus Services?

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

There’s a lot of media buzz right now about cloud computing – which encompasses both “web 2.0” and Software as a Service (SaaS). Whilst it’s undeniable that web services are becoming increasingly more important, I’ll stand by my comments from a couple of years ago that the “webtop” will not be in mainstream use any time soon and those who are writing about the the death of Microsoft Windows and Office are more than a little premature.

Even so, I was interested to hear Microsoft’s Kevin Sangwell explain the differences between SaaS and the Microsoft idea of software plus services (S+S) during the recent MCS Talks session on infrastructure architecture.

I’ve heard Microsoft executives talk about software plus services but Kevin’s explanation cut’s through the marketing to look at what S+S really means in the context of traditional (on premise) computing and SaaS:

Kevin made the point that there is actually a continuum between on premise and SaaS solutions:

Software delivery continuum and software services taxonomy

  • We all understand the traditional software element – where software is installed an operated in-house (or possibly using a managed service provider).
  • Building block services are about using web services to provide an API to build applications “in the cloud” – so Amazon’s simple storage service (S3) is an example. This gives developers something to hook into and onto which to deliver a solution – for example, Jungle Disk uses the Amazon S3 platform to provide online storage and backup services.
  • Attached services provide self-contained functionality – for example anti-spam filtering of e-mail as it enters (or exits) an organisation.
  • Finished services are those that operate entirely as a web service – with salesforce.com being one, often quoted, example – Google Apps would be another (not that Microsoft are ever likely to promote that one…).

S+S is about creating a real-world hybrid – not just traditional or cloud computing but a combination of software and services – for example an organisation may use a hosted Exchange Server service but they probably still use Microsoft Outlook (or equivalent software) on a PC.

So, would moving IT services off to the cloud make all the associated IT challenges disappear? Almost certainly not! All this would lead to is a disjointed service and lots of unhappy business users. SaaS and S+S do not usually remove IT challenges altogether but they replace them with new ones – typically around service delivery (e.g. managing service level agreements, integrating various operational teams, etc.) and service support (e.g. presenting a coherent service desk with appropriate escalation between multiple service providers and the ability to assess whether a problem relates to internal IT or the hosted service) but also in relation to security (e.g. identity lifecycle management and information rights management).

Kevin has written an article for The [MSDN] Architecture Journal on the implications of software plus services consumption for enterprise IT and, for those who are interested in learning more about S+S, it’s worth a read.