A Microsoft view on the consumerisation of IT (#ukitcamp)

This content is 12 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I never realised that my blog posts were feared. At least not until Microsoft’s Andrew Fryer (@deepfat) said he was less concerned about my event feedback on yesterday’s IT Pro Camp event than on my blog post! Well, all I can promise is to try and be objective, fair and balanced – which is what readers have come to expect around here – even if there is less Microsoft-focused content these days.

I went along to yesterday’s IT Pro Camp on Consumerisation as a result of a Twitter conversation that suggested I come and see what Microsoft is doing to embrace and support consumerisation.  To be fair, I should have known better. For the last 20 years, Microsoft has provided desktop (and back-office) systems to enterprises and the consumerisation megatrend threatens this hegemony. Sure, they also operate in the consumer space, but consumerisation is increasingly mobile and cross-platform which means that Microsoft’s dominance is weakening*.

What the UK TechNet team has done is to put together a workshop that looks at how Microsoft tools can be used to support consumerisation in the enterprise – and, at that level, it worked well (although I’m pretty sure the event synopsis changed at some point between me booking my place and it actually taking place).  Even so, I was naive to expect anything more than marketing. Indeed, I nearly went home at lunchtime as it was starting to feel like a big System Center Configuration Manager pitch and there was very little discussion of what is really meant by the consumerisation of IT.

There is little doubt in my mind that the event provided a great demo to show off a host of functionality in Microsoft’s products (and, to be fair, there is an increasing amount of cross-platform support too) but, time and time again, I was the awkward so-and-so who asked how I would implement a feature (for example Direct Access) in a cross-platform estate (e.g. for BYOD) and the answer was that it needs Windows.

So, earlier in the week I was slating Oracle for an event that basically said “buy more of our stuff” and this week… well, it’s just “stuff” from Redmond instead of (different) “stuff” from Redwood Shores, I guess.

Even so, there were some snippets within the product demos that I would like to call out – for example, Simon May (@simonster)’s assertion that:

“We need to be more permissive of what’s allowed on the network – it’s easier to give access to 80% most of time and concentrate on securing the 20%.”

In a nutshell, Simon is re-enforcing the point I made earlier this month when I suggested that network access control was outdated and de-perimiterisation is the way forward (although Microsoft’s implementation of NAC – called Network Access Protection – did feature in a demonstration).  There was also a practical demonstration of how to segregate traffic so that the crown jewels are safe in a world of open access (using IPsec) and, although the Windows implementation is simpler through the use of Group Policy, this will at least work on other devices (Macs and Linux PCs at least – I’m not so sure about mobile clients).

Of course, hosted shared desktops (Remote Desktop Services) and virtual desktop infrastructure reared their ugly heads but it’s important to realise these are just tactical solutions – sticking plaster if you like – until we finally break free from a desktop-centric approach and truly embrace the App Internet, with data-centric policies to providing access.

There was no discussion of how to make the App Internet real (aside from App-V demos and SharePoint/System Centre Configuration Manager application portals) but, then again, this was an IT Pro event and not for developers – so maybe a discussion on application architecture was asking a little too much…

Other topics included protection of mobile devices, digital rights management, and federation, featuring a great analogy from Simon as he described claims-based authentication as being a bit like attempting to buy a drink in a bar, and being asked to prove your age, with a driving licence, that’s trusted because the issuer (e.g. DVLA in mainland Britain) has gone through rigourous checks.

Hopefully this post isn’t too critical – my feedback basically said that there is undoubtedly a lot of work that’s gone into creating the TechDays IT Pro Camps and for many people they will be valuable. Indeed, even for me (I haven’t been involved in Microsoft products, except as a user, for a couple of years now) it’s been a great refresher/update on some of the new technologies. But maybe IT architects have a different view? Or maybe it’s time for me to get more intimately involved in technology again?


* I don’t see Microsoft being pushed out any time soon – Windows still runs on a billion PCs worldwide and analysts haven’t given up hope on Windows Phone either – at least not based on an IDC event I attended recently.

Getting started with Raspberry Pi (#RasPi)

This content is 12 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Raspberry Pi is a trademark of the Raspberry Pi FoundationMuch to my manager’s disgust (he has a programming background, whilst I’m an infrastructure guy “by trade” – although I did write code in my youth!), my Raspberry Pi arrived last week. Despite the botched launch, I still think this is one of the most exciting products we’ll see this year because, well, because it’s a fully functioning computer for around £25 (Model B) and that means the potential addressable market is enormous. Actually, that’s not quite right – the Pi is around £25 (plus VAT) and then you’ll need some peripherals – although they should be relatively easy to lay your hands on:

  • A micro-USB mobile phone charger (I use the one that came with my Nokia Lumia 800 but any 5V supply that can feed a micro-USB cable will do)
  • A USB keyboard
  • (Optionally) a mouse
  • (Optionally) some speakers
  • (Optionally) a USB hub (powered)
  • A wired network connection
  • An SD card
  • A display – but watch out as Raspberry Pi supports HDMI and component out (RCA) – not VGA.

My monitors are mostly VGA (I have one that will take DVI) and my TV is far too old for HDMI (it’s a 14-year-old Sony Trinitron 32″ widescreen CRT!) so I set the Pi up to use the analogue  connection to the TV.

Installing the operating system

I selected a Linux distro (the Raspberry Pi blog suggests that Fedora Remix is the recommended distro, as does the FAQ, although there is extensive discussion about whether to use Fedora or Debianthe Raspberry Pi quick start guide suggests that developers should use Debian and there are alternative downloads too). Eventually, I managed to install the Raspberry Pi Fedora Remix on my SD card (my Ubuntu machine recognised the SD card, but the Python version of the Fedora ARM Image Installer didn’t*; meanwhile my work laptop installed an image on the SD card but it wouldn’t boot – I suspect that’s down to the disk encryption software we use; finally I managed to run the Windows version of the Fedora ARM Image Installer on another Windows 7 PC).

Once I had an operating system installed, I booted and the RasPi picked up an IP address from my DHCP server, registered itself in DNS (raspi.domainname) and set to work expanding its disk to fill the 8GB SD card I’m using.

*getting this installer to work involved installing the python-qt4 package in the Ubuntu Software Centre, then running ./fedora-arm-installer.

Switching displays

Unfortunately, standard definition CRT TVs are no better at working with Raspberry Pi’s than they are with any other computer (except a games console) – and why I thought that should be the case is a mystery…

With only part of the display visible via component out (and not exactly easy to read) I started to investigate options for use of the HDMI port.  It turns out that HDMI to VGA is too expensive, but an HDMI to DVI cable cost just £2.39 at Amazon (thanks to Chromatix, The EponymousBob and GrumpyOldGit on the Raspberry Pi forums for sharing this info). With the RasPi hooked up to my only digital monitor, everything was much easier, although I did have to plug the cable directly into the monitor and I’m now waiting for delivery of a DVI-I female to female gender changer so that it’s a bit easier to swap the monitor cable between my computing devices.

So, what’s it like to use then?

Did I mention that the Raspberry Pi is a fully functioning computer for around £25? Well then, what’s not to like? Sure, performance is not lightning fast – the Raspberry Pi FAQs suggest:

“… real world performance is something like a 300MHz Pentium 2, only with much, much swankier graphics”

but that’s plenty for a bit of surfing, email and teaching my kids to write code.

I am finding though that I’m struggling a little with my chosen distro. For example, I haven’t yet managed to install Scratch and it doesn’t seem to be one of the recognised packages so I may have to resort to compiling from source – hardly ideal for getting kids started with coding. For that reason, I might switch to Debian (I’m downloading it as I write) but for now I’ll continue to explore the options that the Fedora Remix provides.

I’m sure there will be more RasPi posts on this blog but if you’re one of the thousands waiting for yours to arrive, hopefully this post will help to prepare…

And once the educational models are available, I’ll be encouraging my sons’ school to buy a lab full of these instead of a load more netbooks running Windows XP…

Big data according to the Oracle

This content is 12 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

After many years of working mostly with Microsoft infrastructure products, the time came for me to increase my breadth of knowledge and, with that, comes the opportunity to take a look at what some of the other big players in our industry are up to.  Last year, I was invited to attend the Oracle UK User Group Conference where I had my first experience of the world of Oracle applications; and last week I was at the Oracle Big Data and Extreme Analytics Summit in Manchester, where Fujitsu was one of the sponsors (and an extract from one of my white papers was in the conference programme).

It was a full day of presentations and I’m not sure that reproducing all of the content here makes a lot of sense, so here’s an attempt to summarise it… although even a summary could be a long post…

Big data trends, techniques and opportunities

Tim Jennings (@tjennings) from Ovum set the scene and explained some of the ways in which big data has the potential to change the way in which we work as businesses, citizens and consumers (across a variety of sectors).

Summing up his excellent overview of big data trends, techniques and opportunities, Tim’s key messages were that:

  1. Big data is characterised by volume, variety and velocity [I’d add value to that list].
  2. Big data represents a change in the mentality of analytics, away from precise analysis of well-bound sources to rough-cut exploratory analysis of all the data that’s practical to aggregate.
  3. Enterprise should identify business cases for big data and the techniques and processes required to exploit them.
  4. Enterprises should review existing business intelligence architectures and methods and plan the evolution towards a broader platform capable of handling the big data lifecycle.

And he closed by saying that “If you don’t think that big data is relevant to your organisation, then you are almost certainly missing an opportunity that others will take.”

Some other points I picked up from Tim’s presentation:

  • Big data is not so much unstructured as variably-structured.
  • The mean size of an analytical data set is 3TB (growing but not that huge) – don’t think you need petabytes of data for big data tools and techniques to be relevant.
  • Social network analytics is probably the world’s largest (free) marketing focus group!

Big Data – Are You Ready?

Following the analyst introduction, the event moved on to the vendor pitch.  This was structured around a set of videos which I’ve seen previously, in which a fictitious American organisation grapples with a big data challenge, using an over-sized actor (and an under-sized one) to prove their point. I found these videos a little tedious the first time I saw them, and this was the second viewing for me.  For those who haven’t had the privilege, the videos are on YouTube and I’ve embedded the first one below (you can find the links on an Oracle’s Data Warehouse Insider blog post).

The key points I picked up from this session were:

  • Oracle see big data as a process towards making better decisions based on four stages: decide, acquire, organise and analyse.
  • Oracle considers that there are three core technologies for big data: Oracle NoSQL, Hadoop, and R; brought together by Oracle Engineered Systems (AKA the “buy our stuff” pitch).


Had I been at the London event I would have been extremely privileged to see Doug Cutting, Hadoop creator and now Chief Architect at Cloudera speak about his work in this field.  Doug wasn’t available to speak at the Manchester event so Oracle showed us a pre-recorded interview.

For those who aren’t familiar with Cloudera (I wasn’t), it’s effectively a packaged open source big data solution (based on Hadoop and related technologies) providing an enterprise big data solution, with support.

The analogy given was that of a “big data operating system” with Cloudera doing for Hadoop what Red Hat does for Linux.

Perhaps most pertenent of Doug Cutting’s commenst was that we are at the beginning of a revolution in data processing where people can afford to save data and use it to learn, to get a “higher resolution picture of what’s going on and use it to make more informed decisions”.

Capturing the asset – acquire and organise

After a short pitch from Infosys (who have a packaged data platform, although personally, I’d be looking to the cloud…) and an especially cringeworthy spoof Lady Gaga video (JavaZone’s Lady Java) we moved on to enterprise NoSQL. In effect, Oracle has created a NoSQL database using the Berkeley key value database and a Java driver (containing much of the logic to avoid single points of failure) that they claim offers a simple data model, scalability, high availability, transparent load balancing and simple administration.

Above all, Oracle’s view is that, because it’s provided and maintained by Oracle, there is a “single throat to choke”.  In effect, in the same way that we used to say no-one got fired for buying IBM, they are suggesting no-one gets fired for buying Oracle.

That may be true, but it’s my understanding that big data is fuelled by low-cost commodity hardware (infrastructure as a service) and open source software – and whilst Oracle may have a claim on the open source front, the low-cost commodity hardware angle is not one that sits well in the Oracle stable…

Through partnership with Cloudera (which leaves some wondering if  that will last any longer than the Red Hat partnership did?), Oracle is positioning a Hadoop solution for their customer base:

Oracle describe Cloudera as the Redhat for Hadoop, but also say they won't develop their own release; they said that for Linux originally
Debra Lilley

Despite (or maybe in spite of) the overview of HDFS and MapReduce, I’m still not sure how Cloudera  sits alongside Oracle NoSQL but their “big data appliance” includes both options. Now, when I used to install servers, appliances were typically 1U “pizza box” servers. Then they got virtualised – but now it seems they have grown to become whole racks (Oracle) or even whole containers (Microsoft).

Oracle’s view on big data is that we can:

  1. Acquire data with their Big Data Appliance.
  2. Organise/Analyse aggregated results with Exadata.
  3. Decide at “the speed of thought” with Exalytics.

That’s a lot of Oracle hardware and software…

In an attempt not to position Oracle’s more traditional products as old hat, the next presenter suggested that big data is complementary and not really about old and new but about familiar and unfamiliar. Actually, I think he has a point: at some point “big” data just becomes “data” (and gets boring again?) but this session gave an overview of an information architecture challenge as new classes of data (videos and images, documents, social data, machine-generated data, etc.) create a divide between transactional data and big data, which is not really unstructured but better described as semi-structured and which uses sandboxes to analyse and discover new meaning from data.

Oracle has big data connectors to integrate with other (Oracle) solutions including: a HiveQL-based data integrator; a loader to move Hadoop data into Oracle 11G; a SQL-HDFS connector; and an R connector to run scripts with API access to both Hadoop and more traditional Oracle databases. There are also Oracle products such as GoldenGate to replicate data in heterogeneous data environments

[My view, for what it’s worth, is that we shouldn’t be moving big data around, duplicating (or triplicating) data – we should be linking and indexing it to bridge the divide between the various silos of “big” data and “traditional” data.]

Finding the value – analyse and decide

Speaking of a race to gain insight analytics becoming the CIO’s top priority for 2013 and business intelligence usage doubling by 2014, the next session looked at some business analytics techniques and characteristics, which can be summarised as:

  • I suspect something – a data scientist or analyst needs to find proof and turn into a predictive model to deploy into business process (classification).
  • I want to know if that matters – “I wish I knew” (visual exploration and discovery).
  • I want to make the best decision now – decisions at the speed of thought in the context of a business process.

This led on to a presentation about the rise of the data scientist and making maths cool (except it didn’t, especially with a demo of some not-very-attractive visualisations run on an outdated  Windows XP platform) and introduction of the R language for statistical analysis and visualisation.

Following this was a presentation about Oracle’s recently-acquired Endeca technology which actually sounds pretty interesting as it digests a variety of data sources and creates a data model with an information-discovery front-end that promises “the simplicity of search plus the power of BI”.

The last presentation of this segment looked at Oracle’s Exalytics in-memory database servers (a competitor to SAP Hana) bundling bsuiness intelligence software, adaptive in-memory caching (and columnar compression) with information discovery tools.


I learned a lot about Oracle’s view of big data but that’s exactly what it was – one vendor’s view on this massively hyped and expanding market segment. For me, the most useful session of the day was from Ovum’s Tim Jennings and if that was all I took away, it would have been worthwhile.

In fairness, it was good to learn some more about the Oracle solutions too but I do wish vendors (including my own employer) would sometimes drop the blatant product marketing and consider the value of some vendor agnostic thought leadership. I truly believe that, by showing customers a genuine understanding of their business, the issues that they face and the directions that business and technology and heading in,  the solutions will sell themselves if they truly provide value. On the other hand, by telling me that Oracle has a complete, open and integrated solution for everything and what I really need is to buy more technology from the Oracle stack and… well, I’d better have a good story to convince the CFO that it’s worthwhile…

Slidedecks and other materials from the Oracle Big Data and Extreme Analytics Summit are available on the Oracle website.

A collection of SharePoint shortcuts

This content is 12 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I spent most of last Friday developing a business system with Microsoft Office SharePoint Server (2007).  I’ve worked on a few SharePoint sites over the years and I’m impressed at how much can be done using just standard functionality (lists, etc.) but, whilst the platform is powerful and flexible in many ways, it’s also intensely infuriating at times.

In developing this latest site, there were a few things that I had to Google for – and I’m hoping that posting them here might help others…

Changing the page layout

I created a new page using one of the templates provided for me by my IT department.  Unfortunately I found that the webpart layout was a little too restrictive and I needed to change the page layout.  I hunted around for a while (even after a colleague had told me to look for the page settings) and then I found a post by Shane Young that helped me out. As Shane descibes, the steps are:

  1. “Browse to the page
  2. Click Site Actions, Edit Page
  3. From the tool bar click Page
  4. In the drop down list click Page Settings
  5. Now pick your Page Layout
  6. Click OK”

With the new page layout in place I was able to get the page looking (almost) how I wanted.

Hiding the Title column from forms

My site is built around a document library with a number of columns. One of the default columns is called Title and it’s not really that useful to me as it really just duplicates the Name field (doubling up the details that users need to enter for a document in the library). I can always hide column from list views but I can’t delete it completely and the field still appears in forms. Sometimes, I repurpose Title by changing the column name but I can’t change the column type – it’s always a single line of text. Then I found John Owings’ post which describes the steps to hide the Title column from forms:

  1. “From the list view click Settings [then] List Settings
  2. On the Settings Screen, under the ‘General Settings’ heading, click ‘Advanced Settings’
  3. On the Advanced Settings screen click ‘Yes’ for the value: ‘Allow Management of Content Types?’
  4. Click ‘OK’
  5. Now, back on the Settings Screen, under the ‘Content Types’ heading, click ‘Item’
  6. On the Content Type Management Screen, under the ‘Columns’ section, click on the ‘Title’ column
  7. On the next screen click the radio button for ‘Hidden (Will not appear in forms)’
  8. Click ‘OK’”

Internal anchors

Whilst I’m sure it’s possible to use inline CSS, my SharePoint pages resort to some awful HTML hacks at times, like using tables for layout (and then having to mess around with valign directives and other such code that I haven’t used in about ten years…). I probably shouldn’t admit to such awful practices but I also had to relearn something I’d forgotten many years ago – the use of internal anchors within a page.

It’s worth noting though that, using SharePoint’s Rich Text Editor to create a link to #anchor actually created a link to http://server.domain.tld/layouts/RTE2PUEditor.aspx#anchor. I had to explicitly include the full pathname (e.g. http://server.domain.tld/Pages/Page.aspx#anchor) in the link in order to avoid this behaviour.

Short takes: the rise of the personal cloud; what’s in an app; and some thoughts on Oracle

This content is 12 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few highlights from my week that didn’t grow into blog posts of their own…

Oracle: complete, open and integrated

I was at an Oracle event earlier this week when I heard the following comment that amused me somewhat (non-attributed to protect the not-so-innocent):

“Oracle likes to say they are complete, open and integrated – and they are:

  • Complete – as in ‘we have a long price list’.
  • Open – as in ‘we’re not arrogant enough to think you buy everything from us’.
  • Integrated – as in ‘if we own both sides of the connection we’ll sell you the integration’.”

I don’t have enough experience of working with Oracle to know how true that is, but it certainly fits the impression I have of the company… I wonder what the Microsoft equivalent would be…

The rise of the Personal Cloud

I’ve been catching up with reading the paper copy of Computing that arrives ever fortnight (and yes, I do prefer the dead tree edition – I wouldn’t generally read the online content without it). One of the main features on 22 March was about the rise of the personal cloud – a contentious topic among some, but one to ignore at your peril as I highlighted in a recent post.

One quote I particularly liked from the article was this one:

“The personal cloud isn’t so much the death of the PC as its demotion. The PC has become just another item in a growing arsenal of access devices.”

Now all we need is for a few more IT departments to wake up to this, and architect their enterprise to deliver device-agnostic services…

What’s app?

In another Computing article, I was reading about some of the technologies that Barclays is implementing and it was interesting to read COO Shaygan Kheradpir’s view on apps:

“Many […] tasks that happen on the front-line are […] app-oriented […].

And what are apps? They are deep and narrow. They’re not like PC applications, which are broad and shallow. You want apps to do one, often complex, task.”

Sounds like Unix to me! (but also pretty much nails why mobile apps work so well in small packages.)

Network access control does its job – but is a dirty network such a bad thing?

This content is 12 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Earlier this week, I was dumped from my email and intranet access (mid database update) as my employer’s VPN and endpoint protection conspired against me. It was several hours before I was finally back on the corporate network, meanwhile I could happily access services on the Internet (my personal cloud) and even corporate email using my mobile phone.

Of course, even IT service companies struggle with their infrastructure from time to time (and I should stress that this is a personal blog, that my comments are my own and not endorsed by my employer) but it raises a real issue – for years companies have defended our perimeters and built up defence-in-depth strategies with rings of security. Perhaps that approach is less valid as end users (consumers) are increasingly mobile and what we really need to do is look at the controls on our data and applications – perhaps a “dirty” network is not such a bad thing if the core services (datacentres, etc.) are adequately secured?

I’m not writing this to “out” my employer’s IT – generally it meets my needs and it’s important to note that I could still go into an office, or pick up email on my phone – but I’d be interested to hear the views of those who work in other organisations – especially as I intend to write a white paper on the subject…

In effect, with a “dirty” corporate network, the perimeter moves from the edge of the organisation to its core and office networks are no more secure than the Wi-Fi access provided to guests today – at the same time as many services move to the cloud. Indeed, why not go the whole way and switch from dedicated WAN links to using the public Internet (with adequate controls to encrypt payloads and to ensure continuity or service of course)? And surely there’s no need for a VPN when the applications are all provided as web services?

I’m not suggesting it’s a quick fix – but maybe something for many IT departments to consider in adapting to meet the demands of the “four forces of IT industry transformation”: cloud; mobility; big data/analytics and social business?

[Update: Neil Cockerham (@ncockerhreminded me of the term “de-perimiterisation” – and Ross Dawson (@rossdawson)’s post on tearing down the walls: the future of enterprise tech is exactly what I’m talking about…]

Why I’m leaving Foursquare

This content is 12 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

For the last year or so, I’ve been religiously “checking in” to all venues on my business travels (not personal ones though) to try and get a handle on Foursquare. This and Farmville (long since forgotten) were part of a quest to understand two of the examples of gamification that were often quoted (back when gamification was the current buzzword).

Well, I have to admit, I don’t really see the advantage. Not to me at least.

  • I’ve been the mayor of a few places (I was even the mayor of Fujitsu’s UK HQ for a while, although I suspect the CEO may have a different view) but no-one has ever offered me a discount.
  • Not once have I been alerted to the fact that one of my friends was also at the same venue as me.
  • I frequently forget to check in at the station on the way home – Foursquare doesn’t let you edit your timeline.
  • Even as the mayor of a location I was unable to do anything about the “tip spam” – and Foursquare didn’t respond to my requests to remove the offending items either.

Meanwhile I have given Foursquare plenty of information about my travel patterns and the offices I visit. That information might be useful in a broader context but, as with every other “free” social platform, I am the product – not the customer – and I’m simply providing data for potential analysis and even sale. Foursquare, along with Google Latitude and Facebook Places, holds no interest for me any more (especially since Foursquare awarded me the “trainspotter” badge!)

So, in the words of the famous BBC “dragons”, I’m sorry, but “I’m out“.

[Updated 21:42 – added point about “tip spam”]