Weeknote 2024/04: Coffees, and staying curious

Another week, and lots of positive feedback from colleagues on these weeknotes, so they keep going. This time I’ve written it over the course of the week, rather than in one huge writing session at the weekend. I’m not sure it really helped… it’s still way too long. Anyway, here it is.

(I’m also slightly concerned that some people think I have too much time on my hands. I really don’t. I just stay up too late and don’t get enough sleep!)

This week at work

I struggle to write about work at the moment. I’m doing lots of cool stuff, but I don’t really want to tell competitors what Node4 is developing. Even so, it’s no secret that we’re driving forwards with our Digital delivery (that’s why Node4 bought TNP, risual, Tisski, and ThreeTwoFour) – and public cloud is a big part of that, particularly in the Microsoft space.

My presentation to the Node4 Go To Market community on our public cloud transformation capabilities seemed to go well. And it would be remiss of me not to say that, if you want to know more about how we can potentially help your organisation on its Microsoft Azure journey then I, or my colleagues, would be pleased to have a conversation. Feel free to get in touch on email, or book some time with me.

Beyond that, I joined an interesting call with IDC, looking at the European cloud market in 2024. And I’m just getting involved in a project with some cool tech to help address the ransomware challenge.

Most exciting though is that I’ve submitted a request to join Node4’s Innovate Leadership Development Pathway for 2024. This looks to be a great programme, run over several months, that results in an ILM qualification. The reason I’m excited is that, for the first time in a while, I feel that I’m in a role where I can exploit my leadership potential. I had a career diversion into management, because I thought I needed that experience. Then I got out of it, only to fall back into it (and was very unhappy for quite a long time). Management and leadership are very different things, and over the years I’ve learned that I want to be a leader, not a manager.

Coffees (virtual and IRL)

Much is made of “watercooler moments” as a reason to return to the office (RTO). Well, is there any reason that such moments can’t happen outside the office too?

In 2023, Matt Ballantine ran a “100 coffees” experiment to chat without any particular agenda. It was a big success so it’s rolled on into 2024, currently at around 138. (I was number 49.) Incidentally, you don’t have to drink coffee. It’s about taking the time to chat with people and other beverages are equally acceptable. Or, as Matt describes it in a post he wrote for his employer, Equal Experts, about the process and its benefits:

“Coffee here is a metaphor. A metaphor for being intentional about making space in our working days to create serendipity, build relationships, reflect, have new ideas, share old ideas and a wealth of other benefits that come from conversations without agenda.”

Matt Ballantine: “How to have coffee”

Earlier in the month I had some “coffees” with some colleagues I no longer work with on an daily basis. It was brilliant just to check in and see what they are up to, to keep myself in touch with what’s going on in a different part of the organisation. This week, in addition to some “quick chats” with a couple of my peers, I met several people outside the company for “coffee”. Their roles included: a Chief Evangelist; a Managing Director; and a Digital Transformation Consultant.

One I hadn’t seen since we worked together over a decade ago. Another is part of a “coffee club” that Matt set up to encourage us have a monthly conversation with someone we don’t normally talk to. And one has become a friend over the years that we’ve been catching up for coffee and occasional lunches. My own lack of confidence makes me think “what do I have to add to this conversation”, but invariably I learn things. And I assume that the value of meeting up with no agenda to “just have a chat” goes both ways.

Some of the things we talked about

Our conversation topics were wide and varied. From family life to:

  • Recognising when to buy services vs. learning to do something yourself.
  • “Thought leadership” and qualitative vs. quantitative metrics – looking at the “who” not the size of the reach.
  • Next-generation content management systems.
  • How localisation is more than just translation – sometimes you might rearrange the contents on the page to suit the local culture.
  • How UK town centres seem to encourage chains to flourish over independent retailers.
  • The frustrations of being an end user in a world of corporate IT security (managed devices, classifying information, etc.)
  • Being proud of your kids.
  • What travel was like when we were young, when our location wasn’t being tracked, and when our parents must have been super-worried about where we were. (Is the world more dangerous, or just more reported?)
  • Finding your tribe by showing things in the background on virtual meetings.
  • Bad service and food vs. great coffee but no space. And on what makes a good English breakfast.
  • Parenting young adults and supporting their life decisions.
  • Publishing newsletters, weeknotes, blogs. Owning your own content, and why RSS is still wonderful.
  • Fountain pens, a place for everything (and everything in its place) – and why I’d like to be more like that… but have to accept I’m just not.
  • Four day weeks, balancing work, health and exercise (or lack of).

That’s the whole point. No agenda. See where the conversation leads. Get to know each other better. Learn new things. Build relationships.

And all three “coffees” ran out of time!

This week in tech

  • Here’s something I wrote a blog post about. I had intended there to be more posts, but I overestimated the amount of time I have for these things:
  • A couple of weeks ago, I mentioned I’d been looking at Calendly. It turned out to be a trial (I missed that) and I need to subscribe for some of the features that I set up. So, I guess that experiment didn’t work out…
  • I don’t understand why Google opening a new data centre in the UK this is news. All of the hyperscalers already have data centres in the UK. This is just another one. I’m not sure that they contribute much to the economy though, except maybe in construction and through services consumed (electricity, water, etc.). As for the PM’s statement that “Google’s $1 billion investment is testament to the fact that the UK is a centre of excellence in technology and has huge potential for growth”. Poppycock. It shows there is a demand for cloud computing services in the UK. It’s got nothing to do with excellence.
  • I found a new setting in Microsoft Teams that makes my video feed look like I’m using a decent camera! It’s so much better than the old background blur.

Some posts I liked elsewhere

  • On digital inclusion…
  • Of course, not everyone finds online easy. And we have to recognise that sometimes, for any age group, there’s a need for a human connection…

Life

Some readers may know that I have been using the Zoe personalised nutrition programme to see what insights I can get into my diet. I’ve tweeted a bit, and it deserves a longer blog post, but I found this article in the Times very interesting. Jay Rayner has a slightly less reverent view in The Guardian. (Kate Bevan shared both of these articles.)

And I have a holiday to look forward to… or at least a mini-break. Mrs W and I have just booked a long weekend in Tallinn for a few weeks’ time…

This week’s watching

After finishing our recent dramas, it was time to start something new. Several people had recommended Lessons in Chemistry (on Apple TV) and we’re really enjoying it. As an aside, we still have a long way to go on diversity, inclusion and equality but, oh my, we’ve come a long way since the 1950s.

This week’s listening

I listen to a lot of podcasts when I’m walking the dog, or when I’m driving alone. The Archers is the first on my list but please don’t judge me.

I also like to listen to The Bottom Line, though sometimes find Evan Davis’ views on modern work to be a little “traditional”. This week’s episode on e-commerce returns was fascinating, though I do wonder why no major UK retailers (e.g. Next, John Lewis) or online-only retailers like Amazon or even Wiggle wanted to take part…

I used to listen to The Rest Is Politics – it’s a great podcast but there is just too much of it – I found the volume of content overwhelming. But I did listen the Rest Is Politics Leading interview with Bill Gates. I was looking for a link to the podcast episode to share, but I found it’s available on YouTube too, so you can watch or listen:

Some of the things I took away from the interview were:

  • It’s well-known that Bill Gates dropped out of Harvard, but it’s clear he was a very smart kid… he quietly mentions finishing his classes a year early.
  • I was interested in his responses to tough questions – like asking if his approach at Microsoft was “flattening competition not creating excellence”. And on monopolistic views of the world and how they needed to lower prices to gain market share. Remember the mission was to get a computer onto every desk and into every home.
  • On his position as a rich and powerful person, and why he follows the philanthropic path that he does of trying to kill malaria rather than direct giving to those in poverty.
  • On family, the impact he can have on his granddaughter’s future world, and the advantages/disadvantages of growing up with wealthy/famous parents
  • On the future of AI.
  • On politicians he admires (and giving very guarded responses!)
  • His rather odd (IMHO) views on climate change.
  • On learning from Warren Buffet, and on a lifetime of staying curious.

Maybe that’s what I should call this blog… “staying curious”.

This week in the press

On the PR front, I had a brief quote in Digitalisation World’s Tech Transformations for 2024 article.

…and not in the press

After initially being flattered to be contacted by a major UK newspaper for comment on the importance of public sector work to Fujitsu, I declined to comment. Not sure if it was my media training or common sense, but it feels right. I had already written a brief post on LinkedIn, but a lot will have changed in the time since I left and anything I can remember would already been in the public domain.

More thoughts on the Post Office Scandal

I was going to write about this last week, but I was still reeling from some of the comments I’d received on social media, so thought on for a bit more.

Understandably, this is a very emotive subject. Lives were ruined. Some who were affected took their own lives. It’s nothing short of a tragedy.

Even so, it was upsetting to be told last week on Twitter/X that anyone who has Fujitsu on their CV should never work again (or words to that effect). I was at ICL or Fujitsu for around 16 years over one internship and two periods of employment. In common with most people there, I had nothing to do with (or knowledge of) Horizon, other than knowing of its existence, in a separate business unit. And, in common with most people who saw the recent ITV Drama, I was shocked and appalled.

I can’t defend Fujitsu – but I am going to use someone else’s words, because they sum up the situation about their future in the UK public sector market perfectly for me:

“A lot of innocent people [may] lose work at Fujitsu. All of us who have worked for outsourcing partners will know the nature of contracts means many will know nothing of other ongoing projects. Today many workers at Fujitsu [may] be ‘at risk’ for something they had no control over.”

From a technical perspective, I found this video from Dave Farley to be an excellent explanation of the types of technical issues in the Horizon system that led to accounting errors. Then add in believing the computer over the humans, together with an unhealthy dose of corporate mismanagement (as is being uncovered by the ongoing inquiry), and you get the full horror of the Post Office Scandal.

This week in photos

Looks like I didn’t take many, but I did wrap up the week with a nice dog walk in the winter sunshine.

Featured image by Engin Akyurt from Pixabay.

Using NFC tags to automate my home

Imagine a home with a smart thermostat to control the heating, smart lights to control the lighting, and smart sockets to control other electrical devices. Well, for some people, that doesn’t require a lot of imagination – it’s just the way things are!

My home isn’t quite like that. We still have an analogue thermostat switch on the hall wall (one day we might upgrade) but there are various smart sockets around and we do have some smart lighting. The smart sockets control things like the heater in my Man Cave, the Christmas tree lights (mid December-January 6th only) or the fairy lights in the garden. And I wrote some posts about the smart lights, in this two part series in 2021:

NFC tags

Many of us are familiar with Near Field Communication (NFC), even if we don’t know it. It’s the technology used in contactless payment cards. To learn more, I thoroughly recommend watching Professor Hannah Fry‘s Secret Genius of Modern Life TV episode about the bank card, on BBC iPlayer.

And those smartphones we carry everywhere with us, well, if they are NFC-enabled, they can read NFC tags to perform other operations. It’s not just for making electronic payments!

All you need is to buy some tags and, as much as I try to avoid the big online marketplace that sells everything from A to Z, that’s where I picked mine up.

What follows is for iOS, as my family are all iPhone users (tested on 17.2.1). Android users can do similar things, but you’ll use a different app.

Shortcuts and automations

The iOS Shortcuts app has a section called Automation.

  1. Click + to create a new automation.
  2. Scroll down to NFC.
  3. Click Scan and scan your tag, then name it.
  4. Pick when to run the automation (immediately, or after confirmation) and whether to be notified.
  5. Select what the automation will do.

That’s it. Just touch the top of the iPhone to the tag and it will run your automation. Stick it to the desk, the wall, or wherever is handy to run the automation.

Things I discovered

In this experiment, I did find out a few things…

  • I should have bought tags with sticky pads. Or maybe not – a glue stick seems to work pretty well for attaching them to things.
  • They don’t seem to work on metal things (like my desk lamp, or the metal switch sockets in the Man Cave). I guess it interferes with the signal, so you’ll need to stick them nearby.
  • The iOS Shortcuts app will integrate with many applications, but not directly with Alexa, it seems. I have tested a couple of workarounds though:
    • Play a recording that issues the Alexa command.
    • More elegantly, use the scripting option to Open the Alexa app, wait a second, speak “Alexa”, speak the Alexa command.
  • You can also create quite advanced no-code scripts to launch menus and ask for input – for example scan the same tag and ask whether to turn on the device or turn it off, then take action accordingly.

There’s more in this thread on Twitter/X…

Conclusion

NFC tags are cheap (especially when bought in bulk) and an effective way to automate tasks around the home (or at work, in the car, or wherever). There’s lots more that you can do with NFC tags and YouTube is full of videos to provide inspiration. Have fun!

Featured image from the Computer Science Wiki, used under a Creative Commons Attribution-NonCommercial-ShareAlike licence.

Smart lighting: Part 2 (adding Innr and IKEA Trådfri bulbs to my Philips Hue installation)

This content is 4 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

My blog posts are like buses. You wait years for one to come along, and then two arrive at once. The only problem is that they are four years late.

In part 1 of this series, I wrote about getting started with Zigbee lighting, in the form of Philips Hue. Unfortunately, although it’s widely supported, Hue can be expensive so I quickly started to add compatible devices to my network. Here’s what I found.

Coloured bulbs

Whilst I use white lights in communal areas, I have some coloured lamps in some of the bedrooms and in the home offices. I also have one on the landing outside my office, which can be linked to my Teams presence information to show if I’m busy, using Isaac Levin (@isaacrlevin)’s PresenceLight solution).

Rather than shelling out £50 for a coloured Philips Hue bulb, I used Innr smart bulbs (both B22 and GU10 formats). These are also Zigbee-based but are not Apple HomeKit certified. That means that they work with the Hue app, but not natively in iOS. I decided that I can live without that (even more so since I switched to Android).

Innr supports connecting its smart bulbs to a Philips Hue bridge (but not for Hue Sync).

Low cost GU10s

I mentioned in my first post that I have some low-voltage MR16 bulbs in the house, for which I can’t find Zigbee replacements. Newer parts of the house (like the loft extension) have mains voltage GU10 fittings. For these, I used inexpensive IKEA Trådfri bulbs.

At the time, getting Trådfri working with Hue was a bit hit and miss but newer firmware seemed to improve this. The IKEA website even states that the Trådfri products can be used with Hue:

Do the IKEA smart lighting products work with the Philips Hue Bridge?

Yes, you can use the IKEA smart lighting products together with the Philips Hue Bridge.

How do I connect my IKEA smart lighting products to a Philips Hue Bridge?

If the software version of your IKEA smart lighting products is 1.2.x or later, you can connect them directly to a Philips Hue Bridge. Simply follow these steps: – First, make sure that the light sources that you want to connect have an updated software version (1.2.x or later).

– Keep the light sources close to the Philips Hue Bridge.

– Search for new devices with the Philips Hue app.

– Do a factory reset of the light sources by toggling the main switch six times.

[…] If the software version of your products is not 1.2.x or later, you need to update it by using a TRÅDFRI gateway and the IKEA Home smart app.”

IKEA Smart Lighting product support

A few things I found:

  1. Trådfri bulbs do seem to need to be physically close to the Hue Bridge in order to pair (as noted above).
  2. Some early firmware versions didn’t work so well with non-IKEA gateways (as noted above). I’ve had no real issues with my 2017 Week 44/46 and 2018 week 01 bulbs. You can find the version number on the packaging before purchase. According to the Hue software, these are all running software version 1.2.214.
  3. I couldn’t make IKEA Trådfri accessories (switches, etc.) work with the bulbs whilst the bulbs were paired with Hue. Your mileage may vary. I returned my Trådfri gateway to the store.
  4. Sometimes, the Trådfri bulbs will stop responding (remain on or off, regardless of control). This can usually be fixed by removing them from their fitting and then reconnecting them (basically rebooting the bulb). Later firmware may help.

Mixed messages

One side effect of the mixed system is that the Philips Hue software can only update its own equipment. It recognises the other equipment and will even tell me the software versions but updates would need the corresponding Innr or IKEA gateways and apps to be used. That’s a cost and level of complexity that I decided to manage without.

Software update in the Philips Hue app (Hue bulbs)
Software update in the Philips Hue app (Ikea Trådfri bulbs)
Software update in the Philips Hue app (Innr smart bulbs)

Smart Assistants

I mentioned that my cheaper bulbs are not compatible with Apple HomeKit, but I’ve had no problems working with Amazon Alexa via the Philips Hue skill. In truth, my home automation is a rats nest of Samsung SmartThings, TP-Link Kasa, SmartLife, Apple HomeKit and Amazon Alexa. I really need to look at sorting that out (maybe with Home Assistant). Watch out for a future blog post… hopefully it won’t be four years in the making).

Wrap-up

My experience with Zigbee smart bulbs from a variety of manufacturers has largely been positive. I still occasionally ask Alexa to turn on a light and find it doesn’t work because someone has switched off the circuit but that’s what us IT folks refer to as a “layer 8 problem” or an issue with the “wetware”. Whilst mixing manufacturers may present some challenges with updates, a Hue hub at the centre of a mixed network seems to work pretty well for me. After all, the likelihood of someone hacking my unpatched IKEA lightbulbs seems pretty minimal…

Acknowledgements

Featured image by soynanii from Pixabay.

Smart lighting: Part 1 (getting started with Philips Hue/Zigbee)

This content is 4 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Earlier today, I spotted a tweet from Karan Chadda (@kchadda) that reminded me of an unfinished blog post from 2017…

So, here’s part one of the story that never got posted…

A new Hue

Around four years ago, I began an experiment. Hot on the heels of success with a Wi-Fi activated smart socket (a TP Link HS-110), I thought I’d expand on my home’s Internet of Things (IoT) credentials with some smart lighting.

I should explain that my house is a fairly typical UK house: a 1990s-built, detached property, with some pretty uninspiring pendant lights in most rooms. The kitchen/dining room is a little different, as it has low-voltage MR16 spotlights. These were recommended by the electrician who worked on our extension in 2009.

I did some research, and decided that I wouldn’t go down the Wi-Fi route. Not only were the bulbs expensive but it’s not a great use for Wi-Fi (and at the time my home Wi-Fi performance was pretty flaky). Instead, I went for a Zigbee-based solution, with Philips Hue at its heart.

The Hue gateway is pretty easy to set up – it just needs a wired connection to the network. Most home routers have a few of these; my setup is a little more extensive, with PowerEthernet running to my office and other locations that are away from the Internet connection but have a need for wired network connections. With a gateway in place, it was just a case of strategic lightbulb swap-outs, taking out traditional bayonet-fit (B22) bulbs and replacing them with smart equivalents.

Smart lighting, not so smart users…

At this point I should explain, all the smart technology is useless if the circuits aren’t left powered on. And this has been the major flaw in my plan. Our family is divided between the geeks (myself and my eldest son), and the “normal” tech users (my wife and my youngest son). If I was being less charitable, I might put my wife into the laggards category but, to be fair, she’s happy to adopt technology when she can see its value.

For me, part of that value was the ability to set up routines so that lights turn on/off when we’re away from the home. I also have one that turns all the lights off after everyone has gone to work/school (because physical switches appear to only work in one direction for my family – they can all turn lights on, but seemingly not off – I believe this is a common complaint for Fathers up and down the land, walking around houses turning lights off in empty rooms, even during daylight hours).

The biggest drawback I found was that I’ve yet to identify suitable Zigbee switches for the UK market. That means that, when the circuit is switched off (usually when leaving the house or going to bed), the lights are no longer controllable in software. On the flip side, the less-technically-inclined family members can operate the lights as normal, with the only minor inconvenience being, if the light has been turned off in software, they need to flick the switch off and on again to turn on the light “manually”.

Those in other parts of the world may have more luck – have a listen to these podcast episodes or watch some of the videos on this channel:

Form factors and accessories

Over time, I’ve expanded the system and I now have smart bulbs in the communal areas (hall, stairs, landing, etc.) as well as in the home offices and some of the bedrooms.

Unfortunately, there are no suitable MR16 Hue-compatible bulbs, so the rooms with those lights still have traditional halogen (for dimmer-controlled rooms) or LED spotlights. I’ve also stuck with “normal” bulbs in the bathrooms.

I’ve added a Hue sensor in the garage storeroom (so the light comes on when we open the door) and a couple of Hue dimmers, one of which has moved between various rooms over the last couple of years but is currently in our loft room. For the dimmer, I bought a Samotech adapter that covers the original light switch (left switched on), whilst still allowing the Hue dimmer to attach magnetically.

Samotech adapter in use with a Philips Hue dimmer and a standard UK light switch

The verdict?

All in all, things are working well. After nearly four years I’ve only had one failed bulb (replaced under warranty after about a year). The Philips Hue system seems to be a widely supported platform, with plenty of integrations (e.g. to smart home assistants) and the use of third-party bulbs in places has helped me to keep costs down to a reasonable level (I’ll write about these in my next post).

Acknowledgements

Featured image by HeikoAL from Pixabay.

Getting started with Azure Sphere: Part 2 (integration with Azure services)

This content is 4 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last week, I wrote about my experiences getting some sample code running on an Avnet Azure Sphere Starter Kit. That first post walked through installing the SDK, setting up my development environment (I chose to use Visual Studio Code), configuring the device (including creating a tenant, claiming the device, connecting the device to Wi-Fi, and updating the OS), and downloading and deploying a sample app.

Since then, I’ve managed to make some steps forward with the Element 14 out of the box demo by Brian Willess (part 1, part 2 and part 3). Rather than repeat Brian’s posts, I’ll focus on what I did to work around a few challenges along the way.

Working around compiler errors in Visual Studio Code using the command line

My first issue was that the Element 14 blogs are based on Visual Studio – not Visual Studio Code and I was experiencing issues where Code would complain it couldn’t find a compiler.

Thanks my colleague Andrew Hawker who was also experimenting with his Starter Kit, but using a Linux VM, I had a workaround. That workaround was to run CMake and Ninja from the command line, then to sideload the resulting app package onto the device from the Azure Sphere Developer Command Prompt:

cmake ^
-G "Ninja" ^
-DCMAKE_TOOLCHAIN_FILE="C:\Program Files (x86)\Microsoft Azure Sphere SDK\CMakeFiles\AzureSphereToolchain.cmake" ^
-DAZURE_SPHERE_TARGET_API_SET="4" ^
-DAZURE_SPHERE_TARGET_HARDWARE_DEFINITION_DIRECTORY="C:\Users\%username%\AzureSphereHacksterTTC\Hardware\avnet_mt3620_sk" ^
-DAZURE_SPHERE_TARGET_HARDWARE_DEFINITION="avnet_mt3620_sk.json" ^
--no-warn-unused-cli ^
-DCMAKE_BUILD_TYPE="Debug" ^
-DCMAKE_MAKE_PROGRAM="ninja.exe" ^
"C:\Users\%username%\AzureSphereHacksterTTC\AvnetStarterKitReferenceDesign"
ninja
azsphere device sideload deploy --imagepackage AvnetStarterKitReferenceDesign.imagepackage

I wasn’t able to view the debug output (despite my efforts to use PuTTY to read 192.168.35.2:2342) but I was confident that the app was working on the device so moved on to integrating with cloud services.

Brian Willess has since updated the repo so it should now work with Visual Studio Code (at least for the high level application) and I have successfully tested the non-connected scenario (part 1) with the changes.

Integration with Azure IoT Hub, device twins and Azure Time Series Insights

Part 2 of the series of posts I was working though is where the integration starts. The basic steps (refer to Brian Willess’ post for full details) were:

  1. Create an Azure IoT hub, which is a cloud-hosted back-end for secure communication with Internet of Things (IoT) devices, of which the Azure Sphere is just one of many options.
  2. Create and configure the IoT Hub Device Provisioning Service (DPS), including:
    • Downloading a certificate from the Azure Sphere tenant (using azsphere tenant download-CA-certificate --output CAcertificate.cer at the Azure Sphere Developer Command Prompt) and using this to authenticate with the DPS, including validation with the verification code generated by the Azure portal (azsphere tenant download-validation-certificate --output validation.cer --verificationcode verificationcode) and uploading the resulting certificate to the portal.
    • Creating an Enrollment Group, to enrol any newly-claimed device whose certificate is signed by my tenant. This stage also includes the creation of an initial device twin state, editing the JSON to include some extra lines:
      "userLedRed": false,
      "userLedGreen": false,
      "userLedBlue": true
    • The initial blue illumination of the LED means that we can see when the Azure Sphere has successfully connected to the IoT Hub.
  3. Edit the application source code (I used Visual Studio Code but any editor will do) to:
    • Uncomment #define IOT_HUB_APPLICATION in build_options.h.
    • Update the CmdArgs line in app_manifest.json with the ID Scope from the DPS Overview in the Azure portal.
    • Update the AllowedConnections line in app_manifest.json with the FQDNs from the DPS Overview (Global Device Endpoint) and the IoT Hub (Hostname) in the Azure portal.
    • Update the DeviceAuthentication line in app_manifest.json with the Azure Sphere tenant ID (which may be obtained using azsphere tenant show-selected at the Azure Sphere Developer Command Prompt).
  4. Build and run the app. I used the CLI as detailed above, but this should now be possible within Visual Studio Code.
  5. Use the device twin capabilities to manipulate the device, for example turning LEDs on/off (though clearly there are more complex scenarios that could be used in real deployments!).
  6. Create a Time Series Insights resource in Azure, which is an analytics solution to turn IoT data into actionable insights.
    • Create the Time Series Insights environment using the existing IoT Hub with an access policy of iothubowner and consumer group of $Default.
  7. Add events inside the Time Series Insights to view the sensor readings from the Azure Sphere device.
Time Series Insights showing sensor data from an Azure Sphere device.

Time Series Insights can get expensive for a simple test project without any real value. I could quickly have used my entire month’s Azure credits, so I deleted the resource group used to contain my Azure Sphere resources before moving on to the next section…

Integration with Azure IoT Central

Azure IoT Central is a hosted IoT platform. It is intended to take away much of the underlying complexity and let organisations quickly build IoT solutions using just a web interface.

Following part 3 in Brian Willess’ Azure Sphere series, I was able to get my device working with IoT Central, both using the web interface to control the LEDs on the board and also pushing sensor data to a dashboard. As before, these are the basic steps and refer to Brian Willess’ post for full details:

  1. Create a new IoT Central application.
  2. Select or create a template:
    • Use the IoT device custom template.
    • Either import an existing capability model (this was mine) or create one, adding interfaces (sensors, buttons, information, etc.) and capabilities.
    • Create custom views – e.g. for LED device control or for device metrics.
  3. Publish the template.
  4. Configure DPS:
    • Download a certificate from the Azure Sphere tenant using azsphere tenant download-CA-certificate --output CAcertificate.cer at the Azure Sphere Developer Command Prompt. (This is the same certificate already generated for the IoT Hub example.)
    • Upload the certificate to IoT Central and generate a validation code, then use azsphere tenant download-validation-certificate --output validation.cer --verificationcode verificationcode to apply this.
    • Upload the new validation certificate.
  5. Create a non-simulated device in IoT Central:
  6. Run ShowIoTCentralConfig.exe, providing the ID Scope and a shared access signature key for the device (both obtained from the Device Connection details in IoT Central) and the Device ID (from the device created in the previous step). Make a note of details provided by the tool
  7. Configure the application source code to connect to IoT Central:
    • Uncomment #define IOT_CENTRAL_APPLICATION in build_options.h.
    • Update the CmdArgs line in app_manifest.json with the ID Scope obtained from the Device Connection details in IoT Central.
    • Update the AllowedConnections line in app_manifest.json with the FQDNs obtained by running ShowIoTCentralConfig.exe.
    • Update the DeviceAuthentication line in app_manifest.json with the Azure Sphere tenant ID (which may be obtained using azsphere tenant show-selected at the Azure Sphere Developer Command Prompt).
  8. Build and run the application.
  9. Associate the Azure Sphere device with IoT Central (the device created previously was just a “dummy” to get some configuration details). IoT Central should have found the real device but it will need to be “migrated” to the appropriate device group to pick up the template created earlier.
  10. Open the device and enjoy the data!

I hadn’t expected IoT Central to cost much (if anything, because the first two devices are free) but I think the app I’m using is pretty chatty so I’m being charged for extra messages (30,000 a month sounds like a lot until you realise it’s only around 40 an hour on a device that’s sending frequent updates to/from the service). It seems to be costing just under £1/day (from a pool of credits) so I won’t be worrying too much!

What’s next for my Azure Sphere device?

Having used Brian Willess’ posts at Element 14 to get an idea of how this should work, I think my next step is to buy some external sensors and write some real code to monitor something real… unfortunately the sensors I want are on back order until the summer but watch this space!

Getting started with Azure Sphere: Part 1 (setup and running a sample app)

This content is 4 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Late in 2019, I got my hands on an Azure Sphere Starter Kit, which I’ve been intending to use for an IoT project, using some of the on-board sensors for temperature and potentially an external one for humidity…

For those who aren’t familiar with Azure Sphere, it’s Microsoft’s Secure Internet of Things (IoT) solution using certified chips, a custom operating system and a security service. My device is an Avnet Azure Sphere MT3620 Starter Kit and this blog post focuses on getting it up and running with one of the sample applications that Microsoft provides, using Windows 10 (other options include Linux).

Installing Visual Studio Code and the Azure Sphere SDK

Having obtained the kit, the next stop was Microsoft’s Getting Started with Azure Sphere page. I downloaded and installed Visual Studio Code (I don’t really need the whole Visual Studio 2019 application – though I later found that a lot of the advice on the Internet assumes that’s what you’re using…) and then immediately found that there are two versions of the Azure Sphere Software Development Kit (SDK). According to the Microsoft docs, either can be used with Visual Studio Code but I found the setup for the Azure Sphere SDK for Visual Studio failed when it can’t find Visual Studio (not really surprising) and so I used the Azure Sphere SDK for Windows.

Connecting the hardware

I plugged in the Avnet Azure Sphere Starter Kit, using the supplied USB cable, and watched as Windows installed drivers after which a virtual network interface was present and three COM ports appeared in Device Manager.

Setting up my dev environment

Installing Visual Studio Code and the Azure Sphere SDK was only the first part of getting ready to create code for the device. I needed to install the Azure Sphere extension (easily found in the Extensions Marketplace):

The Azure Sphere extension also installs two dependencies:

  • C/C++
  • CMake Tools

I also need to install CMake (in my case it was version 3.17.1). Not really knowing what I was doing, I followed the defaults but on reflection, I probably should have let CMake add its directory to the system %PATH% variable (I later uninstalled and reinstalled CMake to do this, but could just have added C:\Program Files\CMake\bin to the Path in the user environment variables).

The final installation was Ninja. Windows Defender SmartScreen blocked this app, but I was later able to work around that, by unblocking in the properties for ninja.exe:

I missed the point in the Microsoft documentation that said I needed to manually add Ninja to the %PATH% environment variable but I later went back and added the folder that I copied ninja.exe to (which, for me, was C:\Users\%username%\Tools).

(The above steps were my second attempt – the first time I installed MinGW-W64 to work around issues when Visual Studio Code couldn’t find a compiler, together with several changes in settings.json. I later removed all of that and managed to compile and deploy a sample application using just the settings above…)

Configuring the Azure Sphere device for use

There are a few steps required to configure the device for use. These are all completed using the Azure Sphere Developer Command Prompt, which was installed earlier, with the SDK.

Creating an Azure Sphere tenant and claiming the device

Each Azure Sphere device must be “claimed” and associated with a “tenant”. I followed the Microsoft documentation to do this…

azsphere login --newuser user@domain.tld

After completing Multi-Factor Authentication (MFA) and confirming I wanted to allow Azure Sphere to use my account, I was logged in but with a warning that I don’t have access to any Azure Sphere tenants, so I created one:

azsphere tenant create --name "Mark Wilson"

Warning – more research required: I used a Microsoft Account, as per the Microsoft instructions, but am now concerned I should have used an Azure Active Directory (Organisational/Work or School) account (especially as Role Based Access Control is supported from Azure Sphere 19.10 onwards). As a device can only be claimed once and, once claimed, the device is permanently associated with the Azure Sphere tenant, I’m stuck with these settings now…

I then went ahead and claimed the device:

azsphere device claim

Connecting to Wi-Fi and updating the device operating system

I checked the current OS version on the device:

azsphere device show-deployment-status

As can be seen, not only is the OS out of date, but the device is not connected to a network, so I connected to Wi-Fi:

azsphere device wifi show-status
azsphere device wifi add --ssid "SSID" --psk password
azsphere device wifi show-status

Now, with network connectivity in place, the device had a fighting chance of an OS update and according to the Microsoft documentation:

The Azure Sphere device checks for Azure Sphere OS and application updates each time it boots, when it initially connects to the internet, and at 24-hour intervals thereafter. If updates are available, download and installation could take as much as 15-20 minutes and might cause the device to restart.

Configure networking and update the device OS

I tried several restarts using azsphere device restart with no success. In the end, I left the device connected overnight and, by the morning, it had updated to 20.03.

Finally, I enabled application development on the device, ready to download some code and deploy an application:

azure sphere device enable-development

Downloading a sample app

My initial attempts to use the app that I wanted to didn’t work so I decided to test my setup with one of the Microsoft Quick Starts.

I needed to use git to clone the Azure Sphere Samples Repo, so that meant installing git. Then, from the Terminal in Visual Studio Code, I ran git clone https://github.com/Azure/azure-sphere-samples.git.

I then opened the Samples\HelloWorld\HelloWorld_HighLevelApp folder in Visual Studio Code, ready to build and deploy the app.

Building and deploying the app

Having set up my dev environment, set up the device and downloaded some sample code, I followed the instructions in the Visual Studio Code Azure Sphere Extension to run the following in the Command Palette: Azure Sphere: Configure Settings (selecting High-Level Application) and CMake: Build.

I was then able to build and deploy the sample app to my Azure Sphere device, by starting a debug session (F5) .

and was rewarded with a blinking LED on the board!

Azure Sphere Starter Kit with blinking LED

I can also view the application status with azsphere device app show-status.

Next steps

The next step is to get the app I really wanted to use working on the device, making use of some of the on-board sensors and then integrating this with some of the Azure services. I’m having trouble compiling that code at the moment, so that blog post may be a while longer…

Further reading

Dreaming of a better commute

This content is 10 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Travelling in and out of London this week for the course I’ve been attending has reminded me why working from home (mostly) is a huge blessing. At least 4 hours’ travel a day for a relatively simple 60-mile commute? No thank you!

I did, however, use two different routes with contrasting experiences and that made me think – why does it have to be this way? And what might it be like one day?

Commute route 1: Olney to London via Bedford (Thameslink/East Midlands Trains)

After driving to Bedford and finding a space in the car park (not always easy), the next question was where are the ticket machines? The option to pay and display with optional mobile phone/SMS/app payment seems to have been replaced by a system to pay as you leave the car park on foot (albeit with an optional mobile app). It uses ANPR to recognise my car but the user interface is confusing and there’s no option for contactless payment (surely a perfect use case for fast commodity transactions like this?). At £7.90 for a day’s parking (when the only reason you would ever park there is to catch a train!), it’s expensive too.

Then, at the station I bought a ticket – again falling foul of a confusing user interface (not helped by Thameslink’s corporate colours not really highlighting what I need to see). I switched to another machine and followed a different (but more familiar) purchase journey on the touch screen whilst another customer switched queues because of a broken card reader in the machine she was using.

Catching the train is simple, with frequent services but lots of stops and the (07:34) train is packed well before reaching London.

The good thing about this route (on Thameslink – not on East Midlands Trains) is that it goes right to the heart of the city (not the West End) although I change at Farringdon to get on the underground towards Tower Hill. Sadly, with no barriers to pass through and crowds of commuters I didn’t see an Oyster touch in/out machine, which I realise after boarding the train – wouldn’t it be good if there were more of these machines or if you could swipe on the train!? I touch out at the end of the journey but am charged the full fare and it takes me a lot of time on the phone waiting to sort out the charging…

Commute route 2: Olney to London via Milton Keynes (London Midland/Virgin Trains)

After a faster drive to Milton Keynes (MK is famous for its roundabouts but there’s a real benefit in the national speed limit grid road network), I park close to the station. The actual station parking is extortionate (so much so that I know some people who don’t pay, preferring to take the risk of an occasional fine) but off-street parking is available and half price if I pay by phone (£4.18).

I buy a ticket at the station but know to always allow time for queuing: there are 6 machines and 4 booths but that’s never enough! It’s 06:54 so I dash for the 06:55 London Midland service, but see that the (faster) 06:53 Virgin train has only just arrived (even though it’s showing as “on time”).

We set off towards London, only to be delayed by a vehicle striking a bridge at Watford and are overtaken by the slower London Midland service that I nearly caught earlier! Eventually, we get moving and arrive in London 20 minutes late…

A dream of a better commute

These real world stories are just single journeys and it could all be so different on another day. So let’s compare with what it could be like:

  • My calendar shows that I’m planning to be in London for the day.
  • My alarm wakes me with enough time to get ready, and the lights in the house gently warm up to wake me from my slumber.
  • I drive to the station and, as I park my phone recognises my location and that I’m stationary, asks me if I need to pay for parking and then takes care of the details.
  • Arriving on the station concourse, my digital personal assistant has pre-bookèd my train ticket and there’s a boarding pass on my phone. No paper tickets are required as the barriers can simply scan a QR code on my screen (or even use NFC?)
  • There’s a steady flow of trains (on time of course!) and as I switch to the Underground, payment is dealt with as I pass through turnstiles using a contactless payment card – and, even if I end up on the platform via a different route I can pick my boarding point (verified using location services) and ensure I’m correctly billed, using a smartphone app…
  • Realising there are delays on the line, my phone reschedules appointments as required, or otherwise ensures that contacts are aware I will be delayed.

It’s not difficult – all of this technology is available today but it just doesn’t quite work together… all of this talk of an Internet of Things brings it tantalisingly close but train companies, car park operators and other organisations still cling on to outdated methods. So it seems I’ll be dreaming for a little while longer…

 

Short takes: tl;dr; online influence (#digitalsurrey); and the Internet of things (#cloudcamp)

This content is 12 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

It’s been another crazy week without any time for blogging so here are some quick highlights from the stuff I would like to have written about (and still might, time permitting!)

tl;dr

I was reading one of Matthew Baxter-Reynold’s articles on the Guardian website a few days ago and he gave a summary of the key points under the heading tl;dr.  I hadn’t seen that before but it turns out it’s an Internet meme – tl;dr is an abbreviation for “too long; didn’t read” – something that I suspect many of my blog posts suffer from. Maybe I’ll start including a tl;dr section in future…

Return on Influence

On Tuesday evening, Mark W Schaefer (@MarkWSchaefer) spoke at Digital Surrey about the use of influence marketing on the web. It was an enlightening talk and certainly something to consider as organisations increasingly judge our online influence in deciding how to (or whether to) react to and interact with us. My personal view is that Klout and its ilk are over-rated (Klout in particular is very much led by volume of online activity – if I go on holiday for a few days, my Klout takes a hit) but, if I were to give a “tl;dr” view on Mark’s talk it would probably include this diagram:

  1. Surround yourself with people who care about you (and your views) and have a pre-disposition to “move” (i.e. like, retweet, advertise, etc.) your content.
  2. Create unique and interesting content – have something to say (in order to make it “move”) – make it relevant, interesting, timely and entertaining.
  3. Be consistent in engagement – not just broadcasting but being authentically helpful and looking for opportunities to interact.

Common sense? Perhaps – but it’s how Mark suggests we build influence.  Read more in Mark’s book – Return On Influence: The Revolutionary Power of Klout, Social Scoring, and Influence Marketing.

(Jas Dhariwal has made a recording of Mark’s talk available.)

The Internet of things

The Internet of what? Well, depending on your source of technology reading material, you might have head that we’re increasingly connecting lots of “things” to the Internet – sensors, for example – and Wednesday saw a CloudCamp Special in London on The Internet of things. As usual, the evening was introduced by Simon Wardley (@swardley) with his well-practiced (but still interesting) talk on the cycle of innovation leading up to his vision of “augmented intelligence” supported by utility computing (cloud), big data, and intelligent mobile applications.

Then, onto the lightning talks with: Andy Bennet (@databasescaling)’s introduction to the Internet of things (it’s not new!); Raphael Cohn’s fascinating recital of how Smith Electric Vehicles overcame a major business issue in that “electric trucks rule, but batteries suck, and mobiles die”; Kuan Hon (@kuan0)’s rundown on cookie laws (which have a much broader impact than just websites); Paul Downey intruducing us to the wonderful world of open source hardware (which is far more extensive than I ever imagined); and Chris Swan (@cpswan)’s review of the Internet of Things in some of his favourite science fiction novels. Oh yes, a a couple of guys from Betfair stood up and tried to plug their new application cloud, which I’m sure is very good but seemed a little too like a vendor pitch to me…

Wrapping up with a panel discussion, before beer and pizza, it was a thoroughly agreeable way to spend the evening and I learned loads about the Internet of things… hopefully I’ll write some more on the topic over the coming weeks.

Starting to play with the Internet of things

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Unlike some people, who find it invasive, I love the concept of the Internet of things. I’m truly excited by some of the possibilities that a world driven by data opens up. Sure, there are issues to overcome (primarily around privacy and connectivity) – but anyone who believes their data isn’t already being captured by service providers (even if those providers don’t yet know how to handle the massive volumes of data) is in for a shock. So why not embrace the possibilities and use our increasingly smart world to our collective advantage?

In my recent presentation to the BCS Internet Special Interest group, I referred to the Technology Strategy Board‘s Future Internet Report, which talks about [emphasis added by me]:

“An evolving convergent Internet of things and services that is available anywhere, anytime as part of an all-pervasive omnipresent socio–economic fabric, made up of converged services, shared data and an advanced wireless and fixed infrastructure linking people and machines to provide advanced services to business and citizens.”

The report also acknowledges the need for more than just “bigger pipes” to handle the explosion in data volumes. We do need a capable access mechanism but we also need infrastructure for the personalisation of cloud services and for machine to machine (M2M) transactions; and we also need convergence to enable a transformational change in both public and private service delivery.

That’s the big picture but scaling back down to a personal level, one of my colleagues, David Gentle (@davegentle – who happens to be the main author of Fujitsu’s Technology Perspectives microsite) highlighted a site called Pachube to me last week. I first came across Pachube a few months back but [partly because it used to be a chargeable service (it became free at the start of this month)] it got added to my “list-of-things-to-have-a-better-look-at-one-day” (that day rarely comes, by the way!). This time I had a better look and I found it to be pretty cool.

Pachube is basically a cloud-based broker for connected devices with a web service to manage real-time data and a growing ecosystem of applications to feed and consume data. That sounded like it might need some programming (i.e. could be difficult for me these days) but then I found a method to hook an energy monitor up to the web, with no coding required!

I’ve written before about the EnergyFit (Current Cost) power meter that E-ON sent me. I wasn’t a fan of E-ON’s software so I hooked it up to Google PowerMeter for a while, but that service has closed down (along with Microsoft’s Hohm service – which I don’t think even made it to the UK). Using a USB to serial driver and a companion application I now have one of my computers feeding data from my Current Cost meter to the Pachube website, where it gets transformed into JSON, XML or CSV format and “magic” can be performed. I used the Mac OS X software versions of the driver and the application but there are also Windows (driver/application) and Linux (driver/application) variants that I have not tested. The process of setting up a Pachube feed has also changed slightly since the original guidance was written but the basic steps are:

  1. Install the USB-serial drivers.
  2. Install the application
  3. Run the application and select the appropriate serial port (for me, on my Mac, that is /dev/tty.usb-serial).
  4. Create a feed (a push feed – and however many times I turn it private it seems to switch back to public…).
  5. Paste the XML version of the feed into the application.
  6. Set up a secure sharing (API) key (you probably don’t want to use the master key) and paste it into the application.
  7. Save preferences and wait for the application to start feeding data, at which point the feed should show as live

The application I used and the Pachube website seem to work together to configure the datastreams within the feed (one for temperature and one for power) and it’s all set to go.

Once the feed is live, there are a load of apps listed on the Pachube website with everything from graphs and visualisations to mapping tools and augmented reality. I decided to create a page to display some of these, starting out with a customisable PNG-based graph from my feed. That worked, so I added another, together with a PachuDial and a couple of PachuBlog gadgets (sadly, these are Flash-based, so don’t work on the iPad…). Next I created a second feed to consume the power usage from the first one and measure the associated carbon footprint.

Having played around with energy usage, I found that I could also use Pachube to monitor my Twitter account (a pull feed this time) – which might be useful too.

Now I’ve mastered the basics with my Current Cost meter, I might try some home automation using Arduino devices – although that looks to have quite a steep learning curve on the electronics front… In the meantime, you can see the Home electricity usage and Twitter statistics pages that I created using just the Pachube platform and some basic HTML.

[Update 30 November 2011: added comment about Pachube becoming free to use]