Getting started with Azure Sphere: Part 2 (integration with Azure services)

This content is 4 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last week, I wrote about my experiences getting some sample code running on an Avnet Azure Sphere Starter Kit. That first post walked through installing the SDK, setting up my development environment (I chose to use Visual Studio Code), configuring the device (including creating a tenant, claiming the device, connecting the device to Wi-Fi, and updating the OS), and downloading and deploying a sample app.

Since then, I’ve managed to make some steps forward with the Element 14 out of the box demo by Brian Willess (part 1, part 2 and part 3). Rather than repeat Brian’s posts, I’ll focus on what I did to work around a few challenges along the way.

Working around compiler errors in Visual Studio Code using the command line

My first issue was that the Element 14 blogs are based on Visual Studio – not Visual Studio Code and I was experiencing issues where Code would complain it couldn’t find a compiler.

Thanks my colleague Andrew Hawker who was also experimenting with his Starter Kit, but using a Linux VM, I had a workaround. That workaround was to run CMake and Ninja from the command line, then to sideload the resulting app package onto the device from the Azure Sphere Developer Command Prompt:

cmake ^
-G "Ninja" ^
-DCMAKE_TOOLCHAIN_FILE="C:\Program Files (x86)\Microsoft Azure Sphere SDK\CMakeFiles\AzureSphereToolchain.cmake" ^
-DAZURE_SPHERE_TARGET_API_SET="4" ^
-DAZURE_SPHERE_TARGET_HARDWARE_DEFINITION_DIRECTORY="C:\Users\%username%\AzureSphereHacksterTTC\Hardware\avnet_mt3620_sk" ^
-DAZURE_SPHERE_TARGET_HARDWARE_DEFINITION="avnet_mt3620_sk.json" ^
--no-warn-unused-cli ^
-DCMAKE_BUILD_TYPE="Debug" ^
-DCMAKE_MAKE_PROGRAM="ninja.exe" ^
"C:\Users\%username%\AzureSphereHacksterTTC\AvnetStarterKitReferenceDesign"
ninja
azsphere device sideload deploy --imagepackage AvnetStarterKitReferenceDesign.imagepackage

I wasn’t able to view the debug output (despite my efforts to use PuTTY to read 192.168.35.2:2342) but I was confident that the app was working on the device so moved on to integrating with cloud services.

Brian Willess has since updated the repo so it should now work with Visual Studio Code (at least for the high level application) and I have successfully tested the non-connected scenario (part 1) with the changes.

Integration with Azure IoT Hub, device twins and Azure Time Series Insights

Part 2 of the series of posts I was working though is where the integration starts. The basic steps (refer to Brian Willess’ post for full details) were:

  1. Create an Azure IoT hub, which is a cloud-hosted back-end for secure communication with Internet of Things (IoT) devices, of which the Azure Sphere is just one of many options.
  2. Create and configure the IoT Hub Device Provisioning Service (DPS), including:
    • Downloading a certificate from the Azure Sphere tenant (using azsphere tenant download-CA-certificate --output CAcertificate.cer at the Azure Sphere Developer Command Prompt) and using this to authenticate with the DPS, including validation with the verification code generated by the Azure portal (azsphere tenant download-validation-certificate --output validation.cer --verificationcode verificationcode) and uploading the resulting certificate to the portal.
    • Creating an Enrollment Group, to enrol any newly-claimed device whose certificate is signed by my tenant. This stage also includes the creation of an initial device twin state, editing the JSON to include some extra lines:
      "userLedRed": false,
      "userLedGreen": false,
      "userLedBlue": true
    • The initial blue illumination of the LED means that we can see when the Azure Sphere has successfully connected to the IoT Hub.
  3. Edit the application source code (I used Visual Studio Code but any editor will do) to:
    • Uncomment #define IOT_HUB_APPLICATION in build_options.h.
    • Update the CmdArgs line in app_manifest.json with the ID Scope from the DPS Overview in the Azure portal.
    • Update the AllowedConnections line in app_manifest.json with the FQDNs from the DPS Overview (Global Device Endpoint) and the IoT Hub (Hostname) in the Azure portal.
    • Update the DeviceAuthentication line in app_manifest.json with the Azure Sphere tenant ID (which may be obtained using azsphere tenant show-selected at the Azure Sphere Developer Command Prompt).
  4. Build and run the app. I used the CLI as detailed above, but this should now be possible within Visual Studio Code.
  5. Use the device twin capabilities to manipulate the device, for example turning LEDs on/off (though clearly there are more complex scenarios that could be used in real deployments!).
  6. Create a Time Series Insights resource in Azure, which is an analytics solution to turn IoT data into actionable insights.
    • Create the Time Series Insights environment using the existing IoT Hub with an access policy of iothubowner and consumer group of $Default.
  7. Add events inside the Time Series Insights to view the sensor readings from the Azure Sphere device.
Time Series Insights showing sensor data from an Azure Sphere device.

Time Series Insights can get expensive for a simple test project without any real value. I could quickly have used my entire month’s Azure credits, so I deleted the resource group used to contain my Azure Sphere resources before moving on to the next section…

Integration with Azure IoT Central

Azure IoT Central is a hosted IoT platform. It is intended to take away much of the underlying complexity and let organisations quickly build IoT solutions using just a web interface.

Following part 3 in Brian Willess’ Azure Sphere series, I was able to get my device working with IoT Central, both using the web interface to control the LEDs on the board and also pushing sensor data to a dashboard. As before, these are the basic steps and refer to Brian Willess’ post for full details:

  1. Create a new IoT Central application.
  2. Select or create a template:
    • Use the IoT device custom template.
    • Either import an existing capability model (this was mine) or create one, adding interfaces (sensors, buttons, information, etc.) and capabilities.
    • Create custom views – e.g. for LED device control or for device metrics.
  3. Publish the template.
  4. Configure DPS:
    • Download a certificate from the Azure Sphere tenant using azsphere tenant download-CA-certificate --output CAcertificate.cer at the Azure Sphere Developer Command Prompt. (This is the same certificate already generated for the IoT Hub example.)
    • Upload the certificate to IoT Central and generate a validation code, then use azsphere tenant download-validation-certificate --output validation.cer --verificationcode verificationcode to apply this.
    • Upload the new validation certificate.
  5. Create a non-simulated device in IoT Central:
  6. Run ShowIoTCentralConfig.exe, providing the ID Scope and a shared access signature key for the device (both obtained from the Device Connection details in IoT Central) and the Device ID (from the device created in the previous step). Make a note of details provided by the tool
  7. Configure the application source code to connect to IoT Central:
    • Uncomment #define IOT_CENTRAL_APPLICATION in build_options.h.
    • Update the CmdArgs line in app_manifest.json with the ID Scope obtained from the Device Connection details in IoT Central.
    • Update the AllowedConnections line in app_manifest.json with the FQDNs obtained by running ShowIoTCentralConfig.exe.
    • Update the DeviceAuthentication line in app_manifest.json with the Azure Sphere tenant ID (which may be obtained using azsphere tenant show-selected at the Azure Sphere Developer Command Prompt).
  8. Build and run the application.
  9. Associate the Azure Sphere device with IoT Central (the device created previously was just a “dummy” to get some configuration details). IoT Central should have found the real device but it will need to be “migrated” to the appropriate device group to pick up the template created earlier.
  10. Open the device and enjoy the data!

I hadn’t expected IoT Central to cost much (if anything, because the first two devices are free) but I think the app I’m using is pretty chatty so I’m being charged for extra messages (30,000 a month sounds like a lot until you realise it’s only around 40 an hour on a device that’s sending frequent updates to/from the service). It seems to be costing just under £1/day (from a pool of credits) so I won’t be worrying too much!

What’s next for my Azure Sphere device?

Having used Brian Willess’ posts at Element 14 to get an idea of how this should work, I think my next step is to buy some external sensors and write some real code to monitor something real… unfortunately the sensors I want are on back order until the summer but watch this space!

Weeknote 17/2020: Geeking out and taking advantage of the sunshine

This content is 4 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Another week of socially-distanced, furloughed fun: here are some of the highlights…

“Playing” with tech: Azure Sphere

I took a break from exam study this week, partly because I had some internal meetings that made a big hole in the calendar and diverted my attention. Instead, I finally got my Azure Sphere Starter Kit IoT device working, with both Microsoft samples and with some more practical advice from Brian Willess at Element 14.

I’m blogging my progress (slightly behind the actual learning) but over the course of a few days, supported by Brian’s blog posts, I managed to get the sensor readings from my device working locally, with Azure IoT Hub and Time Series Insights, and then finally in Azure IoT Central.

The next stop is to try and write some code of my own rather than using other people’s – it’s been a while since I wrote any C/C++!

Blogging

I also wrote some blog posts:

Other geek stuff

I finished watching “Devs“. No spoilers here, but the ending did leave me a little flat…

I didn’t spot any SpaceX Starlink satellites, despite a few attempts and some very clear evenings. This website seemed particularly helpful, although the developer (@modeless) had to remove the Google Street View content when the site got popular.

Being “too British”

Thursday meant my usual trip to the local market, followed by the supermarket, buying provisions for my family and others. Because product availability is a bit “hit and miss”, in the supermarket (and because I prioritise supporting local businesses over the big retailers, where I can), I bought some peppers (capsicums) from the market greengrocer. There was no price displayed but, as he bagged them, he said they were expensive… and he was not wrong: £3/lb, I think! But I was too embarrassed to say “no thanks at that price” so bought them anyway. Lesson learned…

To add insult to injury, when I got to Sainsbury’s they had plenty, at a much more reasonable price…

“On holiday”, in the garden

The week wrapped up with sunshine, low wind and reasonably high temperatures (19°C is not bad for April in England!). After a decent bike ride with my son (permissible under the current social distancing advice), I made the time to just relax a bit…

What a great way to end the week!

Getting started with Azure Sphere: Part 1 (setup and running a sample app)

This content is 5 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Late in 2019, I got my hands on an Azure Sphere Starter Kit, which I’ve been intending to use for an IoT project, using some of the on-board sensors for temperature and potentially an external one for humidity…

For those who aren’t familiar with Azure Sphere, it’s Microsoft’s Secure Internet of Things (IoT) solution using certified chips, a custom operating system and a security service. My device is an Avnet Azure Sphere MT3620 Starter Kit and this blog post focuses on getting it up and running with one of the sample applications that Microsoft provides, using Windows 10 (other options include Linux).

Installing Visual Studio Code and the Azure Sphere SDK

Having obtained the kit, the next stop was Microsoft’s Getting Started with Azure Sphere page. I downloaded and installed Visual Studio Code (I don’t really need the whole Visual Studio 2019 application – though I later found that a lot of the advice on the Internet assumes that’s what you’re using…) and then immediately found that there are two versions of the Azure Sphere Software Development Kit (SDK). According to the Microsoft docs, either can be used with Visual Studio Code but I found the setup for the Azure Sphere SDK for Visual Studio failed when it can’t find Visual Studio (not really surprising) and so I used the Azure Sphere SDK for Windows.

Connecting the hardware

I plugged in the Avnet Azure Sphere Starter Kit, using the supplied USB cable, and watched as Windows installed drivers after which a virtual network interface was present and three COM ports appeared in Device Manager.

Setting up my dev environment

Installing Visual Studio Code and the Azure Sphere SDK was only the first part of getting ready to create code for the device. I needed to install the Azure Sphere extension (easily found in the Extensions Marketplace):

The Azure Sphere extension also installs two dependencies:

  • C/C++
  • CMake Tools

I also need to install CMake (in my case it was version 3.17.1). Not really knowing what I was doing, I followed the defaults but on reflection, I probably should have let CMake add its directory to the system %PATH% variable (I later uninstalled and reinstalled CMake to do this, but could just have added C:\Program Files\CMake\bin to the Path in the user environment variables).

The final installation was Ninja. Windows Defender SmartScreen blocked this app, but I was later able to work around that, by unblocking in the properties for ninja.exe:

I missed the point in the Microsoft documentation that said I needed to manually add Ninja to the %PATH% environment variable but I later went back and added the folder that I copied ninja.exe to (which, for me, was C:\Users\%username%\Tools).

(The above steps were my second attempt – the first time I installed MinGW-W64 to work around issues when Visual Studio Code couldn’t find a compiler, together with several changes in settings.json. I later removed all of that and managed to compile and deploy a sample application using just the settings above…)

Configuring the Azure Sphere device for use

There are a few steps required to configure the device for use. These are all completed using the Azure Sphere Developer Command Prompt, which was installed earlier, with the SDK.

Creating an Azure Sphere tenant and claiming the device

Each Azure Sphere device must be “claimed” and associated with a “tenant”. I followed the Microsoft documentation to do this…

azsphere login --newuser user@domain.tld

After completing Multi-Factor Authentication (MFA) and confirming I wanted to allow Azure Sphere to use my account, I was logged in but with a warning that I don’t have access to any Azure Sphere tenants, so I created one:

azsphere tenant create --name "Mark Wilson"

Warning – more research required: I used a Microsoft Account, as per the Microsoft instructions, but am now concerned I should have used an Azure Active Directory (Organisational/Work or School) account (especially as Role Based Access Control is supported from Azure Sphere 19.10 onwards). As a device can only be claimed once and, once claimed, the device is permanently associated with the Azure Sphere tenant, I’m stuck with these settings now…

I then went ahead and claimed the device:

azsphere device claim

Connecting to Wi-Fi and updating the device operating system

I checked the current OS version on the device:

azsphere device show-deployment-status

As can be seen, not only is the OS out of date, but the device is not connected to a network, so I connected to Wi-Fi:

azsphere device wifi show-status
azsphere device wifi add --ssid "SSID" --psk password
azsphere device wifi show-status

Now, with network connectivity in place, the device had a fighting chance of an OS update and according to the Microsoft documentation:

The Azure Sphere device checks for Azure Sphere OS and application updates each time it boots, when it initially connects to the internet, and at 24-hour intervals thereafter. If updates are available, download and installation could take as much as 15-20 minutes and might cause the device to restart.

Configure networking and update the device OS

I tried several restarts using azsphere device restart with no success. In the end, I left the device connected overnight and, by the morning, it had updated to 20.03.

Finally, I enabled application development on the device, ready to download some code and deploy an application:

azure sphere device enable-development

Downloading a sample app

My initial attempts to use the app that I wanted to didn’t work so I decided to test my setup with one of the Microsoft Quick Starts.

I needed to use git to clone the Azure Sphere Samples Repo, so that meant installing git. Then, from the Terminal in Visual Studio Code, I ran git clone https://github.com/Azure/azure-sphere-samples.git.

I then opened the Samples\HelloWorld\HelloWorld_HighLevelApp folder in Visual Studio Code, ready to build and deploy the app.

Building and deploying the app

Having set up my dev environment, set up the device and downloaded some sample code, I followed the instructions in the Visual Studio Code Azure Sphere Extension to run the following in the Command Palette: Azure Sphere: Configure Settings (selecting High-Level Application) and CMake: Build.

I was then able to build and deploy the sample app to my Azure Sphere device, by starting a debug session (F5) .

and was rewarded with a blinking LED on the board!

Azure Sphere Starter Kit with blinking LED

I can also view the application status with azsphere device app show-status.

Next steps

The next step is to get the app I really wanted to use working on the device, making use of some of the on-board sensors and then integrating this with some of the Azure services. I’m having trouble compiling that code at the moment, so that blog post may be a while longer…

Further reading

When is agile not Agile? And why Waterfall is not always wrong!

This content is 5 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

In 2004 (when I started writing this blog), I was working for a company called conchango*. The developers talked a strange language – about Scrum and XP – and it was nothing to do with Rugby Union or Windows but it did have something to do with sprints…

That was my first encounter with Agile software development methodologies. Not being a coder, I haven’t done a huge amount of agile development, with the infrastructure projects I’ve been involved in generally being run using a traditional “waterfall” approach.

These days, things are different. There’s a huge push for Agile projects and the UK Government Digital Service’s Service Manual even says:

“You must use the agile approach to project management to build and run government digital services.

Agile methods encourage teams to build quickly, test what they’ve built and iterate their work based on regular feedback.”

Agile and government services: an introduction

There’s also a lot of confusion in the marketplace. Colleagues and clients alike are using the word “agile” in different ways. And there’s an undertone that agile is the one true way and waterfall is bad.

No!

Agile/agile/agility

Let’s start off by comparing uses of the word “agile” (in IT) and what they mean:

  1. Agile (big A) often relates to a methodology – for example APMG International’s AgilePM project management methodology or the AgileBA approach to business analysis – but really they have their roots in Agile software development, with the Agile Manifesto, written in 2001.
  2. When we talk about being agile (small a), it’s a mindset: the approach taken. Literally, being able to adapt to change and to move quickly. We might use Agile (big A) approaches to help increase our agility (small a).
  3. Agility is about reaction to change. Many business want to be agile. That doesn’t mean they only run projects with Agile approaches. It means they want the ability to flex and change in line with business requirements.
  4. And then there’s the UK public sector. Specifically Police, who for some reason refer to what the rest of us consider to be remote/mobile working as agile working (as shown in this Agile Working Policy from West Yorkshire Police). That’s just an anomaly.

So that’s Agile/agile/agility sorted then. There are Agile frameworks/methodologies/approaches to delivering outcomes in a more agile manner, to increase organisational agility.

Agile=good, waterfall=bad?

Now waterfall. If Agile, is the one true way, waterfall must be old hat and avoided at all costs, right?

Not at all.

Agile projects work well for quickly creating a minimum viable product (MVP) and iterating development – for example as a series of sprints. They are great when there is a known problem but the requirements are less clear. The solution can evolve in line with the definition of the requirements. The requirements may change as the solution develops: respond to market changes; adapt to new requirements; fail fast.

But some projects are less defined. In a 2018 blog post, Matt Ballantine (@ballantine70) referred to unknown problems with unknown solutions as tinkering. That seems fair – if you don’t know what the issue is, then you can’t have a solution!

Similarly, unknown problems with a known solution. That’s nonsense. Or “WTF?” as Matt so succinctly puts it in his 2×2 diagram:

Matt Ballantine's 2x2 diagram of which path to take, including Agile and Waterfall approaches
Matt Ballantine’s 2×2 for which path to take, including Agile and Waterfall approaches (used with kind permission).

You’ll see though, that there is a place for waterfall project management. Waterfall works when there is a known problem and a known solution. Instead of constantly iterating towards an end, work out the steps to go straight there. It will almost certainly be more efficient. Waterfall projects are based on the golden triangle of time/cost/quality (which together define scope). A known deliverable (scope) bounded by how fast/cheap/good you want it to be – and there’s always a trade-off.

So there we have it. Agile is not a silver bullet and there is still a place for waterfall projects.

What to use, when?

In my line of work, Cloud Transformation might appear to use a combination of Agile and Waterfall approaches. We might create a virtual datacentre in Azure or AWS and take an iterative approach to migrating workloads but that’s still really just Waterfall with incremental delivery – even if a Kanban approach is used to inject some urgency! Similarly migrating batches of mailboxes to the cloud is just iteration, as is a programme that’s adopting Office 365 workloads one by one. An Agile approach comes into its own when we think about Business Transformation, or Digital Transformation, where we can define an MVP and then use sprints to iterate development of a set of new business processes or the digital tools to deliver those processes in a new way.

Further reading

For a clear definition of Cloud, Business and Digital Transformation, see my blog post from last year: “Digital Transformation – it’s not about the Technology“.

* The small c is not a typo – that was the branding!

Weeknote 16/2020: new certifications, electronic bicycle gears, and a new geek TV series

This content is 5 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Another week, another post with some of the things I encountered this week that might be useful/of interest to others…

Fundamentally certified

Last week, I mentioned I had passed the Microsoft Power Platform Fundamentals exam (and I passed the Microsoft Azure Fundamentals and Microsoft 365 Fundamentals exams several months ago). This week, I added Dynamics 365 Fundamentals to that list, giving me the complete set of Microsoft Fundamentals certifications.

That’s 3 exams in 7 “working” days since I was furloughed, so I think next week I’ll give the exams a bit of a rest, knock out some blog posts around the things I’ve learned and maybe play with some tech too…

Website move

Easter Monday also saw this website move to a new server. The move was a bit rushed (I missed some communications from my hosting provider) and had some DNS challenges, but we took the opportunity to force HTTPS and it seems a little more responsive to me too (though I haven’t run any tests). For a long time, I’ve been considering moving to Azure App Service – if only for reasons of geek curiosity – but the support I receive from my current provider means I’m pretty sure it will be staying put for the time being.

The intersection of cycling and technology

Those who follow me on Twitter are probably aware that for large parts of the year, I’m “Cyclist’s Dad”. At weekends in the autumn, I can usually be found in a muddy field somewhere (or driving to/from one), acting as pit crew, principal sponsor and Directeur Sportif for my eldest son – who loves to race his bike, with cyclo-cross as his favourite discipline.

This weekend, we should have been at Battle on the Beach (not technically cyclo-cross but still an off-road race) but that’s been postponed until the Autumn, for obvious reasons.

Instead, we’ve been having fun as my son upgraded his CX bike to electronic gears, using a Shimano Ultegra/GRX Di2 mix.

It’s all been his work – except a little help from Olney Bikes to swap over the bottom bracket (as I lack the tools for changing press-fit BBs) – and the end result is pretty spectacular (thanks also to Corley Cycles/@CorleyCycles with their help sourcing some brake hose inserts at short notice). I’ve never had the good fortune (or budget) for electronic shifting on my bikes but having ridden his yesterday (long story involving a mid-ride puncture on my bike) I was blown away by the difference that all the components he’s swapped to save weight have made and the smooth shifting. Oh yes, and it’s finished with a gold chain. I mean, who doesn’t need a gold chain on their bike?

Electronic shifting has its critics but first impressions, based on a couple of off-road rides this weekend, are very positive. Maybe I need to get a couple of newspaper delivery rounds to start saving for upgrades on my bikes…

TV

Right, it’s getting late now and Sunday night is a “school night” (especially true since my Furlough Leave is being spent focusing on learning and development). I’m off to watch an episode of the BBC’s new drama, “Devs”, before bed. I’m 4 episodes in now and it’s a bit weird but it’s got me hooked…

Weeknote 15/2020: a cancelled holiday, some new certifications and video conferencing fatigue

This content is 5 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Continuing the series of weekly blog posts, providing a brief summary of notable things from my week.

Cancelled holiday #1

I should have been in Snowdonia this week – taking a break with my family. Obviously that didn’t happen, with the UK’s social distancing in full effect but at least we were able to defer our accommodation booking.

It has been interesting though, being forced to be at home has helped me to learn to relax a little… there’s still a never-ending list of things that need to be done, but they can wait a while.

Learning and development

Last week, I mentioned studying for the AWS Cloud Practitioner Essentials Exam and this week saw me completing that training before attempting the exam.

It was my first online-proctored exam and I had some concerns about finding a suitable space. Even in a relatively large home (by UK standards), with a family of four (plus a dog) all at home, it’s can be difficult to find a room with a guarantee not to be disturbed. I’ve heard of people using the bathroom (and I thought about using my car). In the end, and thanks to some advice from colleagues – principally Steve Rush (@MrSteveRush) and Natalie Dellar (@NatalieDellar) – as well as some help from Twitter, I managed to cover the TV and some boxes in my loft room, banish the family, and successfully pass the test.

With exam 1 under my belt (I’m now an AWS Certified Cloud Practitioner), I decided to squeeze another in before the Easter break and successfully studied for, and passed, the Microsoft Power Platform Fundamentals exam, despite losing half a day to some internal sales training.

In both cases, I used the official study materials from Amazon/Microsoft and, although they were not everything that was needed to pass the exams, the combination of these and my experience from elsewhere helped (for example having already passed the Microsoft Azure Fundamentals exam meant that many of the concepts in the AWS exam were already familiar).

Thoughts on the current remote working situation

These should probably have been in last week’s weeknote (whilst it wasn’t the school holidays so we were trying to educate our children too) but recently it’s become particularly apparent to me that we are not living in times of “working from home” – this is “at home, during a crisis, trying to work”, which is very different:

Some other key points I’ve picked up include that:

  • Personal, physical and mental health is more important than anything else right now. (I was disappointed to find that even the local Police are referring to mythical time limits on allowed exercise here in the UK – and I’m really lucky to be able to get out to cycle/walk in open countryside from my home, unlike so many.)
  • We should not be trying to make up for lost productivity by working more hours. (This is particularly important for those who are not used to remote working.)
  • And, if you’re furloughed, use the time wisely. (See above re: learning and development!)

Video conference fatigue

Inspired by Matt Ballantine’s virally-successful flowchart of a few years ago, I tried sketching something. It didn’t catch on in quite the same way, but it does seem to resonate with people.

In spite of my feelings on social video conferencing, I still took part in two virtual pub quizzes this week (James May’s was awful whilst Nick’s Pub Quiz continues to be fun) together with trans-Atlantic family Zooming over the Easter weekend…

Podcast backlog

Not driving and not going out for lunchtime solo dog walks has had a big impact on my podcast-listening…

I now need to schedule some time for catching up on The Archers and the rest of my podcasts!

Remote Work Survival Kit

In what spare time I’ve had, I’ve also been continuing to edit the Remote Work Survival Kit. It’s become a mammoth task, but there are relatively few updates arriving in the doc now. Some of the team have plans to move things forward, but I have a feeling it’s something that will never be “done”, will always be “good enough” and which I may step away from soon.

Possibly the best action film in the world…

My week finished with a family viewing of the 1988 film, “Die Hard”. I must admit it was “a bit more sweary” than I remembered (although nothing that my teenagers won’t already hear at school) but whilst researching the film classification it was interesting to read how it was changed from an 18 to a 15 with the passage of time

Weeknote 14/2020: Podcasting, furlough and a socially-distanced birthday

This content is 5 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

We’re living in strange times at the moment, so it seems as good as ever an opportunity to bring back my attempts to blog at least weekly with a brief precis of my week.

In the beginning

The week started as normal. Well, sort of. The new normal. Like everyone else in the UK, I’m living in times of enforced social distancing, with limited reasons to leave the house. Thankfully, I can still exercise once a day – which for me is either a dog walk, a run or a bike ride.

On the work front, I had a couple of conversations around potential client work, but was also grappling with recording Skills Framework for the Information Age (SFIA) skills for my team. Those who’ve known me since my Fujitsu days may know that I’m no fan of SFIA and it was part of the reason I chose to leave that company… but it seems I can’t escape it.

Podcasting

On Monday evening, I stood in for Chris Weston (@ChrisWeston) as a spare “W” on the WB-40 Podcast. Matt Ballantine (@Ballantine70) and I had a chat about the impact of mass remote working, and Matt quizzed me about retro computing. I was terrible in the quiz but I think I managed to sound reasonably coherent in the interview – which was a lot of fun!

Furlough

A few weeks ago, most people in the UK would never have heard of “Furlough Leave”. For many, it’s become common parlance now, as the UK Government’s Job Retention Scheme becomes reality for hundreds of thousands, if not millions of employees. It’s a positive thing – it means that businesses can claim some cash from the Government to keep them afloat whilst staff who are unable to work due to the COVID-19/Coronavirus crisis restrictions are sent home. In theory, with businesses still liquid, we will all have jobs to go back to, once we’re allowed to return to some semblance of normality.

On Tuesday, I was part of a management team drawing up a list of potentially affected staff (including myself), based on strict criteria around individuals’ current workloads. On Wednesday it was confirmed that I would no longer be required to attend work for the next three weeks from that evening. I can’t provide any services for my employer – though I should stay in touch and personal development is encouraged.

Social distancing whilst shopping for immediate and extended family

So, Thursday morning, time to shop for provisions: stock is returning to the supermarket shelves after a relatively small shift in shopping habits completely disrupted the UK’s “just in time” supply chain. It’s hardly surprising as a nation prepared to stay in for a few weeks, with no more eating at school/work, no pubs/cafés/restaurants, and the media fuelling chaos with reports of “panic buying”.

Right now, after our excellent independent traders (like Olney Butchers), the weekly town market is the best place to go with plenty of produce, people keeping their distance, and fresh air. Unfortunately, with a family of four to feed (and elderly relatives to shop for too), it wasn’t enough – which meant trawling through two more supermarkets and a convenience store to find everything – and a whole morning gone. I’m not sure how many people I interacted with but it was probably too many, despite my best efforts.

Learning and development

With some provisions in the house, I spent a chunk of time researching Amazon Web Services certifications, before starting studying for the AWS Cloud Practitioner Essentials Exam. It should be a six hour course but I can’t speed up/slow down the video, so I keep on stopping and taking notes (depending on the presenter) which makes it slow going…

I did do some Googling though, and found that a combination of Soundflower and Google Docs could be used to transcribe the audio!

I also dropped into a Microsoft virtual launch event for the latest Microsoft Business Applications (Dynamics 365 and Power Platform) updates. There’s lots of good stuff happening there – hopefully I’ll turn it into a blog post soon…

#NicksPubQuiz

Saturday night was a repeat of the previous week, taking part in “Nick’s Pub Quiz”. For those who haven’t heard of it – Nick Heath (@NickHeathSport) is a sports commentator who, understandably, is a bit light on the work front right now so he’s started running Internet Pub Quizzes, streaming on YouTube, for a suggested £1/person donation. Saturday night was his sixth (and my family’s second) – with over 1500 attendees on the live stream. Just like last week, my friend James and his family also took part (in their house) with us comparing scores on WhatsApp for a bit of competition!

Another year older

Ending the week on a high, Sunday saw my birthday arrive (48). We may not be able to go far, but I did manage a cycle ride with my eldest son, then back home for birthday cake (home-made Battenberg cake), and a family BBQ. And the sun shone. So, all in all, not a bad end to the week.