Author: Mark Wilson

  • Working with legacy tech: accessing old web portals that use an insecure TLS version

    Working with legacy tech: accessing old web portals that use an insecure TLS version

    In my last post, I wrote about importing MiniDV tape content to a modern computer. That leads nicely into today’s topic… because modern computers tend not to have huge amounts of local storage. We generally don’t need it, because we store our files in the cloud, and only use the local drive as a cache. But what about when you’re importing large amounts of data (say video), and you want somewhere to stage it locally, with a little more space?

    I was about to throw away an old NetGear ReadyNAS Duo unit (that’s been in disgrace ever since a disk failure taught me the hard way that RAID1 is not a backup…), but then I thought it might be useful to stage some video content, before moving it somewhere safer.

    Accessing the ReadyNAS

    First problem was knowing what IP address it had. I managed to find that using NetGear’s RAIDar utility. But, to change the IP address (or any other settings), I needed to use the admin console. And that gave me a problem: my browser refused to connect to the site, saying that the connection was not secure and that it uses an unsupported protocol.

    Well, it’s better than a modern cutesey “Something went wrong” message. It gave me a clue as to the problem – SSL version or Cipher mismatch – sounds like out of date TLS. Indeed it is, and Gøran B. Aasen wrote about the challenge in March 2022, along with a potential solution for certain ReadyNAS devices.

    I’m not bothered about upgrading Apache to support TLS 1.2 – but I did still need to administer the device. I tried messing around with browser settings in Edge, Chrome and Firefox but had no luck. The transition period is over. TLS 1.0 is not supported at all. Then I had an idea… what if I installed an older browser version? And instead of installing it, what if I used a portable app version?

    Tada!

    Firefox 73.0.1 being used to access the ReadyNAS admin console

    So, here we go, Firefox 73.0.1 from PortableApps, via SourceForge. And I’m successfully accessing the ReadyNAS admin console.

    The risk statement

    For security folks who will tell me why this is a really bad idea, I know. So here’s the disclaimer. You’re only at risk when you’re using that old browser, because you didn’t install it on your system – it’s a portable app. And you’ll only use that old browser to access this one website, so when you’re not accessing the website, you’ll have closed it down, right? That way you are taking a calculated risk and mitigating it by minimising the time you have the old software running for.

    As for publishing an internal IP on the web… yes, you’ve got me there…

    Featured image: author’s own.

  • Working with legacy tech: importing MiniDV tape content to a modern Mac

    Working with legacy tech: importing MiniDV tape content to a modern Mac

    It’s a common problem: aging media on legacy technology; valuable content that needs to be preserved. For me, that content is my travels to Australia and South Africa when I was in my 20s; getting married; and early days with our children (who are now adults). And whilst losing some of those videos (never watched or edited) would simply be annoying, the footage of the-tiny-people-who-grew-into-men would be devastating to lose.

    So I have a project, to import as much of the content as I can to a modern computing system. I even used it as a justification to buy myself a new MacBook Pro (it’s “man maths” – don’t ask*).

    What’s the problem?

    1. My digital video camera is a Sony DCR-TRV900E from circa 1998. The media are MiniDV cassettes (digital, but read serially). The interface is Firewire 400.
    2. My Mac has Thunderbolt 3 ports, using the USB-C form factor.

    I needed to connect these two things.

    And the hardware solution?

    The Internet is full of videos about this – and it’s a cable and adapter setup that is so Heath Robinson-esque it shouldn’t work. But it seems to. For now. My task is to import the content before it stops working!

    1. First up, the Firewire 400 cable from the camcorder. Firewire 400 is also known as IEEE1394 or Sony i.LINK. I already had a suitable cable, possibly supplied with the camera.
    2. Then, a Firewire 400 to 800 adapter. I used this one from Amazon. That’s the inexpensive part.
    3. Next, Firewire 800 to Thunderbolt 2. Apple no longer sells this adapter so it’s expensive if you can find one. There are some very-similar looking ones on AliExpress, but they were only a little less expensive than the genuine Apple one that I found here: Apple Thunderbolt to FireWire 800 Adapter Dongle A1463. I paid £85 (ouch).
    4. Finally, Thunderbolt 2 to Thunderbolt 3. These are still available from Apple (Thunderbolt 3 (USB-C) to Thunderbolt 2 Adapter), but are another £49 so I saved a few quid by buying one second-hand.

    The whole setup looks like this (£150 of electronics… but priceless memories, remember…):

    How to import the footage

    1. Connect the Camcorder to the Mac, using the various cables and adapters.
    2. Put the Camcorder into Video Tape Recorder (VTR) mode and insert a tape. Rewind it to the start.
    3. Fire up iMovie on the Mac and create/open a library.
    4. Click Import Media (from the File menu). Select the camera (e.g. DCR-TRV900E) and you should see the first frame.
    5. Click Import.

    Then, sit back and wait as the contents of the video are read, serially, and imported to iMovie. Other video editing packages may also work, but I used what I already had installed with macOS. Just remember that the transfer happens in real-time – but it’s also an opportunity to get nostalgic about times past. You’ll also need to stop the import when you get to the end of the footage.

    Now, does anyone have an Advanced Photo System (APS) film attachment for a Sony Coolscan IV ED film scanner?

    *I do have functioning Mac Minis from 2006 and 2012 with Firewire connections, but the 2006 Mini is long-unsupported and the versions of MacOS and iMovie it has are alarmingly dated. The 2012 one is better, but I found I was getting application crashes on import. That drove me to invest in a solution that would work with my M4 MacBook Pro…

    Featured image: author’s own.

  • Sorting out my smart home

    Sorting out my smart home

    A few years ago, I bought some Philips Hue lights. Then I extended the system with some Innr bulbs, and some IKEA Trådfri lights too. It’s mostly worked OK: the Philips and Innr stuff is flawless; the IKEA kit has its foibles (but was cheap).

    More recently, I started to look at Home Assistant and I installed a pre-built “Home Assistant Green” unit. One of the things I bought was a SkyConnect dongle – for Zigbee support. When I set up Home Assistant it discovered my Hue bridge and all the lights, so I didn’t use the SkyConnect at first, until I started to add some more sensors that Hue didn’t recognise.

    Consolidating Zigbee networks

    I couldn’t understand why my lights were not acting as Zigbee routers for the door/window sensors I had purchased – and then I realised that I actually have two Zigbee networks in the house now! So I started to move the Hue lights over to the Zigbee Home Assistant (ZHA) network that the SkyConnect had established.

    The next problem was, predictably, an IKEA bulb I’d purchased. When I originally joined it to Hue, I had problems with the official app and I needed to use TouchLink via the Hue Essentials app. I’m having no such luck on ZHA – I can’t even get the bulb into pairing mode… so it’s presently just a normal (not smart) lamp…

    Extending the reach

    With most of the Hue and Innr lamps now migrated to ZHA, I’ve started to add other Zigbee devices to the network now. Firstly, I have some cheap (£3) temperature and humidity sensors from Tuya that I bought on AliExpress. One of these is monitoring the levels overnight in the Man Cave and turning on a radiator if necessary overnight to keep the humidity down. I’ve also put a Tuya Zigbee signal repeater in the kitchen, to help boost the signal towards the garden (at least until I put some more bulbs in that will act as routers).

    I’ve since found that the SkyConnect is not necessarily the best Zigbee co-ordinator (people seem to recommend a Sonoff model instead), but it has the advantage of being supported by Nabu Casa – the creators of Home Assistant. And then there’s the choice between ZHA and some other competing approaches… let’s not go into that here.

    A couple more things…

    A year or so ago, I wrote about using NFC tags to automate some of the routines in my house. That’s all been moved across to Home Assistant now, with the advantage that the automations are no-longer device-specific and anyone with the Home Assistant app can scan a tag.

    As for the app… after an incident last summer when the garden lights turned off (on a timer) and my wife wanted to switch them back on but didn’t know how, I gave access to the app. She is pleased that she now has “the power”. I am pleased that the system is usable by non-geeks. I do need to learn how to create Home Assistant dashboards though, because the list of connected devices and entities is getting quite extensive.

    I still can’t get my family to stop turning off the lights at the switch though… so I need to find some UK-compatible Zigbee smart switches that will be acceptable for use!

    Featured image by Gerd Altmann from Pixabay.

  • Monthly retrospective: January 2025

    Monthly retrospective: January 2025

    Last year I tried a thing – another attempt at weeknotes. Weeknotes became monthly retrospectives. Monthly retrospectives sometimes became every two months… and then they dried up completely last summer. I’m sorry. I was busy and, to be honest, this blog is not as important to me as it once was.

    But then, an anonymous commenter said that they miss them and asked me to fill the gap to the end of 2024. That might happen (or it might join the great list of unwritten blog posts in the sky), but let’s have another go at the present. So, 31 January, 2025. Monthly retrospective…

    At work

    Things have really stepped up a gear at work. Last year I started to work on a future vision around which the Office of the CTO could structure its “thought leadership” content. Some important people supported it and I found myself co-presenting to our executive board. The next steps will remain confidential, but it’s no bad thing for me. And, the follow-on work has given me a lot of exposure to some of the marketing activities – my last fortnight has been full of market analysis and ideal client profiles.

    But the last fortnight was not just those things. I had the hairbrained idea that, as productivity is is one of the outcomes we seek for our clients, maybe we should “do something for National Productivity Week”. After writing a series of blog posts (see below), and a fun day recording video content with our brand team, it feels like a one-man social media takeover. In fact, we had so much content that some of it will now have to go out next week. But that’s OK – productivity is not just for one week of the year. These are the posts that are live on the Node4 website today:

    And the last post, next week, will be about building sustainable productivity approaches.

    There are also a couple of videos up on LinkedIn:

    And, earlier in the month (actually, it sneaked out on YouTube before Christmas but I asked for it to be pulled for an edit), there was this one. Not my best work… but it did lead to the purchase of a teleprompter which has made later videos so much easier!!!

    Learning

    Also on the work front, this month I completed my ILM Level 5 Award in Leadership and Management. Node4 runs this as part of a 7-month programme of workshops, with two coursework assignments that relate to four of the workshops. Over the last 7 months, I’ve covered:

    • Developing your personal leadership brand.
    • Inclusive leadership and motivation skills.
    • Managing and implementing strategic change.
    • Developing a High-performance team culture.
    • Manager as a coach.
    • Personal impact and emotional intelligence.
    • High impact presentations.

    At home

    Home Automation

    I bought myself a Shelly temperature and humidity monitor for the Man Cave. It’s Home Assistant compatible, of course, so lets me use cheap overnight energy to stop the cabin from getting too cold/damp.

    Also on the home automation front, I picked up some cheap Tapo P100 smart plugs. Like my no-name Chinese ESP32-based plugs, they are a better form factor than my older Kasa HS100/110 plugs so they don’t take space from the adjacent socket. But they lack any kind of reporting for energy usage so I should have got a pack of the slightly more expensive P110 models instead. I also struggled to add them to Home Assistant. They were recognised but wouldn’t authenticate, unless I reset my TP-Link password (which seemed to be the workaround – even if the password was the same)!

    Getting away from it all

    Aside from the tech, Mrs Wilson and I got away to London for a weekend, to celebrate a friend’s birthday. We were almost blown away by the tail of Storm Éowyn at Primrose Hill viewpoint but had fun (I’d never been before, but it’s in so many films!).

    Tomorrow, I’m off to France for the UCI Cyclocross World Championships. Not competing of course (and disappointed that British Cycling is not sending a Women’s team or an U23 Men’s team). Just spectating. And probably consuming quite a lot of beer. And frites.

    Writing

    There have been some personal blog posts this month too:

    In pictures

    Some snaps from my Instagram:

  • Digital transformation is only as good as the supporting processes

    Digital transformation is only as good as the supporting processes

    Earlier today, I received a penalty charge notice. I’d dropped my son at the airport a couple of weeks ago – and thought my car was registered to auto-pay the Heathrow terminal drop-off charge. Apparently it’s not, because APCOA wrote to me demanding £80 for my mistake, reduced to £40 if I paid within 14 days (five of which had already passed because they used a slow postal service). Hey ho. It was a mistake. One which I’ll hopefully not make again. It’s annoying though.

    It reminded me of another letter on my desk. You see my confusion about autopayment came about because I do have AutoPay set up for the various London charges – Congestion Charge, ULEZ, Dartford Crossing. All the Transport for London (TfL) ones but not Heathrow Airport, it would seem…

    A mystery charge based on flawed process

    Last month, I was checking the transactions on my credit card and I spotted an extra charge to TfL. It seemed strange so I logged on to my account. My son had driven into the ULEZ one day (which I knew about and expected the charge for), but there was another charge he didn’t recognise.

    Our car’s registration is KU07 ABC (it’s not really, but I’ve changed the details enough to tell the story, without publishing personal information). When I checked my online account, it showed a picture of KO07 ABC. But the ANPR had identified it as KD07 ABC. KD07 ABC is not a valid registration, so somewhere, either a human or an AI had decided that the charge should be allocated to our car. I suspect it was either based on the fact that our car had been driven in the ULEZ zone previously, or because someone has to check these things manually and they get very, very bored. Regardless, our Volkswagen Golf was not the Seat Ibiza in the photo.

    The cheque’s in the post

    I contested the charge and was pleased to get an email a few days later that confirmed my complaint had been upheld, based on the evidence provided (TfL’s own photos from my account). But the part that amused me was this – the refund for this highly automated digital charging process was to be sent in the form of a cheque.

    So, I have a very analogue cheque for £12.50, to pay into my account (as it’s 2025, I shall do this with a digital photo), but all as the result of a digital process that doesn’t quite work…

    Postscript

    A couple of days after I wrote this post, my Nectar card was used fraudulently. Someone managed to spend 1000 points (I think that’s £5) but it wasn’t me.

    I contacted Nectar, who stopped the card and will issue another. But the process I had to go through was terrible. Before I could start an online chat session I needed to provide name, card number, and various other details. Then I reached an awful chat interface using Oracle software, which worked in a mobile browser but was in no way optimised for the screen I was using.

    The agent then proceeded to ask me for the same details I had already provided. By this point I was very nervous about phishing attempts and reluctant to provide any information. It turned out to just be a shockingly bad process.

    Featured image: Traffic sign image licensed under the Open Government Licence version 1.0 (OGL v1.0).

  • A few thoughts on the UK Government’s AI announcement

    A few thoughts on the UK Government’s AI announcement

    Most of the text in this post previously appeared on my LinkedIn feed. I thought it should have been here…

    Sometimes, I read something on LinkedIn and repost or comment, before realising I’ve pretty much written an entire blog post. On my phone. Twice, because I navigated away and lost the first attempt. Maybe I should have put in here, but it probably gets seen by more people on LinkedIn. Still, I own this platform, so I’m putting it up for posterity.

    The post in question was one from the BBC’s Technology Editor, Zoe Kleinman. Zoe had posted insights about the UK Prime Minister’s “bold and ambitious plans to support the UK’s AI sector”.

    Zoe’s post and articles are well worth a read, but I wanted to add some more:

    “[…] I can see why the UK wants to position itself as an innovative place for growth, without being (quite as) reliant on US tech behemoths, but most of us have yet to establish what we want to use AI for.

    Sure, “AI” is the perceived answer to everything at the moment – and there are some very large companies with very deep pockets pouring billions into “AI” – but it’s an arms race. “Big tech” hasn’t worked out how to make money from its AI investments yet. The tech giants just want to make sure they have a big slice of that pie when we do finally get there.

    Putting aside the significant environmental and social challenges presented by AI (as mentioned in Zoe’s post […]), “we” (our companies and our countries) haven’t got a solid business case. We just know we can’t afford to be left behind…

    We’ve used some AI technologies in a variety forms for years (for example Machine Learning) – and the recent advances in generative AI (genAI) have democratised access to AI assistants and opened a huge opportunity. But genAI is just one type of AI, and we don’t fully understand the large language models that underpin it.

    One thing that sticks in my mind is something I heard on a recent podcast, when Johannes Kleske commented something along the lines of “when it’s in the future, it’s AI. Once we have worked out what to do with it, it’s just software.”

    More on the UK Prime Minister’s AI announcement

    Artificial Intelligence: Plan to ‘unleash AI’ across UK revealed [BBC News]

  • Are you in the UK and looking at using Apple AirPods Pro 2 as hearing aids? Read this first!

    Are you in the UK and looking at using Apple AirPods Pro 2 as hearing aids? Read this first!

    I’m sorry for the clickbait headline, but the urgency is real, because I’m seeing people making purchasing decisions based on a technical feature that’s not available in the UK yet.

    If you’re a middle-aged man or woman, you may have noticed that it’s difficult to hear people in some social situations. I certainly have, and so have some of my friends. Generally in pubs and bars with hard surfaces and lots of background noise.

    I tell myself that I need to get a professional hearing test. I keep trying at Specsavers when I have my eyes tested but have struggled with appointment availability. And anyway, it’s not that bad. Plus I don’t have a couple of thousand pounds ready for buying hearing aids.

    Apple is bringing Hearing Health capabilities to the masses

    When I heard that Apple AirPods Pro 2 have hearing aid capabilities, I was very interested. A consumer tech device that might help me in those limited circumstances when I need to wear a hearing aid, without the financial outlay.

    It’s been possible to create an audiogram and use it with your AirPods (or other headphones) for a while, but there’s lots of excitement as Apple is bringing Hearing Health capabilities natively to the iPhone with iOS18 and with AirPods Pro 2. But, if you’re in the UK, you might want to hold off…

    Here’s the problem: AirPods Pro 2 do not yet have regulatory approval as hearing aids in the UK.

    They do in many other countries, but not here. Not at the time of researching this post in late-November 2024. But there is a global website, and a global ad campaign. Apple even says in the notes for this ad that:

    “The Hearing Test and Hearing Aid features are regulated health features that require approval and will be offered after authorization is received. Feature availability varies by region”

    Unfortunately, I’ve seen people (including those with profound hearing loss) saying they will ask Santa for some AirPods Pro for Christmas, based on advertising this feature.

    So, what can you do?

    1. Firstly, and I rarely give this advice to anyone, turn off automatic updates. Do not let your iPhone update to iOS 18.x. Manually apply updates for 17.x. Of course, that means you won’t get other iOS18 goodness either, but Apple Intelligence isn’t available in the UK yet either (like the Heading Aid feature, it’s “coming soon”).
    2. Then, download the Mimi app, find a quiet space and carry out a hearing test. Follow these instructions to save the audiogram to Apple Health and set up the Headphone Accomodations for your AirPods. Basically, you can get some what Apple will bring to the UK, but only with older operating systems that don’t have the Apple capabilities built in (and turned on for other regions).
    3. Finally, keep an eye on the Apple website. This is the page that has the details on regional availability for the Apple Hearing Heath features.

    One more thing

    The new Hearing Health features are for Apple AirPods Pro 2. I checked mine: they are listed on my receipt as “AirPods Pro (2nd generation)”. Is that the same thing? The short answer is “yes”, but it took me a while to get that information.

    I had an infuriating online chat with Apple Support, who seemed incapable of understanding my question, despite me providing serial numbers and product codes. Thankfully, I also found an Apple support article, which gave me the answer (yes). Mine are model number A3048 which is now called “AirPods Pro 2 with MagSafe Charging Case (USB-C)”. Why can’t they just say “the marketing folks changed the name”?

    Featured image by Miguel Angel Avila on Unsplash.

  • Microsoft Ignite 2024 on a page

    Microsoft Ignite 2024 on a page

    You probably noticed, but Microsoft held its Ignite conference in Chicago last week. As is normal now, there’s a “Book of News” for all the major announcements and the keynotes are available for online review. But there’s an awful lot to sort through. Luckily, CNET created a 15 minute summary of Satya Nadella’s keynote:

    Major announcements from Ignite 2024

    Last year, I wrote about how it was clear that Microsoft is all about Artificial Intelligence (AI) and this year is no different. The rest of this post focuses on the main announcements with a little bit of analysis from yours truly on what the implications might be.

    AnnouncementWhat it meansFind out more
    Investing in security, particularly around Purview.Data governance is of central importance in the age of AI. Microsoft has announced updates to prevent oversharing, risky use of AI, and misuse of protected materials. With one of the major concerns being accidental access to badly-secured information, this will be an important development, for those that make use of it.https://aka.ms/Ignite2024Security/
    Zero Day QuestA new hacking event with $4m in rewards. Bound to grab headlines!https://aka.ms/ZeroDayQuest
    Copilot as the UI for AIIf there’s one thing to take away from Ignite it’s that Microsoft sees Copilot as the UI for AI (it becomes the organising layer for work and how it gets done).

    1. Every employee will have a Copilot that knows them and their work – enhancing productivity and saving time.
    2. There will be agents to automate business processes.
    3. And the IT dept has a control system to manage secure and measure the impact of Copilot.
    Copilot ActionsCopilot Actions are intended to reduce the time spent on repetitive everyday tasks – they were described as “Outlook Rules for the age of AI” (but for the entire Microsoft 365 ecosystem). I’m sceptical on these but willing to be convinced. Let’s see how well they work in practice.https://aka.ms/CopilotActions
    Copilot AgentsIf 2023-4 were about generative AI, “agentic” computing is the term for 2025.

    There will be Agents within the context of a team – teammates scoped to specific roles – e.g. a facilitator to keep meeting focus in Teams and manage follow-up/action items; a Project Management Agent in Planner – to create a plan and oversee task assignments/content creation; self-service agents to provide information – augmenting HR and IT departments to answer questions and complete tasks; and a SharePoint Agent per site – providing instant access to real-time information.

    Organisations can create their own agents using Copilot Studio – and the aim is that it should be as easy to create an Agent as it is to create a document.
    https://aka.ms/AgentsInM365
    Copilot AnalyticsAnswering criticism about the cost of licensing Copilot, Microsoft is providing analytics to correlate usage to a business metric. Organisations will be able to tune their Copilot usage to business KPIs and show how Copilot usage is translating into business outcomes.https://aka.ms/CopilotAnalytics
    Mobile Application Management on Windows 365Microsoft is clearly keen to push its “cloud PC” concept – Windows 365 – with new applications so that users can access a secure computing environment from iOS and Android devices. Having spent years working to bring clients away from expensive thin client infrastructure and back to properly managed “thick clients”, I’m not convinced about the “Cloud PC”, but maybe I’m just an old man shouting at the clouds…https://aka.ms/WindowsAppAndroid
    Windows 365 LinkWindows 365 Link is a simple, secure purpose built access device (aka a thin PC). It’s admin-less and password-less with security configurations enabled by default that cannot be turned off. The aim is that users can connect directly to their cloud PC with no data left locally (available from April 2025). If you’re going to invest in this approach, then it could be a useful device – but it’s not a Microsoft version of a Mac Mini – it’s all about the cloud.https://aka.ms/Windows365Link
    Windows Resiliency InitiativeDoes anyone remember “Trustworthy Computing”? Well, the Windows Resiliency Initiative is the latest attempt to make Windows more secure and reliable. It includes new features like Windows Hotpatch to apply critical updates without a restart across an entire IT estate. https://aka.ms/WinWithSecurity
    Azure LocalA rebranding and expansion of Azure Stack to bring Azure Arc to the edge. Organisations can run mission critical workloads in distributed locations.https://aka.ms/AzureLocal
    Azure Integrated HSMMicrosoft’s first in-house security chip hardens key management without impacting performance. This will be part of every new server deployed on Azure starting next year.https://aka.ms/AzureIntegratedHSM
    Azure BoostMicrosoft’s first in-house data processing unit (DPU) is designed to accelerate data-centric workloads. It can run cloud storage workloads with 3x less power and 4x the performance.https://aka.ms/AzureBoostDPU
    Preview NVIDIA Blackwall AI infrastructure on AzureBy this point, even I’m yawning, but this is a fantastically fast computing environment for optimised AI training workloads. It’s not really something that most of us will use.https://aka.ms/NDGB200v6
    Azure HBv5Co-engineered with AMD, this was described as a new standard for high performance computing and cited as being up to 8 times faster than any other cloud VM.https://aka.ms/AzureHBv5

    FabricSQL Server is coming natively to Fabric in the form of Microsoft Fabric Databases. The aim here is to simplify operational databases as Fabric already did for analytical requirements. It provides an enterprise data platform that serves all use cases, making use of open source formats in the Fabric OneLake data lake. I have to admit, it does sound very interesting, but there will undoubtedly be some nuances that I’ll leave to my data-focused colleagues.https://aka.ms/Fabric
    Azure AI FoundryDescribed as a “first class application server for the AI age” – unifying all models, tooling, safety and monitoring into a single experience, integrated with development tools as a standalone SDK and a portal. 1800 models in the catalogue for model customisation and experimentation.https://aka.ms/MaaSExperimentation
    https://aka.ms/CustomizationCollaborations
    Azure AI Agent ServiceBuild, deploy and scale AI apps to automate business processes. Compared with Copilot Studio for a graphical approach, this provides a code-first approach for developers to create agents, grounded in data, wherever it is.https://ai.azure.com/
    Other AI announcementsThere will be AI reports and other management capabilities in Foundry, including including evaluation of models.

    Safety is important – with tools to build secure AI including PromptShield to detect/block manipulation of outputs and risk/safety evaluations for image content.
    Quantum ComputingThis will be the buzzword that replaces AI in the coming years. Quantum is undoubtedly significant but it’s still highly experimental. Nevertheless, Microsoft is making progress in the Quantum arms race, with a the “World’s most powerful quantum computer” with 24 logical Qubits, double the previous record.https://aka.ms/AQIgniteBlog

    Featured image: screenshots from the Microsoft Ignite keynote stream, under fair use for copyright purposes.

  • Putting AI to work: making content more accessible

    Putting AI to work: making content more accessible

    I’m really struggling with AI right now. On the one hand, it’s infuriating when it doesn’t help me with the task or activity that I prompt it to, and hallucinates total garbage. I’m also concerned for humanity, as the blind faith that some have in AI seems to me to be pretty dangerous. But, on another level, if we strip away the overenthusiastic promises, and look at the little things that are now embedded in our productivity tools, it can be quite impressive.

    Multiple takes

    Today, I recorded a short video for an internal SharePoint site that supports some work I’m doing around Node4’s 2030 vision and strategy. It took about 8 takes and a lot of wrangling with technology for a simple talking head in front of a background. But I got there and, after an hour or so, I had the minute or so of content I wanted. I didn’t use professional video tools – just the things I have in Microsoft 365 and Windows 11 – I recorded myself in Teams against a custom background I’d created in PowerPoint, used Clipchamp to trim the video, and uploaded to Stream.

    Copilot has entered the room

    Then the cool stuff started. Microsoft Stream generated captions from the video. They weren’t quite right – but the edits were minimal and it let me make them. That means that I now have a working transcript and closed captions – assistive technologies to aid viewers.

    But the impressive part was when I asked Copilot to summarise the content. This is what it came up with.

    Key Topics:

    • Introduction to Vision2030: Mark Wilson, one of the Technology and Innovation Directors in Node4’s Office of the CTO, introduced Vision2030, a forward-looking strategy that outlines key trends and technologies expected to define the future of mid-market businesses. 0:00
    • Focus of Vision2030: Mark Wilson explained that Vision2030 focuses on integrating emerging innovations across all technology verticals in which Node4 operates, aiming to empower clients to anticipate and adapt to future technology trends, navigate complexity, and increase competitiveness and resilience. 0:21
    • Creating Lasting Value: Mark Wilson emphasized that Vision2030 is not just about adopting new technologies but also about helping clients create lasting value. 0:39
    • Future Content and Sales Plays: Mark Wilson mentioned that over the coming weeks and months, Node4 will create more content that builds on the core theme of enabling productivity and directly links to their sales plays, ensuring a clear connection from Vision through go-to-market to delivery. 0:45
    • Commitment to Mid-Market Businesses: Mark Wilson reiterated Node4’s commitment to guiding mid-market businesses through the transformation, ensuring they are well-positioned to succeed in the future digital economy. 0:57

    Spot on. Sure, it has an edited transcription to work from but now my colleagues don’t even need to watch the video. (Which begs the question why I recorded it in the first place – to which the answer is choice).

    Changing the medium

    So now, lets take this a bit further… out of Copilot and Stream and into the real implications of this technology. Starting with a couple of observations:

    • When I’m driving, Apple CarPlay reads my messages to me. Or, I ask Siri to send a message, or to take a note.
    • When I’m in a group messaging situation, some people seem to have a propensity to create short form audio.

    I used to think that WhatApp voice messages are the spawn of the devil. Why should I have to listen to someone drone on for 30 seconds when I could read a text message much more quickly? Is it because they couldn’t be bothered to type? Then someone suggested it might be because they struggle with writing. That puts a different lens on things.

    Create and consume according to our individual preferences

    Now, with technologies like this we can create content in audio/video or written form – and that same content can be consumed in audio/video or written form. We can each use our preferred methods to create a message, and the recipient can use their preferred method to consume it.

    This is the stuff that really makes a difference – the little things that make someone’s life easier – all adding up to a bigger boost in our individual productivity, or just getting things done.

    Featured image – author’s own screenshot

  • Put the big rocks in first

    Put the big rocks in first

    This post previously appeared on my LinkedIn feed. I thought it should have been here…

    A few weeks ago, I heard Michelle Minnikin refer to “big rocks first” on the WB-40 Podcast. It rang a bell with me – an approach to prioritising activities – first the big rocks, then the pebbles, then the sand. First attributed to Stephen Covey, it’s based on the story of a professor demonstrating to his class that they need to focus on the important things first, in order of priority, and then fit the minutiae of life around them. I’ve linked a 2 minute video at the end of this post that tells the story.

    It seems I’ve used the analogy a lot recently – firstly helping someone manage the things that are making them anxious; now it seems that I’ll be doing the same with my son, in terms of prioritising activities to prepare for his A-Levels; and it works in a business context too – in terms of setting goals to achieve strategic aims.

    So, whether it’s helping with mental health, learning about time management, or simply determining the priorities to achieve success, think about your rocks, pebbles, and sand.

    And for a slightly longer (and older) video, here’s a practical demonstration featuring Stephen Covey himself: