Microsoft Ignite | The Tour: London Recap

One of the most valuable personal development activities in my early career was a trip to the Microsoft TechEd conference in Amsterdam. I learned a lot – not just technically but about making the most of events to gather information, make new industry contacts, and generally top up my knowledge. Indeed, even as a relatively junior consultant, I found that dipping into multiple topics for an hour or so gave me a really good grounding to discover more (or just enough to know something about the topic) – far more so than an instructor-led training course.

Over the years, I attended further “TechEd”s in Amsterdam, Barcelona and Berlin. I fought off the “oh Mark’s on another jolly” comments by sharing information – incidentally, conference attendance is no “jolly” – there may be drinks and even parties but those are after long days of serious mental cramming, often on top of broken sleep in a cheap hotel miles from the conference centre.

Microsoft TechEd is no more. Over the years, as the budgets were cut, the standard of the conference dropped and in the UK we had a local event called Future Decoded. I attended several of these – and it was at Future Decoded that I discovered risual – where I’ve been working for almost four years now.

Now, Future Decoded has also fallen by the wayside and Microsoft has focused on taking it’s principal technical conference – Microsoft Ignite – on tour, delivering global content locally.

So, a few weeks ago, I found myself at the ExCeL conference centre in London’s Docklands, looking forward to a couple of days at “Microsoft Ignite | The Tour: London”.

Conference format

Just like TechEd, and at Future Decoded (in the days before I had to use my time between keynotes on stand duty!), the event was broken up into tracks with sessions lasting around an hour. Because that was an hour of content (and Microsoft event talks are often scheduled as an hour, plus 15 minutes Q&A), it was pretty intense, and opportunities to ask questions were generally limited to trying to grab the speaker after their talk, or at the “Ask the Experts” stands in the main hall.

One difference to Microsoft conferences I’ve previously attended was the lack of “level 400” sessions: every session I saw was level 100-300 (mostly 200/300). That’s fine – that’s the level of content I would expect but there may be some who are looking for more detail. If it’s detail you’re after then Ignite doesn’t seem to be the place.

Also, I noticed that Day 2 had fewer delegates and lacked some of the “hype” from Day 1: whereas the Day 1 welcome talk was over-subscribed, the Day 2 equivalent was almost empty and light on content (not even giving airtime to the conference sponsors). Nevertheless, it was easy to get around the venue (apart from a couple of pinch points).

Personal highlights

I managed to cover 11 topics over two days (plus a fair amount of networking). The track format of the event was intended to let a delegate follow a complete learning path but, as someone who’s a generalist (that’s what Architects have to be), I spread myself around to cover:

  • Dealing with a massive onset of data ingestion (Jeramiah Dooley/@jdooley_clt).
  • Enterprise network connectivity in a cloud-first world (Paul Collinge/@pcollingemsft).
  • Building a world without passwords.
  • Discovering Azure Tooling and Utilities (Simona Cotin/@simona_cotin).
  • Selecting the right data storage strategy for your cloud application (Jeramiah Dooley/@jdooley_clt).
  • Governance in Azure (Sam Cogan/@samcogan).
  • Planning and implementing hybrid network connectivity (Thomas Maurer/@ThomasMaurer).
  • Transform device management with Windows Autopilot, Intune and OneDrive (Michael Niehaus/@mniehaus and Mizanur Rahman).
  • Maintaining your hybrid environment (Niel Peterson/@nepeters).
  • Windows Server 2019 Deep Dive (Jeff Woolsey/@wsv_guy).
  • Consolidating infrastructure with the Azure Kubernetes Service (Erik St Martin/@erikstmartin).

In the past, I’d have written a blog post for each topic. I was going to say that I simply don’t have the time to do that these days but by the time I’d finished writing this post, I thought maybe I could have split it up a bit more! Regardless, here are some snippets of information from my time at Microsoft Ignite | The Tour: London. There’s more information in the slide decks – which are available for download, along with the content for the many sessions I didn’t attend.

Data ingestion

Ingesting data can be broken into:

  • Real-time ingestion.
  • Real-time analysis (see trends as they happen – and make changes to create a competitive differentiator).
  • Producing actions as patterns emerge.
  • Automating reactions in external services.
  • Making data consumable (in whatever form people need to use it).

Azure has many services to assist with this – take a look at IoT Hub, Azure Event Hubs, Azure Databricks and more.

Enterprise network connectivity for the cloud

Cloud traffic is increasing whilst traffic that remains internal to the corporate network is in decline. Traditional management approaches are no longer fit for purpose.

Office applications use multiple persistent connections – this causes challenges for proxy servers which generally degrade the Office 365 user experience. Remediation is possible, with:

  • Differentiated traffic – follow Microsoft advice to manage known endpoints, including the Office 365 IP address and URL web service.
  • Let Microsoft route traffic (data is in a region, not a place). Use DNS resolution to egress connections close to the user (a list of all Microsoft peering locations is available). Optimise the route length and avoid hairpins.
  • Assess network security using application-level security, reducing IP ranges and ports and evaluating the service to see if some activities can be performed in Office 365, rather than at the network edge (e.g. DLP, AV scanning).

For Azure:

  • Azure ExpressRoute is a connection to the edge of the Microsoft global backbone (not to a datacentre). It offers 2 lines for resilience and two peering types at the gateway – private and public (Microsoft) peering.
  • Azure Virtual WAN can be used to build a hub for a region and to connect sites.
  • Replace branch office routers with software-defined (SDWAN) devices and break out where appropriate.
Microsoft global network

Passwordless authentication

Basically, there are three options:

  • Windows Hello.
  • Microsoft Authenticator.
  • FIDO2 Keys.

Azure tooling and utilities

Useful resources include:

Selecting data storage for a cloud application

What to use? It depends! Classify data by:

  • Type of data:
    • Structured (fits into a table)
    • Semi-structured (may fit in a table but may also use outside metadata, external tables, etc.)
    • Unstructured (documents, images, videos, etc.)
  • Properties of the data:
    • Volume (how much)
    • Velocity (change rate)
    • Variety (sources, types, etc.)
Item TypeVolume Velocity Variety
Product catalogue Semi-structured High Low Low
Product photos Unstructured High Low Low
Sales data Semi-structured Medium High High

How to match data to storage:

  • Storage-driven: build apps on what you have.
  • Cloud-driven: deploy to the storage that makes sense.
  • Function-driven: build what you need; storage comes with it.

Governance in Azure

It’s important to understand what’s running in an Azure subscription – consider cost, security and compliance:

  • Review (and set a baseline):
    • Tools include: Resource Graph; Cost Management; Security Center; Secure Score.
  • Organise (housekeeping to create a subscription hierarchy, classify subscriptions and resources, and apply access rights consistently):
    • Tools include: Management Groups; Tags; RBAC;
  • Audit:
    • Make changes to implement governance without impacting people/work. Develop policies, apply budgets and audit the impact of the policies.
    • Tools include: Cost Management; Azure Policy.
  • Enforce
    • Change policies to enforcement, add resolution actions and enforce budgets.
    • Consider what will happen for non-compliance?
    • Tools include: Azure Policy; Cost Management; Azure Blueprints.
  • (Loop back to review)
    • Have we achieved what we wanted to?
    • Understand what is being spent and why.
    • Know that only approved resources are deployed.
    • Be sure of adhering to security practices.
    • Opportunities for further improvement.

Planning and implementing hybrid network connectivity

Moving to the cloud allows for fast deployment but planning is just as important as it ever was. Meanwhile, startups can be cloud-only but most established organisations have some legacy and need to keep some workloads on-premises, with secure and reliable hybrid communication.

Considerations include:

  • Extension of the internal protected network:
    • Should workloads in Azure only be accessible from the Internal network?
    • Are Azure-hosted workloads restricted from accessing the Internet?
    • Should Azure have a single entry and egress point?
    • Can the connection traverse the public Internet (compliance/regulation)?
  • IP addressing:
    • Existing addresses on-premises; public IP addresses.
    • Namespaces and name resolution.
  • Multiple regions:
    • Where are the users (multiple on-premises sites); where are the workloads (multiple Azure regions); how will connectivity work (should each site have its own connectivity)?
  • Azure virtual networks:
    • Form an isolated boundary with secure communications.
    • Azure-assigned IP addresses (no need for a DHCP server).
    • Segmented with subnets.
    • Network Security Groups (NSGs) create boundaries around subnets.
  • Connectivity:
    • Site to site (S2S) VPNs at up to 1Gbps
      • Encrypted traffic over the public Internet to the GatewaySubnet in Azure, which hosts VPN Gateway VMs.
      • 99.9% SLA on the Gateway in Azure (not the connection).
      • Don’t deploy production workloads on the GatewaySubnet; /26, /27 or /28 subnets recommended; don’t apply NSGs to the GatewaySubnet – i.e. let Azure manage it.
    • Dedicated connections (Azure ExpressRoute): private connection at up to 10Gbps to Azure with:
      • Private peering (to access Azure).
      • Microsoft peering (for Office 365, Dynamics 365 and Azure public IPs).
      • 99.9% SLA on the entire connection.
    • Other connectivity services:
      • Azure ExpressRoute Direct: a 100Gbps direct connection to Azure.
      • Azure ExpressRoute Global Reach: using the Microsoft network to connect multiple local on-premises locations.
      • Azure Virtual WAN: branch to branch and branch to Azure connectivity with software-defined networks.
  • Hybrid networking technologies:

Modern Device Management (Autopilot, Intune and OneDrive)

The old way of managing PC builds:

  1. Build an image with customisations and drivers
  2. Deploy to a new computer, overwriting what was on it
  3. Expensive – and the device has a perfectly good OS – time-consuming

Instead, how about:

  1. Unbox PC
  2. Transform with minimal user interaction
  3. Device is ready for productive use

The transformation is:

  • Take OEM-optimised Windows 10:
    • Windows 10 Pro and drivers.
    • Clean OS.
  • Plus software, settings, updates, features, user data (with OneDrive for Business).
  • Ready for productive use.

The goal is to reduce the overall cost of deploying devices. Ship to a user with half a page of instructions…

Windows Autopilot overview

Autopilot deployment is cloud driven and will eventually be centralised through Intune:

  1. Register device:
    • From OEM or Channel (manufacturer, model and serial number).
    • Automatically (existing Intune-managed devices).
    • Manually using a PowerShell script to generate a CSV file with serial number and hardware hash, which is then uploaded to the Intune portal.
  2. Assign Autopilot profile:
    • Use Azure AD Groups to assign/target.
    • The profile includes settings such as deployment mode, BitLocker encryption, device naming, out of box experience (OOBE).
    • An Azure AD device object is created for each imported Autopilot device.
  3. Deploy:
    • Needs Azure AD Premium P1/P2
    • Scenarios include:
      • User-driven with Azure AD:
        • Boot to OOBE, choose language, locale, keyboard and provide credentials.
        • The device is joined to Azure AD, enrolled to Intune and policies are applied.
        • User signs on and user-assigned items from Intune policy are applied.
        • Once the desktop loads, everything is present, including file links in OneDrive) – time depends on the software being pushed.
      • Self-deploying (e.g. kiosk, digital signage):
        • No credentials required; device authenticates with Azure AD using TPM 2.0.
      • User-driven with hybrid Azure AD join:
        • Requires Offline Domain Join Connector to create AD DS computer account.
        • Device connected to the corporate network (in order to access AD DS), registered with Autopilot, then as before.
        • Sign on to Azure AD and then to AD DS during deployment. If they use the same UPN then it makes things simple for users!
      • Autopilot for existing devices (Windows 7 to 10 upgrades):
        • Backup data in advance (e.g. with OneDrive)
        • Deploy generic Windows 10.
        • Run Autopilot user-driven mode (can’t harvest hardware hashes in Windows 7 so use a JSON config file in the image – the offline equivalent of a profile. Intune will ignore unknown device and Autopilot will use the file instead; after deployment of Windows 10, Intune will notice a PC in the group and apply the profile so it will work if the PC is reset in future).

Autopilot roadmap (1903) includes:

  • “White glove” pre-provisioning for end users: QR code to track, print welcome letter and shipping label!
  • Enrolment status page (ESP) improvements.
  • Cortana voiceover disabled on OOBE.
  • Self-updating Autopilot (update Autopilot without waiting to update Windows).

Maintaining your hybrid environment

Common requirements in an IaaS environment include wanting to use a policy-based configuration with a single management and monitoring solution and auto-remediation.

Azure Automation allows configuration and inventory; monitoring and insights; and response and automation. The Azure Portal provides a single pane of glass for hybrid management (Windows or Linux; any cloud or on-premises).

For configuration and state management, use Azure Automation State Configuration (built on PowerShell Desired State Configuration).

Inventory can be managed with Log Analytics extensions for Windows or Linux. An Azure Monitoring Agent is available for on-premises or other clouds. Inventory is not instant though – can take 3-10 minutes for Log Analytics to ingest the data. Changes can be visualised (for state tracking purposes) in the Azure Portal.

Azure Monitor and Log Analytics can be used for data-driven insights, unified monitoring and workflow integration.

Responding to alerts can be achieved with Azure Automation Runbooks, which store scripts in Azure and run them in Azure. Scripts can use PowerShell or Python so support both Windows and Linux). A webhook can be triggered with and HTTP POST request. A Hybrid runbook worker can be used to run on-premises or in another cloud.

It’s possible to use the Azure VM agent to run a command on a VM from Azure portal without logging in!

Windows Server 2019

Windows Server strategy starts with Azure. Windows Server 2019 is focused on:

  • Hybrid:
    • Backup/connect/replicate VMs.
    • Storage Migration Service to migrate unstructured data into Azure IaaS or another on-premises location (from 2003+ to 2016/19).
      1. Inventory (interrogate storage, network security, SMB shares and data).
      2. Transfer (pairings of source and destination), including ACLs, users and groups. Details are logged in a CSV file.
      3. Cutover (make the new server look like the old one – same name and IP address). Validate before cutover – ensure everything will be OK. Read-only process (except change of name and IP at the end for the old server).
    • Azure File Sync: centralise file storage in Azure and transform existing file servers into hot caches of data.
    • Azure Network Adapter to connect servers directly to Azure networks (see above).
  • Hyper-converged infrastructure (HCI):
    • The server market is still growing and is increasingly SSD-based.
    • Traditional rack looked like SAN, storage fabric, hypervisors, appliances (e.g. load balancer) and top of rack Ethernet switches.
    • Now we use standard x86 servers with local drives and software-defined everything. Manage with Admin Center in Windows Server (see below).
    • Windows Server now has support for persistent memory: DIMM-based; still there after a power-cycle.
    • The Windows Server Software Defined (WSSD) programme is the Microsoft approach to software-defined infrastructure.
  • Security: shielded VMs for Linux (VM as a black box, even for an administrator); integrated Windows Defender ATP; Exploit Guard; System Guard Runtime.
  • Application innovation: semi-annual updates are designed for containers. Windows Server 2019 is the latest LTSC channel so it has the 1709/1803 additions:
    • Enable developers and IT Pros to create cloud-native apps and modernise traditional apps using containers and micro services.
    • Linux containers on Windows host.
    • Service Fabric and Kubernetes for container orchestration.
    • Windows subsystem for Linux.
    • Optimised images for server core and nano server.

Windows Admin Center is core to the future of Windows Server management and, because it’s based on remote management, servers can be core or full installations – even containers (logs and console). Download from http://aka.ms/WACDownload

  • 50MB download, no need for a server. Runs in a browser and is included in Windows/Windows Server licence
  • Runs on a layer of PowerShell. Use the >_ icon to see the raw PowerShell used by Admin Center (copy and paste to use elsewhere).
  • Extensible platform.

What’s next?

  • More cloud integration
  • Update cadence is:
    • Insider builds every 2 weeks.
    • Semi-annual channel every 6 months (specifically for containers):
      • 1709/1803/1809/19xx.
    • Long-term servicing channel
      • Every 2-3 years.
      • 2016, 2019 (in September 2018), etc.

Windows Server 2008 and 2008 R2 reach the end of support in January 2020 but customers can move Windows Server 2008/2008 R2 servers to Azure and get 3 years of security updates for free (on-premises support is chargeable).

Further reading: What’s New in Windows Server 2019.

Containers/Azure Kubernetes Service

Containers:

  • Are fully-packaged applications that use a standard image format for better resource isolation and utilisation.
  • Are ready to deploy via an API call.
  • Are not Virtual machines (for Linux).
  • Do not use hardware virtualisation.
  • Offer no hard security boundary (for Linux).
  • Can be more cost effective/reliable.
  • Have no GUI.

Kubernetes is:

  • An open source system for auto-deployment, scaling and management of containerized apps.
  • Container Orchestrator to manage scheduling; affinity/anti-affinity; health monitoring; failover; scaling; networking; service discovery.
  • Modular and pluggable.
  • Self-healing.
  • Designed by Google based on a system they use to run billions of containers per week.
  • Described in “Phippy goes to the zoo”.

Azure container offers include:

  • Azure Container Instances (ACI): containers on demand (Linux or Windows) with no need to provision VMs or clusters; per-second billing; integration with other Azure services; a public IP; persistent storage.
  • Azure App Service for Linux: a fully-managed PaaS for containers including workflows and advanced features for web applications.
  • Azure Kubernetes Service (AKS): a managed Kubernetes offering.

Wrap-up

So, there you have it. An extremely long blog post with some highlights from my attendance at Microsoft Ignite | The Tour: London. It’s taken a while to write up so I hope the notes are useful to someone else!

Strava Art

It’s no secret that I enjoy cycling – and I’m also a bit of a geek. My cycling and my tech come together in various places but Strava is one of the most obvious… I have been known to say “If it’s not on Strava, it didn’t happen” (of course, that’s in jest – but I do get annoyed if my GPS traces get messed up).

There’s something quite wonderful about maps too. Maybe this is another side of my geekiness but I love looking at a good map. So, what if you could have a map of a ride you’re particularly proud of turned into a piece of art to display on your wall?

As it happens – you can do exactly that. Cyced produce high quality Strava Art for runners and cyclists. So, when Angus from Cyced asked me if I’d like to review their service, I was interested to give it a try.

I settled on the ride I did with my son last year to raise money for his trip to the Kandersteg International Scout Centre and I provided a Strava link (a GPX file would have been another option). Soon afterwards, Angus sent me a PDF proof to review and, a couple of days after I confirmed the edits, the final print arrived.

I was impressed by how well it was wrapped – indeed, I’ve never had an “unboxing experience” quite like this for a piece of artwork: wrapped in tissue paper; sandwiched between sheets of heavy-duty card; all inside a sturdy card envelope. It would be pretty difficult for my postie to accidentally bend this package!

The print itself is really high quality and the simple (Strava-inspired) colours look amazing – greys, whites and orange highlights. The print that Angus created for me is A4 but there are A3 and A2 options too. I’m now considering buying a second print for the “Man Cave” when I have a suitably big ride to be proud of (or maybe with last year’s London Revolution route).

If you’re looking for something a little different for your wall – a piece of art to celebrate a ride, a run, or maybe a present for a runner/cyclist friend or family member, I’d recommending checking out the Strava Art from Cyced. As the website says it’s “worth more than just a kudos”!

Full disclosure: Cyced provided me with an A4 print in exchange for this blog post but that doesn’t influence my review. Everything I’ve written is my true opinion – but it’s nice to have the artwork for my son to keep as a memento of our 70 miles MTBing along the Grand Union Canal towpath last summer!

Caching OneDrive for Business content when Files On-Demand is enabled

Not surprisingly, given who I work for, I’m a heavy user of Microsoft technologies. I have a Microsoft Surface Pro, running the latest versions of Windows 10 of Office 365 ProPlus, joined to Azure Active Directory and managed with Intune. I use all of the Office 365 Productivity apps. I AM A MICROSOFT POWER USER!

Enough of the drama! Let’s bring this down a level…

…I’m just a guy, using a laptop, trying to get a job done. It’s a tool.

OneDrive icon

Most of my files are stored in OneDrive for Business. There’s lots more space there than the typical SSD has available and so Microsoft introduced a feature called Files On-Demand, whereby you see the whole list of files but it’s only actually downloaded when you try to access it.

That sounds great, unless you travel a lot and work on trains and other places where network connectivity is less than ideal.

In my case, I have around 50GB of data in OneDrive and 90GB of free space on my Surface’s SSD so I have the potential to cache it all locally. I used to do this by turning off Files On-Demand but the latest build I’m running has disabled that capability for me.

It’s not feasible to touch every file and force it to be cached and I thought about asking my admins to reverse the setting to force the use of Files On-Demand but then I found another way around it…

If I right-click on a OneDrive file or folder in Windows Explorer there’s the option to “Always keep on this device”. [Update: Peter Bryant (@PJBryant) has flagged a method using the command line too – it seems there are new attributes P and U for Files On-Demand]

By applying this to one of the top-level folders in my OneDrive, I was able to force the files to be cached – regardless of whether Files On-Demand is enabled or not. Now, I can access all of the files in that folder (and any subfolders), even when I’m not connected to the Internet.

Defining multiple RADIUS servers for Aruba Wi-Fi

Wi-Fi logo (via Pixabay)

I’ve spent some time over the last few months working with a customer who is building a complete greenfield IT infrastructure, in preparation for launching a new business. It’s been a rare privilege to work without piles of technical debt (of course, it’s never completely that simple – there is data to bring across and there are some core systems that will tie back into the parent organisation) but there have been some challenges along the way too.

One of these was when the customer’s network partner asked for a RADIUS server to be added to our identity solution (to support 802.1x based authentication for Wi-Fi clients). In itself, that wasn’t too big an ask – we could use Windows Servers running Microsoft Network Policy Server (NPS), across two Azure regions. Unfortunately, we also needed to provide resilience and the network partner was suggesting that they could only configure one IP address in their HP-Aruba cloud controllers. Azure Load Balancers only work within region and DNS round robin is not exactly smart, so myself and the other Consultants working on the solution were left scratching our heads.

Luckily, for me, having a reasonably large Twitter network meant I could ask for help – and the help came (thanks to @Tim_Siddle and others)!

We were able to take the information about server groups to our networking partner, who advised us that the cloud controllers lacked the server groups capability until recently (it was only a feature on physical controllers) but that it had now been added.

Other people responded to say they had had similar issues in the past, so this might be useful for others who are trying to configure a certificate-based authentication solution for Wi-Fi with Microsoft NPS servers.

Further reading

Enabling RADIUS Server Authentication [Aruba]

Short takes: Canon Selphy ink cassette issues; a new Samsung 4K monitor; tracking down digital copies of Ikea assembly instructions


A collection of snippets from this week’s life with tech…

Ink cassette/cartridge issues with a Canon Selphy photo printer

My son has a Canon Selphy CP780. It’s been great for printing the odd 6×4″ photo on demand but it recently started playing up, struggling to feed paper and then complaining that its ink cassette needed to be tightened. I couldn’t remove the cassette (it was stuck) – but this YouTube video helped:

Unfortunately, even after releasing and replacing the cassette with a new one, I was getting errors to say that it was empty. With quite a stock of paper and ink in the cupboard (enough for 72 prints, I decided to replace with the latest model: Aldi had a CP1300 on offer for £89 this week but that offer has now passed – you should be able to pick one up for around £99 at John Lewis (and elsewhere).

A new 4K monitor

I’ve been wanting to get a decent, large, high-resolution monitor for photo editing for a while now. The Mac Mini that I use only supports 1440p (2560×1440) but Picture In Picture/Picture By Picture (PIP/PBP) capabilities would be useful to also display 1080p (1920×1080) output from another PC and both my work PC (Microsoft Surface Pro 3) and personal PC (MacBook) can output at full 4K/UHD (3840×2160).

I considered a 28″ Samsung 4K UHD monitor (LU28E590) – currently about £289 – but the reviews suggesting lots of customers with faulty screens. Then I saw a newer, 32″ model: the U32J590, giving me a newer model with a larger panel for £379.

Initial impressions are good – and a former colleague asked me to send him a pic with two documents side by side at 100% – looks like this could be a useful work tool!

Finding digital copies of Ikea product instructions

My recent loft conversion means I have bought a lot of products from Ikea recently. Generally, I keep a digital copy of the assembly instructions and get rid of the paper ones but sometimes they aren’t easy to find on the UK website. Then I found a trick:

  1. Take the URL from a working document – for example https://www.ikea.com/gb/en/doc/assembly_instructions/best%C3%A5__aa-1402080-9_pub.pdf
  2. Look at the paper document that you want a copy of and look for a code on the last page – for example AA-1402080-9.
  3. Edit the URL from step 1, and you should be able to find the document you are after, in this case https://www.ikea.com/gb/en/doc/assembly_instructions/best%C3%A5__aa-1402080-9_pub.pdf.

 

Saving money on mobile calls and data whilst travelling in the USA


Those who follow me on Twitter (@markwilsonit) may have noticed that I’ve been in the USA for the last couple of weeks. Normally, my overseas travel is in Europe, where the European Union (EU) Roaming Regulations mean I can use my bundled data and minutes – at least until Britain leaves the EU. Outside the European Economic Area (EEA) though, I need to pay for international-rate mobile data, voice and messaging.

As some degree of Internet access is pretty much required (and certainly very useful) when travelling these days, I considered buying a pay-as-you-go (PAYG) SIM in the States. A little bit of Internet research told me that wasn’t cheap either – mobile data is really expensive over there (though not as expensive as roaming with my normal UK SIM – 20p per MB doesn’t sound too bad until you realise that’s £200 for a GB!). A monthly contract was out of the question too, without a US-registered credit card.

Then I heard about UK mobile operator Three’s Go Roam offer. This allows you to use your data overseas (in 71 countries, within fair use limits) and to make UK calls and send texts to UK numbers as if you were at home. Following the advice from my local Three store, I took a monthly rolling contract and then served notice (30 days) the day after it had been activated, moving my SIM onto a PAYG basis, so I only had to pay for one month’s service.

So that was data sorted, but what about calls to non-UK numbers whilst I was away? My accommodation included Wi-Fi, so WhatsApp was useful whilst connected but that didn’t help with calling US numbers. I do have 60 minutes of Skype calls included in my Office 365 Home subscription though – and they were fine for making the odd call to US numbers whilst travelling!

I’m sure that these “hacks” have saved me tens, if not hundreds, of pounds during my US trip – and they might help others too. And for those who have a pay monthly contract with Vodafone, their Global Roaming option may be worth considering – paying a flat-rate £6 each day that you use the phone abroad could work out a little more expensive than the Three Go Roam deal but is likely to be much cheaper than standard roaming tariffs.

Poorly-targeted InMail on LinkedIn…


A good chunk of the email I receive is either:

  1. Spam from SEO specialists who can’t even present a well-written email (so why would I let them loose on my website?).
  2. Spam from people who want to advertise on my website or write content to link to their client’s dubious sites (no thanks).
  3. LinkedIn requests from recruiters I’ve never even spoken to (read on).

Now, let me be clear, there are some good recruiters out there: people who build rapport and work on relationships with people. Maybe one day we’ll work together, maybe we won’t but when I hear my peers talking about recruiters that I know, then I know they are well-connected within our industry and they will be my first port of call if I find myself looking for work (or to recruit).

Then there’s stuff like this, a real email, received tonight via LinkedIn’s InMail feature. I’ve changed the names to protect the guilty but apart from that, it’s a facsimile:

“Hi Mark,

[Do I know you?]

A leading global provider of retail software solutions is seeking an experienced EPOS Architect to join the European Portfolio team in a key leadership role at the heart of a massive digital transformation programme.

[Doesn’t appear to be very well researched: I’m an Enterprise Architect, not an EPOS Architect… I know very little about EPOS systems. Sure, maybe EPOS might be part of something I do put together but I’m no EPOS specialist. Well, it starts with E and ends with Architect – so it must be related! Does this recruiter even know what they are recruiting for?]

You’ll be working closely with the technical leadership of tier 1 global retailers such as huge retailer name removed, and leading national retailers across Europe to shape and deliver next generation cloud and on premise point of sale systems.

[Minor point but it’s “on-premises”, FFS. It’s a place, not an idea.]

An excellent package of £75,000 – £100,000 + car + bonus is on offer, plus extensive European travel to the headquarters of the continent’s leading businesses.

[Since when was “extensive European travel to the headquarters of the continent’s leading businesses” a perk? This is the sort of benefit dreamed up by people who never leave their office. What it generally means is “spend lots of time away from home travelling economy class to a business park but never really see the city you’re going to…”]

Further details: website/Job/Detail/epos-solution-architect-leeds-en-GB

[So it’s in Leeds. Leeds is 3 hours from where I live]

For a fully confidential discussion, contact someone.i.dont-know@recruiter.co.uk

 

Someone Else
Senior Recruitment Consultant @ leading global specialist recruitment group | Specialising in Testing across Yorkshire | someone.else@recruiter.com

[Why am I getting email on a Friday evening from one person I don’t know to ask me to contact someone else I don’t know? Mind you, if their specialism is “Testing across Yorkshire”, maybe that explains the poor targetting of this role to a guy 150 miles away in Milton Keynes…]”

Luckily, I’m not looking for work (or to hire anyone) at the moment but, when I am, this agency will not be on my list… sadly, this is not an isolated incident.

Bringing engineering to life with some Key Stage 2 schoolchildren and K’nex


Last year, I signed up as a STEM Ambassador. With my employer’s backing, I can now volunteer to take part in events that are intended to bring Science, Technology, Engineering and Maths (STEM) subjects to life and demonstrate their value in life and in careers.

I receive regular invitations to take part in events but, until recently, I hadn’t been able to make them fit around my calendar. Then, a few weeks ago, I saw an invitation to run an engineering workshop with some Year 4-6 students as part of a school Science Day. The brief was to give a short presentation on:

  • What is STEM?
  • Why STEM skills are important
  • The story of what I was like at school and what I wanted to do for a job
  • What I do now
  • What I enjoy about my job

and then to facilitate an activity, breaking the children into small teams with a box of K’nex to build a tower that could support a small object.

I was pretty nervous about the activity – after all, I’m not a teacher! I spent quite a bit of time tuning the presentation and, taking advice from my own children (who are in years 6 and 8), making sure there were lots of images (that’s my style anyway) and animations. Unfortunately, when I arrived at the school, the animations were useless: PowerPoint 2010 didn’t like my 2016-based graphics so I quickly removed all of the transitions and animations – and the moral of the story there is don’t take advice from your 13 year-old…

I ran two workshops, each with a class of around 28 children. The teachers were present at all times (dealing with any disruptive children) and I found I just needed to be myself, to answer the children’s questions (which, of course, ranged from “what age can you start being an engineer?” to “what car do you drive?”) and to guide them during the activity.

I set out the activity as a challenge, with requirements and materials:

STEM engineering challenge, with requirements and materials

but I didn’t tell the children how to make a tower strong.

Time to test the towers

Only after we had tested it, did we spend some time talking about the things they had done to make their towers work (and all of them had managed this themselves, whether they did it consciously or not).

Making towers strong and stable

It was fantastic to see how each group approached the activity – each team had different ideas for how they might use the K’nex. Some children had played with it before whilst others needed some advice on how to make the connectors and rods fit together but almost every team completed the challenge successfully. The one team that didn’t complete the task had struggled because they had divided into two smaller groups and ended up with two short towers – that gave me an opportunity to talk about teamwork and also about project management (managing to time!).

I came away from school that afternoon with a great buzz. It’s wonderful to hear children say things like “I like your lessons – they’re fun!” and “Are you coming back next year?”. And, if you want to know more about STEM Ambassadors (either getting someone involved in an activity or event – or perhaps becoming one yourself), check out the website.

Microsoft Surface Pro 3 refuses to power on: fixed with a handful of elastic bands


This week didn’t start well (and it hasn’t got much better either) but Monday morning was a write-off, as the Microsoft Surface Pro 3 that I use for work wouldn’t “wake up”.

I’d used it on Friday, closed the “lid” (i.e. closed the tablet against the Type Cover) and left it on a table all weekend. Come Monday and it was completely dead. I tried charging it for a while. I tried Power and Volume Up/Down combinations. I tried holding the power button down for 30 secs (at which point the light on the charging cable flashed, but that was all).

After speaking to colleagues in our support team, it seemed I’d tried everything they could think of and we were sure it was some sort of battery failure (one of my customers has seen huge levels of battery failure on their Surface Books, suspected to be after they were kept in storage for an extended period without having been fully shut down).

I was ready for a long drive to Stafford to swap it for another device, hoping that OneDrive had all of my data synced and that I didn’t get the loan Dell laptop with the missing key (I’m sure that’s a warning to look after our devices…).

Then I found a post on the Windows Central Forums titled “Surface Pro 3 won’t turn back on! – possible solution when all hope is lost”.

All hope was indeed lost. This had to be worth a read?

“My SP3 mysteriously stopped working yesterday morning. (Keep reading to the end for the solution that worked for me and maybe you too!)

It was fine the night before. […]

I spent the morning attempting to reboot the SP3. I thought maybe my charger wasn’t working even though I did see a white LED light on the adapter that connects to the Surface. I tried the hard reset, the 2-button reset, every combination of the volume up and down with the power button.

[…]

Finally, this morning, I caved in and call MS support. The tech said she would charge me $30 for a remote over the phone troubleshooting. I declined as I’ve tried everything I’ve found on the internet. Instead, I scheduled app with the MS store support in Garden City, NY (Roosevelt Field Mall).

I had the first or second app: 11:15am. The tech, I think his name was Adam, young guy in his 20’s. I told Adam my issue and that I’ve tried everything. I even had a USB LED light to show that the battery in my case wasn’t the problem. The USB LED light lit up for a few seconds when I pressed power. He said the problem was internal hardware and they there was no way to fix it. Since my SP3 was out of warranty, the only solution from MS was full replacement for $500. But, since I needed my files, a replacement won’t do me any good. So, the only other solution was have it sent to a third party data recovery place for $1000! They would basically destroy the SP3 and MS would then be unable to replace it.

Talk about bad options. Neither one seemed practical. I asked Adam if he’s seen this type of problem with any of the Surfaces before. He said maybe one or twice before. I was about to leave when another guy walked with his Surface, sat down next to me and said his Surface won’t boot up. I looked at Adam and I didn’t believe this was a rare issue with the Surface. MS probably train their techs to say that because they don’t want a class action law suit on their hand.

Anyway, just before I left, Adam, did say something, almost accidentally that I picked up. He said some guy had used a rubber band to hold down the power button for about a day and eventually the Surface woke up from sleep.

When I came home this afternoon, I was sure I had a $1100 paper weight with me. With nothing to lose, I took out some rubber bands and popsicle stick. I placed the popsicle stick flat against the power button and used the rubber band to apply pressure to keep the power button depressed the whole time. I can see the USB light connected to my Surface coming on and off as the power cycled. No sign of the Surface waking up.

Came back from dinner (that’s 5 hours later) and noticed the USB light didn’t come on and off any more. But still no sign the Surface was back. My 8 yr old sons comes into my office sees the contraption and says “what’s this” and pulls the popsicle stick off the Surface. I wasn’t even paying attention.

Lo and behold! the F—ing Surface logo flashed on the screen and booted up!!!!!
I immediately plugged in the charger and a backup HD and copied all my files!”

I was struggling to find any elastic bands at home but then, as the day’s post landed on my doormat, I thought “Royal Mail. Rubber bands!” and chased the postie down the street to ask if she had any spares. She was more than happy to give me a handful and so this was my setup (I don’t know what a “popsicle stick” is, but I didn’t need one):

A couple of hours later, I removed the bands and tried powering on the Surface Pro. I couldn’t believe it when it booted normally:

So, if your Surface Pro 3 (or possibly another Surface model) fails to power on, you might want to try this before giving up on it as a complete battery failure.

Explaining Office 365, with particular reference to the crossover between OneDrive, SharePoint and Teams


For most of my career, I’ve worked primarily with Microsoft products. And for the last three years, I’ve worked in a consulting, services and education organisation that’s entirely focused on extracting value for our customers from their investments in Microsoft technology (often via an Enterprise Agreement, or similar). So, living in my Microsoft-focused bubble, it’s easy to forget that there are organisations out there for who deploying Microsoft products is not the first choice. And I’ve found myself in a few online conversations where people are perplexed about Office 365 and which tool to use when.

I used to use the Office 365 Wheel from OnPoint solutions until I discovered Matt Wade’s “Periodic Table of Office 365”, which attempts to describe Office 365’s “ecosystem of applications in the cloud” in infographic format:

The web version even lets you select by licence – so, for most of my customers, Enterprise E3 or E5.

But, as I said, I’ve also been in a few discussions recently where I’ve tried to help others (often those who are familiar with Google’s tools) to understand where SharePoint, OneDrive for Business and Microsoft Teams fit in – i.e. which is used in what scenario?

A few weeks ago, I found myself trying to do that on the WB-40 Podcast WhatsApp group, where one member had asked for help with the various “file” constructs and another had replied that “not even Microsoft” knew that. Challenge accepted.

So, in short form for social media, I replied to the effect that:

  1. Teams is unfinished (IMHO) but built on top of Office 365 Groups (and very closely linked to SharePoint).
  2. SharePoint can be used for many things including a repository for team-based information – regardless of what those teams are (projects, hierarchy, function).
  3. OneDrive is a personal document store.

In effect OneDrive can be used to replace “home drives” and SharePoint to provide wider collaboration features/capabilities when a document moves from being “something I’m working on” to “something I’m ready to collaborate on”. Teams layers over that to provide chat-based workspace and more.

And then I added a caveat to say that all of the above is the way we work and many others do but there is not one single approach that fits all. And don’t even get me started with Yammer…

The key point for me is that organisations really should have an information management strategy and associated architecture, regardless of the technology choices made.

And, just in case it helps, this is how one UK Government department approaches things (I would credit my source, but don’t want to get anyone into trouble):

They split up documents into a lifecycle:

  1. Documents start life with a user, so can go in OneDrive.
    • As the user collaborates with colleagues those colleagues can gain shared access to the document in OneDrive.
    • They proposed the use of 2-year deletion policies on all OneDrive for Business files [I would question why… storage is not an issue with Enterprise versions of Office 365, and arbitrary time-based deletion is problematic when you go back to a document for a reference and find it’s gone…].
  2. If the original document leads to a scoped piece of work then the Documents are moved to an Office 365 Group, as that neatly fits in with a number of resources that are common to collaboration: Planner, Calendar, File Storage (SharePoint), etc. And O365 Groups underpin Teams.
    • However, this type of data is time limited.
    • They proposed the use of 2-year deletion policies on all O365 Groups [again, why?].
  3. If a document became part of organisational policy/guidance, etc. then the proposal was to create permanent SharePoint sites for document management or potentially to move such documents to the organisation’s Intranet service [which could be running on SharePoint Online], or other relevant location.

So, you can see the lifecycle properties:

  1. User (limited need to know).
  2. Group (wider need to know).
  3. Organisation (everyone can know).

This plan has the potential to allow the organisation to manage data in a better way and minimise the costs of the additional storage required for SharePoint. But, core to that is turning the idea that OneDrive for Business is personal use on its head. It’s a valid place to store business data, but users should manage the lifecycle of data better. And this needs to be plain for the user to understand so they can spend the minimum amount of time managing the data.

[i.e. they don’t like the idea that OneDrive for Business is a personal data store – it’s a data store provided to users as part of their job and they don’t like “personal” being part of that definition. My 4pth is that the limits of “personal” and “work” are increasingly eroded, but I can see that organisations have legal and regulatory concerns about the data held in systems that they manage.]

So, which Office 365 tool to use? There is no “one size fits all” but some of the above may help when you’re defining a strategy/architecture for managing that information…