Thoughts on the use of Sway as a presentation tool

A couple of weeks ago, I gave a short talk on adopting cloud services at Milton Keynes Geek Night (MKGN). I’ll admit being a little nervous – the talk was supposed to be 5 minutes (and I had more to say than would ever have fitted – I later learned it’s pretty rare for anyone to stick to their allotted time) and I’m not used to speaking to an audience larger than a meeting room-full (a typical MKGN audience in the current venue is around about 100).  Just to make things a little harder for myself, I decided to use Microsoft Sway for my visual aids.

For those who are unfamiliar with Sway, I got excited about it when it first previewed in 2014. Since then it’s shipped and is available as part of Office 365 or as a standalone product. It’s a tool for presenting content from a variety of sources in a visually-appealing style that works cross-platform and cross-form factor.

Even though Sway has an app for Windows 10, some of the content (e.g. embedded tweets) relies on having an Internet connection at the time of presenting.  Wi-Fi at conferences is notoriously bad and 3G/4G at the MKGN venue is not much better (although it did hold up for me on the night). So, with that and the 7Ps in mind I had PowerPoint and PDF fallback plans but I persisted with Sway.

I’m still not sure Sway is a presentation tool though…

You see, as I swiped and clicked my way through, the audience saw everything I saw. I prefer the simplicity of a picture, with my notes on my screen – I talk, the audience listens, the image re-enforces the view. Sway didn’t work for me like that. Indeed, Sway falls into what Matt Ballantine recently described as the latest whizz-bang tool in a post about a request he was given to knock up a few slides of PowerPoint:

“PowerPoint [… is …] rarely used to perform the task it was designed to do […] The latest whizz-bang tool is the answer! Prezi, Sway or whatever it is that the cool kids are using. Actually, though, the answer probably lies as much in new skills that people need to develop to communicate in a Digital era. Questions like:

  • Who is your audience?
  • What is the message that you are trying to deliver?
  • Where will they be?
  • How will they consume your content?
  • How can you extend the conversation?”

We use Sway at work for weekly updates on what’s been happening in the company – internal communications that used to make use of lengthy HTML emails (I almost never used to read to the end) became more immersive and easier to engage with. And that’s where I think Sway fits – as a tool for communications that are read asynchronously. Not as a tool for presenting a message to an audience in real time.

You can see what you think about the use of Sway as a presentation tool when you take a look at the Sway I used for my MKGN talk.

Adventures on a Brompton bike: my first London commute

Those who know me well know that I have a collection of bikes in my garage. Fans of the Velominati will be familiar with rule #12, which states:

“Rule #12: The correct number of bikes to own is n+1.

While the minimum number of bikes one should own is three, the correct number is n+1, where n is the number of bikes currently owned. This equation may also be re-written as s-1, where s is the number of bikes owned that would result in separation from your partner.”

So, it was with great delight that I recently persuaded my wife it would be a great idea for me to buy a new bike. Maybe not the super-light road bike that I might like (I need a super-light Mark before that makes sense anyway) but a commuter. A folding bike to take on the train. A Brompton.

My employer doesn’t take part in a Cycle to Work scheme and Bromptons are pretty pricey (so saving the tax would make a big difference) but I did my research and snapped up a second-hand example with “only 100 miles on the clock” on eBay (checking first to see if it was reported as stolen, of course!). So, on Monday, I was very excited to return home from work to find that my “new” bike (bike number four) had arrived.

For those familiar with Brompton specs, It’s an M3L. I’d like an S6L or S6R but this will do very nicely instead. (If you don’t know what that means, there’s a useful configurator on the Brompton website.)

Yesterday was my first trip to London with the Brommie, so how did it go?

Look out!

Well, my hi-vis purchases from Wiggle haven’t arrived yet and it’s a good idea to be brightly coloured. Nipping up the inside of large vehicles is a very bad idea that’s likely to get you killed but, if you’re confident in traffic, the Brompton is responsive and handles remarkably well.

The biggest problem I had was whilst riding off the end of a bus lane, when a motorist decided that was his (perfectly legal) cue to change lanes in front of me but clearly hadn’t seen me coming. My bell is pretty pathetic for warning car drivers (even with open windows) but my shout of “look out!” worked better. As did my brakes, hastily applied as I brought the Brompton to a skid stop a few inches from the door of the car (don’t tell Mrs W…). No harm done so off we rode/drove. I might invest in an air horn though…

London roads

In common with the rest of the UK, London’s roads are poorly surfaced in places and pretty congested at times. But there are plenty of cycle lanes in central London – including the ability to ride through roads that are closed to motorised traffic (sometimes contra-flow). My normal walking route from Euston to Whitehall through Bloomsbury and Seven Dials worked really well but the reverse was less straightforward. I’ve also ordered some free cycle route maps from Transport for London, so I’ll see if they inspire some nifty short-cuts.

I know some people are critical of the system with painted bike lanes being far less satisfactory than dedicated infrastructure but this is Britain and there’s not a lot of space to share between different types of road user! Even so, with bikes becoming more and more common, I’m sure that motorists are more used to cyclists sharing the road (I have some experience on “Boris Bikes” in London too, prior to buying my Brompton bike).

Folding, carrying, etc.

Watch any experienced Brompton bike user and they fold/unfold their bike in seconds. I currently take a bit more time… though by the end of the day I was starting to get the hang of it! There’s advice on the website (as well as in the manual).  I have to admit it’s a bit heavy to lug around (up stairs, etc.) and I felt like I was Ian Fletcher in an episode of W1A as I walked into the lift but that’s OK. And joking about my cycling attire (I was only wearing a helmet but that didn’t stop the lycra jokes) amused my colleagues and customer!

Sweaty

Clothing could be an issue. I was wearing a suit, with a rucksack on my back to hold my laptop etc. and my coat. That turned out to be a bad idea. I was dripping wet when I got to work… so I’ll need a different luggage solution and maybe a change of clothes (or I may need to see if I can get away without the suit, or at least the jacket…)

Suspension

Next up, the suspension. My Brompton arrived with the standard suspension block but Brompton recommend the firm version for those over 80kg or “who cycle more aggressively and are prepared to sacrifice some comfort”! So, at lunchtime I headed over to Brompton Junction to get a replacement suspension block of the firm variety (the store bike mechanic told me that even lighter people need it as the standard is just too soft). I also picked up a pump as it was missing from my bike (some retailers fit one as standard but maybe not all do) and took a look at some luggage. Expensive but nice. After mulling it over all day, I’ve ordered a waxed cotton shoulder bag which should be in my local branch of Evans Cycles (together with the front carrier block) for collection tomorrow…

So was it worth it?

I live 12 miles from the local railway station, which would be a bit far on a Brompton (it takes 45 mins on a road bike) so I’ll still be driving that part of my journey. Once off the train though, using the bike instead of walking cut my London travel down from about 45 minutes each way to around 15. So saving 30 minutes, twice a day (on the days when I’m in town) gives me back an hour in my day (if I avoid the temptation to use it for work…) – together with more exercise. And I can use the bike and take the train to the office in Stafford now instead of a 200-mile round trip (catching up with some work, reading, or even some sleep on the train). Sounds like a result to me.

Have I been pwned?

You’re probably aware that LinkedIn suffered a major security breach, in which something like 164,611,595 sets of user credentials were stolen. Surprisingly, you won’t find anything about this in LinkedIn’s press releases.

In less enlightened times (and before I started using LastPass), I may have re-used passwords. That’s why breaches like the one at LinkedIn are potentially bad. Re-using that identity means someone can potentially log in as me somewhere else – I could be pwned.

Microsoft Regional Director and MVP, Troy Hunt (@troyhunt) has set up an extremely useful site called HaveIBeenPwned. Entering your email address (yes, that means trusting the site) checks it against a number of known lists and yes, it seems mine was compromised in three hacks (at LinkedIn, Adobe and Gawker). In all of those cases, I’ve since changed my passwords and for popular sites – where they offer the option – I’ve started to use second factor authentication solutions (Azure MFA has been on my Office 365 subscription for a long time, I use Google two-step verification too and, since tonight, I’ve added LinkedIn’s two-step verification and Facebook Login Approvals).

So, I guess the two points of this post are:

  1. For heavens sake stop re-using passwords on multiple sites – you can’t rely on the security of others.
  2. Turn on 2FA where it’s available.

Hopefully one day soon, passwords will be consigned to the dustbin of technology past…

Scripting Azure VM build tasks: static IP addresses, BGInfo and anti-malware extensions

Following on from yesterday’s blog post with a pile of PowerShell to build a multiple-NIC VM in Azure, here are some more snippets of PowerShell to carry out a few build-related activities.

Setting a static IP address on a NIC

$RGName = Read-Host "Resource Group"
$VNICName = Read-Host "vNIC Name"
$VNIC=Get-AzureRmNetworkInterface -Name $VNICName -ResourceGroupName $RGName
$VNIC.IpConfigurations[0].PrivateIpAllocationMethod = "Static"
Set-AzureRmNetworkInterface -NetworkInterface $VNIC

Installing BGInfo

$RGName = Read-Host "Resource Group"
$VMName = Read-Host "Virtual Machine Name"
$Location = Read-Host "Region/Location"
Set-AzureRmVMExtension -ExtensionName BGInfo -Publisher Microsoft.Compute -Version 2.1 -ExtensionType BGInfo -Location $Location -ResourceGroupName $RGName -VMName $VMName

Installing Microsoft Antimalware

This one is a little more difficult – the script is a development of Mitesh Chauhan’s work entitled Installing Microsoft Anti Virus Extension to Azure Resource Manager VM using Set-AzureRmVMExtension

It’s worth reading Mitesh’s post for more background on the Microsoft Anti Virus Extension (IaaS Antimalware) and also taking a look at the Security Health in the Azure Portal (currently in preview), which will highlight VMs that have no protection (amongst other things).

Mitesh’s script uses a simple settings string, or for more complex configuration, it reads from a file. I tried to use a more complex setting and it just resulted in PowerShell errors, suggesting this wasn’t proper JSON (it isn’t):

$AntiMalwareSettings = @{
"AntimalwareEnabled" = $true;
"RealtimeProtectionEnabled" = $true;
"ScheduledScanSettings" = @{
"isEnabled" = $true;
"day" = 1;
"time" = 180;
"scanType" = "Full"
};
"Exclusions" = @{
"Extensions" = ".mdf;.ldf;.ndf;.bak;.trn;";
"Paths" = "D:\\Logs;E:\\Databases;C:\\Program Files\\Microsoft SQL Server\\MSSQL\\FTDATA";
"Processes" = "SQLServr.exe;ReportingServicesService.exe;MSMDSrv.exe"
}
}

Set-AzureRmVMExtension : Error reading JObject from JsonReader. Current JsonReader item is not an object: Null. Path”, line 1, position 4.

If I use the JSON form it’s no better:

$AntiMalwareSettings = {
"AntimalwareEnabled": true,
"RealtimeProtectionEnabled": true,
"ScheduledScanSettings": {
"isEnabled": true,
"day": 1,
"time": 180,
"scanType": "Full"
},
"Exclusions": {
"Extensions": ".mdf;.ldf;.ndf;.bak;.trn",
"Paths": "D:\\Logs;E:\\Databases;C:\\Program Files\\Microsoft SQL Server\\MSSQL\\FTDATA",
"Processes": "SQLServr.exe;ReportingServicesService.exe;MSMDSrv.exe"
}
}

Set-AzureRmVMExtension : Unexpected character encountered while parsing value: S. Path ”, line 0, position 0.

So the actual script I used is below:

# Install Microsoft AntiMalware client on an ARM based Azure VM
# Check note at the end to be able to open up the SCEP antimalware console on the server if there are problems.
# Author – Mitesh Chauhan – miteshc.wordpress.com (updated by Mark Wilson - markwilson.co.uk)
# For Powershell 1.0.1 and above
# See https://miteshc.wordpress.com/2016/02/18/msav-extension-on-azurearm-vm/

# Log in with credentials for subscription
# Login-AzureRmAccount

# Select your subscription if required (or default will be used)
# Select-AzureRmSubscription -SubscriptionId "Your Sub ID here"

$RGName = Read-Host "Resource Group"
$VMName = Read-Host "Virtual Machine Name"
$Location = Read-Host "Region/Location"

# Use this (-SettingString) for simple setup
# $AntiMalwareSettings = ‘{ "AntimalwareEnabled": true,"RealtimeProtectionEnabled": true}’;

# Use this (-SettingString) to configure from JSON file
$AntiMalwareSettings = Get-Content ‘.\MSAVConfig.json’ -Raw

$allVersions= (Get-AzureRmVMExtensionImage -Location $location -PublisherName "Microsoft.Azure.Security" -Type "IaaSAntimalware").Version
$typeHandlerVer = $allVersions[($allVersions.count)–1]
$typeHandlerVerMjandMn = $typeHandlerVer.split(".")
$typeHandlerVerMjandMn = $typeHandlerVerMjandMn[0] + "." + $typeHandlerVerMjandMn[1]

Write-Host "Installing Microsoft AntiMalware version" $typeHandlerVerMjandMn "to" $vmName "in" $RGName "("$location ")"
Write-Host "Configuration:"
$AntiMalwareSettings

# Specify for -SettingString parameter here which option you want, simple $settingsstring or $MSAVConfigfile to sue json file.
Set-AzureRmVMExtension -ResourceGroupName $RGName -VMName $vmName -Name "IaaSAntimalware" -Publisher "Microsoft.Azure.Security" -ExtensionType "IaaSAntimalware" -TypeHandlerVersion $typeHandlerVerMjandMn -SettingString $AntiMalwareSettings -Location $location

# To remove the AntiMalware extension
# Remove-AzureRmVMExtension -ResourceGroupName $resourceGroupName -VMName $vmName -Name "IaaSAntimalware"

# If you have error saying Admin has restricted this app, Navigate to “C:\Program Files\Microsoft Security Client”
# Run "C:\Program Files\Microsoft Security Client\ConfigSecurityPolicy.exe cleanuppolicy.xml"
# Or simply drag the cleanuppolicy.xml file above onto the ConfigSecurityPolicy.exe to sort it and you should be in.

The MSAVconfig.json file contains the JSON version of the Anti-Malware settings:

{
"AntimalwareEnabled": true,
"RealtimeProtectionEnabled": true,
"ScheduledScanSettings": {
"isEnabled": true,
"day": 1,
"time": 180,
"scanType": "Full"
},
"Exclusions": {
"Extensions": ".mdf;.ldf;.ndf;.bak;.trn",
"Paths": "D:\\Logs;E:\\Databases;C:\\Program Files\\Microsoft SQL Server\\MSSQL\\FTDATA",
"Processes": "SQLServr.exe;ReportingServicesService.exe;MSMDSrv.exe"
}
}

Building a multiple NIC VM in Azure

I recently found myself in the situation where I wanted to build a virtual machine in Microsoft Azure (Resource Manager) with multiple network interface cards (vNICs). This isn’t available from the portal, but it is possible from the command line.

My colleague Leo D’Arcy pointed me to Samir Farhat’s blog post on how to create a multiple NIC Azure virtual machine (ARM). Samir has posted his script on the TechNet Gallery but I made a few tweaks in my version:

#Variables
$VMName = Read-Host "Virtual Machine Name"
$RGName = Read-Host "Resource Group where to deploy the VM"
$Region = Read-Host "Region/Location"
$SAName = Read-Host "Storage Account Name"
$VMSize = Read-Host "Virtual Machine Size"
$AvailabilitySet = Read-Host "Availability Set ID (use Get-AzureRMAvailabilitySet to find this)"
$VNETName = Read-Host "Virtual Network Name"
$Subnet01Name = Read-Host "Subnet 01 Name"
$Subnet02Name = Read-Host "Subnet 02 Name"
$cred=Get-Credential -Message "Name and password for the local Administrator account"
 
# Getting the Network
$VNET = Get-AzureRMvirtualNetwork | where {$_.Name -eq $VNETName}
$SUBNET01 = Get-AzureRmVirtualNetworkSubnetConfig -Name $Subnet01Name -VirtualNetwork $VNET
$SUBNET02 = Get-AzureRmVirtualNetworkSubnetConfig -Name $Subnet02Name -VirtualNetwork $VNET
 
# Create the NICs
$NIC01Name = $VMName+'-NIC-01'
$NIC02Name = $VMName+'-NIC-02'
Write-Host "Creating" $NIC01Name
$VNIC01 = New-AzureRmNetworkInterface -Name $NIC01Name -ResourceGroupName $RGName -Location $Region -SubnetId $SUBNET01.Id
<code>Write-Host "Creating" $NIC02Name

$VNIC02 = New-AzureRmNetworkInterface -Name $NIC02Name -ResourceGroupName $RGName -Location $Region -SubnetId $SUBNET02.Id
 
# Create the VM config
Write-Host "Creating the VM Configuration"
$VM = New-AzureRmVMConfig -VMName $VMName -VMSize $VMSize -AvailabilitySetId $AvailabilitySet
$pubName="MicrosoftWindowsServer"
$offerName="WindowsServer"
$skuName="2012-R2-Datacenter"
Write-Host " - Setting the operating system"
$VM = Set-AzureRmVMOperatingSystem -VM $vm -Windows -ComputerName $vmName -Credential $cred -ProvisionVMAgent -EnableAutoUpdate
Write-Host " - Setting the source image"
$VM = Set-AzureRmVMSourceImage -VM $vm -PublisherName $pubName -Offer $offerName -Skus $skuName -Version "latest"
#Adding the VNICs to the config, you should always choose a Primary NIC
Write-Host " - Adding vNIC 1"
$VM = Add-AzureRmVMNetworkInterface -VM $VM -Id $VNIC01.Id -Primary
Write-Host " - Adding vNIC 2"
$VM = Add-AzureRmVMNetworkInterface -VM $VM -Id $VNIC02.Id
 
# Specify the OS disk name and create the VM
$DiskName=$VMName+'-OSDisk'
Write-Host " - Getting the storage account details"
$SA = Get-AzureRmStorageAccount | where { $_.StorageAccountName -eq $SAName}
$OSDiskUri = $SA.PrimaryEndpoints.Blob.ToString() + "vhds/" + $vmName+"-OSDisk.vhd"
Write-Host " - Setting up the OS disk"
$VM = Set-AzureRmVMOSDisk -VM $VM -Name $DiskName -VhdUri $osDiskUri -CreateOption fromImage
Write-Host "Creating the virtual machine"
New-AzureRmVM -ResourceGroupName $RGName -Location $Region -VM $VM

Upgraded Azure support for Enterprise Agreement customers

I recently found myself in a situation where I tried to log a support request on my customer’s Microsoft Azure subscription, only to find that they didn’t have any eligible support agreements in place.

You'll need to buy a support plan before you can submit a technical support request

That seemed strange, as from 1 May 2016, Microsoft is offering a 12-month support upgrade to all customers that have or intend to buy Microsoft Azure services on an Enterprise Agreement (EA), except those customers with a Premier support contract.

Digging a little deeper, I found that:

“Microsoft will begin upgrade for existing Azure customers on Enterprise Agreement on May 1, 2016, and plans to complete the upgrades by September 30, 2016. New customers will be upgraded within 30 days of account activation. Customers will be notified by email upon being upgraded. For more information, please talk with your account manager or contact EA Azure Support through the Enterprise Portal”

But, the Enterprise Agreement Support Offer page that contains this information is subtitled: “to activate, contact your Microsoft account team”, so I contacted my customer’s account team.  Initially, they said that the customer needed to contact their Microsoft Licensing Solution Provider (LSP), who were equally confused, but I pushed a little harder and the account team investigated further, before arranging the necessary support.

So, if you’re an EA customer and you can’t wait until September to get an upgrade to your Azure support agreements, it may just be worth a chat with your Microsoft account team.

Short takes: deleting bit.ly Bitlinks; backing up and restoring Sticky Notes; accessing cmdlets after installing Azure PowerShell

Another collection of short notes to add to my digital memory…

Deleting bit.ly links

Every now and again, I spot some spam links in my Twitter feed – usually prefixed [delicious]. That suggests to me that there is an issue in Delicious or in Twitterfeed (the increasingly unreliable service I use to read certain RSS feeds and tweet on my behalf) and, despite password resets (passwords are so insecure) it still happens.

A few days ago I spotted some of these spam links still in my bit.ly links (the link shortener behind my mwil.it links, who also own Twitterfeed) and I wanted to permanently remove them.

Unfortunately, according to the “how do I delete a Bitlink” bit.ly knowledge base article – you can’t.

Where does Windows store Sticky Notes?

Last Friday (the 13th) I wrote about saving my work before my PC was rebuilt

One thing I forgot about was the plethora of Sticky Notes on my desktop so, today, I was searching for advice on where to find them (in my backup) so I could restore.

It turns out that Sticky Notes are stored in user profiles, under %appdata%\Microsoft\Sticky Notes, in a file called StickyNotes.snt. Be aware though, that the folder is not created until the Sticky Notes application has been run at least once. Restoring my old notes was as easy as:

  1. Run the Sticky Notes desktop application in Windows.
  2. Close Sticky Notes.
  3. Overwrite the StickyNotes.snt file with a previous copy.
  4. Re-open Sticky Notes.

Azure PowerShell installation requires a restart (or explicit loading of modules)

This week has involved a fair amount of restoring tools/settings to a rebuilt PC (did I mention that mine died in a heap last Friday? If only the hardware and software were supplied by the same vendor – oh they are!). After installing the Azure PowerShell package from the SCCM Software Center, I found that cmdlets returned errors like:

Get-AzureRmResource : The term ‘Get-AzureRmResource’ is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

After some RTFMing, I found this:

This can be corrected by restarting the machine or importing the cmdlets from C:\Program Files\WindowsPowerShell\Modules\Azure\XXXX\ as following (where XXXX is the version of PowerShell installed[)]: import-module "C:\Program Files\WindowsPowerShell\Modules\Azure\XXXX\azure.psd1" import-module "C:\Program Files\WindowsPowerShell\Modules\Azure\XXXX\expressroute\expressroute.psd1"

Adventures with robocopy.exe

It’s been a while since I had to make copies of large numbers of files in complex directory structures from the Windows command prompt but, faced with the need to take a backup within a command line environment (the WinRE Command Prompt), I needed to refresh my Windows command line skills.  There’s loads of advice out there on the Internet (most of it subjective) but the general consensus seems to be that the Extended Copy command (xcopy.exe) is deprecated and has been replaced in recent versions of Windows by the Robust File Copy command (robocopy.exe). Of course, there are many alternatives but they are not natively provided in WinRE!

(Some of the more useful articles I found are Nicholas Tyler’s reply on Stack Overflow, Oliver Muchai’s reply on Super User and Scott Hanselman’s blog post from 2007.)

Robocopy has loads of options but the ones I selected in the end were:

robocopy sourcefolder targetfolder /MIR /ZB /XJ /R:3 /W:1 /log:filename.txt

to make a mirror copy of the data, in restartable mode (to survive network glitches), using backup mode in the case of an access denied error, to exclude Junction Points, to retry 3 times on failure, waiting 1 second each time (compared with the defaults of 1 million and 30 seconds respectively) and to log to the chosen file.

The /XJ switch was added as a late addition after some abortive attempts ended up with recursive Application Data folders. Some people have erroneously referred to this as a bug in Robocopy – actually it’s caused by Windows’ attempts to prevent application developers writing to system locations (and forcing them to write to the user profile instead, as described by “DaddyMan” on a Microsoft Forum post:

“The Application Data folder is actually a junction, which points back to its parent folder.
[%username]\AppData\Local\Application Data\
points to
[%username]\AppData\Local\”

and by Shawn Keene (@LtCmdrKeene) in another, similar, post:

“[Any time] an application tries to save a file to a naughty location (such as C:\Windows or C:\Program Files), Windows will force the actual save to end up at a place inside your user folder instead (C:\Users\Username\LocalSettings\VirtualStore\Program Files).  It tricks the program into thinking that the file really did go to the Program Files folder, but in reality it’s somewhere inside your user folder.

This [virtualisation] (tricking the program) is required so that badly-created apps that save to naughty locations will still work.  The alternative is that the program tries to save and then crashes when it can’t access the Program Files folder.  If Windows didn’t do this, the program would require administrator access every time it runs — which is very insecure, plus would make the program impossible to use in corporate environments where users aren’t allowed to be administrators.

Rest assured that the multiple layers you are seeing are a result of folder redirection and [virtualisation] (also known as junction points).  There’s no need to clean these up or correct it, and you are well advised to avoid exploring those files.”

Finally, I needed to remove the folders that I had accidentally created with recursive Application Data folders inside (I counted 25 in one case!). Neither Windows nor the Windows Command Prompt (del and rmdir commands) could do this, resulting in “too long” errors but Super User Aaron has the answer (which is a variation on the method Bob Coss commented on one of my own old blog posts):

“Create an empty directory with mkdir empty, then use robocopy empty\ "Application Data\" /mir" which will remove the whole directory tree. Then issue a rmdir empty and rmdir "Application Data to clean up and you’re done.”

Windows 10 PC stuck in BitLocker loop (and recovering details of open tabs in the Edge browser)

I try not to reboot my PCs too often – frankly I thought I’d left the days of daily reboots behind with Windows 95 – but, faced with a display driver bug on my Surface Pro 3 (that seems to be triggered by the Azure Portal), a change of password that led to repeated authentication prompts (and OneDrive refusing to sync), together with some software updates pushed to my PC from SCCM, I had little choice this afternoon.

Unfortunately that “quick reboot to get things working again” turned into a disaster, with an hour long support call, followed by a desperate attempt to recover the last few hours’ work.

Stuck in a BitLocker loop

After rebooting, I found that a Windows 10 update hadn’t properly applied. Each time I entered my BitLocker PIN, I was faced with a message that invited me to use the BitLocker key to recover my PC. My IT support team gave me my key… and then after a restart we went round the loop again. We tried hard resets, turning the TPM on and off in the BIOS and more, until I found a TechNet wiki article that seemed to describe the issue (or at least something very like it).

To terminate this BitLocker recovery loop, I needed to suspend BitLocker from within the Windows Recovery Environment (WinRE). That’s OK, as long as you have the recovery key and, following the advice in the article linked above, I chose the “Skip this drive” link at the bottom of the page that requests entry of the recovery key, before selecting Advanced options/Troubleshoot/Advanced options/Command Prompt.

Next, I disarmed BitLocker using the following commands:

manage-bde -status c:
manage-bde -unlock c: -rp recoverypassword
manage-bde -protectors -disable c:

With BitLocker disabled, I hoped to be able to restart the PC and boot Windows, but unfortunately it was still not playing ball. I’ll be driving to the office on Monday for someone to take a look at my PC and I suspect a rebuild will be on the cards…

Work in progress

Despite the support team’s assurances that all of my data is on servers, I’m pretty sure it’s not. All of my data until I changed my password is on servers but anything since then has been failing to sync. If the sync engine can’t authenticate, I’m pretty sure I must be working from a local copy – which will be lost if the PC is rebuilt!

The items of most concern to me were some scripts I’d finally got working this afternoon; and any notes in OneNote.  I wrote last year about issues with OneNote and OneDrive (now overcome by doing it properly) but goodness knows where the unsynced changes are (again, I found a backup, but it doesn’t have the latest changes in it).

Again, using the WinRE Command Prompt, I backed up the files I thought were most likely to be missed. I tracked down the scripts that I’d finally completed and that had led to a few late nights this week (phew!) – and made a backup copy of my user profile, just in case.

The last worry for me was my browser. Forced by policy to use a Microsoft browser, I had lots of open tabs in Edge, as well as a few in Internet Explorer. The ones in Edge included the various posts I’d found that had helped me to complete my scripts – and I wanted to go back through them to blog about what I found…

Edge does recover sessions after a crash but, with a potential PC rebuild on the cards, I’m not sure I’ll ever get the chance so I tried tracking down the location of the recovery data.  Brent Muir’s fascinating look at Windows 10 – Microsoft Edge Browser Forensics told me where to find the recovery files (in %userprofile%\AppData\Local\Packages\Microsoft.MicrosoftEdge_8wekyb3d8bbwe\AC\MicrosoftEdge\User\Default\Recovery\Active) but they are binary. Gleb Derzkij’s answer to a Stack Overflow forum post looked useful but I couldn’t get it to work.  What I could do though was open each of the (115!) .dat files in the Active Recovery folder using Notepad and see enough information in there to identify the URIs, then manually copy and paste them to a text file (ready to open when I’m back at my PC).

So that’s recaptured my work and the PC is ready to be completely razed to the ground if necessary. And the moral of the story? Never apply updates on Friday the 13th!

Reset the password for a Windows virtual machine in Azure

Imagine the scenario: you have a virtual machine running in Azure but something’s gone wrong and you don’t have Administrative credentials to log in to Windows. That’s a more common occurrence than you might expect but there is a workaround: in Azure there an option to reset the local administrator password.

Unfortunately, that capability hasn’t been implemented yet in the management portal for Azure Resource Manager but it is available in Microsoft Azure PowerShell.

Reset Password - Coming Soon

I found the following commands worked for me (based on a blog post by Dan Patrick), resetting the built-in administrator account for the defined server in the defined Resource Group to be called DisabledAdmin (after which it won’t be disabled any more but after unlocking the server and creating an alternative administrator, the built in account can be disabled again) with a GUID for the password:

$rgName = "Example-Resource-Group"
$vmName = "SERVERxxx"
$extName = "VMAccessAgent"
$userName = "DisabledAdmin"
$password = [guid]::newguid()
$location = "westeurope"
Set-AzureRmVMAccessExtension -ResourceGroupName $rgName -VMName $vmName -Name $extName -UserName $userName -Password $password -Location $location

(of course, you’ll need to take a note of that GUID if you want to log in to the account!).

The VM Access Extension can be called anything you like (the MSDN reference for Set-AzureRmVMAccessExtension gives more information); however, as noted in the Microsoft Azure documentation (How to reset the Remote Desktop service or its login password in a Windows VM):

“You can reset remote access to your VM by using either Set-AzureRmVMExtension or Set-AzureRmVMAccessExtension

“Both commands add a new named VM access agent to the virtual machine. At any point, a VM can have only a single VM access agent. To set the VM access agent properties successfully, remove the access agent set previously by using either Remove-AzureRmVMAccessExtension or Remove-AzureRmVMExtension. Starting from Azure PowerShell version 1.2.2, you can avoid this step when using Set-AzureRmVMExtension with a -ForceRerun option. When using -ForceRerun, make sure to use the same name for the VM access agent as set by the previous command.”

So, by using a known name for the VM Access Extension (VMAccessAgent), I can avoid potential issues later.