Securely wiping hard disks using Windows

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

My blog posts might be a bit sporadic over the next couple of weeks – I’m trying to squeeze the proverbial quart into a pint pot (in terms of my available time) and am cramming like crazy to get ready for my MCSE to MCITP upgrade exams.

I’m combining this Windows Server 2008 exam cramming with a review of John Savill’s Complete Guide to Windows Server 2008 and I hope to publish my review of that book soon afterwards.

One of the tips I picked up from the book this morning as I tried to learn as much as I could about Bitlocker drive encryption in an hour, was John’s tip for securely wiping hard drives using a couple of Windows commands:

format driveletter: /fs:ntfs /x

will force a dismount if required and reformat the drive, using NTFS.

cipher /w:driveletter:

will remove all data from the unused disk space on the chosen drive.

I don’t know how this compares with third party products that might be used for this function but I certainly thought it was a useful thing to know. This is not new to Windows Server 2008 either – it’s certainly available as far back as Windows XP and possibly further.

For more tips like this, check out the NTFAQ or John’s site at Savilltech.com.

How Microsoft and RSA plan to protect our sensitive data

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Mention Microsoft and security in the same sentence and most people will scoff but these day’s it really a bit unfair… Windows security has come a long way (it still has a way to go too) but nevertheless, many of the customers that I deal with run third party solutions (often at great cost) rather than trust their data security to Microsoft.

Then there’s digital rights management (DRM) – we hear a lot about how DRM is applied to music and video downloads but little about the real practical use of this technology – making sure that only those who are entitled to see a particular item of data (for example medical records or financial details) are able to access it.  Microsoft has rights management services built into Windows as one of the many identity and access solutions but it seems to me that very few organisations use this capability.  Perhaps a few of the frequent and high profile Government data security mishaps would be mitigated if DRM was applied to their data…

Today, Microsoft and RSA – a well-respected security company, now absorbed into EMC – announced an expansion of their technology partnership.  Under the terms of this partnership, Microsoft will license the RSA Data Loss Prevention (DLP) classification engine in order to trigger policy-based controls over information.

Tom Corn, Vice President of Product Management and Marketing for RSA’s Data Security Group, explained that organisations have a requirement to share information without limiting accessibility – striking a balance between security and accessibility.  Slating existing point products as costly, complex and not addressing the problem he explained how:

  1. Protection is an end-to-end problem and the data moves around – existing products only acts at certain points in the data exchange.
  2. Infrastructure components lack visibility of the data sensitivity – context is required to classify data and take appropriate actions.
  3. Existing tools and controls lack identity awareness, making it difficult to tie protection to identity.
  4. Management – security policies often exist as binders on shelves and may be written by different groups within an organisation (e.g. security, or operations) leading to a disconnected approach.  All too often the management policies are infrastructure-centric (e.g. laptop security policy, Internet security policy) rather than information-centric (e.g. credit card data storage policy).

Meanwhile, John (JG) Chirapurath, Director of Identity and Security at Microsoft spoke about how Microsoft is licensing DLP to build it into products such as Exchange Server and Office SharePoint Server to provide content awareness, then providing identity awareness through components such as Active Directory Rights Management Services (AD RMS) to allow collaboration (which relies on knowledge of identity) whilst protecting intellectual property.  By “building in” and not “bolting on”, Microsoft believes that it can provide an end-to-end solution, supported with centralised management for information-centric policies for usage, protection and access.

Under the terms of the agreement, RSA will launch DLP v6.5 later this month with full integration to AD RMS and, as new versions of products come to market eventually the entire infrastructure will make use of the DLP technology.  Customers are able to protect their investment as the core engine and policy formats exist today and, as the core DLP technologies are adopted into the Microsoft platform, RSA will continue to develop complimentary products (e.g. advanced management consoles).

Microsoft were unwilling to disclose any further details of their roadmap for integrating the DLP product into their products but did comment that the claims-based identity platform codenamed Geneva (formerly Zermatt) is a key part of Microsoft’s identity strategy and that there would be clear advantages in using Windows CardSpace to unlock business to consumer (B2C) scenarios for data exchange.  There was also a hint that management would be possible from RSA’s products and from the Forefront integrated security system product (codenamed Stirling).

All in all, this is a positive step on the part of Microsoft and EMC/RSA.  What remains to be seen is how willing business and Government customers are to invest in protecting their data.  Right now we have a business problem and a technology solution but it seems to me there is an apparent lack of desire to implement the technology and supporting processes.  Let’s hope that by integrating technologies like DLP into the core IT infrastructure, our personal details can remain confidential as we increasingly collaborate online.

Windows Vista and Server 2008 SP2 is opened up to the public, target release date announced

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

After the storm of announcements from Microsoft at PDC, WinHEC and TechEd EMEA it’s been a quiet few weeks but, for those who haven’t seen, Microsoft announced that the Windows Vista and Server 2008 Service Pack 2 beta will be opened up to a wider audience, starting with TechNet and MSDN subscribers at 14:00 tomorrow (I guess that’s Redmond time, so 22:00 here in the UK) and then via a broader customer preview programme (CPP) on Thursday (4 December).

This release is intended for technology enthusiasts, developers, and administrators who would like to test SP2 in their environments and with their applications prior to final release and, for most customers, Microsoft’s advice is to wait until the final release prior to installing this update.

Full details of the changes in the SP2 beta may be found in Microsoft’s Windows Server TechCenter.

Microsoft also announced the date that they are aiming for (not a firm commitment) – SP2 should be expected in the first half of 2009.

NetBooks, solid state drives and file systems

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Yesterday, I wrote about the new NetBook PC that I’ve ordered (a Lenovo IdeaPad S10). In that post I mentioned that I had some concerns about running Windows 7 on a PC with a solid state drive (SSD) and I wanted to clarify something: it’s not that Windows 7 (or any other version of Windows) is inherently bad on SSD, it’s just that there are considerations to take into account when making sure that you get the most out of a solid state drive.

Reading around various forums it’s apparent that SSDs vary tremendously in quality and performance. As a consequence, buying a cheap NetBook with a Linux distro on it and upgrading the SSD to a larger device (the Linux models generally ship with lower capacity SSDs than their more expensive Windows XP brethren) is not necessarily straightforward. Then there’s the issue of form factor – not all SSDs use the same size board.

Another commonly reported issue is that NTFS performance on an SSD is terrible and that FAT32 should be used instead. That rings alarm bells with me because FAT32: does not include any file-level access control lists; has a maximum file size of 4GB (so no good for storing DVD ISOs – not that you’ll get many of those on the current generation of SSDs – anyway, most NetBooks do not ship with an optical drive).

The reason for poor NTFS performance on SSDs may be found in a slide deck from the 2008 Windows Hardware Engineering Conference (WinHEC), where Frank Shu, a Senior Program Manager at Microsoft, highlights:

  • The alignment of NTFS partition to SSD geometry is important for SSD performance in [Windows]
    • The first Windows XP partition starts at sector #63; the middle of [an] SSD page.
    • [A] misaligned partition can degrade [the] device’s performance […] to 50% caused by read-modify-write.

It sounds to me as if those who are experiencing poor performance on otherwise good SSDs (whilst SSDs come in a smaller package, are resistant to shocks and vibration, use less power and generate less heat than mechanical hard drives SSD life and performance varies wildly) may have an issue with the partition alignment on their drives. Windows 7 implements some technologies to make best use of SSD technology (read more about how Windows 7 will, and won’t, work better with SSDs in Eric Lai’s article on the subject).

In addition, at the 2007 WinHEC, Frank Shu presented three common issues with SSDs:

  • Longer setup time for command execution.
  • SSD write performance.
  • Limited write cycles for NAND flash memory (100,000 write cycles for single layer cell devices and 10,000 write cycles for multi layer cell devices).

(He also mentioned cost – although this is dropping as SSDs become more prevalent in NetBooks and other PC devices aimed at highly-mobile users).

In short, SSD technology is still very new and there are a lot of factors to consider (I’ve just scraped the surface here). I’m sure that in the coming years I’ll be putting SSDs in my PCs but, as things stand at the end of 2008, it’s a little too soon to make that jump – even for a geek like me.

Incidentally, Frank Shu’s slide decks on Solid State Drives – Next Generation Storage (WinHEC 2007: WNS-T432) and Windows 7 Enhancements for Solid-State Drives (WinHEC 2008: COR-T558) are both available on the ‘net and worth a look for anyone considering running Windows on a system with an SSD installed.

Why Lenovo’s S10 seemed like a good idea(pad) to me

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I try to keep my work and home life on different computers. It doesn’t always work, but that’s the idea anyway. The problem I find is that, every time I’m away from home (which is when I get most of my blogging done), I find myself carrying around two laptops and, even without any peripherals (power adapters, etc.), that’s 4.5kg of luggage. Any sensible person would use an external hard disk for one of the workloads but… there you go…

Lenovo IdeaPad S10I’ve been watching developments with small form-factor PCs (so called “NetBooks”) for a while now and over the weekend I took the plunge. Tomorrow morning I’m expecting a delivery of a Lenovo IdeaPad S10 to slip in my bag alongside the Fujitsu-Siemens S7210 that I use for work.

So why did I choose the Lenovo?

  • In terms of build quality, my IBM ThinkPad is by far and away the best notebook PC I’ve ever had (better than the various Toshiba, Compaq, Dell and Fujitsu-Siemens units I’ve used – and certainly better than my Apple MacBook) – I’m hoping that Lenovo have continued that quality as they’ve taken on the former IBM PC business (the reviews I’ve read certainly indicate that they have).
  • I want to use this NetBook with Windows 7 – and I know it can work (this is the model that Steven Sinofsky showed in a keynote at Microsoft’s 2008 Professional Developers Conference).
  • I was impressed with Windows 7 running on Paul Foster’s Acer Aspire One, but the keyboard is just too small for my fat fingers.
  • The Lenovo S10 has a PC Express Card slot (so it should work with my Vodafone 3G card – and yes, I know I can get a USB version but I’d need to convince my employers of the need for an upgrade, which would not be an easy sell when they give me a perfectly good laptop with a PC Express Card slot to use…).
  • I also seriously considered the Dell Mini 9 (especially when they mis-priced it on their website for £99 last week – incidentally, the resulting orders were not fulfilled) but I’m not convinced that using a pre-release operating system on a solid state hard drive is really a good idea – I could easily kill the drive within a few months. Meanwhile, the Lenovo has a traditional 160GB hard disk and the 10.2″ screen (rather than 9″) translates into more space for a larger keyboard without noticeably increasing the size of the computer (for those who still want a 9″ model, Lenovo have announced an S9 but I’ve seen no sign of it in the UK yet). Another option that I discounted was the Samsung NC10 – which has a better battery and one more USB port but no PC Express Card slot.
  • The equivalent Asus and Acer models may be less expensive but the big names (IBM, Dell, HP as well as Samsung and Toshiba) are all reducing their prices – and by waiting for the reduction in the UK’s VAT rate to take effect the price was £292.25 for the S10 at eBuyer with free shipping (although I paid another tenner for next-day delivery).

I’m sure my sons will be amused when yet another computer appears on my desk (my wife may be slightly less so…) but I’m thinking of this as an early Christmas present to myself…

Further reading

Here are some of the posts that I found useful before deciding to buy this PC:

Useful Links: November 2008

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A list of items I’ve come across recently that I found potentially useful, interesting, or just plain funny:

More Xtremely Technical seminars scheduled for spring 2009

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A couple of weeks back, I was lucky enough to attend one of John Craddock and Sally Storey’s XTSeminars on Windows Server 2008 (those who were at the inaugural Active Directory User Group meeting would have got a taster). I’d blogged about the event beforehand and it really was an excellent use of my time – I can’t understate how good these seminars are (think 2 whole days of detailed presentations and demonstrations, diving pretty deep in places, with none of the marketing overhead you would have in a Microsoft presentation).

If the credit crunch hasn’t hit your training budget yet, then you might want to consider one of the workshops that are scheduled for the spring and the current dates are:

  • 25-26 February 2009, Microsoft Active Directory Internals.
  • 11-12 March 2009, Active Directory Disaster Prevention and Recovery.
  • 18-19 March 2009, Windows Server 2008.

If you do decide that you’re interested in one of these sessions and you book onto it – please mention my name (of even get in touch with me to let me know) – it won’t make any difference to your booking process but it will help me if they know you heard about the seminars on this blog!

Getting to grips with presenting using Microsoft Office Live Meeting

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

This morning, I gave a technical presentation to a fairly large group (around 60 people). Nothing special there – I ought to be able to do that by this stage in my career – but this was a presentation with a difference… it was conducted via Microsoft Office Live Meeting 2007 (using the BT Conferencing service).

Now, the fact that this was done over the web was great: 60 less individual journeys in order to meet somewhere mutually convenient (resulting in direct environmental and financial cost benefits, as well as time savings); one less conference room (more financial benefit); and I didn’t need to take a load of equipment with me for a demonstration (although I could have done all the demos for this session on my laptop).

I’ve attended many Live Meetings where other people are presenting but I’ve never led one before and what hadn’t struck me until we did a dry run to test the technology was the impact of not being able to see my audience. With 60 people each connecting individually, many of them behind a corporate proxy server that won’t let SIP-based audio pass let alone video, webcams (even RoundTable devices) were out of the question. In effect, I was talking to my computer for just over an hour and hoping that people were still interested. It’s not a nice way to present – I rely on my audience’s body language to know that people are interested, that they understand what I’m saying, that I’m not going too fast, or too slow – and, even though Live Meeting has the facility for people to provide feedback, when you’re presenting your content and balancing slides, notes and demos, watching the seating chart to see if someone has turned their flag to red, or the Q&A panel to see if someone has a really pertinent question is just not very practical.

Despite that, it worked pretty well. Apart from me having too much content once taking into account the fact that people had joined the call late (as is normal in the organisation where I work) and that even though I’d booked a 75 minute slot, people tend to think in hours and would start to drop off the call at the 60 minute point… never mind, we live and learn.

I don’t want to suggest that I’m now some sort of presentation God (I’m certainly not – although I do enjoy this sort of thing) – what I’d really like to get across in this blog post are the discoveries I made on fairly steep learning curve with Live Meeting over the last few days, in the hope that they may be useful for someone else.

The first challenge was scheduling the meeting. It’s useful to know that Live Meeting can schedule meetings from the client application (which integrates with Microsoft Office functionality – for instance the Outlook Calendar) but that there is also a web interface – and that web interface is where things like recording the meeting, whether or not to include audio, options for presenter feedback, etc. There are also two types of meeting: scheduled; or meet now.

It’s also worth knowing a bit about how the audio content works. I know from trying to watch Microsoft webcasts over Live Meeting when connected to the corporate network that our proxy servers do not allow the audio portion to pass, so I need to work from home or a hotel to use audio with Live Meeting (hence the panic when my ADSL line went down last night). For that reason, I wanted people in an office to be able to dial in to a voice conferencing service and, whilst BT Conferencing’s Live Meeting service is linked to BT MeetMe to provide this functionality, MeetMe has a maximum of 40 participants. BT were quite happy to sell me a managed event call as an alternative but I’m not even empowered to order anything more than the most trivial of expenses these days without management approval (even staying in a half-decent hotel needs director-level sign-off), so I didn’t want to jump through hoops to explain why a lowly solution architect was holding a meeting with a high number of attendees. A bit of lateral thinking led me to a solution – I also have a voice conferencing account with Genesys and whilst I didn’t want to have to installed their software for the webcast – the audio portion of their Meeting Centre does allow 125 participants to join the call. So, after telling everyone behind the firewall to dial a different number and to put their phones on mute, we were in business. The one downside was that I needed to wear headphones with a microphone for the Live Meeting audio (for the recording) and to use a hands-free speaker phone for the voice conferencing at the same time.

Next up – how to present the slides. In my first attempt at getting Live Meeting to work, I shared my screen and showed PowerPoint that way. It really hit my computer’s performance and the quality was awful. The correct way to do it is to go to the Content menu in Live Meeting, select Share, then Upload File (View Only) – or alternatively select Manage from the Content menu and then click the button to upload a file. Live Meeting will convert the file to its own format, before uploading and scanning for any security issues but, even though this feature is intended to work for various Office file formats, PDFs, multimedia and HTML files, if you use a 64-bit operating system (I do) then only PowerPoint will work.

Live Meeting also lets you do things like white-boarding, application sharing and even desktop sharing. I used the application sharing functionality to share a remote desktop connection for some demos and also created some polls to get a feel for my audience’s experience (the idea being that I could pitch the presentation accordingly – all the more important without a direct feedback mechanism).

And, since the meeting ended, I’ve found that I could have set the colour depth when sharing applications and also viewed the screen resolution of other meeting participants in order to pick something appropriate.

So, what else did I learn?

  • I’d definitely recommend using a co-presenter. One of my colleagues facilitated the meeting and was also acting as a presenter in Live Meeting. That meant he could monitor things like the Q&A panel to deal with any urgent questions, connection difficulties, etc.
  • The 6 Ps (or just Practice Practice Practice, for those who are not familiar with the slightly less polite version) – aside from all the normal planning and preparation that I would put into a presentation, there was the effort put into making sure that the technology would work. Here, again, my co-presenter Mike was really helpful (“Can you hear me over Live Meeting? – “No” – “What about now?” – “That’s better!”. “How do the slides look in the Live Meeting client?”, etc.)
  • Give yourself plenty of time before the session to upload the slides and generally prepare. My 15.5MB PowerPoint 2007 presentation was just over twice that size when converted to Live Meeting format, and took a while to upload over an ADSL line). Then there may be polls to set up, applications to get ready for sharing, etc.
  • When presenting PowerPoint slides, you can turn thumbnails on/off in the Content menu, but there is no equivalent to PowerPoint’s Presenter View to access speaker notes. As a consequence, it might be handy to export the PowerPoint presentation to a Word document and print it before starting the meeting.
  • If you like to point things out on your slides (and I do), then the annotation tools may come in handy with a pointer, highlighter, and other tools too.
  • If you’re planning on recording a meeting, don’t forget to click the record button! (and make sure people know that they are being recorded – so they can opt out if they’re not comfortable with that). Whilst on the subject of recordings, by far and away the biggest disappointment for me was that, even though there are two versions available for each recording (for viewing or for download), neither one is perfect:
    • The Microsoft Office Live Meeting High Fidelity Presentation does not need any add-ins to play but I found there were some substituted fonts, the demonstrations using shared applications were not recorded and the slide animations did not work correctly.
    • The Microsoft Office Live Meeting Replay is much better, but does not show slide animations (so some slides will appear with lots of graphics on top of one another) and it requires the “Microsoft Office Live Meeting Replay Wrapper” to be installed from the download page.
  • As a result of the above, it might be necessary to refrain from using some PowerPoint features (e.g. slides with lots of animations) as they may not present well in the recorded version of the Live Meeting – one of my more complex slides wasn’t looking too good during the presentation either (although it seems to be OK on the Live Meeting replay).
  • If you use polls to solicit feedback from the audience, you can extract that data later. It took some time to work out how – in the end I found out that the web console has the ability to generate reports (it’s possible to report on the names of attendees and the time that they connected, disconnected, their IP address, Live Meeting client type, etc.) and those reports include the poll data.

This is just scraping the surface of what’s possible with Live Meeting – there’s a lot more functionality available (meeting lobby, breakout rooms, etc.) but this summarises the basics that I had to get to grips with over the last few days. Sadly the online help provided by Microsoft is very superficial (BT do provide some additional help as part of their service and I’m sure other providers do something similar) but a bit of patience and a well-targeted Google search should help to fill in the gaps.

Great customer service from my ISP – and a useful BT exchange status checker

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

There’s a lot of bad things written about UK Internet service providers, so I’m really glad to have a positive tale to tell tonight. My ISP (Plusnet) may not be the least expensive but, after my friend Alex’s experience of moving from Nildram to Virgin and seeing the line speed on his connection drop from 6Mbps to less than 512Kbps (on the same line, with the same equipment at his end) and Virgin telling him that was the most the line would support, I’m reluctant to switch ISPs – especially as my ADSL connection is normally rock solid with the router reporting a connection speed around 7Mbps.

Unfortunately, as I’m here burning the midnight oil, putting the finishing touches on a presentation I’m supposed to be delivering via Live Meeting (over said ADSL line) tomorrow morning – my connection has gone down. Arghhh! After restarting almost all the equipment on my home LAN, I noticed that my router’s PPP interface had not picked up an IP address, despite showing operational DSL status. I called Plusnet, expecting a lengthy wait, only to be surprised as my call was answered within seconds of selecting the technical support option from the ACD prompt. The really helpful tech support guy that I spoke to (Jake) was just working though checking my router settings when he noticed that my local telephone exchange has a major service outage – detected at 22:26 this evening (just before I got home) and due to be cleared by 00:26 (by when I had hoped to be in bed…). Not to worry – at least I know it’s a problem at the exchange.

Plusnet Exchange Checker showing a major service outage at my local telephone exchangeThe good news is that Plusnet also has an exchange status checker in the user tools section of their website – and even though my ADSL line is down, I can use my iPhone’s mobile data connection to access the status reports.

It’s currently a minute past midnight and the connection’s not back up yet… but at least I can feel better as I keep track of BT whilst they fix the line.

[Update: 00:14 and the line is back up… just enough time to publish this blog post and catch some zeds before an early start tomorrow.]

Recording VoIP calls using Wireshark

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Gary Marshall writes about how the UK Government plans to pour billions of pounds (as if they weren’t wasting enough money already) into recording all of our telephone calls. Well, funnily enough, I want to do the same thing… and it turns out to be remarkably easy – at least it is if you’re using a VoIP phone.

First of all, I should point out that, depending on where you live, it might be illegal to record phone calls without consent. In my case, I recorded a call from my desk phone to the voicemail on my mobile phone. As I was both the caller and the receiver I think it’s safe to say that there was consent – even if it does sound a bit mad. This was a proof of concept – the real usage case I have in mind is for the Coalface Tech podcasts, as last time James and I tried to record one over Skype there was just too much lag (and interference… although that might have been a local problem). Using the Cisco 7940 on my desk in the UK in to call a landline in Oz via my Sipgate account shouldn’t be too bad (and won’t cost too much either). What follows is a recipe for recording the call.

Ingredients

* If a softphone were used on the same computer as the packet capture, then it should be possible to capture the network traffic without needing to use a hub.

Method

  1. Install and configure Wireshark.
  2. Ensure that the computer being used for packet capture can see the phone traffic (i.e. that they are both connected to the hub – not a switch, unless port spanning or a tap are in use).
  3. Using Wireshark, start capturing traffic on the appropriate interface.
  4. Once the call(s) to be captured have been made, end the capture.
  5. In Wireshark, select VoIP calls from the Statistics menu – details of all captured calls should be listed:
  6. Viewing VoIP call statistics from a Wireshark trace

  7. At this point, it’s also possible to graph the traffic and also to play back the call (once decoded) – either one or both streams of the conversation:
  8. VoIP call graph generated from a Wireshark trace
    Playing back a VoIP call using the RTP packets from a Wireshark trace

  9. That’s enough to play back the call but to record it a different approach is required. Return to the list of captured packets and select the first RTP packet in a conversation.
  10. From the Statistics menu, select RTP and then Stream Analysis… This will show the packets in either direction:
  11. Analysing an RTP stream based on a Wireshark trace

  12. Click the Save payload… button to save to file – .au format with both streams is probably most useful:
  13. Saving the payload from an RTP stream based on a Wireshark trace

  14. The .au format is generally used for UNIX-generated sound files and can be played in Windows Media Player (see Microsoft knowledge base article 316992). Alternatively convert it to another format using whatever tools are appropriate (I used Switch on a Mac to convert from .AU to .MP3).

Results

I’m not sharing the full packet capture for security reasons but I have made the MP3 version of the RTP recording available.

Conclusion

Recording VoIP calls seems remarkably simple – given sufficient access to the network. Implementing IPSec should prevent such packet sniffing on the local network but, once a VoIP call is out on the ‘net, who knows who might be listening?

Acknowledgements

Whilst researching for this post, I found the following very useful: