Spreading some link love

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

The rel="nofollow" attribute on HTML anchors was supposed to help prevent comment spam. Unfortunately, as Michael Hampton explains at length, NoFollow hasn’t worked – at least not based on the volume of comment spam that Akismet has removed since I moved to WordPress ([[akismet_counter]] spam comments detected as you read this post).

U comment.  I follow.Randa Clay has created an alternative – the I Follow Movement – sites that acknowledge the contribution that commenting makes to the blogoshere (avoiding the need to specifically add links to a blogroll in order to spread some link love). I figure that if NoFollow is not preventing comment spam, the least I can do is let the information people leave here in comments work for them in the search engines (at the risk that a few spam comments will still make it through).

Following Owen’s example, I’ve implemented the DoFollow WordPress plug-in on this site so URLs in comments will now (hopefully) be picked up by the Googlebot, Slurp, MSNbot, Teoma and others. Incidentally, if I specifically add rel="nofollow" to a link, it still works – so it’s still possible to block links that you really don’t want the bots to follow (robots.txt directives are unaffected too).

So, please, comment away – and consider doing the same on your site.

The Photoshop book for digital photographers

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

It’s been a busy year. My family blog hasn’t been updated in a very long time and we’ve been accumulating digital photos of the boys at an alarming rate. Last night my wife and I and I went through some of them to work out which ones to print (we still have paper-based albums because they are easier to look at) and we still have a lot left to sort out.

I don’t print the photos at home because the high street labs can do it more cost-effectively (sure, they screw up the colours more than I would like for some of my work but remember we’re only talking about the family album here). Even so, there are some edits that need to be made before I send the photos to the lab, and whilst the free tools with Windows or OS X will help me, I prefer the control that a tool like Adobe Photoshop gives me.

The Photoshop book for digital photographersThe trouble is, Photoshop is not always intuitive. I want to understand what I’m doing but half the time I don’t – and the local adult education Photoshop classes run in the daytime (when I’m at work). That’s where the Photoshop book for digital photographers comes in handy. I asked Santa to bring me this as a Christmas present a couple of years back and it’s been great. The main difference between this book and any other Photoshop book that I’ve seen is that instead of telling me what the various features are in Photoshop and how to use them, it takes me through an example (like instant red eye removal, colour-correcting images, or stitching panoramas together), with illustrations. I suppose now I need the traditional manual to teach me how Photoshop works (I’m considering buying the Adobe Photoshop CS3 classroom in a book), but this book gets me going – in effect it teaches me how to do things, not why a particular method works. I still have to ask my friend Alex for help on the more complex stuff (he does pre-press work for a living and really knows his way around Photoshop, Xpress, etc.) but at least with this book I can be self-sufficient for 95% of my digital photo edits. I should probably point out that the version of the book I’m using is based on Photoshop 7.0 but the techniques still seem to work for me with CS2.

If only real life was like Photoshop, I could use the book techniques to remove dark circles under my eyes, whiten teeth, remove love-handles, generally slim and trim myself. Sadly, life’s not like that – so another big push with Weightwatchers and some more exercise it’ll be then…

Trying to get Red Hat Enterprise Linux to accept a DVD-based repository

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I use Windows computers every day, I run my home stuff on a Mac and I want to continue to develop my Linux skills – so, I decided to build a Linux server at home. Out came my Red Hat Enterprise Linux 5 installation DVD and a short while later I had a working server. Great. Next, I wanted to customise the installed packages (the installer had given me the option to customise later, which I had accepted) – I fired up the Package Manager and…

…that’s right, a big empty white space in the browse list – the only listed packages were those that had been installed at setup time.

It seems that yum/pirut cannot read the RHEL installation DVD. After some googling, I decided to set up a new repository and created a file in /etc/yum.repos.d called rhel-dvd.repo, the contents of which were:

[dvd]
mediaid=1170972069.396645
name=DVD for RHEL5
baseurl=file:///media/RHEL_5%20i386%20DVD
enabled=1
gpgcheck=0

(the mediaid=1170972069.396645 line is the first line from the .discinfo file on the RHEL DVD, based on a comment on Jeremy Katz’s site.)

Cannot open/read repomd.xml file for repository: dvdIt seemed to recognise my DVD as an installation source but not as a valid repository, so after digging a little deeper I found that mediaid= requires yum 3.1.2 or later and I ended up in dependency hell (exactly what rpm is supposed to avoid).

This is crazy – it seems that Red Hat expect me to install everything from the Red Hat Network (RHN) – what about servers that do not have a connection to the Internet (or to an RHN proxy/satellite server)? Surely installation from the RHEL DVD should be an option (I suppose it is, technically, if I know what every RPM is for – that’s where the pirut browse capability is so useful).

For once, I give in. I could spend hours on this issue (I’ve already spent a few too many) but it’s Friday evening now and my bad IT day has turned into a bad IT week. I need to put the kids to bed and then have a quiet evening in with a large glass (or two) of wine.

In the meantime, if anyone has any ideas on how to get yum/pirut to recognise a CD/DVD as valid installation media, please leave a comment.

Non-existent fax extension causes Outlook error

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

My clean installation of Windows Vista and Office 2007 has been presenting me with a strange error on the first time that I reply to an e-mail in Outlook:

Microsoft Office Outlook

The Add-in “FaxExtension” (C:\Windows\System32\fxsext32.dll) cannot be loaded and has been disabled by Outlook. Please contact the Add-in manufacturer for an update. If no update is available, please uninstall the Add-in.

After clicking OK, everything is fine until the next time I open Outlook and reply to a message. It’s all a bit odd, because I don’t have a fax extension installed. Then I found a newsgroup post which commented that sometimes deleting the FaxExtension key from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Exchange\Client\Extensions will prevent this error from occuring.

I checked the registry and sure enough, there was the key, with a value of 4.0;C:\\Windows\\System32\\fxsext32.dll;1;00000100000000″.

I shut down Outlook, removed the offending key, restarted Outlook and haven’t seen the message since. Guess that’s a bug then.

Do IT qualifications really matter?

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few days back, I received an e-mail from a young man in Pakistan who had found my website on the Internet and wanted some advice. This is what he had to say (edited for grammar and spelling):

“I have a Bachelors degree in Computer Sciences and am studying for MCSE certification.

[…]

My question to you as a newbie in the networking field is are certifications necessary to jump and fly high in this field and even if it’s true then do I have to stick to Microsoft or can I do a mixture of Cisco and Microsoft certifications. Lots of “thinktanks” here in Pakistan say that a person with MCSE, CCNA AND CCNP certifications is a much needed guy for IT companies.

I am sooooooooooooooo confused as to where I should move.”

The reason I’m blogging about this is because he raised some interesting points. I too have a bachelors degree in Computer Studies and I don’t consider that it’s been of any practical use to me in my work. The process of leaving home and going to university helped me progress from home life to becoming an independent young man (actually, it was a Polytechnic when I started my course – reflecting the vocational nature of its tuition – but don’t get me started about how all the Technical Colleges and Polytechnics have become “Universities” and what a bad idea that is) and it set me up with some valuable first-hand experience about managing personal finances (i.e. debt… and that was 13 years ago – I feel really sorry for today’s young graduates who have no access to grants and have to pay tuition fees too).

My degree was simply a means to join the career ladder at a certain level. Please don’t misunderstand me – I’m sure that has opened some doors that might otherwise have been closed (or would at least have been harder to force my way through) but it was by no means essential to reaching the position that I have today (perhaps I should have aimed higher?) and I have not used any of the Computer Studies skills that I learnt along the way so I could have studied anything (given the amount of writing I do today – perhaps I should have studied English, or journalism? Who knows – back then I didn’t know what I wanted to do with my life!).

IT certifications are similar. I hold a variety of IT certifications but none of that matters if I don’t have experience to back up the qualifications. Sometimes you have to admit your shortcomings too – I didn’t feel comfortable being flown in to one potential customer as an expert earlier this week because I haven’t done anything practical with the associated technology for a long time now. The customer would have seen through me and that would have damaged both mine and my employer’s credibility.

I learnt a few days back that a colleague, whose advice and experience I hold in very high regard, holds no IT certifications. Equally I have friends and colleagues who left school at 16 or 18 and that’s not prevented them from reaching the the same (or a higher) position within the company as myself.

I understand that the UK government has a target for 50% of all school leavers should go to university (Why? Do 50% of all jobs require a degree? How about 50% or more of all school leavers going on to some form of further or higher education – whether that be vocational or academic). When I meet new graduates I recognise how wet behind the ears I was when I started out all those years ago. Which nicely illustrates my point – that it doesn’t matter how highly qualified you are – what really counts is experience, even if the company does still insist that you have the letters after your name before you can get through the door.

Windows fast user switching + Zone Alarm = bad IT day

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

My poor colleagues had to put up with a lot of complaining yesterday. I was having a bad IT day (when nothing seems to go well). And it seems to be continuing today.

I recently rebuilt my company notebook PC to run Windows Vista and Office 2007. That’s going well but then there’s all the stuff that goes on top (anti-virus software, corporate VPN client, etc.). My colleague and trusted advisor, Garry, helped me to get all that in place, an administrator added my machine to the corporate domain and before I left last night I logged on so that I had a profile for my domain account with cached user credentials (for working at home today).

It should have been fine but I didn’t log out from my original account because I was in the middle of something – I used the fast user switching feature instead and then waited… and waited… and waited… as Windows tried to set up my profile.

In the end I gave up and logged out, only to find a load of Zone Alarm messages popped up under the original account.

“Blah blah blah is trying to do something… do you want to allow this?” I don’t know – probably! Just let me get on with logging in.

Today it’s more of the same, as switching back to my old (non-domain) profile to run Windows Easy Transfer resulted in the same problem.

I think Garry was quite disturbed to see how I (and another colleague) quickly tired of reading these incessant firewall popups and just clicked the “allow” button (and the “don’t bug me again” checkbox) every time – which proves a point I made about firewall messages almost two years ago. And anyway, what’s wrong with the Windows Firewall? If I didn’t have to use Zone Alarm to meet VPN access policies then I wouldn’t. Grrr.

The good news is that Windows Easy Transfer was really useful for migrating my application settings from my old profile to the new domain profile (I didn’t use it for the files as it’s easier to just drag and drop them in Explorer).

Windows Server UK user group

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Scotty McLeod has been working hard to get the Windows Server User Group UK website off the existing SharePoint platform and onto something more appropriate (SharePoint is great for some things but it was not great for the user group website – hey, even SharePoint blogs uses Community Server! Come to think of it, so do most of the Microsoft blog sites!).

Anyway, head over to http://www.winserverteam.org.uk/. It’s still work in progress but, over the coming weeks and months, I’m hoping it will grow to become a lively discussion area (backed up with regular meetings) for UK-based IT Professionals who are interested in the development of the Windows Server platform.

Some more about Terminal Services Gateway Servers

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

In an earlier post, I mentioned Austin Osuide’s recent Windows Server User Group presentation on Terminal Services Gateway Server and what follows is some of the detail from that session.

Terminal Services Gateway Server is a server role in Windows Server 2008 – effectively a protocol translator that allows authorised users to remotely access resources on a corporate LAN using RDP over HTTPS.

Up until now, it’s been necessary to open TCP port 3389 to allow RDP traffic through the corporate firewall but by encapsulating the RDP traffic within an SSL-secured tunnel, control may be exercised over which computers (and hence which applications), users can connect to and from. Other advantages include the fact that there is no need for a VPN infrastructure, so connectivity can be gained from any PC, anywhere (home, hotel, business partner or client premises, mobile or wireless hotspot). Then there are other advantages for IT organisations looking to reduce costs…

Consider a large outsourcer, with many support teams, each supporting a single customer’s infrastructure. What if one team (with the appropriately scaled resources) could manage multiple networks? Maybe even in an offshore scenario? As an IT professional, I’m not keen on this and as a customer I would be concerned about the potential impact on security but what if my managers could convince the customer that they can maintain security in a global infrastructure such as this? Using technologies such as network access protection (NAP) the health of any connected devices can be ensured and terminal services gateway servers can be deployed to control who can connect to which computers – perhaps only to defined administrative servers with a controlled application set.

The process for connection is as follows:

  1. Client tunnel RDP connection through HTTPS.
  2. Terminal Services Gateway strips out the HTTPS encapsulation and forwards the request to the terminal server/remote desktop (if the request passes appropriate policy checks – connection authorisation policies control who can connect and resource authorisation policies control what they can connect to, using user-defined or built-in groups for servers).
  3. Remote machine beleives that the request has come directly from the client and responds appropriately.

It all sounds straightforward enough but, as Austin explained, there are some gotchas too:

  • As for when tunneling RPC through HTTPS with Outlook and Exchange, the certificate must be recognised as valid – there is no manual option to trust a site if there are certificate issues (as there would be when browsing the Internet). There are two possible options:
    1. Establish a corporate public key infrastructure and install the appropriate certificates on the client. The downside to this is where clients don’t allow certificates to be installed by users (e.g. in a kiosk scenario).
    2. Alternatively, purchase a certificate from a trusted certification authority.
  • At least initially, few organisations will have a PKI based on Windows Server 2008 and, due to the removal of the Xenroll ActiveX control from Windows Vista (see Microsoft knowledge base article 922706), Windows Vista and Windows Server 2008 computers cannot use the WIndows 2000/2003 CA web interface (or indeed the equivalent interfaces on the Thawte or Verisign websites). It should be possible to craft an appropriate web server certificate using the MMC Certificates snap-in, but the common name for the server needs to be fully qualified and the MMC tools insert an unqualified name in a computer certificate. Thankfully there is another method – using the certreq.exe command line tool and a .inf file with the certificate template information (the syntax is described in Microsoft knowledgebase article 321051 and certutil -csplist will list the trusted cryptographic service providers), for example:

    [Version]
    Signature="$Windows NT$

    [NewRequest]
    Subject = "CN=servername.domainname.tld"
    KeySpec = 1
    KeyLength = 2048
    Exportable = TRUE
    MachineKeySet = TRUE
    SMIME = False
    PrivateKeyArchive = FALSE
    UserProtected = FALSE
    UseExistingKeySet = FALSE
    ProviderName = "Microsoft RSA SChannel Cryptographic Provider"
    ProviderType = 12
    RequestType = PKCS10
    KeyUsage = 0xa0

    [EnhancedKeyUsageExtension]
    OID=1.3.6.1.5.5.7.3.1

In terms of best practice, Austin had some more advice to give:

  • Use a dedicated Terminal Services Gateway Server.
  • Consider placing the gateway server behind an ISA server.
  • Terminate the SSL connection in the DMZ and put the Terminal Services Gateway Server on the corporate network
  • VPNs may still have a place in the infrastructure – Terminal Services gateway servers are best used where no local copy of the data is required or where bandwidth issues mean that the user experience over a VPN experience is poor.

Further information

Microsoft Terminal Services.
Microsoft Terminal Services team blog.
Terminal Services in the Windows Server 2008 Technical Library.

James May’s 20th Century

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve just spent an hour in front of the gogglebox watching James May’s 20th Century. “What’s that?”, you may ask. It’s a new BBC/Open University television programme looking at technology and how it’s changed the way in which we live over the last hundred-or-so years.

Tonight featured two episodes: the first looking at how the world became smaller with the development of air and motorised road travel and how, ironically, it was not supersonic aircraft travel that became the accepted means to “shrink” our planet but computers, fibre optics and the Internet; and the second looking at how the space race grew from one man’s dreams to a desire for military supremecy and eventually to a means to communicate (bit of a theme running here…) – I never realised just how many satellites are in orbit around the world.

Anyway, for UK readers with even the remotest interest in technology (and if you don’t have that, I’m surprised that you’re reading this blog), it’s fascinating viewing. Even my wife was interested, although as our toddler son is asking more and more “who?, “what?”, “when?” and “why?” questions this could be the science lesson that she needs in order to be able to keep up!

Why the banks just don’t get IT

This content is 18 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Identity theft worries me. It doesn’t stop me sleeping at night but nevertheless it does worry me.

It seems that each time I log in to a banking website the security has been “enhanced” with yet another item that I fail to enter correctly and then have to call the helpdesk to get my account unlocked – and I’m an IT guy… what about the “normal” users (they probably write down the details somewhere)!

Mark James has written an interesting article about this issue – and how the answer is really quite simple – if only the banks would apply the same security approach to consumer banking as corporates do for remote access.