Finding that elusive Microsoft support site

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

The Microsoft website is not always the easiest place to find things – especially when in the middle of a crisis.

Blake Hall has published a comprehensive list of Microsoft support resources on the Industry Insiders blog.

Well worth checking out next time you’re researching an issue, if only for the advice on how best to search the knowledge base.

Up and running with WSUS

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve been meaning to upgrade my Software Update Services (SUS) installation to Windows Software Update Services (WSUS) for some time now, but the recent rebuild of my SUS server forced the issue.

I’m pleased to report that the WSUS installation was reasonably straightforward. Because I had already installed Windows Server 2003 SP1, there was no need to install BITS 2.0 or Microsoft.NET Framework v1.1 SP1 and I just needed to make the (Windows Server 2003) an application server (i.e. install IIS, enable COM+ for remote transactions, enable Microsoft DTC for remote access and enable ASP.NET) – all done through the Configure My Server Wizard (because I was feeling lazy). Installing WSUS was simply a case of following the setup routine (which included setting up the MSDE database).

Once installed, I set up the synchronisation schedule and performed a manual synchronisation (just to get things going). I also elected to automatically approve critical and security updates. The WSUS installation had automatically updated the Group Policy template file and because Active Directory already had the GPO settings for the previous SUS installation, it was pretty much configured, although I did need to amend the intranet Microsoft update server locations to include the custom port number (http://servername:8530) and enable client-side targeting for the All Computers group. The final steps were to select the products for which to receive updates and to approve updates for detection/installation.

That was it. WSUS up and running and clients receiving updates. My first impressions are that WSUS is slightly more complex than SUS was, but also more capable. I’m just waiting to get some real world experience with a hierarchy of update servers on a live network now!

Microsoft’s view on managing heterogeneous environments

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

It was interesting to hear Kirill Tatarinov (Microsoft Corporate VP for Enterprise Management) comment (at last Friday’s UK re-run of the key Microsoft Management Summit 2005 presentations) on Microsoft’s support for heterogeneous environments through its management products (especially as they are finally waking up to the idea that organisations want to run – and do run – non-Microsoft guest operating systems under Virtual Server).

At both the partner breakfast briefing and the main event, the message was that basically, Microsoft will embrace other environments but will not (for example) write Linux agents for Microsoft Systems Management Server (SMS), Microsoft Operations Manager (MOM), or any other Microsoft management product. To quote Tatarinov:

“[it’s] not part of our DNA and I don’t think this is something that we should be doing.”

Microsoft’s view is that products should be scalable and interoperable, providing open interfaces (e.g. WS-Management) alongside technologies such as the MOM connector framework and the SMS software development kit (SDK) to work with other products in the management space.

That may be a smart move – if only to avoid another law suit for supposed anti-competitive behaviour – but it also helps Microsoft to present itself as a team player, at a time when people are starting to take SMS seriously, when MOM is really gaining traction, and when the whole area of systems management for infrastructure built on Microsoft technologies is finally being addressed through the dynamic systems initiative (DSI).

Microsoft’s Dynamic Systems Initiative

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

The Microsoft Management Summit is one of Microsoft’s annual conferences and last Friday, the most popular presentations were re-run in the UK. Microsoft clearly took the event seriously, bringing across from Redmond the Corporate VP for Enterprise Management (Kirrill Tatarinov); the Systems Management Server and Operations Manager Program Managers (Bill Anderson and Vlad Joanavic); and a Director of Product Management for Enterprise Management (Michael Emanuel).

Largely due to the quality of the speakers, the event was well worth attending – particular Michael Emanuel’s Dynamic Systems Initiative (DSI) presentation. I’ve seen DSI presentations before, but this was inspirational – largely due to the charismatic way in which he described the differences between desired and actual states as “ought-ness” and “is-ness” (with associated “was-ness”, “could-ness”, “good-ness” and “should-ness”).

I’ll try to explain it all below (with a few additions from previous DSI presentations)!

It is generally regarded that infrastructure costs fall rapidly whilst performance rises (a derivative of Moore’s Law). What is less well known is that as the infrastructure costs drop, the support costs associated with supporting systems rise. Typically, 70% of an organisation’s IT budget is spent on maintenance, with just 30% on new systems. The trouble is that our increasingly well connected, but highly distributed IT systems are becoming incredibly complex. Add to that, the organisational complexity with infrastructure architects, developers, systems administrators, service architects, business stakeholders, testers, IT management and even outsourced/offshore partners – wouldn’t it be great to do something to control the management costs and let them track the decreasing cost of the infrastructure?

IT complexity and cost

Businesses tend to be dynamic. All too often, IT is not. Microsoft’s answer is the DSI, which is about helping IT organisations to capture and use knowledge to design more manageable systems and automate ongoing operations, resulting in reduced costs and more time for IT to focus on what is most important to the business.

It sounds logical enough… so why don’t we do this already? Basically because IT infrastructure architects and IT operations managers don’t tend to talk the same language! In general, designers think about scalability, security and identity but gloss over the management element. With 80% of the cost of a project committed by design decisions at the end of the design phase (but only 8% of the cost incurred), it is all too often too late to change things when they reach production and don’t fit well within an operational model. DSI is about encouraging a full lifecycle view so that operational awareness can be built into applications and services right from the initial design, using models to capture knowledge (i.e. bottling what is known for re-use) throughout the lifecycle.

The key is that systems should be designed for operations with manageability architected into the system from the outset. To do this, there are two fundamental building blocks required:

  • A generic way in which to model knowledge – the systems definition model (SDM).
  • A generic way in which to communicate with a system – WS-Management.

The SDM is basically a manifest which provides a single source of information on a system, describing:

  • What “it” is.
  • What “it” is capable of doing.
  • What “it” needs to achieve these capabilities.

WS-Management is a web services implementation of Web Based Enterprise Management (WBEM), developed as part of the Web Services Interoperability Organization’s WS-* architecture as a joint effort by AMD, BMC Software, Dell, Intel, Microsoft, Sun Microsystems and WBEM Solutions, and the first Windows implementation (WS-Management is heterogeneous) will be made available later this year as part of Windows Server 2003 Release 2 (R2).

Meanwhile, Microsoft is slowly moving the existing models within its management products over to SDM in support of the DSI and sees Visual Studio as a tool for defining the holistic structure of the application, services and system – considering management at design time to integrate service requirements during development.

By combining the application designer’s feature/functionality view of the world with the IT Operations Manager’s data centre policies and constraints, SDM models can be defined and fed through a validation process to identify errors; but a development environment in itself if not enough. Knowledge is the key to management and the diagram below shows a desired state (models, constraints, policy, prescriptive guidance, SLAs, patches) being replicated down (Emanuel refers to this as “ought-ness”) and an actual state (inventory, metrics, events, alerts, compliance, service level, results – the “is-ness”) being replicated up. The art of management is resolving conflicts between the “ought-ness” and the “is-ness” states. Furthermore, this management is not performed using an expensive tool but is actually the knowledge held by administrators and operators which needs to be re-used. The DSI vision is self-managing systems so that every application is delivered with a model which can be deployed across every Windows system.

Managing systems

SDM models are held in a models database and applied through each of the Microsoft operations framework (MOF)/IT infrastructure library (ITIL) workflows to synchronise with reality. Operational systems feed this information into a data warehouse which stores a point in time view of this reality (the “was-ness”). Taking this a step further, by applying “what-if scenarios” (“could-ness”) to this historic state, the potential (“good-ness”) of what should be (“should-ness”, or future “ought-ness”) can be modelled (i.e. capacity planning).

Of course, Microsoft is a product and technology company and so they have products which map on to this approach. Looking at the MOF model, each quadrant has associated products:

  • Changing: Microsoft Systems Management Server.
  • Operating: Microsoft Operations Manager; Microsoft System Center Data Protection Manager.
  • Supporting: Microsoft Visual Studio 2005 Team System; Microsoft Business Solutions CRM.
  • Optimising: Microsoft System Center Capacity Manager; Microsoft System Center Reporting Manager.

To summarise, DSI consists of a number of core technical principles:

  • Software platforms and tools that enable knowledge of an IT system (architectural intent; operational environment; IT policies; resource needs; across platforms)…
  • …to be captured in software models (MOM management packs; software update manifests; SDMs)…
  • …that can be created, modified and operated upon across the IT lifecycle (develop, operate, analyse/act).

In terms of product, Microsoft has currently defined three waves of products to support the move to dynamic systems:

  • System Center Wave 1 is happening now and consists of:
    • Microsoft System Center Capacity Manager 2006 (codenamed Indy).
    • Microsoft System Center Reporting Manager 2005.
    • Microsoft Systems Management Server 2003 (service pack 1).
    • Microsoft System Center Data Protection Manager 2006.
    • Microsoft Operations Manager 2005.
    • Microsoft Visual Studio 2005.
    • Microsoft Windows Server 2003 R2 WS-Management.
  • System Center Wave 2 should happen around 2006-2007 and includes:
    • Windows Server (codenamed Longhorn).
    • Microsoft System Center Capacity Manager v2.
    • Microsoft Operations Manager v3.
    • Microsoft System Management Server v4.
    • Microsoft System Center Reporting Manager v2.
  • System Center Wave 3 is due around 2008-2009, and is when the various strands of the DSI can finally be pulled together.

Happy Birthday Microsoft

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Microsoft turns 30 today. We tend to associate Information Technology (IT) with a rapidly expanding market of young start-up companies but whilst it is nothing compared to the global giants IBM, Hewlett-Packard (HP) and Fujitsu, 30 years is significant.

Microsoft has become ubiquitous – largely through its Windows operating system and Office productivity suite, but recently (and somewhat worryingly for someone who makes a living architecting solutions based on Microsoft technology), Microsoft has been drifting and MSFT stock prices (which were once rising at astronomical levels, splitting nine times between the company’s IPO in 1986 and 2003) have been virtually static in recent years leading to a number of reports suggesting that the company has lost its way. Maybe it was because Bill Gates stepped down as CEO, maybe it was just the sheer size of the giant, which employs almost 60,000 staff in 100 countries and had annual revenues of $39.75bn in 2004/5 (up 8% on 2003/4), generating profits of $12.25bn (up 50%).

On the surface, these figures look great – 8% growth and 50% increase in profits. But a look at the figures for the last 10 years shows that growth has slowed from 49% in 1995/6.

The trouble is that Microsoft has been losing ground to young upstarts like Google (mission: “to organize the world’s information and make it universally accessible and useful”). Let’s face it, it was Microsoft that was the young upstart when Bill Gates and Paul Allen persuaded IBM to make MS-DOS the operating system for the first PC in 1981 (ousting CP/M). After being slow to embrace the Internet and a series of legal wrangles (some justified, others not), Microsoft was also late to embrace search technologies, whereas the current industry darling dominates with 36.5% of the web search market.

It didn’t help that for a period between 1995 and 2001, the flagship product (Windows) was split between the (unreliable and insecure) Windows 95, 98 and ME product line and the expensive business version, Windows NT (later Windows 2000). Since Microsoft finally converged the two product lines with the launch of Windows XP (which is still based on the Windows NT kernel) there has been a push towards delivery of a trustworthy computing platform, and despite its critics, I think Microsoft generally does pretty well there. If you have the largest market share you will get attacked my malware writers – that means Microsoft for PC operating systems and Nokia for mobile handsets!

The trouble is that since Windows 2000 and XP sorted out the security issues, operating system upgrades have been a little dull, with limited innovation. It doesn’t help that any bundling of middleware seems to result in a lengthy courtroom battle but without innovation, there is no reason for consumers to upgrade, and in the business market, where IT is a business tool (not the business itself), IT Managers are under pressure to reduce costs through standardisation. That often means standing still for as long as possible.

I really hope that Windows Vista/Longhorn and Office 12 are not the death of Microsoft. Microsoft’s mission is “enabling people and businesses to realize their full potential” and this week, in an attempt to realise its own potential, a massive re-organisation was announced, with the aim of making the giant more dynamic (and hence able to respond to the industry – let’s face it, Microsoft has never been the innovator but it is very good at marketing other people’s ideas and making them work – even MS-DOS was licensed from Seattle Computer Products). Maybe the new organisation will help the timely delivery of products but it’s amazing how the rising fortunes of the Mozilla Foundation’s Firefox browser has focused Microsoft on delivering a new version of Internet Explorer after years of poor standards compliance) with very few new features and how the desktop search functionality provided by Google (and others) has focused Microsoft’s attention in this space (even if the current MSN Search strategy appears to be failing). Maybe increased competition in the operating system market (come on Apple, give us OS X for the PC – not just Intel-based Macs, which are really just Apple PCs and could also run Windows…) in the shape of the major Linux distributions (Red Hat and Novell SuSE) or free UNIX distributions like the x86 version of Sun Solaris will focus the giant on delivering great new features for Windows.

Microsoft was built on a dream of “a computer on every desk and in every home”. Despite all of the negative publicity that Microsoft tends to attract, it seems to me that (at least in the “developed” world) this dream has largely been realised. Let’s see what the next 30 years brings.

Service packs, feature packs and releases – how they should work

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

The various Microsoft product groups issue service packs, feature packs and releases. This is all very well, but they mean different things to different people and are confusing. Then, last Friday, Paul Thurrott reported in the Windows IT Pro magazine network WinInfo Daily Update that Virtual Server 2005 SP1 will now become Virtual Server 2005 release 2 (R2). This might sound like a trivial name change but what it means for legal users of Virtual Server 2005 (a basically good product, but with a few fairly significant bugs), they will need to purchase R2, rather than install a free service pack.

If Microsoft follows this path they are going the way of Apple, who issue point version upgrades to their OS X operating system and have the audacity to charge existing users for a full product (there is no upgrade available).

In my opinion:

  • Service packs should fix bugs (security or otherwise) and that critical patches should be released in advance of a rolled-up, regression tested, service pack. Ideally service packs should also have a predictable timescale (e.g. 6 months after product release then every 12 months from then on until the product reaches end of life).
  • Feature packs should offer new features for an established product. I don’t believe that there should have been any additional features included with Windows XP SP2 (e.g. the Windows Firewall) – instead SP2 should have been a set of bug fixes (alleviating some of the deployment issues associated with new technology) and additionally Microsoft should have offered a free feature pack for Windows XP which provided the extra security features. In this way, users can stay at the latest supported product release (service pack level) but choose which feature packs to add. Security features and other important updates should be free of charge. Others which enhance a product might carry a small charge.
  • Mid-life releases (e.g. Windows Server 2003 R2) are all very well as a marketing mechanism for rolling the latest service packs into a product for new users, but should not preclude existing users from gaining from the latest service pack/feature pack updates. If a product really warrants a new licence, then it should carry a new (major) version number!

Following this model, Virtual Server 2005 R2 should really be a service pack and there should be an additional feature pack for the new features which Microsoft plans to ship (of which there are precious few details at present). As for supporting Linux as a guest operating system – it either works or it doesn’t – Microsoft needs to make up it’s mind as to whether it is a supported guest or not (if they are smart they will say “yes” – that way users can have a virtual Linux guest running on a Windows host if they need the best of both worlds, with Microsoft still gaining licence revenues for the host operating system and the virtualisation software).

The Microsoft view of connected systems

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few weeks back I was at a breakfast briefing on connected systems (Microsoft’s view of web services and BizTalk Server), delivered by David Gristwood (one of Microsoft UK’s Architect Evangelists). Even though I’m not a developer, I think I understood most of what David had to say (although many of my colleagues’ blogs will undoubtedly have more to offer in this subject area).

David explained how the need to connect applications has led to a shift towards service orientation as applications have longer lifetime and no longer consist of just a single executable program. Consequently there are requirements for application customisation and integration (generally loosely coupled) with the four tenets of a service oriented architecture (SOA) being:

  • Explicit boundaries.
  • Autonomous services (i.e. other services do not have to wait for your schedule).
  • Shared schema and contract (not class).
  • Compatibility based on policy (generally written in XML).

The Web Services Interoperability Organization‘s WS-* architecture is about providing a framework for web services with broad industry support (in the same way that the open system interconnection 7 layer network model has become the industry model for networking).

WS-I web services standards stackBasic web services operate well but are easy to make inoperable. As such WS-I is concerned with identifying the lowest common denominator – the basic profile (BP) or core set of specifications that provide the foundation for web services.

When developing web services, Visual Studio 2005 (codenamed Whidbey) will represent a huge step forward with the Microsoft .NET Framework v2.0 including numerous improvements in the web services protocol stack and ASMX (ASP.NET web services) representing an ongoing evolution towards the Windows communication foundation (formerly codenamed Indigo).

Although coding first and using web methods is still a good way to producing web services, there is a move to interface-based service contracts – first designing the interface using web service definition language (WSDL) and then adding contracts. The new application connection designer (ACD) (codenamed Whitehorse) is a visual tool to drag and drop connections which represent service contracts, allowing the generation of skeleton projects and the basic code required to implement/consume contracts.

In terms of standards and interoperability, this code is WS-I BP 1.1 compliant by default (and hence fits well into the WS-* architecture), whilst ASMX web services automatically support simple object access protocol (SOAP) 1.1 and 1.2.

Web services enhancements (WSE) is a fully supported download which sits on top of ASMX and extends the existing web services support within the Microsoft .NET Framework. WSE is a set of classes to implement on-the-wire standards and is actually an implementation of several WS-* specifications including WS-Addressing and WS-Security, to provide end-to-end message-level security (in a more sophisticated manner than SOAP over HTTP). The current version is WSE 2.0 SP3, and WSE 3.0 will be released with Visual Studio 2005 (due to a dependency on the Microsoft .NET Framework v2.0), with new features including:

  • Message transmission optimization mechanism (MTOM) for binary data transfer, replacing SOAP with attachments and WS-Attachments/direct Internet message encapsulation (DIME).
  • Enhancements to WS-Security/WS-Policy.

It should be noted that there are no guarantees that WSE 2.0 and 3.0 will be wire-level or object-model compatible, but there will be side-by-side support for the two versions. WSE 3.0 is likely to be wire-compatible with the Windows communication foundation (which will ultimately replace WSE around the end of 2006).

The Windows communication foundation itself is about productivity (writing less code), interoperability (binary, highly-optimised interoperability between computers, dropping to WS-I BP 1.1 if required) and service oriented development. Implemented as a set of classes, the Windows communication foundation takes messages, transforms them, maps them to a structure and pushes them to the receiving code.

To illustrate the productivity gains, using an example cited by Microsoft, an application implemented using Visual Studio .NET 2003 consisting of 56296 lines of code (20379 lines of security, 5988 lines for reliable messaging, 25507 lines for transactions, and 4442 lines for infrastructure) was reduced using WSE to 27321 lines of code (10 lines for security, 1804 lines for reliable messaging, and no change to the 25507 lines for transactions) and reduced further using the Windows communication foundation to just 3 lines of code (1 line for security, 1 line for reliable messaging and 1 line for the transactions)! This sounds extreme to me; but even an infrastructure architect like myself can appreciate that less code means easier management.

Evolution of Microsoft.NET FrameworkIn terms of a roadmap, the Windows communication foundation will supersede existing connected systems technologies (e.g. ASMX), but other technologies will continue to exist, supported by the Windows communication foundation (e.g. enterprise services, .NET remoting, COM, COM+ and MSMQ).

Another tool in Microsoft’s integration arsenal is the SQL Server 2005 Service Broker, which will provide a SQL-to-SQL binary data messaging protocol, allowing developers who are familiar with the database programming model to think about queues as databases and to take data from queues as as a kind of hanging query/result sets. Over time, this will be adapted to use the Windows communication foundation so that this will run on top of the Service Broker protocol before eventually allowing the Windows communication foundation to become the transport for WS-* interoperability.

Web services integrationOf course, Microsoft’s most significant integration product for connected systems is BizTalk Server. At 1.5 million lines of C# code, BizTalk Server 2004 is one of the largest Microsoft .NET products written to date (although SQL Server 2005 will exceed this at around 3 million lines). BizTalk Server allows the mesh of point-to-point web service (and other) connections to be replaced with a BizTalk Server “hub”.

Microsoft BizTalk Server
Another advantage of such a process is the ability to take business processes out of (potentially unstable) code and allow BizTalk’s orchestration model to handle the business processes.

BizTalk 2004 is the first Microsoft.NET incarnation of the product (the previous two versions were not .NET applications). Built on the ASP.NET application stack and including WS-I v1.0 support (and a v2.0 adapter), BizTalk Server 2004 is integrated with Visual Studio.NET 2003 and the Office System 2003 with additional features including business activity monitoring, human workflow services and a business rules engine. BizTalk Server 2006 is due to follow the SQL Server 2005 and Visual Server 2005 launch in November 2005 and (according to Microsoft) will offer simplified setup, migration and deployment, comprehensive management and operations, and business user empowerment. An adapter for the Windows communication framework is also expected later in 2006. Future versions of BizTalk Server will be built natively on the Windows communication foundation and will offer support for the next version of Windows Server (codenamed Longhorn) as well as the dynamic systems initiative (DSI).

Ultimately, the Windows communication foundation will become a transport layer for connected systems, with BizTalk Server providing orchestration. With continued support for WS-* standards, truly connected systems may well become a reality for many organisations.

Links

WS-I overview presentation

Microsoft and SAP alliance site

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I don’t know anything about enterprise resource planning (ERP) products, except that SAP are a big player in this space (and that Microsoft runs its business on SAP with a 1.7Tb SQL Server database – pretty much the only non-Microsoft product in use there). Last night, Mat Stephen mentioned the Microsoft/SAP Alliance website and, after having taken a look this morning, the technology section (including details of how to integrate SAP and Microsoft products) looks pretty useful to me.

Towards operational excellence on the Microsoft platform

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I was sorting out my den/office this weekend and came across a Microsoft operational excellence resource CD. The concept seems quite good (although the content seemed a little out of date, even bearing in mind that it had sat in a pile of “stuff to look at when I have time” for 10 months); however, the operational excellence section of the Microsoft website is worth a look.

Watch out for cookies when using the Microsoft AntiSpyware beta

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last year I blogged about Microsoft’s acquisition of Giant Software and I’ve been using their AntiSpyware Beta since it was made available in January; but last week I was looking at the inordinate amount of spam my Dad receives and that got me thinking about the overall security on his PC (which has my e-mail addresses in the address book!). After installing Lavasoft Ad-Aware SE Personal, I found that the Microsoft AntiSpyware Beta product he had been using was doing a pretty good job, but there were a load of tracking cookies which it had not identified. Today, I ran the same tests on two of my PCs and found the same.

As the Microsoft product is based on Giant’s well-regarded software I decided to look a bit deeper…

It turns out that although the Giant version of the product scans for cookies, the Microsoft version does not as they are not regarded as a threat (despite Ad-Aware classifying them as critical objects). In their information for Giant AntiSpyware users who have active subscriptions, Microsoft says:

“Giant AntiSpyware detects and removes cookies from your computer. Because many Web sites require the use of cookies to enable a great user experience, Windows AntiSpyware (Beta) does not remove cookies.”

So are cookies a threat? The answer is both “Yes” and “No”. Quoting from an HP article on where spyware hides:

“Cookies can help users streamline online transactions, remember browsing preferences and user profiles, and personalize pages. Many users don’t realize that cookies can be used to compile data so companies can construct a profile about the websites they visit and the web banner advertisements they click through. This information is mined so companies can deliver targeted ads.

Some websites respectfully use temporary cookies (session cookies) that disappear when you close the browser. Many more websites use persistent cookies that remain on your hard drive indefinitely. Microsoft Internet Explorer and Netscape Navigator, the two most popular browsers, still send out existing cookies even if you’ve disabled cookies in your browser settings. This means you must delete cookie files manually to keep from being tracked by third-party ad networks and spyware providers.”

And from the privacy.net cookie demo:

“Some common uses for Internet cookies are:

  • An anonymous code given to you so the web site operator can see how many users return at a later time. These cookies are configured to stay on your system for months or years and are called “persistent” cookies.
  • A code identifying you. This usually occurs after a registration. The site could keep a detailed account of pages visited, items purchased, etc. and even combine the information with information from other sources once they know who you are.
  • A list of items you purchased. This is often used in “shopping cart” web sites to keep track of your order. Often cookies of this type ‘expire’ as soon as you log out or after a short time. These are called “session” cookies.
  • Personal preferences. This can be anonymous or linked to personal information provided during a registration.

Cookies are supposed to be only accessible from the site that placed them there. However, in some cases cookies from other sites show up in the log files so it is not a secure way to authenticate a user.”

So you can see that session cookies are fine. So are some persistent cookies (e.g. the one which tells the BBC website where I live so it can give me localised information); but most of the ones I found were tracking cookies for advertising sites. These are not good and I urge Microsoft to include cookie detection in the release version of Microsoft AntiSpyware (perhaps using the SpyNet AntiSpyware community to distinguish between good and bad cookies?).

Finally, for anyone worrying about what happens when their version of the Microsoft AntiSpyware Beta expires at the end of July, Microsoft has started to push updates and one of my PCs upgraded itself to version 1.0.614 today, which expires at the end of December. The others are still on 10.0.501 but I expect to see them do the same over the next few weeks.

Further reading

Adware/Spyware thread (pcreview.co.uk)
Cookie demonstration (privacy.net)
Microsoft AntiSpyware: Torn Apart