Microsoft SQL Server overview

This content is 7 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I wrote this post a few months ago… and it crashed my blog. Gone. Needed to be restored from backup…

…hopefully this time I’ll have more luck!

One of the advantages of being in the MVP Reconnect programme is that I occasionally get invited to webcasts that open my eyes to technology I’ve not had a lot to do with previously. For many years, one of the big holes in my knowledge was around Microsoft SQL Server. That was until I saw Brian Kelley (@kbriankelley)’s “Brief overview of SQL Server”. The content’s not restricted, so I thought I’d republish some of it here for others who are getting their head around the major on-premises components of the Microsoft Data Platform.

SQL Server Editions

There are several editions of SQL Server available and these are the key differences (updated for 2017):

  • Express Edition (previously known as MSDE) is a free version, with some limitations around database size, etc.
  • Standard Edition lacks some enterprise features but has high availability and suits many application workloads.
  • Enterprise Edition is the full functionality product (but can be expensive)
  • Developer Edition (not licenced for use in production) offers the full feature set but can also run on a client operating system whereas enterprise will only run on server-based operating systems
  • Web Edition has reduced functionality and is intended for public websites (only available to service providers)
  • Compact Edition is another free version, intended for embedded databases in ASP.NET websites and Windows desktop applications

Although SQL Server is often thought of as an RDBMS product, it’s really a suite of systems, under the SQL Server name. Usually that means the database engine but there are many parts, each of which has a distinct setup (i.e. you don’t need the database service for SQL Server Analysis Services and vice versa).

SQL Server Analysis Services (SSAS)

SSAS (since 2007) is an online analytical and transaction processing (OLAP) tool intended for data warehousing and data mining.

One advantage of OLAP is to run jobs during the night for pre-generated calculations (used for roll-ups – e.g. totals and averages, etc.). It can provide fast results to business users who would otherwise need complex calculations in a transactional system (e.g. sales data based on region, month, quarter, etc. can be done ahead of time).

SSAS is comparable to IBM Cognos or Oracle Essbase (normally packaged with Hyperion for accounting, etc.).

Some SSAS jargon includes:

  • Star schema/snowflake schema – database design differs from transactional design. You can do these things in RDBMS but use SSAS on top.
  • Cubes
  • Dimensions
  • Tabular model
  • Data analysis expressions (DAX) – a language to do things in SSAS

SQL Server Integration Services (SSIS)

SSIS (since 2005) is heavily used for extract, transform and load (ETL) workloads – i.e. to get data from a source, manipulate it and pass it to a destination. It can be used to build a data warehouse, then data marts or to move data between systems. Basically, it’s a back-end batch processing system that performs the data mining.

SSIS is a replacement for Data Transformation Services (DTS). It’s not limited to SQL Server for source/destination so can talk to Oracle, Excel spreadsheets, other ODBC connections, etc.

The drag and drop interface is very powerful with the full functionality and flexibilityof Microsoft.NET behind it.

SSIS is comparable with Informatica (or Clover, etc.).

Some SSIS jargon includes:

  • Packages (whatever is processing, contains all the logic)
  • Tasks (what’s being carried out)
  • Dataflow tasks (how you go from source to destination – could be multiples)
  • Transformation (manipulating data)
  • Business Intelligence Markup Language (BIML)

SQL Server Reporting Services (SSRS)

SSRS was introduced 2005 and became so popular it was ported back to SQL Server 2000!
It is a reporting engine, used to publish reports in-browser. Early versions were built on IIS but since 2008, SSIS has run directly on http.sys.

SSRS can be integrated with SharePoint (for report security based on SharePoint security) or the native, standalone mode is browser-based to look at folders, find reports, and run a report with parameters. Used to print via ActiveX control but now (since 2016) prints to PDF (or opens with a PDF reader).

There are two ways to build reports: Report Builder (a web-side interface for BA-type power user) or Report Designer (a full product for complex designs). There is also a subscription capability so users can subscribe to reports.

SSRS can be compared with IBM Business Objects and Tableau.

SSRS jargon includes:

  • Reports
  • Data sources
  • Datasets
  • ReportServer (API to integrate with other products)
  • Native mode vs. integrated mode (SharePoint)

SQL Server Database Engine

The SQL Server database engine is what most people think of when SQL Server is mentioned.
It is traditionally a relational database management system (RDBMS) although it now contains many other database capabilities. It was originally derived from a Sybase product (until SQL Server 6.5).

SQL Server supports both multiple databases per instance (which can connect and join across) and multiple instances per server (from 2000) – the first is a default instance, then named instances can be created.

SQL Server uses a SQL language variant called T-SQL to interact. A GUI is provided in SQL Server Management Studio but it’s also possible interact via PowerShell.

SQL Server also has a scheduler (the SQL Server Agent), which can alert on success/failure and allows the creation of elaborate scheduling routines with notifications and the ability to run code. It is comparable with IBM DB2, Oracle, PostgreSQL, Sybase, MySQL and MariaDB.

SQL Server 2016 features include:

  • High availability options, including Always On failover clusters; Always On availability groups (which are more flexible because they don’t have to replicate and fail over everything); Database mirroring (one database on multiple systems; deprecated now in favour of availability groups); log shipping.
  • Several encryption options including built-in (certificate, asymmetric keys, symmetric keys); Enterprise Edition also has Transparent Data Encryption (TDE) to encrypt database at rest and stop copies of the database from being loaded elsewhere; connection encryption (SSL/TLS since 2005); Always Encrypted is new for 2016 (transparent to the application and to SQL Server) – data stored in encrypted form within the database.
  • SQL Server and Windows authentication (server or Active Directory). Can have Windows or both, but not just SQL Server-based logins.
  • Replication options to move data between servers.

Other security features include audit objects (who did what?); granular security permissions; login auditing (failed logins are written to the SQL Server Error Log text file and to the application event log); dynamic data masking (depending on who needs to see it – e.g. store social security numbers and only show part of the data; only obfuscation as data is still in clear text); row-level security (to filter rows).

Each new version brings performance enhancements, e.g. columnstore indexes, in-memory OLTP tables, query optimisation.

New Technologies in 2016 include:

  • JSON support. Query and return data in JSON format. Administrators have been able to use SOAP and XML since 2005 but this is now deprecated in favour of JSON (which is popular for RESTful systems).
  • Master data services.
  • Polybase (not to be confused with a clustering solution – it’s about talking to other data sources, e.g. Hadoop, Cloudera and Azure storage, to be expanded to include Oracle, Teradata, Mongo, Spark and more).
  • R Services/R Server (R within the database and also R Server for data science/big data queries).

2017 builds on 2016 to include:

  • Linux and Docker support. Starting with SQL Server 2017, SQL Server is available for either Windows or Linux systems and it’s available as an installable application or for Docker containers.
  • SQL Server R Services has been renamed SQL Server Machine Learning Services, to reflect support for Python in addition to R.

There are many more features in the Microsoft documentation but these are the most significant updates.

But what about the cloud?

This post provided a quick run-down of some of the major on-premises SQL Server components but, just as with Microsoft’s other products, there are cloud alternatives too. I’m planning a follow-up post to cover these so watch this space!

More on NoSQL, Hadoop and Microsoft’s entry to the world of big data

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Yesterday, my article on Microsoft’s forays into the world of big data went up on Cloud Pro. It’s been fun learning a bit about the subject (far more than is in that article – because big data is a big theme in my work at the moment) and I wanted to share some more info that didn’t fit into my allotted 1000 words.

Microsoft Fellow Dr David DeWitt gave an excellent keynote on Day 3 of the SQL PASS 2011 summit last month and it’s a great overview of how Hadoop works. Of course, he has a bias towards use of RDBMS systems but the video is well worth watching for it’s introduction to NoSQL, the differences between key value stores and Hadoop-type systems, and the description of the Hadoop components and how they fit together (skip the first 18 minutes and, if the stream doesn’t work, try the download – the deck is available too). Grant Fritchey and Jen McCown have written some great notes to go with Dr DeWitt’s keynote too.  For more about when you might use Hadoop, Jeremiah Peschka has a good post.

Microsoft’s SQOOP implementation is not the first – Cloudera have been integrating SQL and Hadoop for a couple of years now. Meanwhile, Buck Woody has a great overview of Microsoft’s efforts in the big data space.

I also mentioned Microsoft StreamInsight (formerly code-named “Austin”) in the post (the Complex Event Processing capability inside SQL Server 2008 R2) and Microsoft’s StreamInsight Team has posted what they call “the basics” of event processing. It seems to require coding, but is probably useful to anyone who is getting started with this stuff. For those of us who are a little less code-oriented, Andrew Fryer’s overview of StreamInsight (together with a more general post on CEP) is worth a read, together with Simon Munro’s post on where StreamInsight fits in.

Shortly after I sent my article to Cloud Pro’s Editor, I saw Mike Walsh’s “Microsoft Loves Your Big Data” post. I like this because it cuts through the press announcements and talks about what is really going on: interoperability; and becoming a player themselves. Critically:

“They aren’t copying, or borrowing or trying to redo… they are embracing”

And that is what I really think makes a refreshing change.

SQL Server and Hadoop – unlikely bedfellows but a powerful combination

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Big Data is hard to avoid – what does Microsoft’s embrace of Hadoop mean for IT Managers?

There are two words that seem particularly difficult to avoid at the moment: big data. Infrastructure guys instinctivly shy away from data but such is its prevalence that big data is much more than just the latest IT buzzword and is becoming a major theme in our industry right now

But what does “big data” actually mean? It’s one of those phrases that, like cloud computing earlier, it is being “adopted” by vendors to mean whatever they want it to.

The McKinsey Global Institute describes big data as “the next frontier for innovation, competition and productivity” but, put simply, it’s about analysing masses of unstructured (or semi-structured) data which, until recently, was considered too expensive to do anything with.

That data comes from a variety of sources including sensors, social networks and digital media and it includes text, audio, video, click-streams, log files and more. Cynics who scoff at the description of “big” data (what’s next, “huge” data?) miss the point that it’s not just about the volume of the data (typically many petabytes) but also the variety and frequency of that data. Some even refer to it as “nano data” because what we’re actually looking at is massive sets of very small data.

Processing big data typically involves distributed computer systems and one project that has come to the fore is Apache Hadoop – a framework for development of open-source software for reliable, scalable distributed computing.

Over the last few weeks though, there have been some significant announcements from established IT players, not all of whom are known for embracing open source technology. This indicates a growing acceptance for big data solutions in general and specifically for solutions that include both open- and closed- source elements.

When Microsoft released a SQL Server-Hadoop (SQOOP) Connector,there were questions about what this would mean for CIOs and IT Managers who may previously have viewed technologies like Hadoop as a little esoteric.

The key to understanding what this would mean would be understanding the two main types of data: structured and unstructured. Structured data tends to be stored in a relational database management system (RDBMS), for example Microsoft SQL Server, IBM DB2, Oracle 11G or MySQL.

By structuring the data with a schema, tables, keys and all manner of relationships it’s possible to run queries (with a language like SQL) to analyse the data and techniques have developed over the years to optimise those queries. By contrast, unstructured data has no schema (at least not a formal one) and may be as simple as a set of files.  Structured data offers maturity, stability and efficiency but unstructured data offers flexibility.

Secondly, there needs to be an understanding of the term “NoSQL”.  Commonly misinterpreted as an instruction (no to SQL), it really means not only SQL – i.e. there are some types of data that are not worth storing in an RDBMS.  Rather than following the database model of extract, transform and load (ETL), with a NoSQL system the data arrives and the application knows how to interpret the data, providing a faster time to insight from data acquisition.

Just as there are two main types of data, there are two main types of NoSQL system: key/value stores (like MongoDB or Windows Azure Table Storage) can be thought of as NoSQL OLTP; Hadoop is more like NoSQL data warehousing and is particularly suited to storing and analysing massive data sets.

One of the key elements towards understanding Hadoop is understanding how the various Hadoop components work together. There’s a degree of complexity so perhaps it’s best to summarise  by saying that the Hadoop stack consists of a highly distributed, fault tolerant, file system (HDFS) and the MapReduce framework for writing and executing distributed, fault tolerant, algorithms. Built on top of that are query languages (live Hive and Pig) and then we have the layer where Microsoft’s SQOOP connector sits, connecting the two worlds of structured and unstructured data.

The trouble is that SQOOP is just a bridge – and not a particularly efficient one either – working on SQL data in the unstructured world involves subdivision of the SQL database so that MapReduce can work correctly.

Because most enterprises have both the structured and unstructured data, we really need tools that allow us to analyse and manage data in multiple environments – ideally without having to go back and forth. That’s why there are  so many vendors jumping on the big data bandwagon but it seems that a SQOOP connector is not the only work Microsoft is doing in the big data space:

In our increasingly cloudy world, infrastructure and platforms are rapidly becoming commoditised. We need to focus on software that allows us to derive value from data to gain some business value. Consider that Microsoft is only one vendor, then think about what Oracle, IBM, Fujitsu and others are doing. If you weren’t convinced before, maybe HP’s Autonomy purchase is starting to make sense now?

Looking specifically at Microsoft’s developments in the big data world, it therefore makes sense to see the company get closer to Hadoop. The world has spoken and the de facto solution for analysing large data sets seems to be HDFS/MapReduce/Hive (or similar).

Maybe Hadoop’s success comes down to HDFS and MapReduce being based on work from Google whilst Hive and Pig are supported by Facebook and Yahoo respectively (i.e. they are all from established Internet businesses).  But, by embracing Hadoop (together with porting its tools to competitive platforms), Microsoft is better placed to support the entire enterprise with both their structured and unstructured needs.

[This post was originally written as an article for Cloud Pro.]

An introduction to business intelligence for IT Managers

This content is 15 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A few weeks ago, I caught one of the IT Manager series of webcasts that Microsoft is running, where Andrew Fryer was introducing Business Intelligence for IT Managers (I’ll steer clear of the obvious joke about IT managers and intelligence there… I might want a job as an IT Manager one day…). This was an interesting presentation for a couple of reasons: it’s not a topic that I know well; and Andrew presented 290 slides in less than an hour (which sounds a lot, but it wasn’t – he used slides with just a few words or a picture, in rapid succession – and I like that style).

I can’t find the recorded version of the presentation online but this blog post attempts to capture the main points.

According to Wikipedia, Business Intelligence (BI) can be defined as follows:

“Business intelligence (BI) refers to skills, technologies, applications and practices used to help a business acquire a better understanding of its commercial context. Business intelligence may also refer to the collected information itself.”

That’s a bit of a mouthful, but basically it makes BI seem hard. So, let’s think about intelligence without the business context – is it: knowledge and understanding (we used to think the world was flat!); about meaning and context (some information can seem obvious without context); about foresight (to predict future events); the ability to solve complex problems; or the ability to make decisions?

We make decisions all the time – and some of those decisions are poor ones – so if intelligence is about making decisions (and decisions require people), what makes a good decision? The answer is information. The information provides the meaning and context together with the background information (knowledge and understanding, likelihood of a given event occurring, etc.) to allow someone to make the right decision.

Information has been used throughout history to share stories, to learn and to discover things. Over time, information has helped to provide answers and to unlock secrets, which allowed innovation. Information has provided answers – and answers have allowed people to make decisions.

In a business context, the information is derived from data (about people, products, places, etc.). Where there are questions (which products are best? how are the sales figures looking? how are my people?), some insight is required to provide meaning and to convert raw data to information.

That data needs to be stored – originally it was stored in paper files and later on computer disks and tapes – but it also needs to be managed. The advent of databases provided a means for data to be stored and managed but business applications were needed to abstract the management of the database from business end users. These business applications provided a better way to collect data, easing the process of data entry and managing access, to ensure that those who needed access were able to find answers to business questions. But is wasn’t easy – the data was sourced from many locations. Reports were one approach, but they were one-dimensional and siloed, fragmented and lacking insight.

The advent of data warehouses allowed data from multiple locations to be organised, managed and accessed – or consumed. Now the business applications could analyse as well as report and the term “Business Intelligence” was born. Promising more access, from more locations, BI vendors created demand for more data. This led to businesses wanting faster access to data (improved performance). But as the volume of data exploded, so did the use of personal computers, and most data ended up in desktop productivity applications like Microsoft Excel, and Access. There was no single version of the truth upon which to base decisions, the data was hard to maintain and the BI tools cost a lot of money, so vendors had to find a way to reduce costs and offer increased functionality. The result was a period of vendor consolidation in the BI tools space and the formation of a few BI platforms, from companies like Oracle, SAP, IBM and Microsoft, offering more tools and more functionality, for both online and offline access.

But, for all the promises, BI tools were still not working. Business users were still confused and the business couldn’t get the answers that were needed. It wasn’t about people – it was still about disparate systems, with access to data controlled by the IT department. An overstretched IT department. So business users started to circumvent IT departments, but the BI tools were not intuitive and the users didn’t have the time to be IT administrators. Suddenly BI was about usability, and turning data into the right format to be easily consumed by people, with that data managed by IT.

There’s not just the data in the databases to consider either – there is unstructured data too. That unstructured data comes from places like blogs, wikis, e-mail messages, documents, presentations, and videos – at one point analysts considered that 80% of business was conducted based on unstructured data.

So BI is about the right person, accessing the right data, at the right time – and it needs to be people-centric because it’s generally people that make decisions, not computers. Businesses need to do more to collaborate, search, and communicate about questions and answers in order to drive innovation. Even in today’s times of economic uncertainly, BI is still a priority at CxO level in order for businesses to do more with less, to provide better insight for better decision-making, for more people.

Reporting and scorecards are important components of the BI toolset, along with dashboards to display key performance indicators, for analysis. On the desktop we still use applications like Excel but the data lives in the warehouse. Other BI features include data mining (e.g. the shopping basket analysis that supermarket chains carry out using our loyalty cards). For unstructured data, we have portals for collaboration.

In today’s BI implementations the critical success factors are sponsorship (at a senior level in the company), a compelling need, a culture of analysis (rather than looking for divine inspiration) and, most importantly, partnership between the IT department and business users.

I don’t pretend to know anything about any of the specialist BI tools but, on the Microsoft infrastructure side, we already have some useful tools. Office gives us desktop applications like Excel, there are collaboration services in the form of SharePoint products and technologies, and we have a scalable database engine in SQL Server – there’s more information on Microsoft’s BI blog and learn more about the products on Microsoft’s BI website. There’s also advice on planning for BI in the SharePoint Server TechCenter, webcasts, videos, virtual labs and podcasts and more advice for IT Managers and their teams on the TechNet website. Finally, if you just want the highlights and a bit of technical analysis, Andrew Fryer’s “Insufficient Data” blog is worth a read.

No more heroes {please}

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

That’s it.  A single reference to [IT] heroes.  No more – because I didn’t count how many times that word was used at the 2008 Global Launch event today but I certainly didn’t have enough fingers and toes to keep a tally – and now I’m tired of hearing it.

Although those of us at the UK launch had already heard from a variety of Microsoft executives (including Microsoft UK Managing Director, Gordon Frazer, and Microsoft’s General Manager for the Server and Tools Division, Larry Orecklin) and customers, the highlight was the satellite link-up to the US launch event with Microsoft CEO, Steve Ballmer.Steve Ballmer at the Microsoft 2008 Global Launch  Unfortunately, before we got to hear the big man speak, we had to listen to the warm-up act – Tom Brokaw, who it would seem is a well-known television presenter in the States, but totally unknown over here.  He waffled on for a few minutes with the basic premise being that we are in a transformational age in the history of our world and that the definition of our time and generation comes from unsung heroes (damn, that’s the second time I’ve used the word) – not celebrities.

So.  Windows Server 2008, Visual Studio 2008, SQL Server 2008.  Three new products – one released last year, one earlier this month, and another due later in 2008 in Microsoft’s largest ever launch event with 275,000 people expected to attend events across the globe and another million online at the virtual launch experience website.  Ballmer described them as "The most significant [products] in Microsoft’s history" and "enablers to facilitate the maximum impact that our industry can have".  But what does that mean for you and I – the people that Microsoft likes to refer to with the H word who implement their technology in order to execute this change on an unsuspecting world?

I’ve written plenty here before about Windows Server 2008, but the 2008 global launch wave is about more than just Windows.  For years now, Microsoft has been telling us about dynamic IT and over the last few years we have seen many products that can help to deliver that vision.  The 2008 global launch wave is built around four areas:

  1. A secure and trusted foundation.
  2. Virtualisation.
  3. Web and developer productivity.
  4. Business intelligence (and user experience).

So, taking each of these one at a time, what do the 2008 products offer?

A secure and trusted foundation

Security and reliability are always touted as benefits for the latest version of any product, but in the case of Windows Server there are some real benefits.  The Server Core installation option results in a smaller codebase, meaning a reduced attack surface.  The modular design of IIS (and indeed the role-based architecture for Windows Server) means that only those components that are required are installed. Read-only domain controllers allow for secure deployment of directory servers in branch office situations that previously would have been a major security risk.

Availability is increased with enhancements to failover clustering (including new cluster validation tools), SQL data mirroring and the new resource governor functionality in SQL Server 2008 which allows resources to be allocated to specific workloads.

On the compliance and governance front, there is network access protection, federated rights management, and transparent SQL data encryption.

Microsoft is also keen to point out that their database platform has seen significantly fewer critical vulnerabilities in recent history than Oracle.

Finally, although not strictly security-related, Microsoft cites 40% of data centre costs relating to power and that Windows Server 2008 consumes 10% less power than previous versions of Windows Server, when running the same workload.

Virtualisation

Microsoft’s view on virtualisation is broader than just server virtualisation, encompassing not just the new Hyper-V role that will ship within 180 days of Windows Server 2008 release but also profile virtualisation (document redirection and offline files), client virtualisation (Vista Enterprise Centralised Desktop), application virtualisation (formerly SoftGrid) and presentation virtualisation (Terminal Services RemoteApp), all managed in one integrated, unified manner with System Center.

As for VMware‘s dominance of the server virtualisation space – I asked Larry Orecklin how Microsoft would combat customer perceptions around Microsoft’s lack of maturity in this space. His response was that "the proof is in the pudding" and that many customers are running Hyper-V in beta with positive feedback on performance, scalability and ease of use.  Microsoft UK Server Director, Bruce Lynn added that Hyper-V is actually the tenth virtualisation product that Microsoft has brought to market.

In Steve Ballmer’s keynote, he commented that [customers] have told Microsoft that virtualisation is too hard and too expensive – so Microsoft wants to "democratise virtualisation" – to switch from the current situation where less than 10% of servers are virtualised to a world where 90% are.  Their vision is for a scalable and performant hypervisor-based virtualisation platform, with minimal footprint, interoperability with competitive platforms, and integrated management tools.

Web and developer productivity

At the core of Windows Server 2008 is IIS 7.0 but Visual Studio extends the vision for developer productivity when creating rich web applications including support for AJAX, JavaScript IntelliSense, XAML, LINQ, entity-level data access and multi-targeting.

From a platform perspective, there are improvements around shared configuration, administrative delegation and scalability.

Combined with Silverlight for a rich user experience and Expression Blend (for designers to interact with developers on the same code), Microsoft believes that their platform is enabling customers to provide better performance, improved usability and a better experience for web-based applications.  It all looks good to me, but I’m yet to be convinced by Silverlight, or for that matter Adobe AIR – this all seems to me like a return to the days when every site had a Shockwave/Flash intro page and I’m like to see a greater emphasis on web standards.  Still, at least IIS has new support for running PHP without impacting on performance now – and Visual Studio includes improved CSS styling support.

Business intelligence

Ballmer highlighted that business intelligence (BI) is about letting users engage with applications – providing not just presentation but insight – getting at the data to provide business value.  Excel is still the most popular business intelligence tool, but combined with other products (e.g. SharePoint and PerformancePoint), the Microsoft BI story is strengthened.

SQL Server 2008 is at the core of the BI platform providing highly performant and scalable support for data warehousing with intelligence for both structured and unstructured data.  SQL Server reporting services integrates with Office applications and the ability to store spatial data opens new possibilities for data-driven applications (e.g. the combination of non-relational data and BI data to provide location awareness).

Putting it all together

So, that’s the marketing message – but what does this mean in practice?  Microsoft used a fictitious coffee company to illustrate what could be done with their technology but I was interested to hear what some of their TAP customers had been up to.  Here in the UK there were a number of presentations from well-known organisations that have used 2008 launch wave products to solve specific business issues.

easyJet have carried out a proof of concept that they hope to develop into an improved travel portal for their customers.  As a low-fares airline, you might expect anything more than the most basic website to be an expensive extravagance but far from it – 98% of easyJet’s customers book via the web, and if the conversion rate could be increased by 1% then that translates into £17m of revenue each year.

The easyJet proof of concept uses a Silverlight and AJAX front end to access Microsoft .NET 3.5 web services and SQL Server 2008.  Taking a starting point of, for example, London Luton, a user can select a date and see the lowest prices to all available destinations on a map.  Clicking through to a destination reveals a Microsoft Virtual Earth map with points of interest within a particular radius.  Streaming video is added to the mix, along with the ability to view hotel details using TripAdvisor and book online.

The proof of concept went from design to completion in just 6 weeks.  Windows Server 2008 provided IIS 7.0 with its modular design and simplified configuration.  SQL Server 2008 allowed the use of geospatial data.  And Visual Studio 2008 enhanced developer productivity, team collaboration and the overall user experience.

Next up was McLaren Electronic Systems, using SQL Server 2008 to store telemetry data transmitted in real time from Formula 1 racing cars.  With microwave signals bouncing off objects and data arriving out of sequence, the filestream feature allows data to be streamed into a relational database for fast access.  Tests have shown that for files above 2MB this technology will out-perform a traditional file system.  Formula 1 may sound a little specialised to relate to everyday business but as McLaren explained, a Formula 1 team will typically generate 3TB of data in a season.  That’s a similar volume to a financial services company, or a warehousing and logistics operation – so the technology is equally applicable to many market sectors.

The John Lewis Partnership is using Windows Server 2008 for its branch office infrastructure.  Having rolled out Windows Server 2003, they would like to reduce the number of servers (and the carbon footprint of their IT operations) at the same time as doubling the number of stores.  Security is another major consideration, with the possibility of data corruption if power is removed from a server and a security breach if a directory server is compromised.

By switching branch servers to Windows Server 2008 read-only domain controllers (DCs), John Lewis can combine the DCs with other branch office functions (print, DHCP, System Center Configuration Manager and Operations Manager) to remove one server from every store.  The reduction in replication traffic (AD replication is all one-way from the centre to the RODCs) allows for a reduction in data centre DCs too.  Windows Server 2008 also facilitates improved failover between data centres in a disaster recover scenario.  Other Windows Server technologies of interest to John Lewis include Server Core, 64-bit scalability and clustering.

The University of Cambridge is making use of the ability to store spatial data in SQL Server 2008 to apply modern computing to the investigation of 200 year-old theories on evolution.  And Visual Studio 2008 allowed the construction of the associated application in just 5 days.  As Professor John Parker and his self-confessed "database geek" sidekick, Dr Mark Whitehorn explained, technologies such as this are "allowing the scientific community to wake up to business intelligence".

Finally, the Rural Payments Agency (the UK government agency responsible for paying agricultural subsidies) is using Microsoft Application Virtualization and Terminal Services to provide an ultra-thin client desktop to resolve application conflicts and allow users to work from any desk.

Roadmap

Microsoft never tells us a great deal about the roadmap (at least not past the next year or so) but the 2008 launch wave includes a few more products yet.  Visual Studio 2008 and Windows Server 2008 have already shipped.  SQL Server 2008 will be available in the third quarter of 2008 (with a community technology preview today) and the Hyper-V role for Windows Server will ship within 180 days of Windows Server (although I have heard rumours it may be a lot closer than that).  In the summer we will see a new release of Windows Small Business Server as well as a new product for SMEs – Windows Essential Business Server – and, at the other end of the computing spectrum, Windows High Performance Computing Server.  Finally, a new version of Silverlight will ship at some point this year.

Summary

I may not be a fan of the HEROES happen {here} theme but that’s just marketing – I’ve made no secret of the fact that I think Windows Server 2008 is a great product.  I don’t have the same depth of experience to comment on Visual Studio or SQL Server but the customer presentations that I heard today add credence to Microsoft’s own scenario for a dynamic, agile, IT infrastructure to reduce the demands for maintenance of the infrastructure and drive out innovation to support the demands of modern business. 

Mark Wilson {United Kingdom}

IT Forum ’05 highlights: part 1

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Microsoft UK IT Forum Highlights
A few years back, I used to try and persuade my employer to send me to Microsoft TechEd Europe each year, on the basis that lots of 75 minute presentations on a variety of topics provided a better background for me than a few days of in depth product training (I can build experience later as I actually use the technology). The last time I attended TechEd was back in 2001, by which time it had become more developer-focused and the IT Forum was being positioned as the infrastructure conference (replacing the Microsoft Exchange Conference). For the last couple of years, I haven’t been able to attend the IT Forum due to family commitments (first it clashed with my the birth of my son and then subsequently its been in conflict with his birthday, as it is again this year) but luckily, Microsoft UK has been re-presenting the highlights from IT Forum as free-of-charge TechNet events (spread over two days) and I’ve managed to take some time out to attend them.

Yesterday’s event covered a variety of topics. Unfortunately there was no concept of different tracks from which I could attend the most relevant/interesting sessions, so some it went completely over my head. One of those topics was upgrading to SQL Server 2005, so apologies to the presenter – I was the guy nodding off on the front row.

In the next few paragraphs, I’ll highlight some of the key points from the day.

Upgrading to SQL Server 2005
Presented by Tony Rogerson, SQL Server MVP and UK SQL Server Community leader, this session gave useful information for those looking at upgrading from SQL Server 2000 (or earlier) to SQL Server 2005. I’ve blogged previously with a SQL Server 2005 overview, why SQL Server 2005 is such a significant new product and on the new management tools but the key points from Tony’s presentation were:

  • Upgrades (in-place upgrades) are supported, preserving user data and maintaining instance names in a largely automated fashion, as are side-by-side migrations (mostly manual, copying data from an old installation to a new and then decommissioning the old servers).
  • SQL Server versions prior to 7.0 cannot be migrated directly and SQL Server 7.0/2000 need to be updated to the latest service pack levels before they can be migrated. For SQL Server 2000 that is SP4, which might break some functionality for SP3A users, so the upgrade needs to be carefully planned.
  • The database engine (including subcomponents like the SQL Agent, tools, etc.), analysis services, reporting services and notification services can all be upgraded, and data transformation services can be migrated to integration services.
  • All product editions can be upgraded/migrated (32/64-bit, desktop, workgroup, personal, standard, developer or enterprise editions), as can all SQL Server 7.0/2000 released languages.
  • A smooth upgrade requires a good plan, breaking tasks into:
    • Pre-upgrade tasks.
    • Upgrade execution tasks.
    • Post-upgrade tasks (day 0, day 30, day 90).
    • Backout plan.
  • Microsoft provides the SQL Server 2005 Upgrade Advisor as a free download to analyse instances of SQL Server 7.0 and SQL Server 2000 in preparation for upgrading to SQL Server 2005. This can be used repeatedly until all likely issues have been resolved and the upgrade can go ahead.
  • Migration provides for more granular control over the process that an upgrade would and the presence of old and new installations side-by-side can aid with testing and verification; however it does require new hardware (although a major investment in a SQL Server upgrade would probably benefit from new hardware anyway) and applications will need to be directed to the new instance. Because the legacy installation remains online, there is complete flexibility to fail back should things not go to plan.
  • Upgrades will be easier and faster for small systems and require no new hardware or application reconfiguration; however the database instances will remain offline during the upgrade and it’s not best practice to upgrade all components (e.g. analysis services cubes).
  • Upgrade tips and best practices include:
    • Reduce downtime by pre-installing setup pre-requisites (Microsoft .NET Framework 2.0, SQL Native Client and setup support files) – some of these are needed for the Upgrade Advisor anyway.
    • If planning a migration using the copy database wizard, place the database in single-user mode (to stop users from modifying the data during the upgrade) and make sure that no applications or services are trying to access the database. Also, do not use read-only mode (this will result in an error) and note that the database cannot be renamed during the operation.
    • Be aware of the reduced surface attack area of SQL Server 2005 – some services and features are disabled for new installations (secure by default) – the surface area configuration tools can be used to enable or disable features and services.

Leveraging your Active Directory for perimeter defence
Presented by Richard Warren, an Internet and security training specialist, I was slightly disappointed with this session, which failed to live up to the promises that its title suggested. After spending way too much time labouring Microsoft’s usual points about a) how packet filtering alone is not enough and ISA Server adds application layer filtering and b) ISA Server 2004 is much better and much easier to use than ISA Server 2000, Richard finally got down to some detail about how to use existing investments in AD and ISA Server to improve security (but I would have liked to have seen more real-world examples of exactly how to implement best practice). Having been quite harsh about the content, I should add that there were some interesting points in his presentation:

  • According to CERT, 95% of [computer security] breaches [were] avoidable with an alternative configuration.
  • According to Gartner Group, approximately 70% of all web attacks occur at the application layer.
  • Very few organisations are likely to deploy ISA Server as a first line of defence. Even though ISA Server 2004 is an extremely secure firewall, it is more common to position a normal layer 3 (packer filtering) firewall at the network edge and then use ISA Server behind this to provide application layer filtering on the remaining traffic.
  • Users who are frightened of IT don’t cause many problems. Users who think they understand computers cause most of the problems. Users who do know what they are doing are few and far between. (Users are a necessary evil for administrators).
  • Not all attacks are malicious and internal users must not be assumed to be “safe”.
  • ISA Server can be configured to write it’s logs to SQL Server for analysis.
  • Active Directory was designed for distributed security (domain logon/authentication and granting access to resources/authorisation) but it can also store and protect identities and plays a key role in Windows managability (facilitating the management of network resources, the delegation of network security and enabling centralised policy control).
  • Using ISA Server to control access to sites (both internal and external), allows monitoring and logging of access by username. If you give users a choice of authenticated access or none at all, they’ll choose authenticated access. If transparent authentication is used with Active Directory credentials, users will never know that they needed a username and password to access a site (this requires the ISA Server to be a member of the domain or a trusted domain, such as a domain which only exists within the DMZ).
  • ISA Server’s firewall engine performs packet filtering and operates in kernel mode. The firewall service performs application layer filtering (extensible via published APIs) and operates in user mode.
  • SSL tunnelling provides a secure tunnel from a client to a server. SSL bridging involves installing the web server’s certificate on the ISA Server, terminating the client connection there and letting ISA server inspect the traffic and handle the ongoing request (e.g. with another SSL connection, or possibly using IPSec). Protocol bridging is similar, but involves ISA server accepting a connection using one protocol (e.g. HTTP) before connecting to the target server with another protocol (e.g. FTP).

Microsoft Windows Server 2003 Release 2 (R2) technical overview
Presented by Quality Training (Scotland)‘s Andy Malone, this session was another disappointment. Admittedly, a few months back, I was lucky to be present at an all day R2 event, again hosted by Microsoft, but presented by John Craddock and Sally Storey of Kimberry Associates, who went into this in far more detail. Whilst Andy only had around an hour (and was at pains to point out that there was lots more to tell than he had time for), the presentation looked like Microsoft’s standard R2 marketing deck, with some simple demonstrations, poorly executed, and it seemed to me that (like many of the Microsoft Certified Trainers that I’ve met) the presenter had only a passing knowledge of the subject – enough to present, but lacking real world experience.

Key points were:

  • Windows Server 2003 R2 is a release update – approximately half way between Windows Server 2003 and the next Windows Server product (codenamed Longhorn).
  • In common with other recent Windows Server System releases, R2 is optimised for 64-bit platforms.
  • R2 is available in standard, enterprise and datacenter editions (no web edition) consisting of two CDs – the first containing Windows Server 2003 slipstreamed with SP1 and the second holding the additional R2 components. These components are focused around improvements in branch office scenarios, identity management and storage.
  • The new DFSR functionality can provide up to 50% WAN traffic reduction through improved DFS replication (using bandwidth throttling remote differential compression, whereby only file changes are replicated), allowing centralised data copies to be maintained (avoiding the need for local backups, although one has to wonder how restoration might work over low-speed, high latency WAN links). Management is improved with a new MMC 3.0 DFS Management console.
  • There is a 5MB limit on the size of the DFS namespace file, which equates to approximately 5000 folders for a domain namespace and 50,000 folders for a standalone namespace. Further details can be found in Microsoft’s DFS FAQ.
  • Print management is also improved with a new MMC 3.0 Print Management console, which will auto-discover printers on a subnet and also allows deployment of printer connections using group policy (this requires use a utility called pushprinterconnections.exe within a login script, as well as a schema update).
  • Identity and access management is improved with Active Directory federation services (ADFS), Active Directory application mode (ADAM – previously a separate download), WS-Management and Linux/Unix identity management (incorporating Services for Unix, which was previously a separate download).
  • For many organisations, storage management is a major problem with typical storage requirements estimated to be increasing by between 60% and 100% each year. The cost of managing this storage can be 10 times the cost of the disk hardware and Microsoft has improved the storage management functionality within Windows to try and ease the burden.
  • The file server resource manager (FSRM) is a new component to integrate capacity management, policy management and quota management, with quotas now set at folder level (rather than volume) and file screening to avoid storage of certain file types on the server (although the error message if a user tries to do this just warns of a permissions issue and is more likely to confuse users and increase the burden on administrators trying to resolve any resulting issues).
  • Storage manager for SANs allows Windows administrators to manage disk resources on a SAN (although not with the granularity that the SAN administrator would expect to have – I’ve not seen this demonstrated but believe it’s only down to a logical disk level).
  • In conclusion, Windows Server 2003 R2 builds on Windows Server 2003 with new functionality, but with no major changes so as to ensure a non-disruptive upgrade with complete application compatibility, and requiring no new client access licenses (CALs).

Management pack melee: understanding MOM 2005 management packs
Finally, a fired up, knowledgeable presenter! Gordon McKenna, MOM MVP is clearly passionate about his subject and blasted through a whole load of detail on how Microsoft Operations Manager (MOM) uses management packs to monitor pretty much anything in a Windows environment (and even on other platforms, using third-party management packs). There was way too much information in his presentation to represent here, but Microsoft’s MOM 2005 for beginners website has loads of information including technical walkthoughs. Gordon did provide some additional information though which is unlikely to appear on a Microsoft website (as well as some that does):

  • MOM v3 is due for release towards the end of this year (I’ve blogged previously about some of the new functionality we might see in the next version of MOM). It will include a lightweight agent, making MOM more suitable for monitoring client computers as well as a Microsoft Office management pack. MOM v3 will also move from a server-centric paradigm to a service-centric health model in support of the dynamic systems initiative and will involve a complete re-write (if you’re going to buy MOM this year, make sure you also purchase software assurance).
  • There are a number of third-party management packs available for managing heterogeneous environments. The MOM management pack catalogue includes details.
  • The operations console notifier is a MOM 2005 resource kit utility which provides pop-up notification of new alerts (in a similar manner to Outlook 2003’s new mail notification).

A technical overview of Microsoft Virtual Server 2005
In the last session of the day, Microsoft UK’s James O’Neill presented a technical overview of Microsoft Virtual Server 2005. James is another knowledgeable presenter, but the presentation was a updated version of a session that John Howard ran a few months back. That didn’t stop it from being worthwhile – I’m glad I stayed to watch it as it included some useful new information:

  • Windows Server 2003 R2 Enterprise Edition changes the licensing model for virtual servers in two ways: firstly, by including 4 guest licenses with every server host licence (total 5 copies of R2); secondly by only requiring organisations to be licensed for the number of running virtual machines (currently even stored virtual machine images which are not in regular use each require a Windows licence); finally, in a move which is more of a clarification, server products which are normally licensed per-processor (e.g. SQL Server, BizTalk Server, ISA Server) are only required to be licensed per virtual processor (as Virtual Server does not yet support SMP within the virtual environment).
  • The Datacenter edition of the next Windows Server version (codenamed Longhorn) will allow unlimited virtual guests to be run as part of its licence – effectively mainframe Windows.
  • Microsoft is licensing (or plans to licence) the virtual hard disk format, potentially allowing third parties to develop tools that allow .VHD files to be mounted as drives within Windows. There is a utility to do this currently, but it’s a Microsoft-internal tool (I’m hoping that it will be released soon in a resource kit).
  • As I reported previously, Microsoft is still planning a service pack for Virtual Server 2005 R2 which will go into beta this quarter and to ship in the Autumn of 2006, offering support for Intel virtualization technology (formerly codenamed Vanderpool) and equivalent technology from AMD (codenamed Pacifica) as well as performance improvements for non-Windows guest operating systems.

Overall, I was a little disappointed with yesterday’s event, although part 2 (scheduled for next week) looks to be more relevant to me with sessions on Exchange 12, the Windows Server 2003 security configuration wizard, Monad, Exchange Server 2003 mobility and a Windows Vista overview. Microsoft’s TechNet UK events are normally pretty good – maybe they are just a bit stretched for presenters right now. Let’s just hope that part 2 is better than part 1.

SQL Server 2005 Express Edition: now available for download

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

SQL Server 2005 Express EditionBack in February, I blogged about SQL Server 2005 Express Edition – the successor to the Microsoft SQL Server Desktop Engine (MSDE). Now that SQL Server 2005 has been launched, SQL Server 2005 Express Edition is available for download from the Microsoft website. For further reading, check out my SQL Server 2005 overview and new tools for managing and administering SQL Server 2005 posts.

Microsoft developer road trip CD

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

In a few days time, Microsoft are launching a whole host of products including Visual Studio 2005 and SQL Server 2005. To complement the launch events and associated webcasts, Microsoft have produced the Microsoft developer road trip CD – a 45 minute audio download featuring information on Visual Studio 2005, SQL Server 2005 and the .NET Framework 2.0 from Microsoft’s own developer and platform group experts.

Designed to be listened to on the move, either in the car or on your favourite media player, this should allow you to get up to speed on what these new product releases are all about in just a few short trips.

Having trouble accessing a recently installed instance of MSDE? Make sure the MSSQLSERVER service is started!

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I just spent ages trying to work out why I couldn’t access the Microsoft SQL Server Desktop Engine (MSDE) instance that an application had just installed… it may sound obvious, but make sure the MSSQLSERVER service is started!

In my defence, if an application requires a server or service restart after installation, I expect it to tell me that’s what it needs, but the version of MSDE 2000 SP3A downloaded by Altiris Notification Server 6.0 didn’t seem to do that. I had similar problems a few weeks back whilst playing around with Community Server. On both occasions, I though it might be a problem with my security credentials so I downloaded the SQL Server Web Data Administrator but that couldn’t access the database either. It was only once I’d checked that all the MSDE services were running (MSSQLSERVER was not) that everything jumped into life.

Another tip whilst I’m on the subject – MSDE 2000 SP3A requires a strong password to be set for the sa user. If using a repackaged version of MSDE (as I was), try extracting the package and examining the setup files to find the SA password that has been set as part of the application installation (e.g. using the SAPWD= option for setup.exe).

SQL Server 2005 overview

This content is 19 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

As part of my recent quest to learn about SQL Server from an infrastructure perspective, I’ve been attending a number of events, one of which was Michael Platt‘s keynote at the May 2005 Microsoft Technical Roadshow – SQL Server 2005 for IT Pros.

In my opinion, SQL Server 2005 (codenamed Yukon) is probably Microsoft’s most significant product release since Windows 2000. Central to the data management functionality within the Microsoft platform, this new version of SQL Server, due for release in November, is a massive rewrite (around 3 million lines of C# code), including a huge number of new features with improvements in three key areas, all underpinned by new levels of security and reliability:

  • Developer productivity:
  • Visual Studio 2005 integration.
  • Enterprise data management:
    • SQL Server database.
    • SQL Server replication services.
  • Business intelligence (BI):
    • SQL Server integration services (SSIS) – replacing data transformation services (DTS).
    • SQL Server analytical services.
    • SQL Server reporting services.
    • SQL Server notification services.

    In terms of developer productivity, the Microsoft .NET Framework common language runtime (CLR) is now part of the SQL engine, allowing database-side coding using any Microsoft .NET language. XML support is no longer a bolt-on component, with XML now supported natively, feeding into the web services strategy, with integrated web services features including the new SQL Service Broker for messaging between SQL Server and other systems.

    Looking at enterprise data management: availability and security is enhanced with support for database mirroring and online operations as well as data security and privacy (encryption and enhanced auditing); manageability is improved with a self-tuning database, fast recovery and restore, and a host of management tools; and the platform scales to cover a huge range of devices from smartphones to massive 64-bit installations using technologies such as partitioning and snapshots to allow the system to scale in line with business growth.

    For business intelligence, all of the previously separate tools are pulled together into a single front-end management interface with two components – Management Studio for management and BI Studio for design – both of which are closely integrated with Visual Studio. SQL Server 2005 has more real-time analytical support built into the platform as well as a comprehensive extract, transform and load (ETL) capability for the entire enterprise, supporting heterogeneous databases.

    In the past, DTS used the database engine to carry out the transform whereas SSIS carries out the transform itself, with some standard transforms included, but also expandable with custom transformations. With SQL Server 2005, the SQL Server reporting services reporting solution is supplemented with a report builder for user generated reports and an multidimensional expressions (MDX) builder for developers; providing an interactive enterprise reporting environment, integrated with Office System applications . Analysis Services allows the integration of relational databases and OLAP cubes to cache the OLAP data in synchronisation with relational updates and perform analysis on real time data. Finally, data mining capabilities (previously the domain of expensive high-end technologies) become available for general purpose use in SQL Server 2005 offering exploration, pattern discovery and pattern description.

    SQL Server 2005 Platform

    In line with the Microsoft policy of using its new products internally (a process which they refer to, somewhat delightfully, as “dogfooding”), Microsoft is already using SQL Server 2005 heavily for its enterprise applications (e.g. their SAP R/3 1.7Tb database, the staging data warehouse for all Microsoft data and the Microsoft sales revenue and reporting system – basically all of Microsoft’s business-critical applications!). Looking at the metrics for just one of those applications – the SAP R/3 system in Redmond which handles Microsoft global financial, human resources, sales and distribution resources – this 1.7Tb database over 25 production servers services 2500 named users (over 57000 users in total with between 200 and 600 using the system concurrently) handling 300 000 SAP transactions a day and 100 000 batch jobs a month whilst maintaining greater than 99.9% SAP availability and less than 0.5 second response time – on beta code!

    In my opinion, SQL Server 2005 will be Microsoft’s most significant product release since Windows 2000 and is central to the Microsoft platform. The constant battle between Microsoft and Oracle will continue for some time to come but this new version of SQL Server (which has been a long time coming) might finally help to persuade IT Directors that Microsoft is a serious contender in the enterprise data management space.

    For more information about SQL Server 2005, I recommend the Microsoft TechNet SQL Server Tech Center, Mat Stephen’s blog, and Jamie Thomson’s blog (particularly for SSIS). SQL Server 2005 is due for release in November 2005, with four product editions – Express, Standard, Workgroup and Enterprise.