A list of items I’ve come across recently that I found potentially useful, interesting, or just plain funny:
Monthly Archives: April 2010
A few days ago, I reviewed John Savill’s Complete Guide to Windows Server 2008 and now I’d like to introduce another book for Windows Server administrators – William Stanek‘s Windows Server 2008 Administrator’s Pocket Consultant 2nd Edition, published by Microsoft Press.
Whereas John’s book is a heavyweight reference volume for use by designers and administrators alikeÂ – the Administrator’s Pocket Consultant series is intended to be transportable – although it does push the definition of “pocket-sized” somewhat at almost 700 pages.Â Nevertheless, this book works through many of the key activities that a Windows Server administrator can expect to perform, organised in a logical flow from an overview; to deployment; managing servers, monitoring services, processes and events; automating tasks, policies and procedures; enhancing security; using and administering Active Directory, including user and group account management; managing file systems, drives, volume sets and arrays, including file screening and storage reporting; sharing data (and auditing access); backing up and recovering data; managing TCP/IP; andÂ administering network services such as printing, DHCP and DNS.Â
It’s the sort of book that can be used by an experienced administrator to just dip in and refresh their memory on a particular topic, or read from start to finish for an administrator who is new to Windows Server or updating their administration skills to include Windows Server 2008 R2.
And R2 is an important distinction – this book is not just in it’s second edition – it has been updated to cover Windows Server 2008’s second release (Windows Server 2008Â R2) – in that the Windows Server 2008 topics have been updated to include any changes that R2 has brought.Â What this book doesn’t cover though is some of the newer Windows Server roles and features – like administering Remote Desktop Services.Â And, whereas it talks in detail about the Distributed File System, new R2 features such as BranchCache barely get a mention, nor does DirectAccess.
It’s a difficult balance – after all, this is a pocket consultant – i.e. a smallish book to consult when you need to know something – but I’d really like such a guide to include all of Windows Server’s functionality – even if it can’t drill down into detail on them all (after all, it doesn’t claim to be a complete reference).
At the end of the day, this book aims to be practical, portable, and to provide answers for day-to-day administration of Windows Server 2008 R2.Â On the whole, it achieves that goal – and it’sÂ is well laid out, with plenty of illustrations and list of actions to take for a given scenarioÂ – but it misses some key functionality that many Windows Server administrators will encounter.
If you’re looking for a portable Windows Server reference book, Windows Server 2008 Administrator’s Pocket Consultant 2nd EditionÂ isÂ available from all major booksellers and use the code
MVPT894 for a 40% discount on this book at Microsoft PressÂ until the end of April 2010.
Microsoft’s Solution Accelerators have been around for a while now and, as the name suggests, are intended to accelerate the deployment of solutions built on Microsoft technology.Â Each solution accelerator is a free download from the Microsoft website but they don’t seem to be as well-known as they should be – with many IT organisations still producing their own documentation or purchasing third party tools that duplicate this free of charge functionality.
One of the earliest solution accelerators I worked with was the Business Desktop Deployment (BDD) toolkit, which has since made major advances in its maturity and is now known as the Microsoft Deployment Toolkit (MDT).Â This is just one of the more commonly used accelerators though – the full list of solution accelerators covers a diverse set of technologies from using Windows PE to create a malware removal kitÂ to migrating custom Unix applicationsÂ to discovering the ports used by Windows Server System productsÂ to planning for payment card industry (PCI) compliance.
As the complete set of solution accelerators is so extensive, and growing, it’s not practical to go into detail about each one but here are just a few that technical architects and administrators might find useful:
- Infrastructure Planning and Design (IPD) Guides: intended to complement product documentation by focusing on infrastructure design options, each guide leads the reader through critical infrastructure design decisions, in the appropriate order, evaluating the available options for each decision against its impact on critical characteristics of the infrastructure. The IPD Series highlights when service and infrastructure goals should be validated with the organization and provides additional questions that should be asked of service stakeholders and decision makers.
- Microsoft Assessment and Planning (MAP) Toolkit:Â an agentless toolkit that finds computers on a network and performs a detailed inventory using Windows Management Instrumentation (WMI) and the Remote Registry Service. The data and analysis provided by this toolkit can significantly simplify the planning process for migrating toÂ a variety of Microsoft products including device driver availability and recommendations for hardware upgrades.Â MAP can also be used to gather performance metrics from computers being considered for virtualisation before modelling a library of potential host hardware and storage configurations for “what-if” analysis.
- Microsoft Deployment Toolkit (MDT):Â MDT is the recommended process and toolset for automating Windows desktop and server deployment, providing unified tools and processes in a common deployment consoleÂ together with guidance documents for reduced deployment time and standardised desktop and server images, along with improved security and ongoing configuration management.Â MDT can integrate with System Center Configuration Manager (SCCM) 2007 and Windows deployment tools for zero touch deployment and, for those without an SCCM infrastructure, MDT makes use of Windows deployment tools for lite touch deployments.
- Microsoft Security Compliance Manager: Intended to reduce the time and cost associated with hardening the security ofÂ and infrastructure this solution accelerator provides access to the complete database of Microsoft recommended security settings so that baselines can be created and exported in multiple formats including .XLS, Group Policy objects (GPOs), Desired Configuration Management (DCM) packs, or Security Content Automation Protocol (SCAP) to automate the security baseline deployment and compliance verification process.
- Service Level Dashboard Management Pack for System Center Operations Manager (SCOM): This dashboard integrates with SCOM 2007 R2 to assist in tracking, managing, and reporting on line-of-business (LOB) application service level compliance, displaying a list of applications and their performance and availability against a target service level agreement (SLA).Â
- Microsoft Operations Framework (MOF): Providing is practical guidance for IT organisations, MOF reflects a single, comprehensive IT service lifecycle to help IT professionals connect service management principles to everyday IT tasks and activities in order to ensure alignment between IT and the business.Â Where ITIL is descriptive and describes “what to do”, MOF is prescriptive and provides the “how to do it” guidance.
- IT Compliance Management Guide: Intended for IT managers, professionals, and partners who configure Microsoft products to address specific IT governance, risk, and compliance (GRC) requirements, implementation of the recommendations in this series of guides allows enforcement and management of IT GRC requirements to be shifted onto the underlying Microsoft technologies.
There are also a huge number of specific solution accelerators for given technology scenarios: like servicing offline virtual machines, applying the principle of least user access (LUA) to user accounts on Windows XP, using Windows security and directory services with Unix or server and domain isolation using IPSec and Group Policy,Â as well as product operations guides for Active Directory, DNS, DHCP, file services, print services, etc.Â and migration guidance for scenarios such as Novell NetWare to Windows Server or Oracle on Unix to SQL Server on Windows.Â These are just a few examples so check out the full list of Microsoft Solution AcceleratorsÂ for more options.
For more information on solution accelerators (e.g. new releases and updates), register for Microsoft’s Solution Accelerator Notifications newsletter.
I spent last week hanging out in West London at the Microsoft UK TechDays events in order to learn something more about the technologies I work with and something new about a few others.Â Thursday’s SQL Server/Business Intelligence sessions definitely fell into the latter category and I saw some pretty cool stuff including PowerPivot.
Formerly codenamed Gemini, PowerPivot is part of the SQL Server 2008 R2 release but does not really need SQL Server.Â It is intended to provide business end users with access to distributed data. Available for both Excel (2010 only) and for SharePoint, it lets Excel power users know where data is, how to get to it and need to share it.
PowerPivot allows users to pull in large quantities of data from disparate sources for fast analytics.Â It can store hundreds of millions of rows of data in Excel and runs analysis services on the client, with data stored in memory – not SQL. Column-based compression is used to reduce the data size on disk (the actual ratio depends on whether that data is textual or numeric – numeric compresses well and that’s generally the sort of data that is used for this type of reporting). There is a limit of 4GB of address space and a 2GB file limit on disk but Microsoft state that’s not a limitation of PowerPivot – these restrictions mean that reports can be deployed to SharePoint later.
In order to make PowerPivot easy for business users to use, Microsoft has come up with a set of Data Analysis eXpressions (DAX) that provide a simplified set of analysis functions using Microsoft Excel-like syntax.Â Â Business users can perform DAX functions and standard Excel calculations on the data, including the use of Excel slicers (a new feature so that end user do need to understand pivot tables in order to filter the data). Pivot tables, slicers and graphs can be built up to create a dashboard, which is just a few clicks away from being saved to SharePoint.
Once published, users can view a PowerPivot Gallery in SharePoint – including previews of reports, and so an existing report can be used as a data source for a new report using ATOM for the publishing/syndication.Â Â Taking that a step further, PowerPivot for SharePoint allows pivots to be published as web applications for a team – and administrators can track usage of the dashboards that users create to discover those “apps” that are becoming business critical, in order to transition them to a state that is properly cleansed/governed.Â
Whilst SharePoint is the only supported platform in order to publish PowerPivot reports (and obtain management data), the data consumed by the pivots may originate from a variety of sources.Â It’s worth noting though that, if SharePoint is used,Â an Enterprise SharePoint platform is required in order to provide the Excel Services capability.
Whilst some are concerned about bringing together data from disparate sources, PowerPivot does not represent anarchy. The data may not have been cleansed (e.g. using fuzzy logic) but, if needs to be governed with proper stewardship, it can always be brought into the data warehouse.Â
As for securing the data – if users can access the data, they can access it regardless of PowerPivot. Organisations should look to use rights management and other security mechanisms to protect data from information leakage.
In summary, PowerPivot allows:
- Analysis of external data within the context of corporate data.
- Analysis of large data sets beyond the limits of Excel.
- Sharing of insights.
- Consumption of reports as data sources.
- Easy access to data, without a major IT project.
- The ability to gather business requirements (e.g. identify commonly used reports) prior to implementing a fully managed reporting solution.
This week sees the annual Microsoft Management Summit (MMS)Â taking place in Las Vegas, with over 3500 attendees from around the world, even though there are many people stranded by the current flight restrictions in Europe.Â According to Microsoft, that’s 50% up on last year – and those delegates have access to 120 break out sessions to learn about Microsoft’s vision and technology for IT management – across client devices, the datacentre and the cloud.
The keynote presentations are being streamed live but, for those who missed yesterday’s keynote (as I did) and who are waiting to hear today’s news, here are the main highlights from the event, as described by Paul Ross, a Group Product Marketing Manager for System Center and virtualisation at Microsoft.
Cloud computing is a major trend in the IT industry and many customers are trying to balance new models for elastic computing with trying to get the best TCO and ROI from their existing investments.Â There are those who suggest Microsoft doesn’t have a cloud strategy but it’s now 5 years since Ray Ozzie’s Internet Service Disruption memo in which he set out Microsoft’s software plus services approachÂ and Steve Ballmer reinforced Microsoft’s Cloud Services vision earlier this year.
For many years, Microsoft has talked about the Dynamic Systems Initiative (DSI), later known as Dynamic IT and the transition to cloud services is in line with this – model driven, service focused, unifying servers and management, thinking about services instead of servers, and automated management in place of manual approaches. Meanwhile, new deployment paradigms (e.g. virtualisation in the data centre) see customers shifting towards private and public cloud environments.Â But customers are experiencing a gap in the consistency of security models and application development between on premise and cloud servicesÂ – and Microsoft believes it is the key to allowing customers to bridge that gap and provide consistency of infrastructure across the various delivery models.
Some of the new products announced at this year’s MMS include the next version of System Center Virtual Machine Manager (SCVMM), slated for release in the second half of next year, and which will take a service centric approach to management – including new approaches to deploying applications.Â Alongside SCVMM, System Center Operation Manager (SCOM) will also be updated in the second half of 2011 – itself making the transition to a service-centric model.
Before then, June 2010 will see the release to web of the Dynamic Infrastructure Toolkit for System Center which provides enterprise customers with the foundations for creating a private cloud with concepts such as on demand/self-service provisioning, etc.
Today’s keynote will focus on the shift from device-centric computing to a user-centric approach.Â Many organisations today operate separate infrastructures for different client access models – and there is a need for unification to manage IT according to end user requirements.Â Central to this vision is the need to unify the products used for security and management of the infrastructure, reducing costs and focusing on user-centric client delivery for the cloud.
Earlier this week, we heard about the beta for Windows Intune – offering security, management, Windows Update and MDOP benefits within a single subscription for small to medium sized businesses.Â Today’s headlines are enterprise-focused and will include the announcement of the beta for System Center Configuration Manager (SCCM) 2007 R3 – focused on power management and unified licensing for mobile devices alongside traditional desktop clients.Â SCCM vNext (again, scheduled for the second half of 2011) will be focused on user-centric management – offering a seamless work experience regardless of whether applications areÂ delivered via App-V, VDI, or using a traditional application delivery approach.Â In addition, SCCM vNext will incorporate mobile device management (currently in a separate product – System Center Mobile Device Manager), allowing a single infrastructure to be provided (so, to summarise:Â that’s licensing changes in SCCM R3, followed by the technology the next release).
In other news, we heard yesterday about the release of System Center Service Manager (SCSM) 2010 and System Center Data Protection Manager (SCDPM) 2010 – both generally available from June 2010.Â SCSM is Microsoft’s long-awaited service desk product – with 57 customers in production already and around 3000 on the beta – which Microsoft hopes will disrupt service desk market that they describe as being “relatively stale”.Â Built as a platform for extension by partners SCSM includes the concept of process packs (analogous to the management packs in SCOM) and Microsoft themselves are looking to release beta compliance and risk process packs from June, helping to grow out the product capabilities to cover a variety of ITIL disciplines.Â As for SCDPM, the product gains new enterprise capabilities including client protection (the ability to back up and recover connected client systems) – and both SCSD and SCDPMÂ are included within the Enterprise CAL and Server Management Suite Enterprise licensing arrangements.
For some years now, Microsoft has been showing a growing strength in its IT management portfolio – and now that they are starting to embrace heterogeneous environments (e.g. Unix and Linux support in SCOM, ESX management from SCVMM), I believe that they will start to chip away at some of the territory currently occupied by “real” enterprise management products.Â Â As for that image of a company that’s purely focused on Windows and Office running on a thick client desktop, whilst that’s still where the majority of its revenue comes from, Microsoft knows it needs to embrace cloud computing – and it’s not as far behind the curve as some may believe.Â The cloud isn’t right for everyone – and very few enterprises will embrace it for 100% of their IT service provision -Â but, for those looking at a mixture of on-premise and cloud infrastructure, or at a blend of private and public cloud, Microsoft is in a strong position with a foot in either camp.
Thanks to everyone who attended the Windows Server User Group events last week with guest speakers Joey Snow and Dan Pearson (we hope you manage to get home soon).
For those who were interested in the slide decks, you can find links to them below:
- Windows Server migration
- BranchCache deep dive
- Windows crash dump analysis
- Windows performance troubleshooting and analysis
A couple of years back, I was invited out to the Microsoft Campus in Redmond to learn about Windows Server 2008.Â It was a fantastic week – not just because it was my first trip to Redmond but also because I met so many great people – many of whose workÂ I had been reading in books, magazines and on the ‘net for years.Â One example was John Savill, who, at the time, was working on a book… a rather big book as it turns out – and his publishers sent me a copy to review.
It’s taken me some time (I did plan to use it for my MCSE to MCITP:EA upgrade in 2008) but here’s what I found when I read John Savill’s Complete Guide to Windows Server 2008, published by Addison Wesley.
At over 1700 pages, this is not a lightweight read.Â Having said that, it’s title of “complete guide” is pretty accurate – going right back to a history of Windows (although using the abbreviation of WNT for Windows NT is not something I’ve seen anywhere else, and was somewhat confusing).Â Although the book is written in a style that makes it very readable, it’s size means that it’s not something that can easily be read in bed, or on the train, or anywhere really – and that means it’s most use as a reference book (a digital copy is available to purchasers of the hardback edition, but only for 45 days… not really much use for a book this size).
But what a reference book it is!Â I’ve read many texts on deploying Windows and none have ever taken me through a network trace of a PXE boot, removing the need to press F12, or the structure of the XML that describes a Windows image.Â Sure, we now have tools like the Microsoft Deployment Toolkit but John explores Windows Deployment Services (and the Windows Automated Installation Kit) in great detail – just the sort of detail I would need if I was an administrator looking to discover how Windows works and how to make it work for me.Â These are just a few highlights though from one example of the 24 chapters (plus how to quick reference and index) – indeed I’ll list them here to show the breadth of coverage for this book:
- Windows 101: Its origins, present, and the services it provides
- Windows Server 2008 fundamentals: navigation and getting started
- Installing and upgrading Windows Server 2008
- Securing a Windows Server 2008 deployment
- File system and print management features
- Advanced networking services
- Remote access/securing and optimising the network
- Terminal Services
- Active Directory Domain Services (introduction)
- Designing and installing Active Directory
- Managing Active Directory and advanced concepts
- Active Directory Federated Services, Lightweight Directory Services, and Rights Management
- Server core
- Distributed File System
- Deploying Windows
- Managing and maintaining Windows Server 2008
- Highly available Windows Server 2008
- Virtualisation and resource management
- Troubleshooting Windows Server 2008 and Vista environments
- Group policy
- The command prompt and PowerShell
- Connecting Windows Server to other environments
- Internet Information Services
Each chapter goes into great detail, with plenty of screen shots, and command line output; yet remains extremely readable because the approach taken is to set the scene, before drilling down into the detail – rather than swamping the reader with a mountain of technical know-how.
If I had one tiny criticism, I’d say that there were a (very) few occasions when it left me hanging by referring me to the Microsoft website for more information (e.g. for details of storing BitLocker encryption keys in Active Directory); however, in general, this book provided me with the right balance between readability and technical detail – and I would not hesitate to recommend this text to anyone who works with, or is looking to learn about, Windows Server 2008.
This is the week of the Microsoft Management Summit in Las Vegas and, as well as the whole load of System Center-related announcements that we can expect this week, Microsoft has formally announced the beta of a new cloud-based PC management service called Windows Intune.
Designed for customers who have 25-500 PCs, Windows Intune is intended to provide a cloud-based desktop management service in the way that BPOS does for business productivity applications.Â Aimed squarely at the mid-market, Windows Intune (formerly known as System Center Online Desktop Manager) allows smaller organisations to gain some insight over what’s happening in their PC estate, avoiding the high infrastructure costs associated with enterprise products (and even System Center Essentials needs a server on site).
All that’s required on the PC is an Internet connection (and an agent, which Microsoft described as “lightweight”) but also included in the service is a license for Windows 7 Enterprise Edition and the MDOP technologies – that’s a single license purchase for a lot of functionality!Â Microsoft is making the beta available today but interested customers will have to move quickly – it’s limited to 1000 users in the US, Canada, Mexico and Puerto Rico only – Europe and Asia will follow within a year.
For those organisations that are not quite ready for Windows 7, the license with Intune can be downgraded to WindowsÂ XP Professional or Windows Vista Business.
Administrators simply need an Internet connection and a Silverlight-capable browser to access a console which provides a system overview showing a rolled-up status including malware protection, updates, agent health (offline clients) and reports on operating system alerts (e.g. disk fragmentation) along with a number of workspaces – currently:
- Computers – which may be organised into groups and subgroups (e.g. to assign policies and reports).Â Any groups are completely inside Intune and are nothing to do with Active Directory (a computers can exist within multiple groups). It’s also possible to drill down and expose details for each computer (updates, alerts, malware status. etc.).
- Updates – a roll-up of all updates together with the ability to drill down on update type (i.e. security, critical, definition, service packs, update rollups, mandatory updates) and to filters to see which updates are waiting to be approved.
- Malware protection – showing which clients have been infected and any resulting action – including integration with the endpoint protection encyclopedia (with the Microsoft Malware Protection Center)
- Alerts – for malware protection, monitoring, notices, policy, remote assistance, system or updates.
- Software – an automatic inventory reports details about the machine itself and installed software, which may be printed or exported as a CSV file.
- Licenses – the ability to to track licenses within Software Assurance (SA) agreements by entering theÂ agreement numbers correlating installed software with purchased software (for Microsoft products only).Â Microsoft were keen to highlight that privacy will be taken seriously with third party audit ensuring that theÂ information is private to customers and not used by Microsoft to enforce its licensing.Â In addition, the entering of SA agreement details is optional and the service will function without this information.
- Policy – controlling how Intune and clients function including agent settings (template driven, but not using
- Group Policy – indeed Group Policy will override in any conflict), tools settings, and firewall settings (Intune communicates over HTTP, and the agent installation will also open remote management functionality).
- Reports – providing a snapshot of status.
- Administration – each computer is identified by a download/installation and multiple administrators may be defined for the service, with notifications on particular alerts (i.e. by e-mail).
From a client experience perspective, the Windows Intune Tools can be used for an end user to request help from Easy Assist (by sending an urgent alert to the Intune service – this has to be user-initiated and the administrator cannot arbitrarily take control of a client) and the end user can also check the update status with regards to Windows Update and malware protection.
Those who have worked with Microsoft Security Essentials may be interested to note that:
- Windows Intune will work on servers, but is not supported.
- Malware protection is provided by the common malware protection engine (from Forefront) with the user interface from Microsoft Security Essentials (“at the moment”).Â The use of theÂ Forefront Â scanning engine allows for reporting and policy control that is not present in Microsoft Security Essentials.
In summary, Windows Intune is intended as an easy-to-useÂ cloud-based solution for small-medium businesses that requires little or no infrastructure and remains up-to-date.Â It is not an enterprise solution (it’s certainly not a replacement for System Center Configuration Manager) but it is a useful way to license Windows 7 and prepare for Windows 8.
For more information as the beta progresses, check out the Windows Intune Team Blog.
After much speculation (including some from myself), Microsoft has announced that Office 2010 has been released to manufacturing. I’ve been using the community technology preview, beta, and release candidate versions of the product since last summer and I have to say that there are quite a few features that have become productivity enhancements for me – and that it’s pretty unusual for a mature product to include this sort of innovation.
I’m not going to run through all the features in this post (I did talk about a few of them in an earlier post and I hope to expand on that when I get time) but I thought I’d call out a few pointers that might be useful for organisations looking at deploying the new Office products, based on the presentation by Reed Shaffner, Senior Product Manager forÂ Microsoft Office, at the UK TechDays Office 2010 event on 13 April 2010.
First up, is the fact that, aside from a slightly larger memory footprint, the specifications to run Office 2010 are unchanged from 2007.Â That means that the same Windows Vista-class PCs that can run Windows 7, will also cope well with Office 2010 – although there are other reasons to look at updating PCs as the age of the device is a factor in overall TCO, at least according to Gartner.
All volume licensing editions of Office 2010 require activationÂ – that means that enterprises need to be looking at deploying a key management service (KMS)Â in order to avoid abuse of multiple activation keys (MAKs).
Office 2010 has a new 64-bit product edition – although, unless theÂ ability to access large amounts of memory is important, Microsoft recommends running 32-bit (even on 64-bit versions of Windows) due to incompatibilities with plugins.Â And, talking of incompatibilities, Office 2010 is designed for tighter integration with App-VÂ such that, if App-V v4.6 is used to deploy Office applications, many of the previous application virtualisation issues are fixed (e.g. the hooks to open files in SharePoint, send to the OneNote print driver, send to mail, etc.) by uses proxies to activate them when Office applications are running virtualised.
Office 2010 has a number of security features – recognising that, as the operating system is better protected, malware attacks have moved up the stack to the application layers, concentrating on weaknesses in file formats and document parsing.Â For example, if the Office file validation checks fail, a warning is displayed and the user has to go into Office back stage and explicitly select to ignore warnings and edit anyway.Â More commonly, documents that have originated elsewhere may open in Protected View and give the user the option to enable editing.Â In addition, applications such as PowerPoint remove any content that they do not understand (i.e. which is potentially harmful) from the document structure.
When preparing a document for sharing, in addition to the option to inspect a document for hidden properties, there is a new accessibility checker that can ensure a document complies with various accessibility standards – including the Authoring Tool Accessibility Guidelines (ATAG) and the Web Content Accessibility Guidelines (WCAG) v2.0.Â This accessibility checker educates the user on how and why to fix a document – and is enforceable via group policy.
Document compatibility is another concern and, whilst third party tools are available from partners such as Converter Technology, there are Microsoft tools that can help too:
- The Microsoft Assessment and Planning (MAP) toolkit v5.0 includesÂ an Office 2010 Readiness Assessment which looks at hardware compatibility and provides recommendations and assessments in order to plan an Office 2010 migration, for example identifying add-ins and interfaces, tagging applications that are known to be incompatible, and mitigating VBA/macro code.
- The Office Migration Planning Manager (OMPM)Â is a collection of tools to prepare an environment for migration to Microsoft Office 2010. OMPM can be used to scan for file format issues, identify potential macro issues, and to migrate legacy office files to the Office OpenXML format.
- The Office Environment Assessment Tool (OEAT) is another scanner that can be run against multiple systems and provide rolled-up reports on the overall environment.
- The Office Code Compatibility Inspector (OCCI)Â compares existing (legacy) code against the Microsoft Office 2010 object model to identify possible code issues, looking for issues in macros and inserting comments accordingly.
- Other resources that might be useful for organisations looking at Office 2010 deployment include:
Finally, just as with Office 2007, Microsoft plans to produce an Office Productivity Hub – in both standalone HTML and SharePoint 2010 format – to provide self-help content forÂ end users (effectively a help portal for office) along with an interactive command reference guide (for help with getting to know theÂ fluid/ribbon user interface) and an enterprise learning framework (to helps organisations develop a training and communication plan for employees during deployment of Office).Â All of these are expected to become availableÂ over the next few weeks.
Over the last few years, I’ve attended (and blogged in detail about) a couple of “after hours” events at Microsoft – looking at some of the consumer-related items that we might do with out computers outside of work (first in May 2007 and then in November 2008).
Tonight I was at another one – an evening event to complement the UK TechDays events taking place this week in West London cinemas – and, unlike previous after hours sessions, this one did not even try and push Microsoft products at us (previous events felt a bit like Windows, Xbox and Live promotions at time) – it just demonstrated a whole load of cool stuff that people might want to take a look at.
I have to admit I nearly didn’t attend – the daytime UK TechDays events have been a little patchy in terms of content quality and I’m feeling slightly burned out after what has been a busy week with two Windows Server User Group evening events on top of UK TechDays and the normal work e-mail triage activities. I’m glad I made it though and the following list is just a few of the things we saw Marc Holmes, Paul Foster and Jamie Burgess present tonight:
- A discussion of some of the home network functionality that the guys are using for media, home automation etc. – predictably a huge amount of Microsoft media items (Media Center PCs, Windows Home Server, Xbox 360, etc.) but also the use of X10, Z-Wave or RFXcom for pushing USB or RF signals around for home automation purposes, as well as Ethernet over power line for streaming from Media Center PCs. Other technologies discussed included: Logitech’s DiNovo Edge keyboard and Harmony One universal remote control; SiliconDust HD HomeRun for sharing DVB-T TV signals across Ethernet to PCs; using xPL to control home automation equipment.
- Lego Mindstorms NXT for building block robotics, including the First Lego League – to inspire young people to get involved with science and technology in a positive way.
- Kodu Game Lab – a visual programming language made specifically for creating games that is designed to be accessible for children and enjoyable for anyone.
- Developing XNA games with XNA Game Studio and Visual Studio, then deploying them to Xbox or even running them in the Windows Phone emulator! Other related topics included the use of the Freescale Flexis JM Badge board to integrate an accelerometer with an XNA game and GoblinXNA for augmented reality/3D games development. There’s also a UK XNA user group.
- A look at how research projects (from Microsoft Research) move into Labs and eventually become products after developers have optimised and integrated them. Microsoft spent $9.5bn on research and development in 2009 and some of the research activities that have now made it to life include Photosynth (which became a Windows client application and is now included within Silverlight), the Seadragon technologies which also became a part of Silverlight (Deep Zoom) and are featured in the Hard Rock Cafe Memorabilia site. A stunning example is Blaise Aguera y Arcas’ TED 2010 talk on the work that Microsoft is doing to integrate augmented reality maps in Bing – drawing on the Seadragon technologies to provide fluidity whilst navigating maps in 3D but that environment can be used as a canvas for other things – like streetside photos (far more detailed than Google Streetview). In his talk (which is worth watching and embedded below), Blaise navigates off the street and actually inside Seattle’s Pike Place market before showing how the Microsoft imagery can be integrated with Flickr images (possibly historical images for “time travel”) and even broadcasting live video. In addition to the telepresence (looking from the outside in), poins of interest can be used to look out when on the ground and get details of what’s around and even looking up to the sky and seeing integration with the Microsoft Research WorldWide Telescope.
- Finally, Paul spoke about his creation of a multitouch (Surface) table for less than £100 (using CCTV infrared cameras, a webcam with the IR filter removed and NUI software – it’s now possible to do the same with Windows 7) and a borrowed projector before discussing his own attempts at virtual reality in his paddock at home.
Whilst I’m unlikely to get stuck into all of these projects, there is plenty of geek scope here – I may have a play with home automation and it’s good to know some of the possibilities for getting my kids involved with creating their own games, robots, etc. As for Blaise Aguera y Arcas’ TED 2010 talk it was fantastic to see how Microsoft still innovates and (I only wish that all of the Bing features were available globally… here in the UK we don’t have all of the functionality that’s available stateside).