Springboard Events: Windows 7 Application Compatibility: Your Questions Answered (Part 1)

Here’s the details on the latest Springboard event from MS.

Date: Thursday, June 18th
Time: 11:00am Pacific Time

Windows 7, is approaching fast and from the application standpoint is very similar to Windows Vista. We’re going to examine Windows 7 application compatibility not only from the perspective of moving from Windows Vista, but also for those coming from Windows XP. Join us to discuss the most common challenges around application compatibility when coming from a legacy operating system, why changes were made along the way, compatibility technologies inside the OS and methods for getting incompatible applications to run on Windows 7. Along the way we share tips and tricks, demonstrate free tools to analyze and fix applications and answer your specific questions about application compatibility live.

In Part 2 of this Virtual Round Table discussion (planned for later this Summer/Fall), we’ll discuss the options and approaches for using virtualization tools In depth to address application incompatibilities – including presentation virtualization, desktop virtualization and application virtualization. We’ll be sending out more details and posting information to www.microsoft.com/springboard for part 2 as the dates are finalized.

Xbox Live Disconnected and Windows 7 Beta/EC Media Center Extender

Since I got my XBox 360 recently, it’s become a central part of my home entertainment.  Not only am I playing games on it, but I’ve hooked up the media extender to a Windows 7 RC PC.  The PC has my MP3 and movie collection on it.  This means that instead of having the PC in my sitting room and watching on a monitor or creating a cable nightmare, I can stream the media over wifi from the PC to the XBox and then onto the TV.

I was up early yesterday morning and I was going to watch the second half of a movie I’d yet to finish.  I fired up the Media Center connection on the XBox 360 and it failed, saying I hadn’t logged in.  Being an engineer I went about diagnosing.  The XBox and router were restarted.  I verified the XBox was on the wifi.  I verified internet access was OK via the PC and that the XBox was talking to the PC.  But why the hell did the XBox 360 want to talk to XBox live for an internal operation like streaming a movie from my PC to the XBox?

I went onto the XBox status page.  It took an eternity to load and eventually I learned that there was a scheduled maintenance window that would last up to 24 hours.  Why the hell didn’t we get a notification?  I saw someone posting that Gold members did get a mail.  I sure didn’t.  I know if I brough our hosting service offline without telling our clients I’d get the boot.

Anyway, on Twitter, “whoisaaron” sent me a link.  There was another link from there to the XBox Live Operations blog; a new site.  There I read the following regarding the media extender failing to connect to a Windows 7 pre RTM (beta or RC) media center PC:

“You’re getting an error that states ‘This game requires a connection to Xbox LIVE’.

This requirement is unique to our pre-release operating systems such as the Windows 7 Beta.  Released OSes like Windows Vista – or Windows 7 when it ships later this year – do not have this requirement.  If you were surprised by this, we’re sorry.

Again, to be perfectly clear: the final version of Windows 7 will not require a connection to Xbox LIVE to use the extender functionality”.

So, you should have been OK with XP or Vista.

Microsoft IIS Search Engine Optimization Toolkit Beta

Microsoft has released a beta (pre production and for test only) kit for bumpping up the position of the sites on your IIS servers in search results, i.e. Search Engine Optimisation or SEO.  This tool kit includes “the Site Analysis module, the Robots Exclusion module, and the Sitemaps and Site Indexes module, which let you perform detailed analysis and offer recommendations and editing tools for managing your Robots and Sitemaps files.

Site Analysis Features

  • Fully featured crawl engine named ‘iisbot’
    • Configurable number of concurrent requests to allow users to crawl their Web site without incurring additional processing. This can be configured from 1 to 16 concurrent requests.
    • Support for Robots.txt, allowing you to customize the locations where the iisbot should analyze and which locations should be ignored.
    • Support for Sitemap files allowing you to specify additional locations to be analyzed.
    • Support for overriding ‘noindex’ and ‘nofollow’ metatags to allow you to analyze pages to help improve customer experience even when search engines will not process them.
    • Configurable limits for analysis, maximum number of URLs to download, and maximum number of kilobytes to download per URL.
    • Configurable options for including content from only your directories or the entire site and sub domains.
  • View detailed summary of Web site analysis results through a rich dashboard
  • Feature rich Query Builder interface exposing large amounts of data
  • Quick access to common tasks
  • Display of detailed information for each URL
  • View detailed route analysis showing unique routes to better understand the way search engines reach your content

Robots Exclusion Features

  • Display of robots content in a friendly user interface
  • Support for filtering, grouping, and sorting
  • Ability to add ‘disallow’ and ‘allow’ paths using a physical view of your Web site
  • Ability to add ‘disallow’ and ‘allow’ paths using a logical view of your Web site from the result of site analysis processing
  • Ability to add sitemap locations

Sitemap and Sitemap Index Features

  • Display of sitemaps and sitemap index files in a simple user interface
  • Support for grouping and sorting
  • Ability to add/edit/remove sitemap and sitemap index files
  • Ability to add new URL’s to sitemap and sitemap index files using a physical view of your Web site
  • Ability to add new URL’s to sitemap and sitemap index files using a logical view of your Web site from the result of site analysis processing
  • Ability to register a sitemap or sitemap index into the robots exclusion file”

There is a concern that something like this would cause an undue load on the IIS server/sites.  Microsoft responds to this with instructions for using a configurable setting.  This allows you to set the maximum number of concurrent requests created by the SEO Toolkit.

Windows Media Services 2008 R2 Release Candidate

Microsoft has released an updated version of the RC for Windows Media Services (WMS) 2008 R2.  You may have noticed that some components for IIS are no longer included in the Windows Server media, e.g. FTP and WebDAV.  Why?  Microsoft appears to have taken a similar strategy with IIS like they did with Live Essentials for Windows 7.  Their thinking is that separating the release schedules will allow them to update the web component more frequently, e.g. we’ve already had 2 RTM releases of FTP (7.0 and 7.5) for IIS 7.X.  It also allows the product teams to continue working on something until they have it right and not to the greater Server schedule.

Windows Media Services 2008 R2 RC promises:

  • “More Options. Installs on all Windows Server 2008 R2 Release Candidate (6.1.7100) editions, including the following:

    • Windows Server 2008 R2 Datacenter
    • Windows Server 2008 R2 Enterprise
    • Windows Server 2008 R2 Standard
    • Windows Server 2008 R2 Foundation – new option for WMS!
    • Windows Web Server 2008 R2
  • More Value. All unicast streaming features are now available on all versions of Windows Server 2008 R2 (see table, below).
  • Higher Reliability. Already-proven on-demand and live streaming reliability further improves in Windows Server 2008 R2.
  • Easier Deployment. Fewer installation packages makes deploying WMS simpler:

    • One package for all editions of Windows Server 2008 R2
    • Two packages for the WMS Remote Server Administration Tool (for use on 32-bit and 64-bit Windows 7 clients)
  • Better Virtualization. Hyper-V for Windows Server 2008 R2 improves performance and manageability of streaming VMs.
  • Reduced Power Consumption. Windows Server 2008 R2 can dynamically adjust power usage as streaming loads change.
  • Multicast for Silverlight. With the new Silverlight Multicast Plug-in, WMS can now deliver multicast streams to Silverlight”.
  • You should check out the MS post to see the feature differences between running WMS on Server Standard VS Enterprise/DataCenter editions.

    Microsoft also reminds us that the “easiest way to get started with the WMS 2008 R2 is by running the Microsoft Web Platform Installer 2.0 Beta on your Windows Server 2008 R2 Release Candidate server”.

    IIS Database Manager Release Candidate

    This is going to be a pretty popular addition to IIS 7.X for application developers.  The idea is that the database manager scans for database connections in your web.config file and allows you to perform basic administrative tasks via a web console instead of logging in via SQL Management Studio.  No ports need to be opened for administration.  That’s a tidy solution for remotely hosted websites.  However, I’d still recommend revealing as little as possible to the net; you’re better off logging into the hosted server via RDP and using the management studio console.  But if you want, you can use this solution.  It’ll definitely be a player for internal deployments.

    • Add, rename, drop, and edit tables
    • View and manage primary keys, indexes and foreign keys
    • Data editing
    • Establish connections to multiple databases
    • Create and execute queries
    • Create, alter and delete stored procedures
    • Create, alter and delete views
    • Manage stored procedures
    • Manage both local and remote database from your machine
    • IIS 7.0 Manager provides remote management capabilities with a clean firewall-friendly option for managing a remote SQL Server
    • Exposes a public extensibility platform that enables the development of providers to support other databases
    • Compatible with SQL 2008 and 2005

    It’s a Release Candidate (not for production yet).  MS are promising some documentation soon.

    Synthetic Transaction Monitoring

    HP posted a blog entry where they talk about a survey they did last year – probably with very large customers.  They asked the companies if they used synthetic transaction monitoring.

    What the hell is that?  The idea is that your monitoring solution can instruct an agent or agents to perform tasks against a business service to get a real world view of its health and performance.  This user perspective gives you so much more than just testing the up/down status of the web service or the SQL services.  It’s not a replacement for those monitors but it is a completion of the monitoring solution.

    According to HP, “about half the respondents said they did. This ties in with a recent Aberdeen study that found 57% of companies didn’t do user experience monitoring”. 

    If you’re using Microsoft System Center Operation Manager 2007 or 2007 R2 then you have the ability to do this out of the box for your web applications.  Not only that but you can also test TCP Ports and databases using OLE. 

    OpsMgr includes a set of monitoring templates.  Using a simple wizard you can very quickly capture a browser session.  You will then assign this new monitor to an agent or agents and define how often it will be run.  This means that those agents will perform the web browsing session that you just captured.  If the session fails or is too slow then there will be an alert.  You can go on and build a distributed application using this new monitor.  That can combine the health of web services, SQL services, network devices and the user perspective for the application you are monitoring.  You can take this even further and use the new SLA monitoring pack to see if the application meets the availability requirements of your business or customers.

    If you’re feeling really adventurous then try this.

    image 

    In this solution we have a web service that provides an application to customers on the Internet.  The monitoring solution is in the same site as the web service.  How can you get a users perspective on that when you don’t take the Internet into account?  The solution is simple: lease a virtual hosted server from a service provider.  Punch a hole through the firewalls to allow the encrypted agent traffic from the new hosted agent server to an OpsMgr management server.  The agent will be certificate enabled so it can reside outside of the AD forest, i.e. in a workgroup.  Now, if there is an Internet connection issue at the web server site you will get an alert from the agent.

    We use synthetic web transactions at work for our customers.  We know immediately when there’s an issue.  In fact, we recently had an alert where a developer had made a mistake in a website.  Without the alert, it’s possible that this might have gone unnoticed by the hosting company for some time, thus losing customers. 

    Check out the solution and test it out.  Once you’re happy add it in to your alert subscriptions and you’ll soon see how powerful this solution is.

    EDIT:1

    I should have added something in here.  What do you do if you’ve got no monitoring solution or if you’re in a (dedicated/virtual server) hosting environment with no monitoring, or worse, monitoring you don’t trust.  You can do something pretty similar to the above using an outsourced monitoring service.

    image

    In this solution the owner of a hosted web service has subscribed to an outsourced monitoring solution.  The OpsMgr server(s) perform normal agent monitoring using x.509 enabled agents or a OpsMgr Gateway on the hosted servers.  That just requires the one TCP port to be opened outbound on the hosting firewall.  The OpsMgr server also performs synthetic transactions against the web site(s).  Optionally, the customer could just get synthetic transaction monitoring to verify that the hosting operator is living up to their contracted SLA requirements.  You could even do this with a cheap’n’cheerful web hosting plan.

    The Final Chapter Of the VMware/Hyper-V Crash

    The story is over.  The guy behind the Youtube video of Hyper-V crashing and claiming it was unstable finally admitted he was wrong and removed the video.  It turns out that the VMware lab didn’t meet the minimum requirements for running Windows Server 2008.  When MS did bring their lab down to these configurations they were able to reproduce a crash of a VM.  After some digging they found a year old publicly available patch fixed the situation.

    The lesson here is, don’t believe the FUD that VMware marketing is putting out there, e.g. “Hyper-V = ESXi”.  Yes, MS marketing creates plenty of lame ducks too but I’ve never seen them do anything like VMware.

    Troubleshooting Virtual Machine Manager 2008

    The VMM team published a guide on how to troubleshoot VMM 2008 in 7 different scenarios:

    1. I have a Windows Host that is in a not responding state.  Use VMMCA 2008 to check the host for commonly known issues.
    2. A Windows host shows a status of “Needs Attention”.  Use VMMCA 2008 to check the this host for commonly known issues.
    3. I am using SCVMM 2008 to manage a VMware environment, but I am unable to create new virtual machines on the VMware environment. Use VMMCA 2008  to evaluate the Virtual Center environment for commonly known issues.
    4. I am using the integration between SCVMM 2008 and Operations Manager, but I am unable to see a particular VM or Host in the integrated mapping view.  Use VMMCA 2008 to check the host or virtual machine for an operations manager agent.
    5. I am using SCVMM 2008 to do P2V conversions and the job is failing.  Use VMMCA 2008 to check the P2V source for commonly known issues.
    6. I tried to create a Virtual Machine and it failed during customization and installation of virtual guest services.  Use VMMCA  on the Windows Host to find commonly known issues.
    7. I know that VMM and Hyper-V hosts require certain updates for steady VMM operations.  Use VMMCA to check your VMM server and hyper-v hosts for required updates.

    No IE In European Distribution Of Windows 7

    In a move prompted by the badgering of those unelected, dictator, Eurocrats, Microsoft announced a couple of days ago that they will not be including Internet Explorer in the European distributions of Windows 7.  OEM’s will be free to include any or no browser they feel like.

    To me, this is another case of tax payers money being wasted.  The European Union central “government” continues to bloat with the hiring of staff from just a couple of countries (what percentage of their staff are non-native French speakers?) and wastes money and time on something so trivial as web browsers, crisp flavours, and defining what is chocolate.

    Right now, there is nothing, to stop anyone from choosing what web browser or media player you want.  I doubt the inclusion of IE has impacted the success of Firefox, Safari or Chrome.  In fact, they seem to do quite well.

    How much of a waste of time is this?  How many times have you seen Windows N edition?  According to The Register “sales of Windows XP N represented just 0.005 per cent or 1/20,000th of one per cent of overall XP sales in Europe by April 2006”.  Those were probably installs in Brussels or Paris.

    There’s some thinking that the grey suits in the EU are having a hissy fit because of this announcement.  It pre-empts their decision on what they would do to MS.  What was MS to do?  They will RTM Windows 7 in July and the OEM’s will get immediate access to it to prepare their builds.  Are MS and the OEM’s to wait on some civil servants (and we know how hard working and how schedule conscious they are, right?) to make up their minds on something while these businesses lose billions of Euros?  That’s pure nonsense.

    Odds are, IE will be available to customers via Automatic Updates as an optional download if the OEM doesn’t include it.

    Exchange 2010 Looking Very Interesting

    I’m not really an Exchange person.  In my jobs, I’ve either worked in places with Lotus Notes (and dedicated mail admins) or in environments where Exchange was tiny.

    I’ve been attending Nathan Winters’ sessions on Exchange 2010 today in Microsoft Ireland.  Of interest:

    • The Outlook client sends all communications to Exchange via the CAS.  It doesn’t directly talk to the Mailbox servers.  Exception is for public folders.
    • DAG: Exchange mailbox servers replicate data around the network (LAN or WAN) for fault tolerance.  It’s not Windows Clustering (that’s gone from Exchange now).  Up to 16 copies of your data.
    • Built in basic archiving.  Don’t use PST’s stored on file servers because (a) you’re just moving the problem and (b) you’re wasting expensive primary disk.
    • New storage model optimised for SATA disk.  This bigger disk approach uses streams.  Therefore no more single instance storage.  No need to use a SAN.  JBOD might be the best approach.  Using these bigger and cheaper disks will make Exchange 2010 more scalable and cheaper.  It’s a requirement for unified communications.
    • No more single copy clusters, i.e. tha cluster with shared disk – a single point of failure.  Use JBOD and DAG for fault tolerance.  Probably cheaper anyway, e.g. I can get 35TB of HP JBOD for €21,000 (recently priced) VS €13,000 for a basic HP cluster kit with next to no disk in it (priced 1 year ago).
    • Easier federation via Microsoft online service for integrating different Exchange organisations, e.g. company merger or inter-company partnerships.
    • LOADS of new end user functionality.