Windows NTBackup For Vista/Longhorn

A while back, I posted a document that went through backups on Vista using the native backup functionality.  While reading it, you’ll notice that backups have changed a lot.  No longer are we using a BKF file that simulates a tape but we’re using a VHD file (virtual disk) that we can mount either in the operating system (using VHDMount) or in a virtual machine.  We can assume that "Longhorn" will use a similar tool, if not the same one.  But what do you do if you need to restore from an old Windows NTBackup back up?
 
Yesterday, Microsoft released a "Windows NT Backup – Restore" utility.  If you enable the Removable Storage Manager feature then you can use this tool to recover files from your old backups.
 
If you are using Vista:
  1. Click the Start button, click Control Panel, click Programs, and then click Turn Windows Features on or off. If you are prompted for an administrator password or confirmation, type the password or provide confirmation.
  2. Select the check box next to Removable Storage Management to turn the feature on, and then click OK.
If you are using "Longhorn":
  1. Click Control Panel and then click Administrative Tools.
  2. Open Server Manager, navigate to the Features Summary section, and click Add features.
  3. Select the check box next to Removable Storage Manager, click Next, and then click Install.
Now you can install the utility.  The icon for it will be in "All Programs … Windows NT Backup – Restore Utility".  The download supports IA66, x64 and x86 builds of Windows Vista and "Longhorn".
 
Credit: Bink.

New Microsoft Branch Office Promotion

Yesterday, Microsoft announced a new offering for branch office server deployment.  More details below.  Let me just quickly look at branch office server deployment first.
 
The ideal for any infrastructure with branch offices is that there are no servers in these offices.  Costs such as hardware and software are obvious and you might be surprised they are they small costs.  The hidden costs are the management of these machines:
 
  • Maintenance: more machines = more administrative effort = less time spent on engineering and projects.
  • Security: Machines need regular security maintenance.  Company data must be physically secured, not just logically.  DC’s should only be placed in locations with computer room security.
  • Complexity: More machines, more applications, more custom configurations => more complexity = more failures and more firefighting.
  • Backups: Small branch offices cannot afford IT.  The result is that a secretary or a PA usually does the backups.  Are you really sure you can recover from a disaster?  There’s more expense on hardware, tape media and software licensing.  Then there’s administrative time spent on fixing or explaining things over and over again.  Don’t forget the essential off-site storage for tapes … what good is disaster recovery if the tapes are burned with the building or the building is inaccessible.

Windows 2003, Windows 2003 R2, Microsoft DSI/System Center,  Terminal Services (and partners) and some 3rd party solutions have offered alternatives:

  • A "Wide Area Data Network" approach is possible with a product such as the Riverbed Steelhead.  This TCP/IP optimisation product offers you the ability to completely remove servers from all of your branch offices, thus reducing all of the associated costs.  It’s not an option for everyone so there are other soltions too.
  • We can use a hands-off management thatnks to the System Center family of products.  MOM 2005 allows us to know what is happening on servers everywhere.  Fault and performance monitoring is possible, not only for Microsoft products but also for products where venders have developed management packs (Citrix, Dell, HP), 3rd parties have developed solutions (for UNIX, LINUX, EMC, etc) and you can create your own custom management packs.  SMS 2003 allows you to completely manage the configuration of your servers from a central location.
  • Automated Deployment Services can be used to remotely build a server from an image using a PXE network service.  With Remote Desktop enabled, you can then completely finish the build from a central location.
  • Security can be maintained centrally.  WSUS and SMS 2003 Iventory Too for Microsoft Updates offer 2 ways to deploy updates and report on their deployment.  Other tools such as Microsoft baseline Security Analyser or SMS 2003 Scan Tool for Vulnerability Assessment allow you to scan your secutity configuations and report on them centrally.
  • Terminal Services, Citrix, 2X, ProPalms are just some of the Server based Computing solutions that allow you to run "thin branches" that would contain no servers and just terminals and maybe laptops.  All servers would be placed in hub offices or the HQ.  User’s computing activity appears local but all of their processing is done centally.  All of the data resides centrally, so backups would contain more data but the amount of administrative effort actually decreases hugely (e.g. 2 file servers in HQ instead of 1 server in every branch) if engineered correctly.
  • Backups become much easier.  If you follow server centralisation then the return is obvious.  If you need to maintain file servers in branches then you can take advantage of Windows 2003 R2’s Distributed File System (DFS) and DFS Replication.  Data can be replicated from branch offices to central file servers.  The data exists in the same logical namespace and is accessed by applications and users the same way, wheter they are in the branch office or in the central office.  Security is maintained the same way as always and is replicated automatically.  Files are replicated at block level and block replication repitition (why replicate the same block 100 times because it’s on the file system 100 times) is avoided by the Remote Differential Algorithm.  Now, administrators can backup the central replica and no backups need take place in the branch office.  This is suitable for archive backups and disaster recovery.  Operational backups can be handled by Volume Shadow Copy, both in the branch and the central office.  With some education, users (or power users) can be taught how to use the Previous Versions Client to recover files from their file servers wihtout resorting to tapes or to calling IT.  data Protection Manager is a product that should also be looked at here.
  • Disaster Recovery: A branch with 5 users may generate a lot of cash.  But do you really want a DR site for it?  You may want some functionality to maintain business operations and there may be regulatory requirements for a DR operation.  But do you really need a full replica incurring WAN costs, hardware, software, space rental, etc.  It’s also more stuff to be managed.  We’ve already looked at how we can replicate file server to central offices for backups.  We can also use that for DR.  When DR is invoked, why not use the replica in the central site?  It’s has two way replication and is accessed in exactly the same way via the logical architecture that abstracts the physical location of the data from users and applications.  Users can either access the data via RAS, web facing Terminal Services (via a Citrix gateway or alternatives), travel to the central site, etc.  DR is now available for those branches without any additional costs.
  • Remote branch management is made easier with Windows 2003 R2 tools such as Print Management Console and File Server Resource Management.  Printers are the millstone around the neck of helpdesks everywhere.  Using PMC you can deploy printers to users or computer via group policy.  That all but eliminates the calls asking "how do I connect to my local printer?".  The console itself can be used to monitor the status of those printers.  FSRM can be used to control your file shares.  You can use real folder level quotas to control usage of file systems.  Rules can be implemented to control the types of data being stored.  Does your organisation relaly want to offer an IPod backup service for the employees?  I think not.

That’s just a quick sample of what’s to offer.  Many organisations have started down the road of eliminating branch office sever computing.  Some just cannot.  I worked with a retail operation that had 200+ branches in the UK and Ireland and was centrally manged from Dublin.  They could not afford for server based computing to be offline in the event of WAN outages.  Therefore, a server was placed in every branch.  SMS 2003 with the 1E SMSNomad product was deployed to manage all branches.  The solution appeared to work well.  But think of the costs of deploying all that software.

Microsoft announed a new product bundle offering yesterday.  This SKU is intended for those enterprises that have many branch offices that require branch office server computing.  The bundle offers to reduce server software costs by up to 43%.  It includes:

  • Microsoft Windows Server 2003 R2 Standard Edition
  • Microsoft Internet Security and Acceleration Server (ISA Server) 2006 Enterprise Edition
  • Microsoft System Center Operations Manager 2007 Enterprise Operations Management License (OML) – a license to manage this machine
  • Microsoft Systems Management Server 2003 R2 Server Configuration Management License (CML) – a license to manage this machine
  • Microsoft Virtual Server 2005 R2

This bundle will be available from February 1st, 2007 until January 31st, 2008.  Microsoft goes on to say that "the promotion includes 10-pack licenses for each of the included products".  I think that they are saying you get 10 CAL’s for Windows Server 2003 R2.  You’ll want to verify that with your assigned LAR (not necessarily your direct reseller – they tend not to know the official lines on these things). 

In addition, Packeteer Inc. is offering a 30 percent discount on its iShared FlexInstall wide area file services (WAFS) software product.  WAFS is not the way I’d go, it only optimises file server data.  Riverbed’s Steelhead appliance optimises all TCP/IP traffic: SQL, Oracle, Lotus Notes, Exchange, file servers, HTTP and lots more.  Just ask the UK’s Royal Navy who started trialing them last year for their command and control systems in warships and submarines.

SMS 2003 SP3 Public Beta

SMS 2003 SP3 is now available for public beta testing.  As usual you’ll get the security (I don’t remember any) and bug fixes for the product.  Microsoft will also be adding new functionality that results from the Assetmetrix acquisition.  This new functionality includes:

  • License Reporting: identify installed software like you’ve never been able to do before.  No more trawling through excutable audits or relying just on "Add/Remove Programs".
  • Software Consolidation: Application audits can be grouped into categories, thus enabling more informed reporting.
  • Upgrade Planning: Idenitify key applications so that you can plan regression testing for future application upgrades.

Vista support is added:

  • Deploy Vista updates
  • Deploy applications to Vista
  • Perform hardware and software audits of Vista

The OS Deployment feature pack has already been updated and released so you can actually deploy Vista images usign SMS 2003.

The supported operating systems for this service pack are:

  • Windows 2000 Service Pack 4
  • Windows Server 2003
  • Windows XP
  • Windows Vista

You can doanload this beta release from the Microsoft Connect site.

Windows Server Virtualisation Calculator

The Windows Server Division Weblog posted a link to this tool today.  In Microsoft’s words:

The Windows Server Virtualization Calculator provides two ways to estimate the number and cost of Windows Server Standard Edition, Enterprise Edition and Datacenter Edition licenses needed for your virtualization scenarios to help you determine the most cost-effective edition of Windows Server.

One of the MCS team also blogged about this online tool and reminded us that this tool will also be useful for those considering VMware technologies instead of Microsoft’s solution.

Internet Explorer 7 Deployment Guide Beta 1

Microsoft has just released part one of a 3 part series deployment and maintenance guide for IE7.  In their words:

This is the beta 1 release of the Internet Explorer 7 Deployment Guide, and contains only the first of three parts. Please submit feedback to IEdeployfb@microsoft.com.
This guide helps you to plan and carry out a deployment of Microsoft Internet Explorer 7 on Windows XP and Windows Server 2003. The guide describes the system requirements and deployment methods, as well as the techniques to maintain and support Internet Explorer 7 after deployment

Speaking from experience, IE7 deployment with WSUS was both silent and pain free.  I’ve been using IE7 since last Spring and have loved it.  Tabbed browsing and the RSS reader have been great additions.

Minasi Newsletter: January 2007

This month, Mark has included an absolute tonne of information on how to use the Windows System Image Manager from the Windows Automated Installation Kit for creating automated deployments of Vista.  In case you don’t know, WSIM is used to create the new format of answer file that is required for silent automated Vista intsallations.

While working with WSIM, you may also want to check out some of my earlier posts:

How To Improve Windows Server Based Computing Performance

Whether you use Terminal Services, Citrix, 2X, ProPalms or something else, the core of performance optimisation is based in Windows.  There’s a Microsoft KB article that details some basic steps that will help you get the most out of your servers. 

It starts with getting the hardware right.  If you’re buying now you’ll get 64 bit processors.  That’s a good start:

  • Dual CPU’s with Dual Core or Quad Core support.
  • Memory – 2-4 GB RAM.
  • Optional: DVD + Floppy.
  • Raid Adapter with at least 128 RAM, that support Raid 1 with Hot Spare disk.
  • Backup Battery for Raid Adapter.
  • Three disks of at least 74 GB Ultra SCSI 3 15000 RPM or 74 GB SAS 15000 RPM (Raid 1 + Hotspare).
  • Dual Power Supply.
  • Remote Management Adapter.
  • Dual Network Adapter 1-10 GB (Server Adapter) with an option for "Teaming" (Fiber Channel Network Adapters recommended).

A quick note here.  Memory is a very interesting subject and it’s usually the bottleneck on deciding how many users you can load onto a Terminal Server.  Note that 32 bit applications are very memory inefficient on 64 bit operating systems.  64 bit operating systems are capable of addressing serving much more RAM.  Have a read of Bernhard Tritsch’s (Terminal Services author and MVP) "Big Iron Test". 

Next up is is operating system.  Obviously you go with Windows 2003 now.  Windows Longhorn will offer some serious upgrades which may accelerate it’s deployment.  Do you go 32 bit or 64 bit. Having a 64bit CPU give you the option of either.  As always, do some testing:

  • Are the applications that will be used on the Terminal Server supported when running under x64 runtime or WOW32 under x64 runtime?  Remember that 16 bit applications will not run on a 64 bit OS.
  • Did tests show any improvement or degradation in the server performance when you ran them on a 64 bit OS?
  • Does the current server hardware support 32-Bit runtime and/or x64 runtime?

You’ll also want to make sure that you run the latest service pack, currently SP1 for Windows 2003.  Some optimisations include:

  • Use a dedicated server for Terminal Server tasks.  Don’t think "I’ve got a server with loads of RAM and CPU – why not install SQL on it".  That will kill the server.  You bought that hardware to replace PC’s, not other servers.
  • Verify that third party products are supported under Terminal Server environment.  Watch out for dodgy applications – they sometimes require "application silos" where servers are dedicated to particular applications.
  • Consider using "User Profile Hive Cleanup Service".
  • look at using a large page file.  You will want to know how to overcome the 4,095 MB paging file size limit in Windows.
  • You should also look into how to determine the appropriate page file size for 64-bit versions of Windows Server 2003.
  • Optimsise graphics performance (Control Panel -> "System" -> "Advanced") and change "Visual Effects" and "Adjust for best performance of:" and "Memory usage".
  • Optimise memory management by editing "boot.ini" file.
  • Use the latest Client … RDP, Citrix, termanal OS, etc.
  • Consider implementing QoS (Quality of Services) or Class of Service to boost RDP sessions over the network.
  • Use low resolution for RDP display and consider disabling RDP features such as Auto Network drive mapping, Audio etc.
  • Use as few GPO’s (Group Policy Object) as possible.  Check out loop back processing … very useful if you have users who have both full and thin client requirements and need differing policies depending where they have logged in.
  • Do not use batch technology scripts.  Powershell, VBS, WMI, Windows Power Tools offer more options and better performance.
  • Use printers drivers signed by Microsoft.
  • If at all possible, only redirect the primary printer on full clients.  Try to configure printer mapping so that it logons do not wait for them.
  • Look at pritner optimisation technology such as Riverbed, ThinPrint, etc, when printers are across a WAN from the Terminal Servers.  Some Citrix alternative technologies include optimisation solutions.
  • If you enable NLB (Network Load Balancing), check that the current network equipment can handle NLB traffic.
  • Do not use remote "Roaming Profiles" for Terminal Server access.  In fact, it might be worth not using roaming profiles at all.  Check out a free alternative called Flex Profiles.
  • Disable unnecessary services/options in the user GUI (Graphical User Interface) such as Wallpaper, Active Desktop, Screen Saver, etc.
  • Use a Terminal License Server that is local to the Terminal Servers.  MS PSS call #1: make sure you configure the right type of CAL in the TS configuration on the Terminal Services and that it matches the CAL’s on the Terminal License Server.
  • There’s a recommendation to consider disabling the use of web browsers.  That’s not all that realistic.  What you can do is use a proxy filter to prevent unwanted bandwith eaters.

Test, test, test.  Even when you go into production, you should retain a test environment.  You may even need a development environment if you have internally developed applications.

So Microsoft Software Isn’t Stable, Secure or Scalable?

I’ve been a programmer, consultant, administrator and contractor.  During all those years since 1996 I keep hearing the same old tune from people … "Microsoft software isn’t stable, it isn’t scalable and it isn’t secure".  Hmmm.  Lets have a look at that.

It Isn’t Stable

I ran a network with 160 odd Windows 2003 and a handful of Windows 2000 servers.  We had around 12 Solaris machines which ran our front office and our back office application.  The critical applications that were on those UNIX platforms were far from stable.  In fact, they were clustered and the clustering was not only a huge cost but failed to work correctly.  We also ran Lotus Notes, usually the latest builds.  We had a cracking Lotus Notes team led by one of the best Lotus freaks in Ireland.  We saw many funnies there despite that team’s efforts.  On the MS side?  Was it perfect?  Nope.  But we were stable.  Services did not go down during the day.  We were able to stick to prevously agreed maintenance windows.

It Isn’t Secure

Here’s the one that makes me really laugh out loud.  I’ve asked people why they use ISA Server as a proxy but instead of using this economic product (around €1,000 and no CAL’s required) as their firewall, they cough up countless amounts of money for something like Checkpoint whose licensing makes Dick Turpin look like a saint.  The usual line is "I won’t use a Microsoft Firewall because it isn’t secure".  I usually respond with "What attack on ISA made you feel that way?".  There is never a response.  Since ISA 2000, you can count the number of security patches for the ISA family with fewer digits than are on your hands.  Can you truly say the same for Cisco or Checkpoint?  Plus, ISA is managable and understands your user accounts.  It can be made fault tolerant and is cost effective.  Not only does it do the usual port blocking , etc, but it understands the applications passing though it and can actually intercept malformed packets that are an attack on your network.

Then we get to patching.  Penguin lovers can be quiet here.  When is the last time you saw a fully patched Linux or UNIX network?  How did they deploy the patches?  Microsoft has a responsive solution for getting patches out to the public and they have provided 3 mechanisms (Windows Update in each machine WSUS and SMS) for deploying updates.  With these tools, your Microsoft network can be secured within a 24 hours with minimal business impact or manual effort.

It Isn’t Scalable

Maybe this one was true in the past.  SQL 2000 (certainly 2005), Exchange 2003 and Window 2003 easily took care of all scalability problems.  When Microsoft ran Exchange 2003, they had 3 or 4 clusters for the 55,000 users across the globe in 3 sites.  Each cluster was made up of 6 HP DL380’s, 4 being active, 1 for recoveries and one as a failover node.  That’s 6 * 4 = 24 servers for 55,000 users with room for failover for probably one of the busiest email networks in the world.  That’s impressive if you ask me.

OK … It’s Too Expensive

We all hear headlines about how organisations allegedly dump MS to go with the Penguin way of life.  The Munich government made headlines back in 2003 with their decision to snub Steve Balmer.  He warned them that he was giving them a great price for their needs and that their Linux solution would end up costing more.  They had the whole arrogance thing going on and didn’t listen.   A year later we heard that their Linux project licensing was costing around 30% than what MS had quoted them for licensing.  That worked out well.  I guess they never considered user familiarity, training, managability, deployment, product integration, etc.

As an example, here’s a a case study where the London Stock Exchange adopted Microsoft technology.  You’re not going to find many more sites where cost, scalability, scurity and stability are going to be more important.