Acronis Disk Director Server

We have a number of Windows Server 2003 machines that were installed by an IBM partner back before I joined the company.  This company is marketed as “experts” in servers, storage and virtualisation in Ireland.  The first thing I did when I joined was audit the systems to see what I needed to do to make them fit for management … and then ft for purpose.  Unsurprisingly, I found the C: drives were too small at 10GB.  Once you do things like add in service packs, security updates, etc, that just won’t do.  That was one of around a dozen major faults I found with that company’s work, most of which I spent some time sorting out last year.  I’ve been slowly working through the disk sizing issue. 

I’ve been quick to “un-recommend” this IBM partner to people when talking to them in person.  Strangely, IBM Ireland is very quick to recommend them for major infrastructure projects.  I found them to be amateurs, e.g. the TCP DNS settings on domain controllers pointing to IOL’s public DNS!

By the way, I make C: to be a minimum of 40GB now on all servers.  I used to go for 30GB but W2008 requires a 40Gb partition.  W2008 only takes somewhere around 10GB but I think MS are leaving plenty of space for service packs and security updates; a wise move I believe.

We purchased Acronis Disk Director Server to resize the C: partitions on this small set of servers.  I’d used it years before and it worked a treat.  I installed it a while back on the first of the servers.  Yesterday I ran a job to expand the C: volume.  SQL 2005 was installed on the next volume, G:.  I shrunk G: (it’s a tiny database) and expanded C:.  I committed the operation and rebooted.  I was remotely located (but a short spin in the car from the servers).

The servers rebooted twice and were back online.  Voila!  The C: drive was now 40GB.  The OpsMgr agent came out of maintenance mode and then a series of alerts came in.  Oops!  Acronis has renamed the G: volume to D:.  SQL had failed to start and a series of application services followed suit.  I renamed the volume back to G: and things were OK.  OpsMgr didn’t alert anything after that and an inspection by the application’s manager showed everything was working OK.

That was the backup drive in the application “cluster” (not a Windows cluster but an application cluster).  I’ll be hitting the primary machine in a couple of weeks once I’m sure the dust has settled OK.

Acronis seems to be one of those companies that you think should have a bigger name.  I like their disk management stuff.  I know their cloning solution is loved by people who use it, e.g. I’m told it’s proven to be a fine P2V and V2P solution when native VMware or MS products can’t do the job.  Their products are pretty economic when you consider the time for engineering alternative solutions so give them a look to see what they can do for you.

Windows Server 2008 R2 Foundation Edition

It’s officially been confirmed by the Windows Server team that a new edition will join the Windows Server family in the R2 release.  It will be referred to as “Foundation” and is described as an “entry-level server”.  Paul Thurrot reported on it a while ago.  The name does indeed imply that it will be targeted at emerging markets as is the plan for Windows 7.

Windows Server 2008 R2 Beta Evaluation VHD Images

Microsoft has released Hyper-V VHD images of the beta releases for Windows Server 2008 R2 Core and Full installations.  Strangely, they are RAR files.

These VHD’s will only run on Hyper-V.  Don’t worry, you can always install them on a Full installation of an MSDN or TechNet Hyper-V server (to use the local console) or on the free Hyper-V Server 2008.  Be sure your host hardware is Hyper-V compliant.

Windows Server 2008 R2 Best Practices Analyser

MS has included a BPA or Best Practices Analyzer with Windows Server 2008 R2.  It looks at technologies including:
 
  • Active Directory Certificate Services
  • Active Directory Domain Services
  • DNS Server
  • Web Server (IIS)
  • Remote Desktop Services
  •  

    There is some information on using the BPA on the aforementioned documentation for Windows Server 2008 R2.

    Microsoft.com Now Running Windows Server 2008 R2 Beta

    I just read a post by David Lowe (Microsoft Redmond, originally from here in Ireland and was the keynote speaker at W2008 Ireland launch).  Apparently, a significant portion of microsoft.com is running on Windows Server 2008 R2, Hyper-V and Core Installation on HP hardware.  The performance has increased from Windows Server 2008.  Odds are there will be some stats published in a few months time.

    When Does Windows 2000 Server Extended Support End?

    I just saw someone asking for help at removing Conficker from an NT4 server.  Extended support for that ended back in 2004 or thereabouts … with a lot of notice.  That means no patches, no support and no security fixes for that operating system.

    That made me think; When does extended support for W2K Server end?  I looked it up and found the date for termination of extended support is 13/July/2010.  That means MS will fix security issues until then.  Normal bug fixes ended back 2005 with the end of mainstream support.

    So if you have Windows 2000 Servers you should get planning.  Talk to your application vendors and apply pressure for W2008 support.  You’ll need new hardware or virtual machines because you cannot do an in-place upgrade.  If you don’t upgrade then you face the risk of not only MS not supporting you but others as well, e.g. anti-malware or backup software vendors.  Imagine not having a security patch that every other OS has, not having protection of your system and/or data and catching something like Conficker.  That’s a pricey gamble if you ask me.

    Core Configurator Is Back!

    Sick of trying trying to learn messy NETSH commands and the like and just want to do your job of deploying W2008 Core Installation servers?  Back when W2008 was first released a free tool call Core Configurator was released to the Net.  Using a very simple and lightwight GUI, it translated your clicks into commands.  This allowed you to configure your Windows Server 2008 Core Installation servers with the minimum of fuss.  Unfortunately, the author left his employers and they enacted a little used Intellectual Property clause in his employment contract.  The product disappeared off the download location but with a little hard work you could find it elsewhere.

    Read your employment contract.  You’ll see that it often has an IP clause saying that anything you generate, even outside of work on your own machines, is the property of your employer if you sign the contract.  Yeah, I know, it sucks.  I made sure my employers waived that one for the writing work that I’ve been involved with before I joined the company.  What I do in my own time is mine to do with what I want.

    The owners of the IP rights to Core Configurator have released a new version.  It’s free for non-commercial use and costs $99 for a site license – they included product activation GRRRR!

    Tolly Report: Performance of SMB2 In Vista and W2008

    The Tolly group has written a new report (there was a previous one back in the pre-RTM days) on the performance of SMB2 in Windows Vista and Windows Server 2008.

    If you didn’t know, there’s a serious amount of changes in the TCP stack and SMB2 to improve the performance of data and particularly file transfer between W2008 and Vista machines, e.g. W2008-Vista, Vista-Vista, W2008-W2008.  I’ve written a chapter in the Mark Minasi book, "Mastering Windows 2008: Essential Technologies", on the subject.  Between a new TCP stack that handles more data transfer with less overhead and a new SMB protocol to handle those greater loads as well as doing some meta data caching, things are much better over latent (e.g. WAN or Internet) links.

    The benefits are seen across many scenarios.  Using a Vista client to use Outlook over HTTPS access of a W2008 hosted Exchange server over the Internet is faster.  Using Sharepoint over the WAN is faster.  Using a W2008 iSCSI based server is much faster.  I recently did a test when I was at TechEd EMEA in Barcelona with a proof-of-concept box that I had hosted in our data centre in Dublin.  It was running W2008 Server Std with WSS 3.0.  I used my vista laptop to browse and upload/download files.  It was almost like I was in the same building as the server.  I was very impressed.

    I’m not saying that the improvements will be a cure-all for centralising all your servers out of the branch office.  But it’s definitely an improvement and improves some services which are already suitable for consolidation and centralisation such as Exchange and SharePoint.  You might want to look at things like VDI, Terminal Services (and the TS based solutions) and web interfaces as end-user access technologies if you have other services to centralise or consolidate.

    Windows 7 and Windows Server 2008 R2 are bringing things to another level.  I was chatting with Mark Minasi at TechEd after one of the presentations and he thought that these were the branch office solution releases.  I’d agree with that.  Windows Vista and W2008 gave us the foundation in the new TCP stack and W7 and W2008 R2 build on that to give us the solutions.

    The most interesting of these is BranchCache.  The idea is pretty simple.  In a server-less office, when a Win7 client accesses a remote W2008R2 hosted file or HTTP service it can cache the files that are downloaded.  Each file is uniquely identified using a caching algorithm.  If another W7 client in the same office network requires files from a remote W2008R2 server it broadcasts on the network with the ID’s (in synch with the server telling it what blocks make up the download).  If another W7 client has cached the data then it’s transmitted from one W7 client to the other over the LAN – no unnecessary data transfer over the WAN.  Data security is maintained by ensuring the user/client accessing the data have rights on the download’s ACL.  You can optimise this by using a W2008 server in the branch office to act as a caching server.  Instead of wasteful LAN broadcasts, the branch office client will just consult the caching server to see if it has the files and downloads from there (if it has permission based on the original hosting server).  This process is not used for saving changes.  That’s not perfect but most cross latent network data transfers are read only anyway.

    This has the potential to really change branch office infrastructure on a Windows network.  It will be interesting to see how cross WAN data transfers will compare between Windows 7-Windows Server 2008 R2 with BranchCache and Windows XP-Windows Server 2003.  If you’ve used Windows Server DFS-R with Cross File Replication and monitored the results then you’ll have an idea of what to expect.  It hopefully won’t be too dissimilar to the results from appliances provide by Riverbed and Citrix (although they optimise all unencrytped and unsigned TCP traffic; not just SMB and HTTP.  They also do it in both directions).

    How To Test Outbound Mail Flow With IIS

    I was asked to see if there was a problem with an IIS SMTP server.  I’ve been asked this one many times before in the past.  The first thing I do is to check DNS (just like with Active Directory).  I verified I could look up external MX records.  Next I checked firewall rules and connectivity.  I could telnet to an external mail server on port 25, e.g. telnet mailsvr.destdomain.com 25

    Next up is a tool called SMTPDiag.  I’ve used this one a good bit for this sort of problem.  You need to know which things to ignore in this scenario where there is no Exchange server.  In the past when I was with a hosting company we had a customer who reported an issue with his server sending email.  SMTPDiag did some tests to verify SMTP was OK and then sent test mails.  The destination that was reported in the trouble ticket bounced the tests with an SMTP code … the sender was being rejected for being a spammer.  It turns out he probably was being identified as a spammer due to the large amounts of mail he sent.  In my case today, everything was good.

    The final step was to manually send a mail.  You can use notepad to write an email and drop it into the SMTP pickup folder.  Open notepad and add something like:

    to:tbill@externaldomainname.com

    from:testuser@internaldomain.com

    subject:This is a test.

    this is a test. The date is 23/01/2009 and the time is 15:06.

    Save this in Inetpubmailrootpickup without a file extension (use "", e.g. "testmail").  The file will disappear when SMTP has sent it.  My test is to send it to somewhere like GMail or Hotmail.  That worked fine for me on this ticket.

    Everything seems fine.  This is often the case when it comes to tickets I’ve had related to email.  The server I’m responsible for works fine but the destination server(s) have issues.  More often than not, it’s been antivirus servers acting up or slowing things down.