Monday, February 17, 2014

Generate a Certificate for Exchange 2010 using Microsoft PKI

If creating a certificate request for OWA (and other associated services) with the intention of processing this request on a Microsoft enterprise PKI structure, you'll meet a series of challenges. Here is how to overcome them in the order you will encounter them.

After creating the request and attempting to submit it, you will first see this error message:

ASN1 bad tag value met. 0x8009310b (ASN: 267).



This error is because the request is by default encoded in Unicode while 2008r2 and lower PKI can only process ANSI. To convert, either open the request with notepad and select "File->Save As" and change the encoding to ANSI before saving.



If you submit at this point, you will see:

The request contains no certificate template information. 0x80094801 (-2146875391)
Denied by Policy Module 0x80094801, the request does not contain a certificate template extension or the Certificate Template request attribute.


This is because Microsoft enterprise PKI does not process unqualified (by means of a template) requests. We need to force a template (WebServer will work) by using the following command:

certreq -submit -attrib "CertificateTemplate:WebServer" NameOfMyRequest.req

Select your CA. If your template is configured correctly, the cert request will be successful. If not, you will receive the warning:

Certificate not issued (Denied) Denied by Policy Module  0x80094800, The request was for a certificate template that is not supported by the Active Directory Certificate Services policy: Web Server.

The requested certificate template is not supported by this CA. 0x80094800 (-2146875392)


To remediate this issue you will need to create/enable your template as desired. I created a custom 2k8 compatible Web Server template that allows for exporting the private key (for Web Application Proxy) and it worked well. For more information, see Section 4 of my article.

After saving the request, use either powershell or the management console to finish processing the request, and then enjoy your e-net i-mail over the hinterlandnet.


Sunday, February 9, 2014

Windows Server Failover Clustering Quorum Behavior Guide

A Republic Quorum, if you can keep it.”  -  Ben Franklin.

Your WSFC Quorum is like a Republic, or more accurately, a Democracy. There are many articles out there regarding Quorum voting logic but most are somewhat lengthy. I decided to set out to see how few words I could effectively explain Quorum rules in, so here we go. Don't count this part. Or this. Wait... er... start counting.... NOW.

Windows 2008 and higher:



  • A Quorum is the act of n nodes agreeing on a majority. 
  • A node is a cluster member, shared disk, or fileshare witness. 
  • A cluster can have a shared disk or fileshare, but not both. 
  • A majority is defined by greater than 50% consensus. (a tie is not majority)
  • Fractions (only) are rounded up to the nearest integer. (2.5=3)
  • There is a legacy quorum method called "disk only" wherein one (defined quorum) disk is the only vote. This is considered obsolete because it creates a single point of failure. 

Windows 2008/r2 with Hotfix 2494036 or Higher:

"Nodeweight" was added to revoke a node of its voting privileges (NodeWeight=0). You can use this for nodes in a different site or to ensure that shared disk/fileshare casts the deciding vote. This is generally used in cross-site clusters.

Windows 2012 or Higher:


Dynamic Clustering
"Dynamic Clustering" changes the nodeweight of downed cluster member and effectively reduces the number of participating nodes by one. This works under the following circumstances:
  • Prior to the outage, the cluster has achieved quorum (normal under most circumstances)
  • Nodes must go down one at a time so the remaining nodes can agree to removed the downed member. If multiple nodes go down simultaneously the dynamic removal will not take place. 

Examples/Illustration:



Closing


Windows Server Failover Clustering is an excellent option for SQL Server, Hyper-V, and other services. Hopefully this understanding of cluster failover behavior enables you to design solutions that better meet the needs of your clients.

Note: It is important to consider how a Quorum is formed when considering patching strategy. 

References


TechNet: Understanding Quorum Configurations in a Failover Cluster
Aeval Shah's Blog: Windows Server 2012 Failover Clustering Dynamic Quorum
Configure Cluster Quorum NodeWeight Settings
Microsoft Support: Cluster NodeWeight hotfix for 2008/r2

Tuesday, January 28, 2014

Cloud Authentication Primer: Basic Active Directory Federation Services Setup

Cloud computing. Cool stuff eh? Just pay your (not quite) local provider and all your problems are a thing of the past.

How are we going to authenticate/authorize our users? I dunno, but sign the contract now and you get a special price for a limited time!

Fortunately you're in luck. Microsoft released Active Directory Federation Services (ADFS) all the way back with 2003 r2 and released the much appreciated version 2.0 a bit after the release of 2008 r2 (install-able as an upgrade). Version 3.0 has just shipped with 2012 r2, and each release has brought many welcome new features.

Put (probably too) simply, ADFS allows you to extend your Active Directory space to other platforms, including Azure, Amazon EC2,  and other cloud services including those supporting SAML. By doing so, you can grant your already set up internal users access to new services located elsewhere. ADFS is also required for some Microsoft products including the new Web Application Proxy, which I'll be covering later. This isn't the only way to accomplish AD auth to other parties (AD Replica; IDaaS), but it's a great place to start.

The main components of ADFS are as follows:

  • Federation Service: This service on one to many servers facilitates the core functionality; sending/receiving authentication requests to/from third parties. 
  • Federation Service Proxy: Sits in a perimeter network (DMZ) and sends requests to Federation Service servers on the interior of your network on behalf of clients outside the interior. 
Highly simplified view of ADFS In Use; I'm guessing Bob is cool with that. 

With that primer set, let's walk through a basic install/test of a single instance of Active Directory Federation Services 3.0 on Windows 2012 r2. I will not be covering the proxy or services integration at this time.

Requirements


Other than domain admin rights as listed, you'll also need:
  • An Active Directory domain you would like to extend
  • 2012 r2 media and licensing taken care of. 
  • Domain Admin privileges; minor AD updates will be necessary and you'll need a new service account.
  • The ability to create new trusted certificates from either an internal or external source. (Need a new PKI setup? See my article here.)
  • At least one VM/Physical machine; One for the ADFS server and another as a shared database infrastructure if you desire. More on those options below. 
  • Desired server on the internal network and domain joined with access to Active Directory servers. 
  • Access to add records to DNS. (May not be required but I recommend it) 
  • (If applicable) Ability to create a new DB in a shared infrastructure. Note this is only a requirement if you do not intend to use the Windows Internal Database. 

Sizing


ADFS, like Active Directory and most related services, is not very demanding from a hardware perspective. A modestly sized VM is generally a good solution in most cases and scaling usually is realized horizontally, dictated by geographic concerns. You can use either the built in Windows Internal Database for ADFS or place the database on a shared infrastructure if desired. I will be using a shared DB infrastructure for this example. For more information on sizing, see this link. 

Step 1: Install ADFS Binaries

  1. Log on to the machine you would like to set up as an ADFS server with your administrative logon. 
  2. If not launched already, launch Server Manager and click Manage-> Add Roles and Features
  3. Click Next on the Before you begin page. 
  4. Select Role-based or feature-based installation and click Next.


  5. Select the appropriate server and click Next.


  6. Select the Active Directory Federation Services role and click Next. (Note that despite serving up some services via HTTPS IIS is not needed)


  7. No additional features will be needed; click Next.
  8. Read the ADFS notes and then click Next to proceed.


  9. Click Install to execute. When completed, do not configure ADFS yet.


Step 2 : Acquire Certificates 


For ADFS to function correctly you'll need at least one certificate. The certificates needed are for service communications and Token-signing. For the purpose of this tutorial I will not be replacing the auto-generated Token-signing certificate because the install will work fine without doing so. For a large scale, production installation, however, I highly recommend installing a custom Token-signing certificate after installation. For more information on how to do so, see this TechNet link.

As for the service communications certificate, this must be created to proceed. This can be either an internal PKI or third party cert so long as all clients intending to use ADFS trust the certificate. Obviously if you're working with a large third party (say salesforce.com) you'll need another third party's certificate (digicert for example). If you're working across your enterprise only, you can use your PKI but you will need to ensure all devices (mobile included for workplace join) trust your CA and your CRL needs to be accessible externally (see Publish Root CA CRL & CRT to Web) For the purposes of this example I'll be using the company PKI so expect specifics of certificate application to differ accordingly.

As for the certificate name you should pick the eventual external name of the ADFS service. Should you configure an ADFS proxy in the future, Microsoft instructions mandate the use of the same certificate. For this reason plan accordingly. Other SANs are necessary as well; we'll cover that below:


  1. Apply for a certificate; if using internal Microsoft AD PKI the web server template will work fine. 
  2. Set the common name to your desired external connection URL. For this example I'll be using fs.companyname.com
  3. Add the following DNS SANs (subject alternative names)
    1. fs.companyname.com (Yes, it's in the CN but needed here again to work properly)
    2. internalservername.internaldomain.lan (Unless you plan on splitting DNS internally) 
    3. enterpriseregistration.companyname.com (This is needed should you ever want to utilize Workplace Join and won't hurt you even if you don't) 


  4. Ensure the private key is marked as exportable as you'll need this cert for an ADFS Proxy in the likely event that you need it.


  5. Acquire the certificate and ensure it is installed correctly in the personal store of the local computer. 

Step 3: Configure ADFS


With your cert in hand we're ready to config ADFS. Be prepared to create a new service account for this step. 
  1. On Server Manager on the ADFS server, click the flag and then select Configure the federation server on this server.


  2. On the Welcome screen, select Create the first federation server in a federation server farm and click Next.


  3. For Connect to AD DS ensure you have the appropriate user selected. If it's the current user you won't need to change anything. Click Next


  4. On Specify Service Properties select the SSL certificate you imported earlier or import it now if you have yet to do so. 
  5. Ensure the Federation Service Name matches the external URL reference associated with the SSL certificate name. 
  6. Set the Federation Service Display Name to what you would like your users to see when using the service explicitly, i.e. "My Awesome Company". Click Next.


  7. As for the Service Account, I recommend using a Group Managed Service Account if your domain supports it (Windows 2012). If not, fall back to a Managed Service Account or standard Service Account. As I'll be using a managed service account we'll need to do the following; if not, substitute your own account creation needs for the sub steps below. 
    1. To facilitate the GMSAccount Open a Powershell prompt as administrator and execute Add-KdsRootKey -EffectiveTime (Get-Date).AddHours(-10)


    2. Close Powershell and return to the Wizard
  8. Select the appropriate service account type and specify a name. If using a GMSAccount the Wizard can create it for you. Be sure you give it a name that suits your standards (I use GS_ where G=Group,S=Service, _=meaningful) and click Next


  9. Select if you would like to use the Windows Internal Database or a SQL install/instance on a different server. In my example I'm using a dedicated SQL server. Click Next.


  10. On the Review Options page you will be presented with the information you have just entered. Ensure it is correct and click the View Script button. This is the Powershell equivalent for this installation; save it for later reference and perhaps use for another member in the farm. Click Next.


  11. The Pre-Requisite checks will run. Provided it passes, click Configure


Step 4: Configure Networking

  1. Add a DNS record to your internal network to point the public URL you specified in the SSL certificate to the internal address of the ADFS server. 
  2. If testing workplace join internally, add the appropriate enterprise registration DNS entry as well. 
  3. If you proceed further with an ADFS Proxy you will need to add an external DNS record at that time. 

Step 5: Test and Troubleshoot


After installation you can perform a simple test by navigating to:

https://fs.(companyname).com/adfs/ls/idpinitiatedsignon

Where the fs.(companyname).com represents the URL specified in the DNS SAN/CN of the cert. This should present you with the following logon screen: 

All you need do is click the Sign In button and you will receive a login prompt. If entered correctly and all is working you should get the message You are signed in. when completed. 



If you have any additional problems I can offer the following: 

  • The most interesting event logs are stored in Event Viewer->Applications and Services Logs->AD FS->Admin. 
  • If the configuration fails with a cryptic error message, ensure you don't have anything taking up port 80 or port 443. If either is taken the config routine crashes. 
  • If you need to re-do the installation for any reason that is fine, but make sure you overwrite the database. The easiest way to do so is use the powershell script we mentioned above, but add the -OverwriteConfiguration True parameter. 

In Closing

Clearly this is just the beginning. The next logical step would be to setup an ADFS Proxy and then establish relationships with your SaaS providers. We'll have more on that in the future. To get started now, check these thoroughly vetted and highly appreciated links, now with 86% less sand.

Edit: Dan Salmon from the good folks at RBA Consulting sent over this excellent Microsoft link that lists many SSO ready SaaS offerings. A great place to start if you're looking...

ADFS

WindowServer: ADFS Overview
TechNet: Planning for Federation Server Capacity
TechNet Security Forum: ADFS SSL Certificates
ADFS Product Support Blog: SSL, Token Signing, and Client Authentication Certs
MSDN Social: ADFS Token Cert - 3rd Party Cert Required?
TechNet: Certificate Requirements for Federation Services
TechNet: Token Signing Certificates
TechNet: Federated Web SSO Example
TechNet: Installing the ADFS Web Agent component of ADFS
Jeffrey Schwartz: ADFS 2.0 Open Doors to the Cloud
Example in Action: ServiceNow ADFS/SAML config

ADFS Proxy


Azure + AD


Misc


Thanks for reading!

Tuesday, December 17, 2013

Monitoring Windows: Granular Service Rights in an Enterprise Environment


Foreword


I'm a big fan of Splunk and similar tools for network and systems monitoring. This new generation of machine data analysis brings a myriad of product and monitoring opportunities. These systems work by indexing large swaths of machine logging (logs, performance information, etc.) and providing an incredibly effective interface for reporting of that information.To achieve this, however, you need to provide very broad access to target systems.


On Windows platforms this can be difficult. While Splunk provides some great documentation, it suffers from the same access (and granting) problem as other platforms needing similar access. Generally this problem is solved in one of two ways:

  • Grant local administrator access on every machine to be monitored to a single service account. Clearly this won't fly in organizations concerned with granting the least amount of access needed for their application portfolio. A single service account/group that has administrator everywhere is a significant security risk. 
  • Use Group Policy to grant rights, ACLs, etc. The problem with this approach is that some things cannot be addressed by GPO (WMI) and if granting user rights often you cannot be granular enough since each right defined must list all accounts/groups that get that right. Across an enterprise that is a difficult and potentially costly proposition. The main problem with setting user rights via Group Policy is that the rights are not additive; they overwrite all entries including required system or other application entries. This makes central management of thousands of machines impractical. Say you need to define "act as part of the operating system" for your monitoring software; you would then need to centrally define every account/group that has that right and any machine that needs to define that right in a different way would need to obtain an override GPO for the machines in question. While this is possible, many very large organizations find it difficult to execute.
Neither one of these solutions is desirable in a medium to large sized enterprise due to security requirements and management complexities. To address this issue in a more granular way than the two options listed above, we can use scripts to automate local configuration where necessary.

I'll be using Powershell scripts to perform the following tasks:
  • Add Active Directory groups/users to local groups
  • Grant WMI rights to Active Directory groups/users
  • Grant local user rights to  Active Directory groups/users
Assuming the affected security principals are not set elsewhere, we can set these items with scripts that can be triggered via group policy. I'll be using the GPO startup script functionality, but you can feel free to use any other trigger mechanism you prefer (logon script, etc.). Note that this solution will work for products similar to Splunk that require similar rights.

Execution


First things first, we need to establish what we'll be doing and principals to which we want to grant access.

Assumptions


  • Splunk indexer (or your other monitoring tool) installed and ready to go
  • You have administrative rights on all machines in question and Active Directory rights sufficient to create the service account/group(s)
  • You would rather use the Splunk Indexer to collect information rather than install the Splunk Universal Installer on your target machines (Good option as well). 
  • Powershell 2.0 or newer & Windows 2003 or newer. This could be done with batch/executables or VBScript but I won't be covering that.

Service Account/Group Setup


Per the Splunk instructions, create a service account to run Splunk. This will require administrative access on the machine running Splunk, but nothing else (yet). It would be best to use a managed service account but if you cannot a standard account will be fine.



Splunk instructions deem that an Active Directory group need be created to contain the service account. This isn't really necessary, but there are potential advantages so we'll stick to it.

After creating/configuring that service account, create the service account group and make your service account a member. For the purposes of our demonstration I've named this group "Splunk Accounts"



You should monitor this group to ensure no other accounts are added since doing so would grant those accounts unwanted rights.

Now let's configure services. The service account will need to be local administrator on machines running the Splunk software (splunkd, splunkweb) directly (more on this another time). After granting local admin, stop the Splunkd and splunkweb services, change the logon to your new service account and start them back up.

Now let's tackle the interesting part: granting granular rights to the service account group.

Objective 


Our objective is to grant the appropriate rights to the service account group on every machine in our enterprise that we wish to monitor. Per the documentation from Splunk, the minimum rights to collect Event Log and WMI data remotely on a Windows platform are: 

Misc

  • WMI rights under root/cimv2
  • DCOM launch permissions

User Rights

  • Access this Computer from the Network
  • Act as part of the operating system (I'd like to confirm this one; not sure why)
  • Log on as a batch job
  • Log on as a service
  • Profile System Performance
  • Replace a process level token (not sure about this one either)

Local Group Membership

  • Distributed COM Users
  • Event Log Readers
  • Performance Log Users

I'm sure you can see the why the official documentation prefers local admin rights. :)  Fortunately, we've got automation to do our heavy lifting.

Scripts


Let's examine the scripts we'll be using. I've prepared four scripts for this task: 
  • Powershell script to grant WMI and DCOM rights (Grant-WMIACL.ps1)
  • Powershell script to grant User Rights (Grant-Rights.ps1)
  • Powershell script to add local group membership and tie the others together (Setup-Splunkuser.ps1)
  • Batch file to launch powershell script correctly on all platforms (SetupSplunkuser.bat)

Grant-WMIACL.ps1


As discussed, this script will grant the appropriate WMI and DCOM access. This is based heavily on the work of Steve Lee and Karl Mitschke. Note that I've left options in the script and commented out to change the level of access given to the WMI objects, including the ability to toggle inheritance. I've left these in to help you decipher the SDDL structure. That said, the line that is not commented out in this will work fine for our application. This script could be run remotely (hence the $strComputer="localhost" line) but we'll be using it locally (more on that below).


Grant-WMIACL.ps1 (Click to Expand)  + 

She's a beaut eh? Now for Grant-Rights.ps1:

This one is huge because we're using full .NET code in the beginning. That code is a combination of snippets from here, here, and here. Note the comments displaying the short names for some of the rights we can assign which I've left in to facilitate applying this approach to systems other than Spunk. More can be found here.


Grant-Rights.ps1 (Click to Expand)  + 


Don't let the length of that code scare you off; the only part you need to understand is the last few lines. Now let's examine Setup-Splunkuser.ps1:

This script does a few things: defines the target for these privileges, adds those to a series of groups, and ties the other scripts together. The one thing you'll definitely want to modify is the entry at the top, $SplunkAcct="MYDOMAIN\Splunk Accounts". This should be the domain and group you created to host the service account(s). The next line, $localGroups="Distributed COM Users","Event Log Readers","Performance Log Users" local groups to which you want to add AD groups to, each in quotes and separated by commas. What I've set it to here is perfect for Splunk, but feel free to change it for your needs.


Setup-Splunkuser.ps1 (Click to Expand)  + 

Now for the final piece to the puzzle: SetupSplunkuser.bat:

This script exists to provide compatibility with 2008 (not R2) and lower. Native Powershell scripts run correctly on 2008R2 and higher, but a batch launched script will run from 2003 up with no problem. This will be the script we'll use to kick everything off.

powershell.exe -nologo -noexit -file %~dp0\Setup-SplunkUser.ps1

Now that we have our scripts in place, we need to do the hard part. Run them everywhere we would like in the enterprise.

Running the Scripts on Target Machines


As discussed earlier, there are several ways to accomplish this. The solution I'll be using is Group Policy by utilizing the Startup Script functionality in the Computer portion of the policy. Setting up the group policy is easy, the difficult part is getting the scripts to run correctly, but we'll worry about that when we get there. 

Determine Your Target Machines


If you want to configure Splunk access rights across all machines in your domain you can plan on using the default domain policy GPO to roll out these changes. If you're reading this article, however, I suspect you want to be more targeted with your application of changes.

I recommend using security filtering against the desired groups of computers. If you don't already have groups you would like to use, create a security group in Active Directory and add your desired machines to it. To do this on a large scale, you can use something like this.

After you have your target group(s) set up, you need to configure the group policy you plan on using to execute the script. If this GPO doesn't yet exist, create it and apply the security filtering listed above. Edit the GPO using GPMC:

  1. After opening the GPO, navigate to "Computer Configuration->Policies->Windows Settings->Scripts (Startup/Shutdown)


  2. Select "Startup". 
  3. Click "Show Files..."; this will open a folder on your sysvol volume associated with this Group Policy Object. 


  4. After modifying accordingly (Setup-Splunkuser.ps1) copy all four scripts into the folder. These will be automatically replicated to all domain controllers in the domain. 


  5. Back on the "Startup Properties" screen, ensure the "Scripts" tab is selected (not Powershell Scripts) and click "Add". 
  6. Type or browse to SetupSplunkUser.bat. And click "OK" twice. 


This now is triggered to run on reboot for all impacted machines, but we're not done yet. We still have to navigate the oddly challenging world of running Powershell scripts in the Windows environment.

Running Powershell Scripts in a Distributed Environment - The Problem


The hurdle using this solution is a larger problem: By default, Powershell scripts are super scary and won't run in your environment for two reasons:

  • By default, running scripts requires those scripts be signed. 
  • By default, scripts will not run from the "internet", and for some reason the UNC path to the domain controller that may have just authenticated you is considered the "internet" by Windows. 

More here. There are two ways to solve this problem. 
  • Secure Way/PKI(Recommended): If you have an internal PKI you can sign all your scripts AND add the signer cert(s) to the trusted publisher store on all targeted machines in Active Directory. 
  • Not-So Secure Way/Disable Security: You can run scripts from your DC by setting the execution policy in Powershell to "Unrestricted" and adding your domain controller sysvol share to the trusted sites zone in Internet Explorer configuration. 
I will briefly discuss each solution. 

Secure Solution - PKI


This is the preferred and most secure method. While you could do this with certificates from a public authority, it is recommended that you use your own Active Directory integrated PKI for this. Here is a very high-level overview of the steps:

  1. Acquire a code signer cert from your PKI. If using a Windows PKI you can use the "Code Signing" template, but it really should be customized first. Export or save the public key (.cer) because we'll need it later. 
  2. Sign each script with the following code (substituting scriptname.ps1 with the location of your script): 
  3. #This assumes you have only one code signing script
    #use Get-Childitem cert:\CurrentUser\My -codesigning to see all and change index if necessary
    
    $cert=@(Get-Childitem cert:\CurrentUser\My -codesigning)[0]
    
    #Do this for each script and change scriptname.ps1 to the actual script name
    
    Set-AuthenticodeSignature .\scriptname.ps1 $cert
  4. Copy the scripts into the sysvol share as outlined above. 
  5. Edit the GPO you would like to use to distribute the script... the same one you're using to run the scripts above would be the perfect match. 
  6. Navigate to "Computer Configuration->Policies->Windows Settings->Security Settings->Public Key Policies->Trusted Publishers"
  7. Right click->Import and select your public key (.cer) corresponding to your code signing cert. 
Close the GPO and you should now be good. Once the certificate propagates to the target machines they'll run the signed scripts without issue. 

Alternative Solution A - Disable Security for this Script

  1. Edit the SetupSplunkUser.bat script and change the line to:
  2. powershell.exe -ExecutionPolicy Bypass -nologo -noexit -file %~dp0\Setup-SplunkUser.ps1
    
  3. Upload the file to your sysvol share as described above. 
  4. If you have issues running the other Powershell scripts that are called from the first, you may have to use some trickery: such as this.

Alternative Solution B - Disable Security Permanently

  1. Edit the GPO you would like to use to distribute the script... the same one you're using to run the scripts would be the perfect match. 
  2. Navigate to "Computer Configuration->Policies->Administrative Templates->Windows Components->Windows Powershell"
  3. Edit "Turn on Script Execution" and change it to "Enabled" and "Allow all scripts". Click OK and close the GPO. 
  4. Add your domain sysvol path (ensure you use your domain name, not your domain controller name) to the trusted sites zone: see here and here. OR change the UACAsInternet property; see here

That should do it! Remote eventlog, WMI, and most other types of remote collection should now work using Splunk.

Troubleshooting


If you have issues getting or determining if the scripts are running successfully, try the following:

  • Try executing SetupSplunkUser.bat from an elevated command prompt on one of the target machines. Note any issues. 
  • To audit running in a test group, you can add a line like "ran script Setup-SplunkUser.ps1" | Out-File "c:\temp\temp.txt" to create a log of the script execution on each machine. 
  • Script execution should be logged in the Application Log of each server. 
With this information you should be able to better control the service rights of Splunk or similar software in a large enterprise environment and still keep your auditors happy. For an added bonus, you could use Splunk itself for security auditing now that your're done. 


Sunday, December 8, 2013

Windows 2012 to 2012R2 In-Place Upgrade (Not recommended) Wipes out Network Teams, vSwitches


Preparing for an in-place upgrade. @ NFL I own the rights to this? :)

Word to the wise: if you plan on doing an in-place upgrade from server 2012 to 2012R2 be prepared to re-create all your OS level network teams. If you upgrade a Hyper-V host and your teams were used for switches, be ready to re-create those as well.


I don't recommend doing an in-place upgrade in the first place for reasons like this, but I figured I'd give it a shot on one of the hosts in my lab. Every OS level network team was wiped out, but easily recreated.


Other than this I haven't found any other in-place upgrade issues... yet, and that's why I'm always hesitant to recommend the practice. It's rare you do find what the issue are until the become a problem.

Any issues you've encountered? Leave 'em below!

Monday, October 28, 2013

Windows 8.1 Exhibits Gaming Performance Deficit

Microsoft just released Windows 8.1, and with that I subjected the latest version of the operating system to some gaming performance testing. The results illustrate a small decline in the performance of Windows when moving from 8.0 to 8.1.



System as Tested


  • Asus z87-Pro Motherboard
  • Intel i7-4770k @ 4.7Ghz/4.4Ghz Uncore
  • Corsair Dominator Pro 16GB (8x2) @ 2133, 10-11-11-27
  • Nvidia 670 2GB @  995/3204
    • ForceWare 320.49 on both 8.0 and 8.1
  • Sound Blaster ZxR Rendering in 5.1
  • "Upgrade Install" of 8.1 from 8.0. 8.0 was installed fresh. 
  • All drivers the same 
  • App/Service footprint nearly identical

Testing Methodology


For the last 13 years or so, I've used a benchmarking script to test gaming performance. This script performs automated testing and logs results from each game to a central file. The script runs each game n (i.e. as many as I would like) times and then generates a median (rather than average). I have added games to this periodically since its inception.

I say all this because you'll notice there are a few old games listed. I have found including these results useful to judge CPU/Memory performance and identify bottlenecks specific to different APIs.

For these tests, each game was run at least three times (most more) for each set of settings. There are 4 tested settings that apply to most games: 4xAA/16xAnisotropic filtering, 0x/0x, and all combinations thereof. Normally these results would be split, but for the purposes of this platform comparison they have been combined. Bioshock Infinite does not support any AA, so that is not affected by the AA setting. All tests are performed at 1600x1200 which maintains compatibility with the older titles and while being a less than ten percent deviation from the 1080p effective pixel count.

For the first set of results there were over 250 test runs. These runs have been compared in total between Windows 8.0 and Windows 8.1 on a per game basis to generate a % performance change.

Games tested as follows: (in order of age, oldest->newest)
  • Quake 3 (1999, OpenGL)
  • Comanche 4 (2001, D3D8)
  • Doom 3 (2004 OpenGL)
  • Serious Sam 2 (2005 OpenGL)
  • Company of Heroes (2006 DX10)
  • Crysis (2007 DX10 Very High Settings)
  • Dirt 2 Ultra (2009 DX11Ultra Settings)
  • Alien vs. Predator (2010 DX11)
  • Bioshock Infinite Ultra (2013 DX11 Ultra Settings)
  • Unigine Valley (2013 DX11)

Results 


As illustrated in the bar chart below, the results indicate that Windows 8.1 lags in performance when compared to Windows 8.0 original release.



Game; All Scores Median of Multiple Runs 8.0 4xAA 16xAni 8.1 4xAA 16xAni % Diff 8.0 0xAA 0xAni 8.1 0xAA 0xAni % Diff 8.0 4xAA 0xAni 8.1 4xAA 0xAni % Diff 8.0 0xAA 16xAni 8.1 0xAA 16xAni % Diff
Quake3 904.4 904.73 0.04 912.37 910.97 -0.15 903.43 903.8 0.04 911.1 908.6 -0.27
Comanche4 198.47 198.5 0.02 198.53 198.97 0.22 198.33 199.87 0.78 199.5 198.93 -0.29
Serious Sam 2 486.7 469.6 -3.51 549.3 536.2 -2.38 521.5 507.5 -2.68 547.9 538.7 -1.68
Doom3 338 334.7 -0.98 473.2 472.1 -0.23 339.8 337.7 -0.62 468.5 472.7 0.90
Company of Heroes_DX10 261.55 246.5 -5.75 290.25 268.05 -7.65 267.3 248.7 -6.96 285.15 264.05 -7.40
Crysis_Island_DX10 63.06 62.12 -1.49 81.16 79.59 -1.93 66.03 64.73 -1.97 77.82 76.32 -1.93
Dirt 2 DX11_Ultra 129.46 127.62 -1.42 167.47 165.05 -1.45 142.65 139.97 -1.88 155.99 154.13 -1.19
Alien vs. Predator DX11 63 62.2 -1.27 110.3 108.6 -1.54 65.4 64.6 -1.22 104.6 103.3 -1.24
Bioshock Infinite Ultra_DX11 100.4 99.29 -1.11 99.73 99.11 -0.62 99.53 97.93 -1.61 100.2 98.48 -1.72
Unigine_Valley 58.63 58.28 -0.60 79.75 79.67 -0.10 60.13 59.08 -1.75 78.25 77.6 -0.83


While most percentages are relatively low, the trend appears consistent and the high sample size lends to the reliability of the data. I find this a bit troubling since there are no obvious reasons performance degradation should be introduced when looking at the 8.1 changes. Note there is still work to do; I'd like to get additional opinions including ATI results.

Now let's take a look at Futuremark 3dMark Advanced Edition. These tests were run through a couple times each and averaged. They were performed on default settings. (1080p/720p/720p)





3DMark Test win 8.0 win 8.1 % Diff
Fire Strike 6560 6502 -0.88
Cloud Gate 25138 24331 -3.21
Ice Storm 179351 176476 -1.60


Again, consistent downturns in performance.

This is where I was going to publish this article, but as my finger hovered over the "Publish" button I had one lingering concern: What if I'm wrong? What if I missed a setting or something in the upgrade that might have this impact?

So Then We Did it Over Again


I wouldn't feel right publishing this article without doing a clean install and absolute parity of all operating systems involved. For the sake of being complete, I included a Windows 7 install as well. What I found stunned me... 8.1 isn't the start of a downturn, it's a continuation. On a smaller subset of tests I performed the same amount of runs. (average of 14 runs per title) The operating system/drivers were as follows:

  • Windows 7 SP1 fully patched, clean install
  • Windows 8 fully patched, clean install
  • Windows 8.1 fully patched, clean install
  • Nvidia driver version 327.23 on all platforms
 Here are the results using Windows 7 as baseline:



As you can see, Windows 8.1 lags behind Windows 8 and Windows 8 lags behind Windows 7. This is a troubling trend. Here's 3dMark (at least 3 runs per test):



The "Fire Strike" test gives us the only real hope of progress in Windows 8, but Windows 8.1 gives that gain right back. Overall, more troubling results.

This, coupled with the lingering mouse and input issues make for a bit of a shaky upgrade to 8.1 from a gaming perspective. As I outlined in my previous article, I believe it's time for Microsoft to step up their game (hah!) in light of new market pressures. This trend is unfortunate since there are a lot of minor changes in 8.1 that are a step in the right direction. (UI, etc.) Let's hope they iron out this performance issue quickly.

Update (11/4/2013): It looks as if Microsoft has acknowledged the mouse issue which I expect means a fix would be coming. Let's hope performance and other issues are addressed as well.

Update (11/9/2013): Microsoft has released a patch to deal with mouse issues.