Whilst moving over to our new TFS2012 system I have been editing build templates, pulling the best bits from the selection of templates we used in 2010 into one master build process to be used for most future projects. Doing this I have hit a couple of problems, turns out the cure is the same for both
Problem 1 : When adding custom activities to the toolbox Visual Studio crashes
See the community activities documentation for the process to add a items to the toolbox, when you get to step to browse for the custom assembly you get a crash.
Problem 2: When editing a process template in any way the process is corrupted and the build fails
When the build runs you get the error (amongst others)
The build process failed validation. Details:
Validation Error: The private implementation of activity '1: DynamicActivity' has the following validation error: Compiler error(s) encountered processing expression "BuildDetail.BuildNumber".
Type 'IBuildDetail' is not defined.
Turns out the issue that caused both these problems was that the Visual Studio class library project I was using to host the XAML workflow for editing was targeting .NET 4.5, the default for VS2012. I changed the project to target .NET 4.0, rolled back the XAML file back to an unedited version and reapplied my changes and all was OK.
Yes I know it is strange, as you never build the containing project, but the targeted .NET version is passed around VS for building lists and the like, hence the problem.
I am pleased to say that Black Marble will be doing a session at the Leeds Sharp User group on the evening of the 30th of August on the Samsung SUR40 with Microsoft PixelSense (what used to be call a Surface2).
Should be an interesting session as we will be bringing one of our rental units along for everyone to have a go on.
DDD10 registration opens today, and probably closes about 10 minutes later (server overloads allowing) from past experience. If you can’t make the date, or the don’t fancy the trip down south to TVP remember DDDNorth in October at Bradford University has many of the same speakers and is located in a far more convenient Yorkshire location.
Whilst moving all our older test Hyper-V VMs into a new TFS 2012 Lab Management instance I have had to address a few problems. I already posted about the main one of cross domain communications. This post aims to list the other workaround I have used.
MTM can’t communicate with the VMs
When setting up an environment that includes existing VMs it is vital that the PC running MTM |(Lab Center) can communicate with all the VMs involved. The best indication I have found that you will not have problems is to use a simple Ping. If you are creating a SCVMM environment you need to be able to Ping the fully qualified machine name as it has been picked up by Hyper-V e.g: server1.test.local. If creating a standard environment you only need to be able to Ping the name you specify for the machine e.g: server1 or maybe server.corp.com.
If Ping fails then you can be sure that the MTM create environment verify step will also fail. The most likely reasons both are failing are
- There are DNS issues, the VM names are missing, leases have expired, they are not in the domains expected or are just plain wrong. I found the best solution for me is to edit the local hosts file on the PC running MTM. Just add the name and fully qualified name as well as the correct IP address. You should then be able Ping the VM (unless there is a firewall issue, see below). The host file is only needed on the MTM PC whist the environment is created, once the environment is setup the hosts file is not needed.
- File and print sharing needs to be opened through the firewall on the VM (control panel > firewall > allow applications through firewall)
- Missing/out of date Hyper-V extensions on the VM. This only matters if it is a SCVMM environment being created as this is how the fully qualified is found. This is best spotted in MTM as you get a error on the Machine properties tab. The fix is to reinstall the extensions via the Hyper-V Manager (Actions > Insert Integration Services Disk, and maybe run the setup on the VM if it does not start)
Can’t see a running VM in the list of available VMs
When composing an environment from running VMs one problem I had was that though a VM was running it did not appear in the list in MTM. This turned out to be due to the fact that the VM had meta data associating it with an different environment (in my case a dating back to our TFS2010 instance).
This is easy to fix, in SCVMM or Hyper-V Manager open the VM settings and make sure the name/ note field (red box below) is empty.
Once the settings are saved you will have to wait a little while before SCVMM picks up the changes and lets you copy of MTM know the VM is available.
I don’t know about your systems but historically we have VMs running in test domains that are connected to our corporate LAN. Thus allowing our staff and external testers to access them from their development PC or through our firewall after providing suitable test domain credentials. These test setups are great candidates for system to use the new TFS Lab Management 2012 feature Standard environments. It does not matter if they are hosted as physical devices, Hyper-V or VMware.
However, the use of separate domains raises issues of cross domain authentication, irrespective of the virtualisation technology. It is always a potentially confusing area. If we want the ability to use the deployment and testing features of Lab Management what we need to achieve is Test Agents on each VM, that talks to a Test Controller which is registered to a TFS Team Project Collection. Not too easy when spread across multiple domains.
With TSF2012 the whole process of getting agents to talk to their controller was greatly eased. Lab Management does it for you much of the time if you provide it with a corp\tfslab domain account who is a member of the Project collection test service accounts group in TFS.
The summary of the scenarios is as follows
|Scenario ||How to achieve it |
|If your test VMs are in either a SCVMM managed or standard environment but are joined to your corp domain ||Lab Management wires it all up automatically using your corp\tfslab account |
|If your test VMs are in either a SCVMM managed or standard environment that is not domain joined i.e: just in a workgroup ||Lab Management wires it all up automatically using your corp\tfslab account |
|If your test VMs are in a SCVMM managed network isolated environment ||Lab Management wires it all up automatically using your corp\tfslab account |
|If your test VMs are in either a SCVMM managed (not network isolated) or standard environment and are in their own test domain ||You have to do some work |
If like me you end up with the fourth scenario, the key is to provide a test controller within the test domain. This must be configured to talk back to TFS on the corp domain. This can all done with local machine accounts on the test controller and TFS server with matching names and passwords, what I think of as shadow accounts.
So for example, we have the following scenario of a corp domain with a DC and various TFS servers and controllers and a test domain containing three servers.
So the process to get the test agents on the test domain talking to TFS on the corp domain is as follows:
- On the TFS server (called tfsserver.corp.com in above graphic)
- Open the Control Panel > Computer Manager and create a new local user called tfslabshadow. Set the password and that the user does not need to change it on first login and that it does not expire
- In the TFS administration console add the new user tfsserver\tfslabshadow to the Project collection test service accounts group
- On a machine (called server.test.local in above graphic) within the test domain, note this is any VM running Windows other than the DC
- Open the Control Panel > Computer Manager and create a new local user called tfslabshadow with the same password as on the same account on the tfsserver
- Add this user to the local administrators group for that server.
- Login as this user
- Install the Visual Studio 2012 Test controller
- When the installation is complete the configuration tool will launch. Set the service to run as the tfslabshadow and register it to connect to the TFS server with this account too.
Note 1 When you first load the configuration tool you need to browser for the TFS server and enter its URL. If you have your shadow accounts working correctly you should not need to enter any other credentials at this point.
Note 2 You can enter the local user name in either the .\tfslabshadow or server\tfslabshadow format
- If you have all the settings correct then you should be able to apply the changes without any errors and the new test controller should be registered. If you get any error they usually are fairly clear at this point, you probably forgot to place a user in some group somewhere.
- From a PC running Test Manager 2012 (MTM) on the corp domain
- Go into the Lab Center
- Create a new environment (can be SCVMM or Standard) containing the machines in the test domain (or open an existing environment if you have one that was not correctly configured)
- On the Advanced tab you should be able to select the new test controller server that is hosted within the test domain
- You can make any other setting changes you require (remember on the machines tab to enter the test domain login credentials, they will have default to your current ones). When you are done you can select Verify. I had problem here due to DNS entries. From the PC running MTM I could ping server, but MTM was trying to communicate with using the name server.test.local. To get around this I added an entry in my local host files. I have also a seen VMs that are not registered in DNS at all, again a local hosts file fixes the problem. This is only required for the initial verification and deployment/configuration once this is done the host entries can be removed if you want.
- Once verification passed save the changes and after a short wait the environment finished configuring itself showing no errors
So I hope I have provided a step by step to help you get around issues with cross domain testing in Lab Management. However, it is still important to remember the exceptions
- As we are using local machine accounts you cannot have the TFS server or the Test controller running on a domain controller (as a DC cannot have local machine accounts). If your environment is a single box that is a DC then you either have to setup a cross domain two way trust between test and corp or rebuild the environment as a workgroup or network isolated environment.
- The shadow account cannot have the same name as the corp\tfslab accounts e.g: tfslab. If this you try to use the same name the resolving of the two local machine accounts will fails as on the TFS server end it will not be able to decide whether to use corp\tfslab or tfsserver\rfslab
For more details on this general area see MSDN
On site recently I had a problem that I could not access the site settings in reporting services if I used Internet Explorer from an client PC. IE worked fine on the server and other browsers were OK on the client, just not IE. Initially I though it was just rights, but that was not the case
Turns out this is down to Kerberos negotiation as discussed in the MSDN article. To fix the issue, on this site where we did not need Kerberos, we just disabled Kerberos negotiation in the [Program files]\Microsoft SQL Server\MSRS10.MSSQLSERVER\Reporting Services\ReportServer\RSreportServer.config file, e.g.
<RSWindowsKerberos /> <RSWindowsNTLM />
If you need Kerberos you need to sort out the SPNs as detail in the MSDN post
I am really happy to say that I have had my Microsoft MVP for Visual Studio ALM Re-awarded, it is a privilege to get to work with such a great group of people as a have met via the MVP programme.