Google Website Translator Gadget

Saturday, 31 October 2009

My Ultimate .NET Development Tools 2010 Edition

Here is my 2010 updated list of development tools that I prefer to use when doing .NET development.  I specifically decided to not include any third party control/report libraries.  I focus instead on the tools that assist me in crafting high-quality code quickly and effectively.

Categories

  • IDE = Develop/generate/refactor code within the VS IDE or separate IDE
  • SCM = Software Configuration Management (Source Control etc.)
  • TDD = Test Driven Development
  • DBMS = Database Management Systems
  • CI = Continuous Integration
  • FR = Frameworks (Persistence, AOP, Inversion of Control, Logging etc.)
  • UT = Utility Tools
  • CA = Code Analysis (Static + Dynamic)
  • TC = Team Collaboration (Bug tracking, Project management etc.)
  • MD = Modelling
  • QA = Testing Tools
  • DP = Deployment (Installations etc.)

 

Tools

* = free/open source
  1. [IDE] Visual Studio 2010 Premium Edition
  2. [IDE] ReSharper for refactoring, unit test runner and so much more
  3. [IDE] CodeSmith for generating code.  Also consider T4 with Clarius’s Visual T4 Editor.  
  4. [IDE]* GhostDoc for inserting xml code comments
  5. [IDE] Altova Xml Suite for any xml related work.  XmlPad is the best, free alternative I know of.
  6. [DBMS] SqlServer 2008 for DBMS
  7. [SCM]* Subversion for source control
  8. [SCM]* TortoiseSVN as windows shell extension for Subversion
  9. [SCM] VisualSVN for integration of TortoiseSVN into VS.  AnkhSVN is the best, free alternative I know of.
  10. [SCM]* KDiff3 for merging
  11. [TDD]* NUnit as preferred xUnit testing framework
  12. [TDD]* moq as mock framework.
  13. [TDD] NCover for code coverage stats
  14. [CI]* TeamCity as build server
  15. [CI]* MSBuild Extension Pack for additional MSBuild tasks.
  16. [FR]* log4net as logging framework.  Also see Log4View for an excellent UI for the log files.
  17. [FR]* ANTLR and ANTLRWorks for creating custom DSL’s.
  18. [FR] PostSharp as Aspect Oriented Programming framework
  19. [FR]* Ninject as IoC container
  20. [FR]* RunSharp for generating IL at run-time
  21. [FR] MindScape LightSpeed as my Object-Relational-Mapper.  NHibernate is the best free alternative I’m aware of. 
  22. [UT]* Reflector to drill down to the guts of any code library (also check-out the nice plug-ins)
  23. [UT] Silverlight Spy to dissect any Silverlight application.
  24. [UT] RegexBuddy for managing those difficult regular expressions.  Regulator is the best, free alternative I know of. 
  25. [UT]* LINQPad as a easy way to query SQL databases using LINQ and as a general scratchpad application to test C#/VB.NET code snippets.
  26. [UT]* Fiddler to debug all your HTTP traffic in IE.   Also see the neXpert plugin for monitoring performance problems.
  27. [UT]* Firebug to assist with testing web applications running in Firefox. Also see YSlow add-on for performance testing and Web Developer add-on for additional Firefox web development tools.
  28. [CA]* FxCop to enforce .NET coding guidelines
  29. [CA] NDepend to get all the static code metrics I'd ever want
  30. [CA] ANTS Profiler for performance and memory profiling
  31. [MD] Enterprise Architect to do UML Modelling and Model Driven Design if required. Alternatively use Visio with these simple templates
  32. [MD]* FreeMind as mind mapping tool
  33. [TC]* ScrewTurn Wiki for team collaboration
  34. [QA]* Eviware soapUI for functional and load testing of SOA web services
  35. [QA]* Telerik WebAii Testing Framework for automated regression testing of Web 2.0 apps
  36. [DP]* Windows Installer XML (WiX) for creating Windows Installers

Migrating blog onto Blogger

I am in the process of migrating my old blog onto blogger as the current host seems to be extremely slow these days with a lot of spam information as well.  Those of you using my FeedBurner subscription won’t have to do a thing as I’ve rerouted it to my new blogger site.  I’ll be re-posting some of the most popular content of my old blog just to ensure the continuity of the information going forward.  Sorry for the inconvenience :-|

Integrating your Silverlight Test Run Results into TeamCity

We’ve been using TeamCity with great success at our company to do continuous integration.  We have build configurations defined for building and deploying our On Key suite of software;  for running our suites of tests and also for generating our static code analysis metrics. 

Problem

One of the problems we faced was integrating the Silverlight UI test results generated using the Microsoft Silverlight Test Framework into TeamCity.  TeamCity comes with out-of-the-box support to automatically pick up and display the tests results generated by the NUnit and MSTest report formats.  However, the Microsoft Silverlight Test Framework works differently to NUnit and MSTest as it runs as a Silverlight application.  The same Silverlight security sandbox restrictions therefore also apply to the Silverlight Test Framework.  This makes it more difficult to get the test results out of the framework as the results are not written to a file like NUnit or MSTest.  Fortunately TeamCity allows you to integrate the test results of any framework by writing special ServiceMessage commands as part of your build output.  TeamCity will listen for these commands and interpret and display the information as part of your build results on the portal. 

A Solution That Works

So we initially solved the problem by automating the test run and screen scraping the results from the Silverlight Test run page.  From the screen scraped results we created TeamCity Service Messages and wrote these messages as part of our build output.  I got the screen scraping idea from this blog post by Jonas Follesoe.  This worked, but it was a clumsy solution at its best as parsing the html, looking for specific div’s to find out whether the tests failed or not was very error prone.  We also had no guarantee that the HTML format for the test report would stay the same between subsequent releases of the Silverlight Test Framework.  However, the real show stopper was that we ran into an issue with TeamCity when the tests failed.  The problem occurred because the test exception would contain invalid XML characters.  When TeamCity tried to run the failed tests, the XML RPC communication between the build agent and TeamCity server would fail due to the invalid characters in the XML stream.  This resulted in the build going into a loop as the TeamCity Server was not able to pick up any response from the Build Agent it was trying to run the build on.

A Solution That Rocks

Instead of trying to panelbeat the existing solution, I decided to investigate alternatives and started by using trusty Reflector to look at the source code of the Silverlight Test Framework.  I wanted to see whether there was not a more elegant way in which to report on the test results.  (Btw, you don’t need to use Reflector – you can download the source code of the Silverlight Test Framework as it is included as part of the Silverlight Toolkit).  Sure enough, I discovered that the framework already included the necessary extensibility points for plugging in your own reporting mechanism.   The LogProvider base class provides the core API for creating your own logger that the Silverlight Test Framework will then call into for you to process the test results.  There is also a TestReportService that seems like the mechanism to use to write the log output by exporting it via a service call.

So the idea was to implement my own TeamCityServiceMessageLogProvider that would write the Test results into ServiceMessages that TeamCity understands.  The implementation turned out to be quite straightforward (download from here).  The Logger opens up an IsolatedStorageFileStream and just writes the test results received to the stream.  You plug the logger into the Silverlight Test Framework by adding it as an additional LogProvider in your App startup:

private void Application_Startup(object sender, StartupEventArgs e)
{
   AppSettings.Initialize();

   UnitTestSettings settings = UnitTestSystem.CreateDefaultSettings();
   settings.LogProviders.Add(new TeamCityServiceMessageLogProvider());

   RootVisual = UnitTestSystem.CreateTestPage(settings);
}
So this took care of getting the results in the right format, but I still had to figure out how to get the results out of the Silverlight Test Framework into TeamCity.

Some further reflectoring showed that these extensibility areas also seemed to already exist within the Silverlight Test Framework, but I was unable to get them wired up and working correctly.  So I resorted to implementing my own solution by using a WebClient to upload the streamed results onto our web server for reporting to TeamCity. The first thing I had to do was to hook into the Publishing event of the Test framework to allow my custom logger to upload the results.  For this I had to implement the ITestSettingsLogProvider interface that provides an Initialize method that will be invoked by the Silverlight Test Framework.  I then hooked into the Publishing event as follows:

public void Initialize(TestHarnessSettings settings)
{
  Settings = settings;
  UnitTestHarness testHarness = Settings.TestHarness as UnitTestHarness;
  if (testHarness != null)
  {
    testHarness.Publishing += ((sender, e) => PostTestResults());
  }
  
  Store = IsolatedStorageFile.GetUserStoreForApplication();
  Stream = Store.CreateFile(LogFile);
  Writer = new StreamWriter(Stream);
}
In the PostTestResults method, I upload the results on to the server:
private void PostTestResults()
{
   WebClient client = new WebClient();
   client.OpenWriteCompleted += (sender, e) =>
                                {
                                    Stream input = e.UserState as Stream;
                                    Stream output = e.Result;

                                    byte[] buffer =  new byte[4096];
                                    int bytesRead = 0;
                                    input.Position = 0;
                                    while ((bytesRead = input.Read(buffer, 0, buffer.Length)) > 0)
                                    {
                                        output.Write(buffer, 0, bytesRead);
                                    }
                                    output.Close();
                                    input.Close();
                                };

   Writer.Flush();
   client.OpenWriteAsync(new Uri("http://localhost/OK52/UnitTest.aspx?Results=true", UriKind.Absolute), "POST", Stream);
}
Notice the use of the Results=true query string parameter.

On the server side, we have a UnitTest.aspx web page that we use as the start page for our Silverlight Tests.  A nice feature that I picked up from another excellent Jonas Follesoe post is that you can pass query string parameters onto the test page and use these as initialization parameters for your Silverlight application.  We use this to restrict the suite of tests being run by using the Tag feature of the Silverlight Test Framework -  i.e. http://localhost/OK52/UnitTest.aspx?Tag=Staff to only run the Staff tests.  We also use this to configure the user that is logging on to the system – i.e. http://localhost/OK52/UnitTest.aspx?Tag=Staff&User=Carel&Password=secret.  When reporting the test results, we post to the same UnitTest.aspx page but send through the Results=true query string parameter to indicate that we want to upload the test results onto the server.  The results are then written to a file on the server for further processing as illustrated below:

public void Page_Load(object sender, EventArgs e)
{
  if (string.IsNullOrEmpty(Request.QueryString["Results"]))
  {
    if (!string.IsNullOrEmpty(Request.QueryString["Tag"]))
    {
      Tests.InitParameters = "Tag=" + Request.QueryString["Tag"] + ",";
    }
    if (!string.IsNullOrEmpty(Request.QueryString["UserName"]))
    {
      Tests.InitParameters += "UserName=" + Request.QueryString["UserName"] + ",";
    }
    if (!string.IsNullOrEmpty(Request.QueryString["Password"]))
    {
      Tests.InitParameters += "Password=" + Request.QueryString["Password"] + ",";
    }
    if (!string.IsNullOrEmpty(Request.QueryString["DB"]))
    {
      Tests.InitParameters += "DB=" + Request.QueryString["DB"] + ",";
    }
  }
  else
  {
    StreamReader inStream = new StreamReader(Context.Request.InputStream);
  
    string filePath = Server.MapPath(@"~\Logs\SLTests.log");
    FileStream outstream = File.Open(filePath, FileMode.Create, FileAccess.Write);
    
    // Read from the input stream in 4K chunks and save to output stream
    const int bufferLen = 4096;
    char[] buffer = new char[bufferLen];
    int bytesRead = 0;
    while ((bytesRead = inStream.Read(buffer, 0, bufferLen)) > 0)
    {
      outstream.Write(Encoding.UTF8.GetBytes(buffer, 0, bytesRead), 0, bytesRead);
    }
    outstream.Close();
    inStream.Close();
  }
}          
Once the file has been uploaded, it is a simple matter of echoing the test results back to the TeamCity server by writing it to the NUnit TestContext: 
[TestFixture]
[Category("UI")]
public class RunOnKey5ClientIntegrationTests : BaseTest
{
  private const string OnKeyUri = "http://localhost:80/OK52/UnitTest.aspx";
  private const string LogPath = @"..\..\..\Pragma.OnKey5.Server\Logs\";
  private const string SilverlightTestsLog = "SLTests.log";
  
  [Test]
  public void RunSilverlightTests_UsingTeamCityServiceMessageLogger_ToShowResultsOnTeamCityPortal()
  {
    // Clear the old results
    if (File.Exists(LogPath + SilverlightTestsLog))
    {
      File.Delete(LogPath + SilverlightTestsLog);
    }
  
    using (FileSystemWatcher watcher = new FileSystemWatcher(LogPath, SilverlightTestsLog))
    {
      Manager.LaunchNewBrowser();
  
      // Navigate to the VDir hosting the tests
      ActiveBrowser.NavigateTo(new Uri(OnKeyUri));
  
      watcher.EnableRaisingEvents = true;
      WaitForChangedResult result = watcher.WaitForChanged(WatcherChangeTypes.Created);
      if (!result.TimedOut)
      {
          // Echo the TeamCity ServiceMessages so that the test results can be picked by the Portal
          string[] lines = File.ReadAllLines(LogPath + SilverlightTestsLog, Encoding.UTF8);
          foreach (string line in lines)
          {
              TestContext.Out.WriteLine(line);
          }
      }
    }
  }
}
You’ll notice that the whole test run is managed as a NUnit test fixture that is flagged with “UI” as a Category.  This allows me to use the nunit-console.exe to run the specific tests using the /include=UI command line parameter.  I use the FileSystemWatcher class to wait until the SLTests.log file has been published and them echo the results to TeamCity by writing it to the NUnit TestContext.

Conclusion

I hope you’ll find this information useful for publishing the results of your own Silverlight Test Runs during your automated builds.  Jeff Wilcox has mentioned that more of the build automation extensibility points of the Silverlight Test Framework will be made known in future releases.  Until such time this is one way of doing it.