Google Website Translator Gadget

Tuesday, 30 November 2010

Pragma On Key – Release 3 Milestone Reached

I’m watching the final TeamCity daily build as I type this and I’m super excited about the next major milestone that we’ve accomplished for Pragma On Key, our company’s Enterprise Asset Management System (EAMS).  It’s been 30 months of hard work to re-write and extend the system using Silverlight and .NET.  We started off as early adopters with the Silverlight Beta’s and .NET 3.5 and with the latest build we are using Silverlight 3 and .NET 4.0 as the technology platform.  We evaluated Silverlight 4 but due to some unresolved memory issues in the framework we decided to stick with Silverlight 3 for now.  We hope to be upgrading to Silverlight 4 soon here after.

I’m very proud of the way the team pulled through all the difficult times to end up with what I feel is a solid platform to build on for the future.  As always there is still a lot of things we can and will improve on going forward, but I think we have come along way since the start of the project.  From adopting a new SDLC (SCRUM) to learning a complete new technology skill set (C#; Silverlight, TDD, CI etc.) and in the process close to doubling in team size. Currently we have 2 Analysts, 5 Testers, 13 Developers (me included) and a Software Development Manager.   I can honestly say that within the 2.5 years we only had one or two isolated incidents with the team dynamics.  I think this is a great testimony to the quality of the people working here at Pragma.

We are now entering the final phases of testing the new release of On Key at a big international client who is planning to go live with the system end of Feb 2011.  So after more than 20 000 CI builds, 230 tested builds, 26 820 commits and with just more than 9000 unit and integration tests it is great to finally see the new version of On Key coming of age!  We’ve got some real nifty features included in this release and I think customers will enjoy the new web based interface as well.  I hope to be blogging a bit more about this and our internal development environment in the future.  Well done to all involved!

image

Sunday, 21 November 2010

TransactionScope Fluent Interface

During the past 2 weeks I’ve been spending a lot of time investigating the performance of Pragma On Key, our company’s Enterprise Asset Management System (EAMS).  Part of the investigation was to resolve some issues we were encountering with deadlocks occurring for some of the longer running transactions.  We are using the TransactionScope class added in .NET 2.0 to mark blocks of code in our Application Controllers as participating in a transaction.  Out of the box, the default IsolationLevel used by new a TransactionScope instance is Serializable.  This ensures the best data integrity by preventing dirty reads, nonrepeatable reads and phantom rows.  However, all of this comes at the cost of concurrency as the range lock acquired by the DBMS prevents other transactions of updating or deleting rows in the range.   So I set off to carefully consider the locking requirements for the different transactions to make sure we are acquiring the right level of locking for the different transactions.

I’ve always found the different settings of the TransactionScope class to be rather confusing.  I always need to remind myself what the different items of the IsolationLevel and TransactionScopeOption enumerations actually mean.  There is also the tricky scenario where, when passing in a new TransactionOptions instance to set a different IsolationLevel, you need to remember to set the transaction Timeout value if you are using a custom Timeout setting in your web.config file.  So I decided to create a TransactionScopeBuilder class with a fluent interface to construct new TransactionScope instances with the idea to try and hide the complexity and also provide a more intent revealing interface of what the actual TransactionScope will do.  I’ll start by showing some examples usages of the builder and then conclude with the source code for the builder itself. 

Thursday, 30 September 2010

Understanding MSIL Memory Leaks

We are using a custom RuntimeDataBuilder class that dynamically constructs a .NET type in-memory using Reflection.Emit to create simple property getters/setters for the information that we receive from a generic WCF Data Service. The service exposes a "DataSet like" structure consisting out of 1..m DataTableDefinition classes each containing 1..m DataTableColumnDefinition classes. When the information is received client side, we generate the dynamic type with its property setters/getters to improve the performance and facilitate binding on our Silverlight client. All of this works fine.

I’m currently trying to figure out how the memory management for these custom dynamic types work.  I’m thinking that we might be causing a possible memory leak if we regenerate the type. Reason why I am contemplating this is that the user can change the query parameters which may then result in more/less information coming across the wire. It therefore invalidates the previous type that I created and I want to make sure that I’m are able to free up the memory used by this previous type definition as this can happen quite frequently.  From this article on MSDN we gather that if you are using Light Weight Code Generation (LCG) the code is allocated on the managed heap which will be reclaimed by the GC when there is nothing holding a reference to it. But LCG only seems to apply to dynamic methods. My concern is for the Type with all its property getters/setters that is now not required anymore. If this is allocated on the unmanaged heap our only hope for reclaiming the memory seems to be to make sure that the type is loaded into a temporary AppDomain that we can unload when it is not required anymore.  This opens up a whole new can of worms for inter AppDomain communication which I would rather avoid.

So my question is any of my readers can shed some more light on this topic.  Is my assumption correct or is there another way of reclaiming the memory?  I also posted the question to StackOverflow if you are interested in checking out the responses over there.  Thx :-)

Wednesday, 01 September 2010

Ultimate .NET Development Tools 2010 Edition

Here is my updated 2010 list of development tools that I prefer to use when doing .NET development.  I specifically do not include any third party control/report libraries.

Categories

  • IDE = Develop/generate/refactor code within the VS IDE or separate IDE 
  • SCM = Software Configuration Management (Source Control etc.)
  • TDD = Test Driven Development Tools
  • DBMS = Database Management Systems
  • CI = Continuous Integration
  • FR = Frameworks (Persistence, AOP, Inversion of Control, Logging etc.)
  • UT = Utility Tools
  • CA = Code Analysis (Static + Dynamic)
  • TC = Team Collaboration (Bug tracking, Project management etc.)
  • MD = Modelling
  • QA = Testing Tools
  • DP = Deployment (Installations etc.)

 

Tools

* = free/open source
  1. [IDE] Visual Studio 2010 Premium Edition
  2. [IDE] ReSharper for refactoring, unit test runner and so much more
  3. [IDE] CodeSmith for generating code.  Also consider T4 with Clarius’s Visual T4 Editor.  
  4. [IDE]* GhostDoc for inserting xml code comments
  5. [IDE] Altova Xml Suite for any xml related work.  XmlPad is the best, free alternative I know of.
  6. [DBMS] SqlServer 2008 for DBMS
  7. [SCM]* Subversion for source control
  8. [SCM]* TortoiseSVN as windows shell extension for Subversion
  9. [SCM] VisualSVN for integration of TortoiseSVN into VS.  AnkhSVN is the best, free alternative I know of.
  10. [SCM]* KDiff3 for merging
  11. [TDD]* NUnit as preferred xUnit testing framework
  12. [TDD]* moq as mock framework.
  13. [TDD] NCover for code coverage stats
  14. [CI]* TeamCity as build server
  15. [CI]* MSBuild Extension Pack for additional MSBuild tasks.
  16. [FR]* log4net as logging framework.  Also see Log4View for an excellent UI for the log files.
  17. [FR]* ANTLR and ANTLRWorks for creating custom DSL’s.
  18. [FR] PostSharp as Aspect Oriented Programming framework
  19. [FR]* Ninject as IoC container
  20. [FR] MindScape LightSpeed as my Object-Relational-Mapper.  NHibernate is the best free alternative I’m aware of. 
  21. [UT]* Reflector to drill down to the guts of any code library (also check-out the nice plug-ins)
  22. [UT] Silverlight Spy to dissect any Silverlight application.
  23. [UT] RegexBuddy for managing those difficult regular expressions.  Regulator is the best, free alternative I know of. 
  24. [UT]* LINQPad as a easy way to query SQL databases using LINQ and as a general scratchpad application to test C#/VB.NET code snippets.
  25. [UT]* Fiddler to debug all your HTTP traffic in IE.   Also see the neXpert plugin for monitoring performance problems.
  26. [UT]* Firebug to assist with testing web applications running in Firefox. Also see YSlow add-on for performance testing and Web Developer add-on for additional Firefox web development tools.
  27. [CA]* FxCop to enforce .NET coding guidelines
  28. [CA] NDepend to get all the static code metrics I'd ever want
  29. [CA] ANTS Profiler for performance and memory profiling
  30. [MD] Enterprise Architect to do UML Modelling and Model Driven Design if required. Alternatively use Visio with these simple templates
  31. [MD]* FreeMind as mind mapping tool
  32. [TC]* ScrewTurn Wiki for team collaboration
  33. [QA]* Eviware soapUI for functional and load testing of SOA web services
  34. [QA]* Telerik WebAii Testing Framework for automated regression testing of Web 2.0 apps
  35. [DP]* Windows Installer XML (WiX) for creating Windows Installers

Tuesday, 31 August 2010

Threading in C# – Free E-book Updated

2325664001_ef61441ab7_m This is just a quick post to highlight the fact that Joseph Albahari, author of the excellent LINQPad and C# 4.0 In A Nutshell book, has recently updated hits free Threading in C# e-book.  The e-book contains some really great, concise content on all the latest threading related topics in .NET.  The latest version includes coverage of new .NET 4.0 constructs like ThreadLocal<T> and Lazy<T> as well as a whole chapter dedicated to the new Parallel Programming extensions to the .NET framework.  Highly recommended!

Sunday, 29 August 2010

Log4View – Getting the most out of your log4net/log4j log files

If you are using log4net or log4j for writing log files, do yourself a favour and get the whole team and production support a copy of Log4View as soon as possible.  It really is a wonderful tool that sits on top of your log4net or log4j files to give you a birds-eye view on what’s happening in your application.  Need to monitor your servers remotely as they are running?  No problem, just add a reference to their TCP log appender that allows you to configure log4net to log all statements to a TCP port.  This gives you the flexibility of remotely monitoring the server as the application is running in production (providing you open the port on the firewall of course). 

One of the most powerful features IMO is the grouping feature in the message view.  Look at the following screenshot that shows how easy it is to group messages according to session id/thread id or any other custom log information added to your log files.  Simply drag the columns you want to group by into the Group By area above the grid:

log4view grouping

This specific grouping allows us to easily inspect the separate requests associated with a certain user in the system. Highly recommended!

RunSharp – IL Generation for Dummies

We’ve been doing some interesting work the past 2-3 weeks on creating a Rule Engine for our company’s flagship product, On Key.  The Rule Engine needs to evaluate rules entered by the end user at run-time according to our pre-defined grammar to determine a true/false answer.  This allows us tremendous flexibility in that end-users are able to specify under which conditions certain actions should occur within the system.  The main requirements for the technical solution was that it should be as quick as possible but it also has to cater for rules being changed at run-time by the end-users.  

Before heading off to create our own custom solution, we did some research to consider possible solutions to solving the same kind of problem.  Some of the solutions we came across were:

  1. Flee
  2. NCalc
  3. Irony
  4. Dynamic method generation using Expression Trees 

After looking at all of these solutions, we decided to rather create our own to give us the ultimate control and flexibility over extending the Rule Engine going forward.  Setting up the grammar using the excellent ANTLR and ANTLRWorks was quite easy to do.  ANTLR takes care of generating the C# code that will do the lexing, parsing and type checking of the rules entered by the user.  Thereafter we moved onto the run-time evaluation/compilation of the rules represented in the Abstract Syntax Tree created by ANTLR.  For this purpose, we created 3 tree walkers/visitors on our AST to compare against each other:

  1. Interpreter – Evaluate the rules dynamically as we walk the AST
  2. C# Code Generator – Walk the tree and generate C# code that is compiled using the C# compiler into an assembly
  3. IL Code Generator – Walk the tree and generate dynamic IL code in memory

Tuesday, 01 June 2010

C# Coding Standards Using ReSharper

I found the following great blog post by Tim Lloyd on how to setup coding standards for your development environment using ReSharper 5.0.  We’ve done pretty much the same thing in our environment although I wasn’t aware of the ReSharper Settings Manager which I will test drive soon as I have found sharing settings between team members to be a real pain.  Seems like JetBrains will look at this for v6 though.  Things we are doing in addition to the content mentioned by Tim are:

  1. Using Agent Smith to ensure the correctness of the spelling for our resource files etc.
  2. Created a more advanced layout template to order all our code using the same structure when doing Code Cleanup

It works great if everybody in the team follows the agreed upon procedure.  I find that this makes code reviews focus on the right things – the actual business logic!

Monday, 19 April 2010

Code Metrics

NOTE: This is a repost of on old post as I am moving onto the Blogger platform

I've been wanting to do a post about code metrics for quite a while now - mostly to organize my thoughts on the topic as it is something that I want to introduce at work, but also to get some feedback from other people as to if and how they are using metrics to assist them in crafting quality code.  After reading Jeremy Miller's post on the topic, I thought I might as well take the plunge.   I'll start by musing over what metrics I found to be useful, then continue with looking at tool support for generating these metrics and finish off with considering when to use these metrics.

Wednesday, 13 January 2010

Vacancy: Intermediate C# Software Engineer; Cape Town - South Africa

We’ve been struggling to get some good CV’s from recruitment agencies for an intermediate-senior level C# Software Engineer.  I’m hoping that I might find an interested reader that fits our requirements.  The position is for a year contract starting as soon as possible. 

So if you are in the Cape Town area and looking for a great opportunity to work in an agile environment (SCRUM) using the latest technologies (Silverlight 3.0, .NET 3.5, SQL Server 2008) with a strong focus on quality (TDD) send me your CV.  You can find more details on the position as well as our company and contact details in the job specification here.  Please include an indication of your availability as well as your current monthly remuneration.

Please note:

  • If you hear nothing from me, assume that we are not interested, i.e. we’ll contact you.
  • Please no recruitment agencies.