I had an interesting chat with a team member today about which XML parsing API from the .NET Framework to use for a particular situation. He mentioned that using the XPathNavigator with an XmlReader would not load the document into memory. However, after some quick research I found this to be incorrect. XPathNavigator is different than XmlDocument in that it is optimized for XSLT processing and the XPath data model, but it is similar in that the entire document is read into memory. XPathNavigator appears to use a cursor-base model for traversing the document while XmlDocument uses a tree-base model.
The rule of thumb I have come up with is this: Read-only, forward-only XML parsing API tend to read from a stream and do not store the entire document in memory. Random access XML parsing API, which typically use a tree-base model or cursor-base model, will traverse in-memory the XML document which means this entire document is stored in memory.
I have been using NUnit since its early days but it has only been in the last six months that I have made the move to test driven development. TDD is concept of writing production code based on unit test cases that are written first. And using a framework such as NUnit enables the developer to write automated unit tests that will test the features of an application at a functional level thus encouraging test driven development.
TDD is a paradigm switch for many developers as it was for me. Sure I have been unit testing my code for years. I would employ techniques such as console applications, shell programs, and other types of test harnesses to test the functions of my code. But until recently I never
started the development of a module by writing unit tests first.
Test driven development is an investment up front in the quality of your code. That investment is in the form of the unit tests before production code. Many developers today write the production code first (after the design phase of course), followed by commenting the code, then followed by testing the code. And in many cases the later work of commenting and testing never happens because bug fixes become the developer’s top priority. But by investing in the quality of the code at the start developers will find that bugs are reduced, the classes and functions of the code are better designed, and over all quality of the work is greatly improved. Best of all, refactoring code becomes less of a risk with TDD because recursion testing at the unit level is possible.
Here are some tip I would like to share that will hopefully help you make the move to test driven development:
<ul>
<li>Just do it. Force yourself to write your unit test first. Over time the quality of the unit test will improve, writing unit test will be easier, and it will because almost second nature for you to start with a unit test.</li>
<li>When testing data changes to a database, start a transaction in the SetUp of the unit test and roll back the transaction in the TearDown. This provides an easy way to clean up the data after each unit test.</li>
<li>Writing a unit test prior to writing the production code will give you insight into the design of your class or function. Use this insight to improve on the design prior to writing production code.</li>
<li>Use composite unit testing to test implementors of interfaces.</li>
<li>Write a unit test for each defect reported. This provides a great way to ensure a problem never reoccurs. And for those who already have existing applications but no unit tests, this approach will give you an introduction to writing unit tests.</li>
</ul>
I just read an article about a new service called DidTheyReadIt. In a nutshell, this service places a small Web bug in HTML formatted e-mails that acts as a tracking device. When the e-mail recipient reads the e-mail a script on the DidTheyReadIt site logs the action. It will even log how long it took to read the e-mail message. Based on logged information the e-mail sender can know if you read the e-mail or not. Fortunately this service only works with e-mail readers that support HTML formatted e-mails, which is something I have never liked. Unfortunately most people view e-mail message as HTML.
HTML e-mails look nice but too much can happen behind the scenes that the message reader does not know about such as tracking when the e-mail message was read. I use Eudora as my preferred e-mail reader. It allows me to turn off viewing e-mail as HTML. I can still see the e-mail and read it, but it is not rendered as HTML. This means the e-mail message might not look as nice but it also means there is no unknown activity happening behinds the scenes when I read the e-mail. And I still have the option for view the message as HTML if I like.
I forgot to mention that pictures from Laura’s graduation have been posted in the gallery on thecave.com. Enjoy.
Melanie and I have just returned from our first of three double wedding weekends for this year. Congratulations to Brooke and Josh on their marriage Saturday, and congratulations to Alex and Meg on their marriage on Sunday.
Just came across this dandy project called RAIL - Runtime Assembly Instrumentation Library. RAIL provides “an API that allows CLR assemblies to be manipulated and instrumented before they are loaded or executed.” Imagine if you will loading an assembly, inspecting it, altering the IL, and executing the mutated assembly or even saving it. This has some interesting applications for aspect-oriented programming.
I have become a firm supporter of writing unit tests for most if not all code I write, and NUnit is an ideal framework for implementing unit tests. But one problem I have had repeatedly is finding a good way to test specific implementations of an interface. My approach has always been to write a new test fixture for each implementor of an interface. But the problem with this approach is that I end up duplicating unit tests for each implementor, and if I need to write a new test I have to copy it to multiple fixtures.
This week at TechEd, Microsoft announced Visual Studio Team System. It includes a number of features to get development team more into the life cycle such as unit testing, profiling, and code coverage. Sounds exciting but a couple of things I have read bother me.
Class diagraming will not use UML. Instead Microsoft has created a new notation which Microsoft says is needed to support two-way application design. The other point that I read that worries me is that Visual Studio Team System integrates tightly with SQL Server 2005. My assumption here is that as a team tool the database used most likely has to be a centralized database accessible by all team members. However, this is just an assumption at the moment but as a developer who uses a laptop and one who does development disconnected from a network from time to time such as when flying, I wonder how this will change the development experience or worse how it will limit the development experience. Guess I will have to wait until I get my hands on the bits to see what will happen.
Here’s a fun game to play while walking the streets of New York, hipster bingo.
I’m currently working on a customer project with 16+ VS.NET projects in a single VS.NET solution file. The single solution file is handy in that all the source code that makes up the complete customer solution is available in one place. The solution includes multiple C# projects, unit test projects, database projects, and one Reporting Services project. Also our build environment relies on NAnt scripts so each project that produces an assembly also has its own .build script.
A single solution approach is nice because I have access to everything that makes up the solution, from build scripts to C# source code to stored procedures. But finding the right project or file is becoming more challenging each day as more is added to the solution. This is where Enterprise Template Projects comes to the rescue.
An Enterprise Template Project is a project type available in Visual Studio.NET Enterprise Edition or greater, and it can contain any type of file including other project types. I started out by creating a set of enterprise template projects that represent the grouping I want. For example, I created template projects with the names Applications, BusinessServices, Databases, Frameworks, NAntScripts, and UnitTests. Within each of these projects I added the appropriate reference based on the grouping to existing project files. UnitTests contains references to the C# projects responsible for producing our NUnit unit test assemblies. BusinessServices contains references to projects that make up the middle tier of the solution. Framework contains references to framework projects, and so on.
NAntScripts is interesting in that it does not contain a reference to another project but instead it contains references to .build scripts files found in other projects. I like this because many times when I am working on the build scripts I need to modify more than one script. Before making the grouping I was jumping from project to project in search of the .build script file. Now I have references to all the script files in one place making it easier to find a particular script file.
With this new grouping I am able to more efficiently find the projects and files I need to work on. The grouping has proven itself to be a huge time saver for me especially when working with large single solution files.
One additional note on using enterprise template projects to organize your single solution: To prevent scattering .etp files throughout project directories containing the source code I recommend creating a single directory that will store all of the .etp files. References within an enterprise template project do not have to fall under the template project. Instead files referenced within the .etp can be stored anywhere on the hard drive. This means the .etp file does not have to dirty up the project directory where the source code actually resides.