Last year I wrote a blog post about developing Tridion component templates using .Net. I have been developing component templates for an implementation that uses dynamic component publishing. The one thing with templating that needs improving is automated testing. Many Tridion templating classes are not testable, unnecessarily restricting the options for automated testing. I decided to write two blog posts outlining the options that you do have for implementing automated testing when writing templates.
This first post will be an introduction covering the testing strategy and the different approaches for automated testing. I will wrap up with my personal preferences on which approach to use for testing templates.
The next post will be covering technical example implementations for automated testing of Tridion component templates. There is also an issue on stack overflow about this topic, be sure to check it out.
The testing strategy: why is it useful?
Most people have strong associations with the word strategy. So before we go any further, lets settle on what strategy means within the context of this series of blog posts:
“The determination of the basic long-term goals and objectives, and the adoption of courses of action and the allocation of resources necessary for carrying out these goals”
When creating a testing strategy it’s about making fact based decisions when setting the goals and objectives for testing. Testing is one the important tools in managing risks when developing software. Risks have to be identified and choices have to be made on how to manage these risks, thus requiring a strategy. The strategy be revisited for every project since the risks are different for every Tridion implementation. Consider the following implementations:
- A Tridion implementation that uses dynamic component publishing. The component templates are usually simple data transformations in order to publish xml to the broker. The templates don’t have complex business logic, since that is usually handled by another web application that consumes content from the broker database. The component templates usually consist of one or maybe two building blocks. There is no heavy interaction between the template building blocks.
- A Tridion implementation that uses pages. The page and component templates are likely to be more complex. The page and component templates consist of more building blocks. This also means there is likely more interaction between template building blocks.
Both Tridion implementations are quite different and require different testing strategies in order to manage the risks that the implementation has to deal with. I will now move to the core subject, automated testing. There are two base approaches for automated testing: unit-testing and integration testing.
The testing strategy: unit-testing a template
The first approach is the unit-testing approach of testing small pieces of code within a template building block independently. This is done by isolating the code you want to test properly and avoiding interactions with the untestable Tridion classes inside the TOM (Tridion object model).
Advantages: The tests all cover a small portion of code, making it easy to pinpoint a mistake. The tests don’t have to rely on any external data in order to pass.
Limitations: This approach to testing is vulnerable to the standard unit-testing pitfall. The risk is that all the individual pieces work correctly, but somewhere in the interaction it all goes horribly wrong.
The testing strategy: integration-testing a template
The second approach is the integration testing strategy where you test a bigger, functional unit of code. This functional unit can consist of many smaller units. A functional unit can be the template building block from start to finish.
Advantages: The tests all cover a larger portion in order to test a functional unit. This means the interaction between different pieces of code is also tested, increasing the change of noticing a mistake in early stages of development.
Limitations: Since the tests cover a large portion of code, it is harder to pinpoint the cause of a mistake and thus increases the time required to fix it. The test setup is generally more complex because it requires a dataset, so maintaining the test setups will increase the amount of time required to maintain test code.
The two approaches mentioned above can be used exclusively, but they can also be combined, negating each other’s limitations and creating a more all round robust testing strategy. Personally I feel that an integration testing approach is the more pragmatic and logical approach for data transformations. Making assertions against the output is much more efficient to code, and also easier to understand by other developers.
Implementing an integration test combined with unit-tests at appropriate places makes it’s easier to pinpoint a mistake within a functional unit. This is only recommended in more complex cases.
Time to wrap up the introduction post. In my next post I will be going over some technical implementations for automated testing of Tridion component templates.