Automated testing for Tridion templates: Followup

Earlier this week I released two blog posts on automated testing for Tridion templates. The idea to blog about this subject came from this stack overflow thread where several members from the Tridion community were discussing automated testing for Tridion templates. The technical solutions I offered in my last blog post each have their specific limitations. A post on the before mentioned thread by Nuno Linhares triggered my interest. He mentioned a web service that the template builder uses to debug templates.

After some research it turns out this web service is a really solution to achieve automated testing for templates.  In essence, you feed the web service some basic data, the template is executed and the package with its content is returned to you.  The whole solution was easy to setup, I created a demo solution to share with the community. All you have to do is add the reference to the service and you’re good to go. Check the config file for the service definition location on the Tridion content management machine.

Implementing automated testing this way has to following advantages and limitations:

Advantages: This solution is quite easy to setup and be used right away since it does not require any changes to the existing templates. Since we use a web service provided by Tridion there we can be sure the output is reliable.

Limitations: This solution only enables automated testing for complete template building blocks or templates. Testing individual functions within a template building block is not possible with this solution alone.

Wrapping up

This approach is a very pragmatic way to test Tridion templates. Normally I would be put off by the fact that individual functions cannot be tested. But since template are data transformations making assertions on the output is pretty much all you need.

Using the same web service as the template builder turned out to be a very sweet deal. I wonder why Tridion doesn’t document and offer this solution as a way to automate testing for templates. I would be happy to write a more detailed article plus a demo project for the Tridion world platform if I was asked ;)

Automated testing for Tridion templates: Technical

This is the second post on automated testing for Tridion component templates. The previous post was an overview about the usefulness of a testing strategy and the different approaches to automated testing in general.

This post will focus on actually implementing integration testing for Tridion templates. I will cover integration testing since this approach is well suited for testing template building blocks, which are basically data transformations.

Before we dive into the technical bit, let’s take a moment to discuss the creating the test input for integration tests. Integration tests require data, which has to coded by hand or pulled from somewhere. Both are valid choices, but it is a choice between stability and maintainability:

  • When favoring stability the input has to be local and coded hand in order to avoid a dependency to any external system. When using a hand coded input, changing the tests to accommodate new requirements will take longer since the input will likely have to change as well.
  • When favoring the ability to quickly change tests the input can be pulled from an external component. Since the input isn’t code by hand, it takes less time to change the automated tests. The downside is that you create a dependency on the external system. Downtime of that system causes tests to fail.

Now that we got the options for test input covered, let’s dive into the technical bits. I have two possible implementations for integration testing lined up:

Integration testing using TOM classes with Microsoft Fakes

The first implementation is the creation of an integration test input using a local test setup with the normal TOM (Tridion object model) classes. Since these classes are not developed to be testable I had to use the Microsoft Fakes framework in order to make this work. This framework enables the testing of classes that are not designed to be testable. Using Fakes I was able to create the item fields for a component. See the example code below:

using System;
using System.Collections.Generic;
using System.Reflection;
using System.Xml;

using Microsoft.QualityTools.Testing.Fakes;
using Microsoft.VisualStudio.TestTools.UnitTesting;

using Tridion.ContentManager;
using Tridion.ContentManager.ContentManagement;
using Tridion.ContentManager.ContentManagement.Fields;
using Tridion.ContentManager.ContentManagement.Fields.Fakes;
using Tridion.ContentManager.Fakes;

namespace Tridion.Deloitte.Libraries.Tests.ContentArticle
{
    [TestClass]
    //Use the default visual studio debugger: http://blog.degree.no/2012/09/visual-studio-2012-fakes-shimnotsupportedexception-when-debugging-tests/
    public class ItemFieldsTests
    {
        [TestMethod]
        public void ItemFieldTest()
        {
            using (ShimsContext.Create())
            {
                //Session constructor is bypassed completely to prevent interaction with Tridion. 
                ShimSession.Constructor = x => { };
                Session session = (Session)Activator.CreateInstance(typeof(Session), new object[] { });

                //Finalize is supressed to prevented interaction with Tridion. 
                GC.SuppressFinalize(session);

                //IdentifiableObject constructor is bypassed completely to prevent interaction with Tridion. 
                ShimIdentifiableObject.ConstructorTcmUriSession = (x, uri, sess) => { };
                Schema schema = (Schema)Activator.CreateInstance(typeof(Schema), new object[] { TcmUri.UriNull, session });

                //Create item fields without any interaction with Tridion using Fakes.
                ItemFields fields = GetItemFields(schema, session);

                //Assert fields where actually created.
                Assert.AreEqual(fields.Count, 1);
            }
        }

        private ItemFields GetItemFields(Schema schema, Session session)
        {
            List fields = new List { GetItemField(session) };

            ShimItemFields.ConstructorSchema = (x, y) =>
            {
                FieldInfo fieldsField = typeof(ItemFields).GetField("_fields", BindingFlags.NonPublic | BindingFlags.Instance);
                if (fieldsField != null)
                {
                    fieldsField.SetValue(x, fields);
                }
            };

            ItemFields itemFields = (ItemFields)Activator.CreateInstance(typeof(ItemFields), new object[] { schema });

            var shimFields = new ShimItemFields(itemFields);
            shimFields.ItemGetString = x => GetItemField(session);

            return itemFields;
        }

        private ItemField GetItemField(Session session)
        {
            ShimItemField.ConstructorXmlElementSession = (x, y, z) => { };

            const BindingFlags bindingFlags = BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.CreateInstance;
            XmlDocument emptyDoc = new XmlDocument();
            var arguments = new object[] { emptyDoc.DocumentElement, session };

            Type type = typeof(SingleLineTextField);
            var itemField = (ItemField)Activator.CreateInstance(type, bindingFlags, null, arguments, null);

            var shimField = new ShimItemField(itemField);
            shimField.NameGet = () => "Title";

            var field = itemField as SingleLineTextField;
            field.Value = "This is a title field";

            return itemField;
        }
    }
}

The basic implementation above is pretty decent, but you have to code the item fields by hand instead of just using a local xml file with component xml generated by Tridion. As discusses earlier, coding the input by hand effects the time it takes to change the tests.

In order to overcome this I intended to extended the code above to generate item fields from component and schema xml files generated by Tridion. Again, I used Microsoft Fakes to change the behavior of the TOM classes in order to make this scenario testable.

The end result worked fine, but it’s not something I would recommend other developers to try. Microsoft Fakes is a very powerful framework, but relying on it too much has drawbacks in terms of readability. A great deal of Microsoft Fakes configuration was required to make the scenario testable, making the code complex, hard to read and hard to understand. On top of that coding the test and configuring Fakes properly required extensive interaction with the inner workings of Tridion.

In conclusion, taking this route for automated testing has to following advantages and limitations:

Advantages: This implementation makes the TOM classes testable, so the template building blocks themselves do not have to be changed in order to start creating automated tests.

Limitations: This implementation requires expensive versions of Visual Studio and additional tooling for decompiling and debugging Tridion libraries. Coding the test setup requires a lot of interaction with the inner workings of Tridion. A great deal of Microsoft Fakes configuration makes the code complex, hard to read, and hard to understand.

Test input using Tridion Core Service classes

The first approach is the creation of an integration test input using a local test setup with Tridion core service classes. This might not seem like the most obvious approach at first, but is does offer certain advantages. The core service classes are perfect for creating test input since they are simple data classes that be created without any additional effort. Converting the data into fields is also very easy since there is a library by Frank van Puffelen that handles that nicely. See the example code below:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.VisualStudio.TestTools.UnitTesting;
using Tridion.ContentManager;
using Tridion.ContentManager.CoreService.Client;

namespace Tridion.Deloitte.Libraries.Tests
{
    [TestClass]
    public class ComponentFieldsTests
    {
        [TestMethod]
        public void TestComponentFieldCreation()
        {
            const string fieldName = "Title";
            const string expectedTitle = "Test";

            ComponentData componentData = new ComponentData { Id = TcmUri.UriNull};
            componentData.Content = string.Format("{0}", expectedTitle);

            ItemFieldDefinitionData definition = new SingleLineTextFieldDefinitionData();
            definition.Name = fieldName;

            SchemaFieldsData schemaFieldsData = new SchemaFieldsData();
            schemaFieldsData.NamespaceUri = "Tridion.Deloitte.Libraries.Tests";
            schemaFieldsData.RootElementName = "Content";
            schemaFieldsData.Fields = new [] { definition };

            ComponentFields contentFields = ComponentFields.ForContentOf(schemaFieldsData, componentData);

            Assert.AreEqual(expectedTitle, contentFields[fieldName].Value);
        }
    }
}

The code above is really simple, but you have to code the fields instead of just using content from Tridion. However, this approach offers the freedom to either code the input yourself or pull it from the core service, since its core service classes you’re using in the first place. See the example code below:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.VisualStudio.TestTools.UnitTesting;
using Tridion.ContentManager;
using Tridion.ContentManager.CoreService.Client;

namespace Tridion.Deloitte.Libraries.Tests
{
    [TestClass]
    public class ComponentFieldsTests
    {
        [TestMethod]
        public void TestComponentFieldCreationWithCoreService()
        {
            const string fieldName = "Title";
            const string expectedTitle = "Test";

            ISessionAwareCoreService client = TridionCoreServiceProvider.InitClient();
            var schemaFieldsData = client.ReadSchemaFields("tcm:106-269739-8", true, new ReadOptions());
            var componentData = (ComponentData)client.Read("tcm:106-273325", new ReadOptions());

            ComponentFields contentFields = ComponentFields.ForContentOf(schemaFieldsData, componentData);
            Assert.AreEqual(expectedTitle, contentFields[fieldName].Value);
        }
    }
}

The code above is also really simple, and saves the time of coding the input yourself. This will reduce the time required to change to tests to accommodate new requirements. There is a risk when you create a dependency on the external system. Downtime of that system causes tests to fail. But since this external system is actually another Tridion component the risks are limited.

From a testing standpoint, using the core service classes is a much better approach. However, this approach might not be favorable for existing implementations with their set of existing template building blocks, which will have to be changed in order to accommodate automated testing using this approach. I would not recommend changing a large set of existing template building blocks, the costs will most likely be higher than the benefits that automated testing will bring.

For new implementations, using the core service classes is a valid choice if automated testing is a requirement. The Tridion object model classes are easily mapped to the core service classes since their basically the same. In the code package I will include classes that map fields from a component to testable fields. This download should offer a decent start.

In conclusion, taking this route for automated testing has to following advantages and limitations:

Advantages: The test setup is simple and doesn’t require any trickery to setup. Data can be easily pulled from the core service, this will reduce the time required to change to tests to accommodate new requirements. Overall the best approach from a testing stand point.

Limitations: In order to use this approach existing templates have to be changed, which is a serious limitation for existing implementations with their set of existing template building blocks.

Wrapping up

Tridion should definitely be looking at automated testing for future version of their product. The Tridion object model classes are a nightmare to work with from a testing perspective. I touched on several possible solutions, but they all have their limitations. There is a lot of room for improvement.

Tridion caters to enterprise clients, the client expect a certain quality from their developers and consultants. We all have to work together in order to deliver that quality. A documented and supported approach to automated testing would benefit the overall quality greatly.

As I noted earlier in my first post, the technical solutions I described are not the only way to achieve automated testing. The stack overflow thread I linked in the beginning of my first post mentions several others, but not in much detail. If there is a good approach I didn’t cover yet please say so in the comments. I will definitely look into it.

Automated testing for Tridion templates: Introduction

Last year I wrote a blog post about developing Tridion component templates using .Net. I have been developing component templates for an implementation that uses dynamic component publishing. The one thing with templating that needs improving is automated testing. Many Tridion templating classes are not testable, unnecessarily restricting the options for automated testing. I decided to write two blog posts outlining the options that you do have for implementing automated testing when writing templates.

This first post will be an introduction covering the testing strategy and the different approaches for automated testing. I will wrap up with my personal preferences on which approach to use for testing templates.

The next post will be covering technical example implementations for automated testing of Tridion component templates. There is also an issue on stack overflow about this topic, be sure to check it out.

The testing strategy: why is it useful?

Most people have strong associations with the word strategy. So before we go any further, lets settle on what strategy means within the context of this series of blog posts:

“The determination of the basic long-term goals and objectives, and the adoption of courses of action and the allocation of resources necessary for carrying out these goals”

When creating a testing strategy it’s about making fact based decisions when setting the goals and objectives for testing. Testing is one the important tools in managing risks when developing software. Risks have to be identified and choices have to be made on how to manage these risks, thus requiring a strategy. The strategy be revisited for every project since the risks are different for every Tridion implementation. Consider the following implementations:

  • A Tridion implementation that uses dynamic component publishing.  The component templates are usually simple data transformations in order to publish xml to the broker. The templates don’t have complex business logic, since that is usually handled by another web application that consumes content from the broker database. The component templates usually consist of one or maybe two building blocks. There is no heavy interaction between the template building blocks.
  • A Tridion implementation that uses pages. The page and component templates are likely to be more complex. The page and component templates consist of more building blocks. This also means there is likely more interaction between template building blocks.

Both Tridion implementations are quite different and require different testing strategies in order to manage the risks that the implementation has to deal with. I will now move to the core subject, automated testing. There are two base approaches for automated testing: unit-testing and integration testing.

The testing strategy: unit-testing a template

The first approach is the unit-testing approach of testing small pieces of code within a template building block independently. This is done by isolating the code you want to test properly and avoiding interactions with the untestable Tridion classes inside the TOM (Tridion object model).

Advantages: The tests all cover a small portion of code, making it easy to pinpoint a mistake. The tests don’t have to rely on any external data in order to pass.

Limitations: This approach to testing is vulnerable to the standard unit-testing pitfall. The risk is that all the individual pieces work correctly, but somewhere in the interaction it all goes horribly wrong.

The testing strategy: integration-testing a template

The second approach is the integration testing strategy where you test a bigger, functional unit of code. This functional unit can consist of many smaller units. A functional unit can be the template building block from start to finish.

Advantages: The tests all cover a larger portion in order to test a functional unit. This means the interaction between different pieces of code is also tested, increasing the change of noticing a mistake in early stages of development.

Limitations: Since the tests cover a large portion of code, it is harder to pinpoint the cause of a mistake and thus increases the time required to fix it. The test setup is generally more complex because it requires a dataset, so maintaining the test setups will increase the amount of time required to maintain test code.

Wrapping up

The two approaches mentioned above can be used exclusively, but they can also be combined, negating each other’s limitations and creating a more all round  robust testing strategy. Personally I feel that an integration testing approach is the more pragmatic and logical approach for data transformations. Making assertions against the output is much more efficient to code, and also easier to understand by other developers.

Implementing an integration test combined with unit-tests at appropriate places makes it’s easier to pinpoint a mistake within a functional unit. This is only recommended in more complex cases.

Time to wrap up the introduction post. In my next post I will be going over some technical implementations for automated testing of Tridion component templates.

Developing Tridion 2011 .Net component templates

Recently i had the opportunity to develop component templates for a Tridion 2011 implementation. In this article I want to share my thought on the new templating features, provide quick tips and wrap up with some templating limitations that should be addressed in future versions of Tridion.

My previous Tridion component template development experience is all based on VBscript and the templating features from Tridion 5.3. Since I have plenty experience with .Net development I was confident the switch to Tridion 2011 template development would be a smooth one. It’s worth mentioning that I only developed component templates since the implementation used a dynamic publishing model (only component presentation xml is published to the database). The overall experience with the new templating features was very positive, the biggest improvements that I would like to point out:

  • Better productivity because of the ability to work with Visual Studio, a professional IDE which offers syntax highlighting, code completion and other nifty tools that increase developer productivity.
  • Faster bug fixing because of remote debugging and the logging functionality offered by the Tridion libraries.

The overall improvement of the template development experience has been quite substantial for me, being able to use a full fletched IDE and having decent debugging options makes a world of difference to me and my productivity as a developer.

Tridion .Net templating quick tips

When getting started with .Net template development there are some things you should look at, as they will have a positive effect on productivity and possibly save some frustration:

  • If you have the possibility, install Visual Studio on the same machine as the Tridion Content Manager Server. This allows for easier remote debugging of you templates without the risk of network/domain component causing issues and leaving you unable to use remote debugging. If you are unable to get remote debugging working on the customer network and you cannot install Visual Studio on the Tridion Content Manager, consider setting up a separate development environment where you can, cause debugging will boost productivity greatly. There is documentation on how to get remote debugging working on SDL live content.
  • Download the AssemblyUploader2 plugin for Visual Studio, this plugin allows for fast uploading of dll files into Tridion, having this plugin will save you a lot of manual and repetitive upload tasks.
  • Making the switch from VBscript to .Net I needed to get familiar with the new TOM.Net API. There are two things which make this process easier and faster. The most important thing to obtain the Tridion 2011 SP1 TOM.Net API documentation provided by Tridion. Use the Visual Studio object browser to quickly nagivate the Tridion libraries and see what functionality they expose.

Tridion .Net templating limitations

As stated in the introduction, there are some limitations to the Tridion .Net template development which I encountered during the project.

  • The most noticeable for me is the limited support for automated testing. Tridion classes used for templating are not testable making it harder to isolate code for testing. During training at Tridion the trainer did mention that Tridon will look at this for future versions.
  • A limitation worth mentioning is that all templating code has to be uploaded to Tridion in a single dll file, so all the code has to be maintained within one Visual Studio project. I found this out the hard way: Template execution failed when I referenced a second project which contained utility/support classes while both dll files where unloaded into Tridion correctly. This limitation can be frustrating when you want to reuse code from utility/support classes which reside within a different project. Fortunately there is a workaround for this limitation. Visual Studio offers the choice to add classes from a different project to the current project as a link. This way you can use the code from other projects within you templating project without duplicating the code.

Wrapping up

Tridion still supports VBscript templates, the reason is simple: upgrading Tridion should not break any existing templates and implementations. Even though VBscript is not officially deprecated I strongly recommend making the switch to .Net development for your next template development project. The reason is simple, the switch has to be made sometime, better make it soon and start seeing the benefits of the new .Net template development features for yourself.

Telligent Community 6.0 custom tagging solution

In this article I will be briefly describing a custom component I developed for integrating two web platforms at a client, the website and their community (running Telligent Community 6.0) . The website uses a tagging system which matches news articles and seminars to content pages on the website. The website itself is developed in .Net web forms. The tagging is managed via the Tridion Content Management System (5.3). The tagging system will be expended in the near future by pulling content from the community and matching it to content pages on the website using the same tagging system.

The community is developed using Telligent Community 6.0, which gave me to opportunity to work with this new product. The Telligent platform has evolved greatly compared to its predecessor Community Server, especially when it comes to customization and integration. The easiest way to integrate Telligent Community 6.0 is through using the rest API, which is quite elaborate as the documentation will show.

Unfortunately, while the Telligent API is pretty elaborate, there are no endpoints available to pulling content by their tags. Something I hope will be available in the future, since the data structure for storing tags within Telligent is pretty straightforward.

When developing the integration component we went for a simple integration with maximum performance. Since the website can communicate directly with the Telligent Community 6.0 database, we choose not a developed a web service, but integrate the code into the website directly. We used standard Linq2Sql with repository pattern to hold the objects, and developed some business logic to sort the results by relevancy. Nothing especially fancy from a technical perspective, but I always find it fun to integrate platforms.

Since I will be developing a full community in Telligent Community 6.0 at a later period, there may be more articles regarding Telligent custom widget development on the way.

The Web Forms vs MvC debate

Currently Microsoft ships two frameworks to develop web application with the .Net framework (Asp.Net Web Forms and Asp.Net MvC). Microsoft is developing and supporting both frameworks alongside each other, emphasizing that developers have to pick the right tool for the task at hand. Despite the position Microsoft takes on the subject, there has been debate within the .Net community as to which framework is the better framework for developing web applications. In this article I discuss my views on the subject, which I consider an important subject and thus a good starter posting for an Asp.Net blogger.

Why use frameworks?

Before we get into comparing the Web Forms framework to the MvC framework, I want to address the benefits of using application development frameworks in the first place. First and foremost a framework helps me be an efficient developer by taking care of the general overhead associated with developing web applications. This lets me focus on the client specific challenges. Secondly a framework can help me deliver high quality applications by supporting and encouraging the use of design patterns and best-practices.

At this point in time approximately 80% of all Asp.Net implementations use Web Forms. This is no surprise since Web Forms is been around considerably longer, and has been quite successful in the past. But the MvC framework seems to be on the rise and having worked with both frameworks, I can see why the MvC framework is gaining momentum. It does very well in a lot of the areas where the Web Forms framework has been having troubles.

The Asp.Net Web Forms framework

The Asp.Net Web Forms framework has been around for quite some time, but it’s still maintained and even extended by Microsoft, the last release being the .Net framework 4 release. Despite these efforts, as it stands now, using the Web Forms framework does not help me being an efficient developer. Although this might sound a bit harsh it should be said, the reason the Web Forms framework was created simply doesn’t serve the current needs of enterprise web applications developers.

The original reason for creating the Web Forms framework was to offer desktop developers a smooth transition to web development. This design goal brought forth technical difficulties because there are differences in state between the desktop and the web, which uses the inherently stateless HTTP protocol. To overcome these challenges Web Forms employs an event-driven postback/viewstate development model to hide developers from the stateless web. However, this development model is fragile and has been known to cause problems with scaling complexity.

Aside from the technical issues, the Web Forms framework suffers from a lack of control, mainly with the way it generates output. The Web Forms framework relies on controls to build applications. These controls generate output, giving the developer only very limited control over the output. This makes it more difficult to do front-end and client-side development. This also poses challenges when working with accessibility and html standards. While new options for control over the output were added with the .Net framework 4 release, it’s still limited.

The second point where the Web Forms framework is not successful is in help me deliver quality applications. In all honesty it has to be noted that being a good developer is still mostly the responsibility of the developer itself, and not just that of the framework. However a framework can inherently support best-practices and encourage developers to use best-practices. There is no such support when working with the Web Forms framework. Important best-practices like seperation of concerns, loose coupling and testability have to be implemented by the developers themselves. This shortcoming was remedied somewhat by the creation of the design pattern called MVP (Model View Presenter), but these efforts have nothing to do with the framework, and is not enforced by the framework in any way.

The Asp.Net MvC framework

The Asp.Net MvC framework was developed by Microsoft to be an alternative to Asp.Net Web Forms for building web applications. The framework utilizes the MVC (Model View Controller) design pattern, and is build using important object oriented principles and best-practices like separation of concerns, loose coupling and testability. I’m not going to explain these best-practices in this article, but there’s a wealth of documentation on what they are and how harness their power. The MvC framework also respects web standards, so there’s no bypass of the inherently stateless nature of the HTTP protocol.

The first point where the MvC framework is really successful l: helping me be an efficient developer. The reason the MvC framework succeeds where the Web Forms framework fails is by serving the current needs of enterprise web applications developers. It takes care of the common overhead like you would without compromising on flexibility. This is where it beats the Web Forms framework, the MvC framework gives developers total control over everything that is going on.

The second point where the MvC framework is successful l: helping me deliver quality applications. The framework inherently works with the MvC design pattern, giving separation of concerns out of the box. Separation of concerns makes it easier to manage complexity and make changes on certain isolated parts of the application. Loose coupling is incorporated into the MvC framework itself, enabling developers to swap even the core component of the frameworks with custom replacements to change the behavior of the framework without effecting other components. This makes amazing flexibility and makes working with the best-practice of loose coupling easy. Testability is also relatively easy with the MvC, this best-practice is emphasized by Microsoft since unit-testing is been getting a lot of attention lately.

Wrapping up

Responsibility for the overall quality of the application still lies with the developers. They have to use the best-practices and make sound decisions when developing, whatever framework they use. Working with the MvC won’t guarantee success, it just makes it easier to be successful. If you have had success with the Web Forms framework in the past or still, you can continue to do so. However I would advise that everyone doing Asp.Net development at least fiddle around with the MvC framework a little, it might just be the thing for you.