Telligent Community 6.0 custom caching

In this article I will be briefly describing a custom caching approach which I developed for custom Telligent Community 6.0 widgets. Understanding caching and anticipating issues with cache is important to keep the load time of widgets and pages in check.

Widgets within Telligent are cached for five seconds out-of-the-box. This is a balance between performance gains by not having to re-evaluate the widget code and preventing the content of the widget becoming stale. While the five second cache is fine for most widgets out there, there are some widgets which may require additional caching to reduce load. Examples are widgets that perform many expensive database queries or connect to external API’s.

To illustrate how many expensive database queries can cause high load times for widgets I will describe a widget a recently developed for a client. This custom widget provides an overview of the amount of content within the community to give the user an indication of how active the groups within the community are. The community has ten groups, with a total of fifty sub groups. These groups contain roughly five hundred threads and twenty-five hundred replies. Querying all this content causes high load on the database server and increases load time for the widget. We remedied this problem by enhancing the caching. While this is not the only possible form of improvement, this article will only discuss caching.

There are two approaches known to me to increase caching within Telligent. The first approach is changing the standard Telligent caching through the caching.config. For more information see the documentation. I would like to point out that changes to the standard Telligent caching effect everything that is put into the cache. If you require only a portion of the widgets to be cached for a longer period of time, changing the standard Telligent caching probably yields undesirable results in other parts of the community.

The second approach requires customization through an extension, but you can achieve a more fine grained control over caching than you can with changing the standard caching. If you want to cache a portion of the widgets for ten minutes, and leave the rest on the standard five second caching, this approach is the way to go. If this is your first time writing an extension for Telligent Community 6.0, this article by Adam Seabridge is a good read. Basically what we want to do is write our own extension to handle caching for us and enable us to determine how long an object remains in the cache. The extension will look like this:

using System;

using Telligent.Evolution.Extensibility.Caching.Version1;

namespace KvKVelocityScriptExtensions
{

    public class CachingVelocityScriptExtensions : Telligent.Evolution.Extensibility.UI.Version1.IScriptedContentFragmentExtension
    {

        public string Name
        {
            get { return "KvK Cache Extension Plugin"; }
        }

        public string ExtensionName
        {
            get { return "KvKCachingExtension"; }
        }

        public string Description
        {
            get { return "Caching Extension Class For Use In Custom Widgets"; }
        }

        public void Initialize()
        {
            //stuff here;
        }

        public object Extension
        {
            //pass back an instance of the TwitterService which exposes the methods
            get { return new CachingService(); }
        }
    }

    public class CachingService
    {
        #region GroupStatistics

        public const string KvKGroupListWidgetCacheKeyPrefix = "KVKGROUPLISTWIDGET{0}";

        public GroupStatistics GetGroupStatistics(int groupId) 
        {
            string cacheKey = string.Format(KvKGroupListWidgetCacheKeyPrefix, groupId);
            GroupStatistics groupstatics = (GroupStatistics)CacheService.Get(cacheKey, CacheScope.All);

            if (groupstatics != null) {
                return groupstatics;
            }

            return null;
        }

        public void PutGroupStatistics(int groupId, int minutesToCache, int forumThreadCount, int forumThreadReplyCount, DateTime lastForumThreadReply)
        {
            string cacheKey = string.Format(KvKGroupListWidgetCacheKeyPrefix, groupId);
            GroupStatistics groupStatistics = new GroupStatistics
            {
                ForumThreadCount = forumThreadCount, 
                ForumThreadReplyCount = forumThreadReplyCount, 
                LastForumThreadReply = lastForumThreadReply
            };

            CacheService.Put(cacheKey, groupStatistics, CacheScope.All, null, TimeSpan.FromMinutes(minutesToCache));
        }

        #endregion
    }

    public class GroupStatistics
    {
        public int ForumThreadCount { get; set; }
        public int ForumThreadReplyCount { get; set; }
        public DateTime LastForumThreadReply { get; set; }
    }

}

As the code above shows, we make a simple extension with some simple functions that interact with Telligent caching. As you cannot cast objects within Velocity script, the functions within your caching extension have to return concrete instances of objects, in this case an instance of the GroupStatistics class. You have to write a small amount of additional code when want to cache an additional type of object. Now that we got our simple extension added to Telligent Community 6.0 we can start using it from Velocity script like this:

#foreach($group in $groups)
	#set($groupStatistics = $CachingExtension.GetGroupStatistics($group.Id))
	#if ($groupStatistics)
		## Use cached results.
	#else
		## Execute normal widget code.
		## Insert results into cache.
		$CachingExtension.PutGroupStatistics($group.Id, 15, $threadCount, $replyCount, $lastReactionDate)
	#end
#end

While this solution takes some additional effort, it adds great value over the standard Telligent caching in that it gives developers to power to cache widgets individuality to their needs. While caching is not the answer to every performance issue, adding additional caching in the right places can reduce load on the servers and help keep page loads in check. If you have any questions feel free to leave a comment or contact me.

Asp.Net MvC 3 from scratch: Models

This is the third article in the series where we create a book review application from scratch using Asp.Net MvC 3. In this article were going to start developing the data layer for our application. Since this goal is fairly large I will be chopping it up into three articles, this article will get our data layer up and running. The upcoming two articles will cover unit-testing and proper integration with the rest of the application through dependency injection.

The data layer

The main concern of the data layer is to manage and persist the data of the application. We will build our data layer using the Entity Framework 4.0, a object relational mapper tool build by Microsoft. The newest version of the framework offers a lot of neat new features, the most prominent being the “code first approach”. This feature allows us build our data layer starting with hand coding the data classes (the models). The database will be generated later using the models as the blueprint. The benefits of this approach: the models are clean, not generated by any tool and not directly coupled to Entity Framework. To utilize this approach we have to install an extension for the Entity Framework called “EFCodeFirst”, which you can install through the NuGet Package manager.

The basics

Before we get started, the data layer can be created inside a separate project in our solution, this makes it easier to physically recognize the existence of a separated layer and it dividing the application into multiple projects helps with structuring as the application grows larger. All right then let’s get started with coding our models. I will start with coding the basic models we need at this point, the book and the review:

public class Book
{
        public int Id {get; set; }
        public string Name { get; set; }
        public string Author { get; set; }
        public string ISBN { get; set; }
}

public class Review
{
        public int Id { get; set; }
        public Book Book { get; set; }
        public string Content { get; set; }
        public DateTime Created { get; set; }
}

Now we that we got our basic models in place we have to make the Entity Framework aware of their existence. The code first way to do this is to define a context that derives from the DbContext. The models themselves live inside the context as DbSets:

public class EntityContext : DbContext
{
        public DbSet<Review> Reviews { get; set; }
        public DbSet<Book> Books { get; set; }
}

As mentioned before the database is generated using the model as the blueprint. Personally i like to start with standard set of dummy data every time i run my application. The database if recreated by the Entity Framework every time i run my application, then the dummy data is inserted. We have to code the dummy data so the example provided here is pretty basic:

public class EntityBaseData : DropCreateDatabaseAlways<EntityContext>
{
        protected override void Seed(EntityContext context)
        {
            context.Books.Add(new Book{Author = "Oskar uit de Bos", ISBN = "1234567", Name = "Book1"});
        }
}

Notice that our class inherits from a class class DropCreateDatabaseAlways, which tells the Entity Framework to drop and recreate a fresh database. Insert the class into the gloabal.asax like this:

protected void Application_Start()
{
            AreaRegistration.RegisterAllAreas();

            RegisterGlobalFilters(GlobalFilters.Filters);
            RegisterRoutes(RouteTable.Routes);

            DbDatabase.SetInitializer(new EntityBaseData());
}

Last step is setting up the connection to the database. By default the Entity Framework looks for a connection string with a name that matches our custom DbContext name, in our case EntityContext. Lets give the Entity Framework what it wants, add the following connection string to the web.config file:

<connectionStrings>
    <add name="EntityContext"
         connectionString="data source=.\SQLEXPRESS;Database=BookReviews5;Integrated Security=SSPI;"
         providerName="System.Data.SqlClient" />
</connectionStrings>

The next step

While we got our basic setup covered there is little to no structure at this point, so let’s apply some structure. Most of you should be familiar with the concept of repositories, they encapsulate the mechanisms for storing, retrieving and querying data from the rest of the application. Repositories act as gateways and guardians for our data, preventing data access logic from being scattered all over our data layer. Now that we got some basic structure in place we got testability to think about. While there are many different approaches for testing a data layer, i will take a strict unit-testing approach, so not integration testing with the database.

The focus of the unit-testing lies with data access logic inside the repositories. To enable testing without the database we will abstract the link to the database (our custom DbContext) away from the repositories into a separate component. Let’s start with this separate component, since we will need it when we start working on our repositories.

I named this component EntityProvider, since its main function is to provide the application’s data (entities) to our repositories. The basic idea is that every version of the EntityProvider exposes data through several IDbSet interfaces. The first implementation, our main implementation, works with the actual database. The implementation looks like this:

public interface IEntityProvider
{
        IDbSet<Review> Reviews { get; set; }
        IDbSet<Book> Books { get; set; }

        void PersistChanges();
}

public class SqlEntityProvider : IEntityProvider
{
        private readonly DbContext _context;

        public SqlEntityProvider()
        {
            _context = new EntityContext();
        }

        public IDbSet<Review> Reviews
        {
            get { return _context.Set<Review>(); }
            set { Reviews = value; }
        }

        public IDbSet<Book> Books
        {
            get { return _context.Set<Book>(); }
            set { Books = value; }
        }

        public void PersistChanges()
        {
            _context.SaveChanges();
        }
}

Now on to the first repository. As mentioned before, repositories act as gateways and guardians for our data.  Since there is a basic set of functionality that every repository should have, i have created a simple interface. The first implementation, our book repository, will only implement basic functionality by the interface. Our first repository looks like this:

public interface IRepository<T> where T : class
{
        IEnumerable<T> All();

        void Change(int id, T entity);
        void Add(T entity);
        void Remove(T entity);
}

public class BookRepository : IRepository<Book>
{
        private IEntityProvider Provider { get; set; }

        public BookRepository(IEntityProvider provider)
        {
            Provider = provider;
        }

        public IEnumerable<Book> All()
        {
            return Provider.Books.AsEnumerable();
        }

        public void Change(int id, Book entity)
        {
            Book item = Provider.Books.Where(e=>e.Id == id).FirstOrDefault();
            item.Name = entity.Name;
            item.ISBN = entity.ISBN;
            item.Author = entity.Author;

            Provider.PersistChanges();
        }

        public void Add(Book entity)
        {
            Provider.Books.Add(entity);
            Provider.PersistChanges();
        }

        public void Remove(Book entity)
        {
            Provider.Books.Remove(entity);
            Provider.PersistChanges();
        }
}

Wrapping up

We got a lot of work done, it’s time to wrap up the first article on the models. We have created basic models and wired them up to the Entity Framework using the code first approach. For more in-depth information on the code first approach check out this resource.  We created our first testable repository which contains basic functionality. But how about some results for all our hard work? A possible quick test is adding the following code to the index action of the home controller.

public ActionResult Index()
{
            SqlEntityProvider provider = new SqlEntityProvider();
            BookRepository repository = new BookRepository(provider);
            var result = repository.All();

            ViewBag.Message = result.FirstOrDefault().Author;

            return View();
}

Note that is not the proper way of integrating the repositories with the controllers, this is simply for the quick and dirty results. The proper loosely coupled way to integrate these components is through dependency injection, will will be covered in a later article. I expect the next article to be up within a week or so, until then happy coding. Feel free to ask questions, feedback is much appreciated. Full source for all that we have done so far is available here.

Asp.Net MvC 3 from scratch: Routing

This is the second article in the series where we create a book review application from scratch using Asp.Net MvC 3. In the first article i did a general introduction and created the project. In this article we are going to discuss the importance of automated testing and were going to finally going to write some code for testing our routes.

The routing mechanism is vital to the mapping of requests within our application. This makes subjecting routes to automated testing a valid investment of time. Automated testing of routes requires the help of a mocking framework, there a several frameworks available, i will be using MOQ for this article series.

We will be installing MOQ with help of the NuGet package manager. The easiest way to install third party libraries using NuGet is simply through the GUI. Click references insight the unit-test project and click “Add package reference”. In the next screen select “All” under the “Online” section on the left, then enter your search in the upper right corner like shown in the screenshot below. MOQ is the fourth result in de search results, click it and then click install.

This is just a simple example of the power of NuGet, installing complex libraries with dependencies is just as easy. You can use the same approach to update your third party libraries by selecting “All” under the “Updates” section on the left, so there is no more manual importing of any libraries anymore, hurray!

Unit-testing

We will be writing tests during the development of our application. Now that we got MOQ installed, lets talk about the importance of automated testing in the form of unit-testing. If you have limited or no experience with automated testing or unit-testing, it is recommended to read up on the subject, at least the basics before continuing. check out my resource page for recommended reading.

Before we get to why we should unit-test, lets set a definition for unit-testing. The clearest definition i have come across is one made by Roy Osherove: A unit-test is a an automated piece of code that invokes the method or class being tested and then checks some assumptions about the logical behavior of that method or class. A unit-test is always always written using a unit-testing framework. It is fully automated, trustworthy, readable and maintainable.

But why should we as developers be bothered with unit-testing the first place? Well basically because we should care about the quality of our the web applications we develop. Delivering quality can be a challenge with a constantly evolving application, unit-testing provides a means to enhance and maintain the quality of our web application. The three major advantages offered by unit-testing are:

  • Unit-testing allows for easy refactoring. Readily-available unit tests make it easy for the programmer to check whether a piece of code is still working properly after a change. If a change causes a fault, it can be quickly identified and fixed.
  • Unit-testing makes for great technical documentation. Developers somehow need to convey the functionally offered by class or method to other developers. One of the possibilities is looking at the unit-tests to gain a basic understanding off a class or method.
  • Unit-testing promotes good coding. An important aspect of good coding is loose coupling. Loose coupling is a prerequisite for testing code with dependencies because it enables the replacements of dependencies with test doubles (stubs, mocks). This is the only way to control how dependencies will behave when under test, which is vital for unit-testing.

Having discussed the advantages of unit-testing, it’s only fair to point out that that unit-testing is not able to cover all the bases, unit-testing has it’s limitations and you should not reply soulfully on one technique. The first limitation of unit-testing is that it does not cover integration between different parts of the system. The second limitation is that the test scenarios are created by the developer. Somehow end-users tend to be creative and use/abuse the application in ways that developers do not expect and therefore are not covered by unit-testing.

While unit-testing does not cover all the bases, it is very important for the overall quality of the application. Just do not soulfully on unit-testing, some form of integration testing and user testing are important to prepare an application for “the real world”. Now let’s do some unit-testing of our own.

Routing

As mentioned before, we will start with the automated testing of our routes! The routing mechanism is vital to the mapping of requests within our application. This makes subjecting routes to automated testing a valid investment of time.

The challenge with testing routes is that the routing mechanism expects an request from the user, in the form of an HttpContext instance. This is a dependency, which we are going to replace with a special test double called a mock using the MOQ framework. We configure the mock to behave the way we want, faking a request from the user and giving us control over the dependency behaves.

With the mock in place, we call the routing mechanism, passing in our mocked object. The routing mechanism then returns a the route data, which contains the information on how the routing mechanism will map the request. We can check the contents of the route data to see how the request would be mapped.  Now let’s see the code:

using System.Web;
using System.Web.Routing;
using Microsoft.VisualStudio.TestTools.UnitTesting;
using Moq;

namespace BookReviews.Web.Tests.Routes
{
    public static class RouteTestHelpers
    {
        public static Mock<HttpContextBase> BuildMockHttpContext(string url)
        {
            // Create a mock instance of the HttpContext class.
            var mockHttpContext = new Mock<HttpContextBase>();

            // Decorate our mock object with the desired behaviour.
            var mockRequest = new Mock<HttpRequestBase>();
            mockHttpContext.Setup(x => x.Request).Returns(mockRequest.Object);
            mockRequest.Setup(x => x.AppRelativeCurrentExecutionFilePath).Returns(url);

            var mockResponse = new Mock<HttpResponseBase>();
            mockHttpContext.Setup(x => x.Response).Returns(mockResponse.Object);
            mockResponse.Setup(x => x.ApplyAppPathModifier(It.IsAny<string>())).Returns<string>(x => x);

            return mockHttpContext;
        }

        public static RouteData GetRouteDataForMockedRequest(string url)
        {
            var routes = new RouteCollection();
            MvcApplication.RegisterRoutes(routes);

            // Create a mock instance of the HttpContext class with the desired behaviour.
            var mockHttpContext = BuildMockHttpContext(url);

            return routes.GetRouteData(mockHttpContext.Object);
        }
    }

    [TestClass]
    public class RouteTests
    {
        [TestMethod]
        public void TestReviewControllerIndexRoute()
        {
            RouteData routeData = RouteTestHelpers.GetRouteDataForMockedRequest("~/review/index");

            // Check if the route was mapped as expected.
            Assert.IsTrue(routeData != null && routeData.Values["controller"].ToString() == "review");
            Assert.IsTrue(routeData != null && routeData.Values["action"].ToString() == "index");
        }
    }
}

The code should be pretty straightforward with the commenting provided and the explanation provided earlier. As you may have noticed pretty much all of the code is reusable, so adding tests for new routes takes very little effort. In the next article we will start coding a lot more with our model. So this is it for the second article! Feel free to ask questions, feedback is much appreciated. Get the full source code for our progress so far from here!

Asp.Net MvC 3 from scratch: Introduction

Web development is best suited for those who like continuous learning, you have to with the constant changes in technologies and available frameworks. To get a grip on new technologies and frameworks i often find myself writing code simply to learn and experiment. Now i have decided to combine this experimenting with my blogging by developing  a series of articles based on developing a web application using the Asp.Net MvC 3.

The Asp.Net MvC 3 framework

I choose to develop my application with the Asp.Net MvC framework because it enables me to develop high quality web applications. The framework utilizes successful design patterns (Model View Controller) and is build using important object oriented principles and best-practices like separation of concerns, loose coupling and testability. While it may take some getting used to for those coming from Asp.Net Web Forms, the initial learning investment will prove worth while.

The Asp.Net MvC framework recently had it’s third major release, while many new features have been added, there are two new features that got me really exited. First is the Razor view engine, which offers cleaner and “easy on the eyes” syntax for the views. Razor comes with full Intellisense support. The second feature is the NuGet package manager which enables developers to easily manage third party libraries and their dependencies from within Visual Studio. NuGet makes installing and updating off all third party libraries a breeze. We will be working with both these new features during this article series. After discussing the framework let’s talk about the application we are going to build.

The Application

Before we begin, the concept behind the application is pretty straightforward. Community demo applications like Nerddinner proved that a simple concept can be effective for those wanting to learn a framework. We are going to develop a community website centered on book reviews. We start simple with the core functionality: write, tag and submit book reviews. As the series progresses functionality will be extended. We will start developing responses to book reviews with a badge/kudos system. We should also implement some membership mechanism like OpenID integration. But before we get carried away, lets actually start by creating the a new project!

Project creation

Time to fire up Visual Studio and get coding! Make sure you have installed Asp.Net MvC 3 otherwise it will not show up in the project templates when creating a new project. In the project creation screen select the Asp.Net MvC 3 project template. Name the project BookReviews.Web, and the solution BookReviews. In the second screen of the project creation select the select the following settings:

  1. Project template: Internet application
  2. View engine: Razor
  3. Create a unit-test project: Yes
  4. Test project Name: BookReviews.Web.Tests
  5. Unit-test framework: Visual Studio Unit Test

Wrapping up

After creating the project your Solution Explorer should look like the screenshot on the right, containing the web project along with a unit-test project. The unit-test project offers the opportunity for automated testing. We will start with automated testing in the next article. We will be using automated testing for our routes, ensuring that requests made by users get mapped properly within our application. So this is it for the first article, the next article get will posted within a few days! Feel free to ask questions, feedback is much appreciated.

Generating bulk SQL data scripts

Most web developers who develop Asp.Net web applications come across SQL Server. While developers may have less direct contact with databases than before due to ORM (object-relational mapping) tools, most of us still work with databases on a regular basis. Today was one of those days, I was faced with generating a couple of large SQL data export scripts.  These scripts have to be executed by a service provider, and since I don’t have access to their machines delivering scripts is the only simple way.

There are many tools for generating large SQL data export scripts, but not all are suited for managing large/bulk scripts. SQL data compare, a tool by Red Gate, generates scripts that are rather large, without an option to split the generated script. Since SQL data compare was my only tool for managing SQL data, I had to search for additional tooling, and i found the wonderful BCP utility.

It turns out the BCP utility is included in the SQL server install, and its specifically made for dealing with large bulk exports and imports. Its old school command-line, which might not make it the most user-friendly tool, but damn, is it fast. And it’s available on every machine that has SQL server installed. The basic usage of the BCP utility is clearly explained in this article, it’s pretty straightforward and the tool is awesome so use it! That is all, happy coding!