Wednesday, September 27, 2006

Problems with the DataGridViewComboBoxColumn

I've been playing with the new DataGridView that comes with Windows.Forms 2.0. It's really nice because it fully supports in-place editing and supplies a range of built in cell editors. Unfotunately they don't behave exactly as you'd expect (or maybe it's just me) and I've just spent a very fustrating moring with much googling trying to get some simple functionality to work that should have been a ten minute job. The problem is with the combo box cell editor (the DataGridViewComboBoxColumn column type). Now, with a standard windows combo box you can assign its DataSource property to anything that implements IList, by default it displays the ToString() value in the drop down list and the SelectedItem property is the item that's selected, as you'd expect. This makes it really really easy (one line of code easy) to get the user to choose a particular object from a list. There are also DisplayMember and ValueMember properties that allow you to set the property that's displayed and the property that is returned from SelectedItem, but I've never used these, since the default behaviour is exactly what I need 99% of the time.

However, the DataGridViewComboBoxColumn doesn't work like this, although it will display the ToString value if you don't set the DisplayMember, something internally goes wrong when it tries to look up the SelectedItem, you have to set DisplayMember to a public property of your class. Even worse, the default behaviour if you don't set the ValueMember property is to return the DisplayMember, there's no way of getting actual item itself. The only work around is to add a property to your class that returns itself and set that property to the ValueMember. Of course, if your item isn't something you are able to change (such as one of the framework classes) you'll have to cludge together a container object that holds a reference to your item. It's really awkward. Here's a little example I used to try and get it to work...

First, here's the class I want to bind to my DataGridView, note that the Child property is of type Child:

using System;

namespace DataGridViewTest
{
    public class Thing
    {
        string _name;

        public string Name
        {
            get { return _name; }
            set { _name = value; }
        }

        Child _child;

        public Child Child
        {
            get { return _child; }
            set { _child = value; }
        }
    }
}

Here's the Child class, you can see that I've implemented a 'This' propety that returns a reference to itself:

using System;

namespace DataGridViewTest
{
    public class Child
    {
        string _name;

        public string Name
        {
            get { return _name; }
            set { _name = value; }
        }

        public Child(string name)
        {
            _name = name;
        }

        public Child This
        {
            get { return this; }
        }
    }
}

And here's the form code. In the contructor we bind a list of things to the DataGridView using a P&P DataGridViewBinding, set the _childColumn.ValueMember to the This property so that it returns a child to the Child property of Thing, we set the _childColumn.DisplayMember to the Name property of child to display in the drop down list because the default ToString() functionality doesn't work and finally bind the DataSource of the child column to a list of children. showToolStripMenuItem_Click is the event handler for a menu item that just displays the things that have been created and GetChildren simply constructs a list of children.

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Text;
using System.Windows.Forms;
using Microsoft.Practices.SoftwareFactories.ServiceFactory.Editors.Grid;

namespace DataGridViewTest
{
    public partial class Form1 : Form
    {
        List _things = new List();
        DataGridViewBinding _thingsBinding = new DataGridViewBinding();

        public Form1()
        {
            InitializeComponent();
            _childColumn.ValueMember = "This";
            _childColumn.DisplayMember = "Name";

            _thingsBinding.Bind(_dataGrid, _things);
            _childColumn.DataSource = GetChildren();
        }

        private void showToolStripMenuItem_Click(object sender, EventArgs e)
        {
            StringBuilder message = new StringBuilder();
            foreach(Thing thing in _things)
            {
                message.Append(string.Format("{0}, {1}\r\n", thing.Name, thing.Child.Name));
            }
            MessageBox.Show(message.ToString());
        }

        private Child[] GetChildren()
        {
            return new Child[]{
                new Child("freddy"),
                new Child("Alice"),
                new Child("Leo"),
                new Child("Ben")
            };
        }
    }
}

I noticed that all the example code in MSDN is devoted to binding the DataGridView to DataSets, I wonder if it's because of this data set centric mind set that the Windows.Forms team is making such basic implementation errors?

Monday, September 25, 2006

Discovering GetService

Whilst looking at the Patterns & Practices GAT code, particularly custom wizard pages, I kept seeing lines like this:

IDictionaryService dictionaryService = GetService(typeof(IDictionaryService)) as IDictionaryService;

I'd never come accross this GetService method before, when I hit F1 I found out that it's a method inherited from System.ComponentModel.Component. Now, I thought I sort of understood the whole IComponent / IContainer thing, it's a pattern used by the IDE to allow you to drag and drop classes (components) on surfaces (other classes) allowing us to graphically compose applications. This is classic RAD stuff and it's really easy to use, you just implement IComponent, your class appears in the Visual Studio toolbox and if you double click on the class in the solution explorer it displays an design service which you can drag any other class that implements IComponent on to. I must admit I hadn't really thought of it much, or used that functionality other than as it's provided out of the box by the forms designer. I'm more of a POCO guy myself, I couldn't really see what benefits implementing classes as components provided other than that neat drag drop composition stuff. 

After Googling for GetService I found this excellent post Lightweight Containers and Plugin Architectures by Daniel Cazzulino where he compares the IContainer pattern with lightweight container architectures described by Martin Fowler in this post. According to Mr Cazzulino the IContainer pattern provides the .net world with a very nicely designed component architecture that can be used for composing and sharing services at runtime and that's where the GetService method comes in. I wont repeat all of Daniel's excellent article which is definately worth reading, especially the class model which describes the relationship between IComponent, IContainer, ISite, IServiceContainer and IServiceProvider. Instead here's a little practical example of how a dumb server can have components, some of which may provide services to the other compoents to be configured at runtime using the IContainer pattern. The core trick is to extend the built-in Container to provide an instance of ServiceContainer:

using System;
using System.ComponentModel;
using System.ComponentModel.Design;

namespace ComponentPlay
{
    public class ServiceProvidingContainer : Container
    {
        IServiceContainer _serviceContainer;

        public IServiceContainer ServiceContainer
        {
            get { return _serviceContainer; }
        }

        public ServiceProvidingContainer()
        {
            _serviceContainer = new ServiceContainer();
            _serviceContainer.AddService(typeof(IServiceContainer), _serviceContainer);
        }

        protected override object GetService(Type service)
        {
            return _serviceContainer.GetService(service);
        }
    }
}

Then you can provide a server with your custom Container:

using System;
using System.ComponentModel;
using System.ComponentModel.Design;

namespace ComponentPlay
{
    public class Server
    {
        IContainer _components = new ServiceProvidingContainer();

        public IContainer Components
        {
            get { return _components; }
        }
    }
}

Then I've defined a simple service interface:

using System;
using System.Collections.Generic;
using System.Text;

namespace ComponentPlay
{
    public interface IMessageProvider
    {
        string Message { get; }
    }
}

And a component that implements the IMessageProvider. The important thing here is to note how it overrides the Site property of Component to add itself to the service container, thus registering itself as an available service to other components (this is staight out of Daniel's post):

using System;
using System.ComponentModel;
using System.ComponentModel.Design;

namespace ComponentPlay
{
    public class MessageProvider : Component, IMessageProvider
    {
        string _message;

        public MessageProvider(string message)
        {
            _message = message;
        }

        public string Message
        {
            get
            {
                return _message;
            }
        }

        public override ISite Site
        {
            get
            {
                return base.Site;
            }
            set
            {
                base.Site = value;
                
                // publish this instance as a service
                IServiceContainer serviceContainer = (IServiceContainer)GetService(typeof(IServiceContainer));
                if(serviceContainer != null)
                {
                    serviceContainer.AddService(typeof(IMessageProvider), this);
                }
            }
        }
    }
}

Here's a component that can use an IMessageService. Note that you cannot guarantee that the service will be available:

using System;
using System.ComponentModel;

namespace ComponentPlay
{
    public class Client : Component
    {
        public void SayHello()
        {
            IMessageProvider mp = (IMessageProvider)GetService(typeof(IMessageProvider));
            if(mp == null)
                Console.WriteLine("IMessageProvider is null");
            else
                Console.WriteLine(mp.Message);
        }
    }
}

And here's a little test program showing it all put together:

using System;
using System.Collections.Generic;
using System.Text;

namespace ComponentPlay
{
    class Program
    {
        static void Main(string[] args)
        {
            Server server = new Server();
            Client client = new Client();
            MessageProvider mp = new MessageProvider("Hello from the message provider");

            server.Components.Add(client);
            server.Components.Add(mp);

            client.SayHello();

            Console.ReadLine();
        }
    }
}

The cool thing is that with this pattern is that the server is simply a container and has to know nothing about what components and services it hosts. Services can register themselves without coupling to either the Server framework or to any other components and components can discover services without any intervention from the Server framework. It's really nice to see yet another usefull pattern that comes out-of-the-box with .net, I'm just surprised that it's not more widely advertised. It's really poorly documented with no real explaination in the MSDN library (or am I just missing something). I'd love to have found a good article explaining the design choices behind this pattern and some implementation examples.

Friday, September 22, 2006

Just what do you do all day?

I've just read an excellent post by Peter Hallam, he works on the C# compiler at Microsoft. He looks at the amount of time professional developers spend writing new code, modifying existing code and understanding existing code. How much time do you think you spend on each of these activities? I wouldn't have guessed the correct figures which are:

Writing new code:             5%

Modifying existing code:    25%

Understanding code:         70%

I instinctively think that I spend most of my time writing existing code, but that's really not true. I'm currently working on building a guidance automation package, an entirely new product, but I've probably been spending the vast majority of my time reading the documentation and reading the code in the Service Factory GAT to understand the GAT framework. And this is on a new product. I once spent two and a half years of my life maintaining a huge buggy spaghetti code base, during that time I probably spent 99% of my time trying to understand existing code and the remaining 1% making small changes.

These figures strongly reinforce my belief that the number one imperative for professional software developers is writing easy to understand maintainable code, even if it doesn't perform as well and even if it takes longer to write. Think about it, if I spend five times as long to write a well factored domain model based application verses someone who uses all the visual studio tools to hack together something with datasets, no tiers and business logic spread throughout, I only have to make a small dent in the understanding code time in order to dramatically improve my productivity and the productivity of those poor unfortunate people who have to maintain my code long after I'm gone (from the project that is, not like dead. I really hope I outlive my software creations:)

I'm going to store Peter Hallam's blog in my top blog posts list so that next time someone argues that refactoring, layering or writing a proper domain model is taking too much time I'll have some great figures to back me up.

Thursday, September 21, 2006

How to examine code and write a class with EnvDTE

Further to my experiments with the Guidance Automation Toolkit, I've been playing with generating code with my custom guidance package. Looking at the Service Factory GAT that's been released by the Patterns and Practices group, they use three different techniques for code generation; T4 templates, EnvDTE and CodeDom. If they use all three, I wondered which one I should be using. I've previously used the CodeDom in other projects and although it's very powerfull, you use it to generate the syntactic structure of the code and can then generate C#, VB or whatever, it is really long winded. T4 templates are at the opposite end of the spectrum, a bit like asp for code generation, you simply write a template of the code you want to generate and put code between <# #> marks that the template engine runs. The problem with it at the moment is that they are really new and the tools are there yet. There's no intellisense or code coloring for it and debugging isn't easy either.

So I decided to have a look at the EnvDTE Visual Studio automation class library for code generation. A lot of the GAT stuff seems to be built around it, so it's a natural fit for code generation duties. Unfortunately the documentation isn't that great, and this little demo of how to navigate a code file and write a class took much longer than it should have. But here it is, It gets the current visual studio environment and enumerates though all the projects and project items. It then examines itself outputting all the code elements and finally writes a new class inside its own namespace. If you try this out, make sure you name the file it's in 'HowToUseCodeModelSpike.cs'.

using System;
using NUnit.Framework;
using EnvDTE;
using EnvDTE80;

namespace Mike.Tests
{
    [TestFixture]
    public class DteSpike
    {
        [Test]
        public void HowToUseCodeModelSpike()
        {
            // get the DTE reference...
            DTE2 dte2 = (EnvDTE80.DTE2)System.Runtime.InteropServices.Marshal.GetActiveObject("VisualStudio.DTE.8.0");

            // get the solution
            Solution solution = dte2.Solution;
            Console.WriteLine(solution.FullName);

            // get all the projects
            foreach(Project project in solution.Projects)
            {
                Console.WriteLine("\t{0}", project.FullName);

                // get all the items in each project
                foreach(ProjectItem item in project.ProjectItems)
                {
                    Console.WriteLine("\t\t{0}", item.Name);

                    // find this file and examine it
                    if(item.Name == "HowToUseCodeModelSpike.cs")
                    {
                        ExamineItem(item);
                    }
                }
            }
        }

        // examine an item
        private void ExamineItem(ProjectItem item)
        {
            FileCodeModel2 model = (FileCodeModel2)item.FileCodeModel;
            foreach(CodeElement codeElement in model.CodeElements)
            {
                ExamineCodeElement(codeElement, 3);
            }
        }

        // recursively examine code elements
        private void ExamineCodeElement(CodeElement codeElement, int tabs)
        {
            tabs++;
            try
            {
                Console.WriteLine(new string('\t', tabs) + "{0} {1}", 
                    codeElement.Name, codeElement.Kind.ToString());

                // if this is a namespace, add a class to it.
                if(codeElement.Kind == vsCMElement.vsCMElementNamespace)
                {
                    AddClassToNamespace((CodeNamespace)codeElement);
                }

                foreach(CodeElement childElement in codeElement.Children)
                {
                    ExamineCodeElement(childElement, tabs);
                }
            }
            catch
            {
                Console.WriteLine(new string('\t', tabs) + "codeElement without name: {0}", codeElement.Kind.ToString());
            }
        }

        // add a class to the given namespace
        private void AddClassToNamespace(CodeNamespace ns)
        {
            // add a class
            CodeClass2 chess = (CodeClass2)ns.AddClass("Chess", -1, null, null, vsCMAccess.vsCMAccessPublic);
            
            // add a function with a parameter and a comment
            CodeFunction2 move = (CodeFunction2)chess.AddFunction("Move", vsCMFunction.vsCMFunctionFunction, "int", -1, vsCMAccess.vsCMAccessPublic, null);
            move.AddParameter("IsOK", "bool", -1);
            move.Comment = "This is the move function";

            // add some text to the body of the function
            EditPoint2 editPoint = (EditPoint2)move.GetStartPoint(vsCMPart.vsCMPartBody).CreateEditPoint();
            editPoint.Indent(null, 0);
            editPoint.Insert("int a = 1;");
            editPoint.InsertNewLine(1);
            editPoint.Indent(null, 3);
            editPoint.Insert("int b = 3;");
            editPoint.InsertNewLine(2);
            editPoint.Indent(null, 3);
            editPoint.Insert("return a + b; //");
        }
    }
}

Thursday, September 14, 2006

Are you a nerd?

I tried this are you a nerd test today. My result was 'Mid Level Nerd'. I probably saved by the fact that I'm a family man and my wife kinda provides me with a social life :)

Wednesday, September 13, 2006

How to do database source control and builds

Most business applications are fundamentally database applications. A business application's database is as much a part of the application as its C# source code. Unfortunately, in many development shops versioning and controlling the database is done very differently from the in memory source code. For example:

  • Often a single 'development' database is shared between the developers. This means that if a developer makes a schema change he runs the risk of breaking his colleague's environments at least until the next build. Often this leads to a fear of changing the database schema.
  • The database is often versioned independently of the in memory source code. This makes is hard to deploy a specific build because often the database backups and the in memory source are are not syncronised.
  • The database is often not source controlled, or if it is, it is done as a single build mega-script. This makes it impossible to package linked database and in memory source changes as a single 'feature' or 'fix'. It makes it hard or impossible to roll out a single failing feature. Also, because the mega-script is often created by choosing the 'script database' feature of the dbms it is checked into source control under the build manger's login or whatever login the build scripts are running under. This means that it is impossible to track database changes. I can't go to a single table's create script in source safe, look for the last time it was checked in and find out who made the change and then read the comments to found out why it was changed. The best I can mange is to look through every single version of a huge script file looking for a change, but then there's no way of knowing who made it or why it was made.

How should you manage your database development then? Well the best thing is to treat it as much as you can just like all your other source code:

  • Every developer should develop against their own database instance.
  • A database backup should be stored with the executables from the build. That database backup should be a backup of a database built from the sql scripts retrieved from source control at the same label as all the other source.
  • The database should be maintained as object scripts. Each table, stored procedure, view and function should have its own script. To make any change in their local database a developer should check out the object they want to change, apply that to their database, unit test the change and then check in the change as a package with any other source for that feature or fix. The developer should label the package and comment it with a link to a feature or defect number.
  • Before a developer starts work on the next feature or fix he should get the source from the last build label (which, with continuous integration should also be the latest version) and restore his local database from the same labelled backup.

In order to do this you need tools that allow you to do the following:

  • Build a database from object level source files. Most dbms will choke if you try to just run the scripts in without first working out a build order by examining the object references. The tool must automate this for you, it's far to onerous to try to do this manually.
  • Be able to upgrade a database to a new schema version by calculating 'alter' scripts.

Here's where I plug a product that a couple of guys I know and worked with at NTL have developed, DbGhost. It allows you to adopt all those good practices that I've listed above. I've got it adopted on several projects now and they still haven't given me any commission or offered me a lucrative consultancy contract:(

Tuesday, September 12, 2006

Windows Live Writer

I've been using Windows Live Writer to write this post and the last one. In fact the last post was something I'd already written for our internal wiki and I was able to cut and paste it from there straight into the WLW window. It worked! It's a really cool tool and makes blogging a lot easier.

How to create a guidance package

After creating my first test guidance package using the Guidance Automation Toolkit (GAT), I've cobbled together some bullet point steps on how to do it. This is really rough at the moment, but I'll be adding to it as my knowledge grows. It took a while because there's nothing similar to it, like a walkthrough, and I didn't like using the meta guidance package because I wanted to understand how it all hung together first.

Creating the initial solution

  • Create a new solution with a class library project.
  • Add References:
EnvDTE
EnvDTE80
Microsoft.Practices.Common
Microsoft.Practices.ComponentModel
Microsoft.Practices.RecipeFramework
Microsoft.Practices.RecipeFramework.Common
Microsoft.Practices.RecipeFramework.Library
Microsoft.Practices.RecipeFramework.VisualStudio
Microsoft.Practices.WizardFramework
System
System.Windows.Forms
  • Tools->Guidance Package Manager->Enable/Disable Packages->Choose Guidance Package Development
  • Create a new class library project, name it Installer
  • Add References
Microsoft.Practices.RecipeFramework
Microsoft.Practices.RecipeFramework.VisualStudio
Microsoft.VisualStudio.TemplateWizardInterface
System
System.Cofiguration.Install
  • Set a project dependence of your guidance package project on the installer project
  • Create a new class called 'InstallerClass' in the installer project.
using Microsoft.Practices.RecipeFramework; 
namespace TestGuidancePackageInstaller
{
    /// <summary>
    /// Installer class for the guidance package
    /// </summary>
    [System.ComponentModel.ToolboxItem(false)]
    public class InstallerClass : ManifestInstaller
    {
    }
}
  • Create the guidance package xml configuration document, MyGuidancePackageName.xml, see documentation for details
<?xml version="1.0" encoding="utf-8" ?>
<GuidancePackage xmlns="http://schemas.microsoft.com/pag/gax-core"
Name="GuidancePackageName"
Caption="My Guidance Package"
Description="A test guidance package"
BindingRecipe="BindingRecipe"
Guid="fdd8f06f-6d6d-4228-96db-f842076764af"
SchemaVersion="1.0">
<Overview Url="Docs\Overview.htm"/>
<Recipes>
<Recipe Name="BindingRecipe">
<Types>
<TypeAlias Name="RefCreator" Type="Microsoft.Practices.RecipeFramework.Library.Actions.CreateUnboundReferenceAction, Microsoft.Practices.RecipeFramework.Library"/>
</Types>
<Caption>Creates unbound references to the guidance package</Caption>
</Recipe>
</Recipes>
</GuidancePackage>
  • Set the properties of MyGuidancePackageName.xml to BuildAction="Content", Copy to Output Directory="Copy if newer"
  • Build the solution
  • On the solution context menu select 'Register Guidance Package'
  • Open a new instance of VS, create a new project, go to Tools->Guidance Package Manager->Enable/Disable Packages
  • Should see you package

Create the Binding Recipe

  • Add a new Recipe element under Recipes
  • Set its name to 'BindingRecipe'
  • Add attribute BindingRecipe="BindingRecipe" to the GuidancePackage root element
  • Add types:
<Types>
<TypeAlias Name="RefCreator" Type="Microsoft.Practices.RecipeFramework.Library.Actions.CreateUnboundReferenceAction, Microsoft.Practices.RecipeFramework.Library"/>
</Types>
  • For each recipe you want to reference, add Actions:
<Action Name="<name of action>" Type="RefCreator" AssetName="<name of recipe to bind>" ReferenceType="<unboundRecipeReference class>" />

Create a recipe

  • Add a new Recipe element under Recipes
  • Set attribtes Name="its name" Bound="false"
  • Add Caption
  • Add HostData, this adds the recipe to the Solution, Project or Item context menus
<HostData>
<Icon ID="<icon number>"/>
<CommandBar="Project"/>
  • Add Arguments to specify the arguments that this recipe requires
  • Add GatheringServiceData to define the wizard that gets the arguments
  • Add Actions to specify what the recipe should do.

How to create and execute a T4 template

  • Create a folder 'Templates' in the guidance package project
  • Create a folder 'Text' in the 'Templates' folder
  • Add a template file with the extension .t4
  • Set the properties of the .t4 file: 'Build Action = Content', 'Copy to output directory = Copy if newer'
  • Write your .t4 template (see documentation on this)
  • Create a new Recipe as above
  • Create Arguments for all the properties of the .t4 template
  • Add an argument for the TargetFileName that adds .cs to the class name argument
<Argument Name="TargetFileName">
<ValueProvider Type="Microsoft.Practices.RecipeFramework.Library.ValueProviders.ExpressionEvaluatorValueProvider, Microsoft.Practices.RecipeFramework.Library" 
Expression="$(ClassName).cs">
<MonitorArgument Name="ClassName"/>
</ValueProvider>
</Argument>
  • Add an argument for the currently selected project
<Argument 
Name="CurrentProject" 
Type="EnvDTE.Project, EnvDTE, Version=8.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a">
<ValueProvider Type="Microsoft.Practices.RecipeFramework.Library.ValueProviders.FirstSelectedProject, Microsoft.Practices.RecipeFramework.Library" />            
</Argument>
  • Add an action to execute the template:
<Action Name="<action name>" 
Type="Microsoft.Practices.RecipeFramework.VisualStudio.Library.Templates.TextTemplateAction, Microsoft.Practices.RecipeFramework.VisualStudio.Library"
Template="Text<name of template>.t4">
<Input Name="<name of template property>" RecipeArgument="<recipe argument name>"/>
… as many input elements as you have properties
<Output Name="Content"/>
</Action>
  • Add an action to write a file to the currently selected project:
<Action Name="<actio name>" Type="Microsoft.Practices.RecipeFramework.Library.Actions.AddItemFromStringAction, Microsoft.Practices.RecipeFramework.Library" Open="true">
<Input Name="Content" ActionOutput="GenerateHelloClassAction.Content" />
<Input Name="TargetFileName" RecipeArgument="TargetFileName" />
<Input Name="Project" RecipeArgument="CurrentProject" />
</Action>

How to use solution and project templates to create a new solution structure from the New->Project menu in VS

  • Add a project folder, 'Templates', to the Guidance Package project.
  • Add a sub folder to 'Templates' called 'Solutions'.
  • Add a file called Solution.vstemplate to the 'Solutions' folder
  • Add an icon named Solution.ico to the 'Solutions' folder
  • Write the Solution.vstemplate. This is a 'multi-project' vstemplate with additions for GAT, example below:
<VSTemplate
Version="2.0"
Type="ProjectGroup"
xmlns="http://schemas.microsoft.com/developer/vstemplate/2005">
<TemplateData>
<Name>Test Guidance Package</Name>
<Description>A guidance package created to learn how to create guidance packages</Description>
<ProjectType>CSharp</ProjectType>
<Icon>Solution.ico</Icon>
<CreateNewFolder>false</CreateNewFolder>
<DefaultName>GatTest</DefaultName>
<ProvideDefaultName>true</ProvideDefaultName>
</TemplateData>
<TemplateContent>
<ProjectCollection>
<ProjectTemplateLink ProjectName="$ProjectName$">Projects\Domain\Domain.vstemplate</ProjectTemplateLink>
</ProjectCollection>
</TemplateContent>
<WizardExtension>
<Assembly>Microsoft.Practices.RecipeFramework.VisualStudio, Version=1.0.60429.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a</Assembly>
<FullClassName>Microsoft.Practices.RecipeFramework.VisualStudio.Templates.UnfoldTemplate</FullClassName>
</WizardExtension>
<WizardData>
<Template xmlns=http://schemas.microsoft.com/pag/gax-template
SchemaVersion="1.0"
Recipe="CreateSolution">
<References>
</References>
</Template>
</WizardData>
</VSTemplate>
  • Note the Recipe="CreateSolution" attribute of the template element under WizardData. This should point to a recipe defined in the MyGuidancePackage.xml file. This recipe is executed when the solution loads so you can use it to gather information from the user and execute any actions to build the solution items.
  • Under the 'Solutions' folder, create a folder called 'Projects'
  • Under the 'Projects' folder, create a folder for each project. Give it the project name
  • Add a ProjectName.vstemplate file to the project folder. Here's an example:
<VSTemplate
Version="2.0"
Type="Project"
xmlns="http://schemas.microsoft.com/developer/vstemplate/2005">
<TemplateData>
<Name>Domain model</Name>
<Description>A domain model for the application</Description>
<Icon>Domain.ico</Icon>
<ProjectType>CSharp</ProjectType>
<CreateNewFolder>false</CreateNewFolder>
<DefaultName>Domain</DefaultName>
<ProvideDefaultName>true</ProvideDefaultName>
</TemplateData>
<TemplateContent>
<Project File="Domain.csproj" ReplaceParameters="true">
<ProjectItem ReplaceParameters="true">Properties\AssemblyInfo.cs</ProjectItem>
</Project>
</TemplateContent>
<WizardExtension>
<Assembly>Microsoft.Practices.RecipeFramework.VisualStudio, Version=1.0.60429.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a</Assembly>
<FullClassName>Microsoft.Practices.RecipeFramework.VisualStudio.Templates.UnfoldTemplate</FullClassName>
</WizardExtension>
<WizardData>
<Template xmlns=http://schemas.microsoft.com/pag/gax-template
SchemaVersion="1.0">
<References>
</References>
</Template>
</WizardData>
</VSTemplate>
  • Add a ProjectName.csproj file. This is a standard project template. Here's an example, but you can take any existring .csproj file as a template (just insert the appropriate $variables$ at the right place) 

<Project DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
<Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform>
<ProductVersion>8.0.30703</ProductVersion>
<SchemaVersion>2.0</SchemaVersion>
<ProjectGuid>$guid1$</ProjectGuid>
<OutputType>Library</OutputType>
<AppDesignerFolder>Properties</AppDesignerFolder>
<RootNamespace>$safeprojectname$</RootNamespace>
<AssemblyName>$safeprojectname$</AssemblyName>
</PropertyGroup>
<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
<DebugSymbols>true</DebugSymbols>
<DebugType>full</DebugType>
<Optimize>false</Optimize>
<OutputPath>bin\Debug\</OutputPath>
<DefineConstants>DEBUG;TRACE</DefineConstants>
<ErrorReport>prompt</ErrorReport>
<WarningLevel>4</WarningLevel>
</PropertyGroup>
<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
<DebugType>pdbonly</DebugType>
<Optimize>true</Optimize>
<OutputPath>bin\Release\</OutputPath>
<DefineConstants>TRACE</DefineConstants>
<ErrorReport>prompt</ErrorReport>
<WarningLevel>4</WarningLevel>
</PropertyGroup>
<ItemGroup>
<Reference Include="System"/>
<Reference Include="System.Data"/>
<Reference Include="System.Xml"/>
</ItemGroup>
<ItemGroup>
<Compile Include="Properties\AssemblyInfo.cs" />
</ItemGroup>
<Import Project="$(MSBuildBinPath)\Microsoft.CSHARP.Targets" />
</Project>

  • Add a ProjectName.ico icon file
  • Set the properties of all the items added above to Build Action = "Content", Copy to output directory = "Copy if newer"
  • Add a new folder 'Properties' under the project folder
  • Add an AssemblyInfo.cs file under the Properties folder
  • Insert the appropriate $variables$ to replace the values that visual studio automatically provides. Here's an example:
using System.Reflection;
using System.Runtime.CompilerServices;
using System.Runtime.InteropServices;

// General Information about an assembly is controlled through the following
// set of attributes. Change these attribute values to modify the information
// associated with an assembly.
[assembly: AssemblyTitle("$projectname$")]
[assembly: AssemblyDescription("")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("$registeredorganization$")]
[assembly: AssemblyProduct("projectname")]
[assembly: AssemblyCopyright("Copyright © $registeredorganization$ $year$")]
[assembly: AssemblyTrademark("")]
[assembly: AssemblyCulture("")]

// Setting ComVisible to false makes the types in this assembly not visible
// to COM components. If you need to access a type in this assembly from
// COM, set the ComVisible attribute to true on that type.
[assembly: ComVisible(false)]

// The following GUID is for the ID of the typelib if this project is exposed to COM
[assembly: Guid("$guid1$")]

// Version information for an assembly consists of the following four values:
//
// Major Version
// Minor Version
// Build Number
// Revision
//
[assembly: AssemblyVersion("1.0.0.0")]
[assembly: AssemblyFileVersion("1.0.0.0")]

  • Set the properties of the AssemblyInfo.cs file to Build Action="Content", Copy to output directory = "Copy if newer"
  • Build the solution and Register the guidance package
  • Open a new instance of Visual Studio, select File->New->Project, Your Guidance Automation Package should now appear under 'Guidance Packages'.

Adding documentation to your guidance package

  • Create a new solution folder called 'Docs'.
  • Add a new HTML page called 'Overview.htm'
  • Set the properties of the Overview.htm file to Build Action="Content", Copy to output directory = "Copy if newer"
  • Add the following <Overview> element to your MyGuidancePackage.xml file under the document element:
<Overview Url="Docs\Overview.htm"/>
  • When you choose your guidance package, the Overview.htm page will display in the guidance navigator window 

Thursday, September 07, 2006

Fustrating Visual Studio Templates

VS 2005 has a new template feature that allows you to easily create item and project templates that you can add to the 'new item' and 'new project' dialogue boxes. It's pretty cool, you just select an item or a project from an existing solution and click file->Export template, a wizard pops up that allows you to name the template and optionally deploy it. Scott Guthrie has a nice post about it on his excellent blog (a required read for .net developers). As well as projects and items there's also a multi-project template that you're supposed to be able to use to produce template solutions. I've been playing with this for last couple of days in order to create a web service solution template for our team and it's pretty disapointing. Not much thought seems to have gone into it and there are two things in particular that make it useless for me. First of all, there's no way of customising the name of the projects in the solution. In the new solution dialogue you can choose the name of your solution and most .net projects I've worked on also use this as a solution namespace. Usually you name the projects in the solution prefixed with the solution name which plays well with the default namespace behaviour in visual studio. For example, say I call my solution 'Mike.MegaThing'. I'd then name my projects things like, 'Mike.MegaThing.Domain', 'Mike.MegaThing.Service', 'Mike.MegaThing.Dal'. Now every time I create a new class in 'Mike.MegaThing.Domain', the namespace is automatically created as 'Mike.MegaThing.Domain', perfect. Unfortunately the multi-project-template has no way of allowing you to change the name of it's projects. Although template items, like classes, can be tokenised and expanded with a range of default parameters like $safeprojectname$ this doesn't apply to the .vstemplate files themselves. My solution template has to have fixed project names like 'Service' or 'Domain' with similar default namespaces, which is crap because I want 'Mike.MegaThing.Dal' to be in a different namespace to 'Mike.LittleThing.Dal'. The other thing which makes the multi-project template useless is the wierd way it creates the folder structure for your created solution. Normally, if you add a project to a solution it creates a sub folder under your solution folder with the project name and puts the .csproj file in there. The multi-project template doesn't do this, instead it creates a sub folder under the solution folder with the name of the solution and then creates the project folders in that. There seems to be no way of altering this behaviour. So, thumbs down to visual studio templates for creating template solutions, although if you just want simple project or item templates it's perfectly adequate. I'm now going to be digging into the Guidance and Automation Toolkit (GAT) which looks much closer to what I need, more soon.

Software Factories, Domain Specific Languages and related stuff

I've recently had to get up to speed on some new buzzwords: 'Domain Specific Language' (DSL), 'Software Factories' and the 'Guidance and Automation Toolkit' (GAT). What do they all mean, and how do they relate to each other? I found it quite hard to find out. Two of them are Microsoft specific; 'Software Factories' and GAT, the other (DSL) is a generic software term, but is also being used for a specific Microsoft product. Martin Fowler explains it here extremely well. Before I get down to the terms themselves, first a little background... There's a constant trend in software development for common tasks to be automated. That's what computers are for. Of course the automation is a very tricky thing to get right and takes a while to evolve. First assemblers automated the conversion of human readable codes to ones and zeros, then compilers automated common tasks in a way that allowed code to be written at a higher level of abstraction. At the same time operating systems and now frameworks gradually handled more and more housekeeping tasks such as opening and closing files, network communication and printing stuff. There were a few false dawns, usually when the level of abstraction is raised too quickly without understanding all the implications, the case tools of the 80's and 90's are a good example. They never really succeeded because they couldn't bridge the gap between code generation and customisation. But now we're on the cusp of a new breakthrough in the level of abstraction we use to build common application types. The nirvana of the business developer is to be able to concentrate on capturing the business entities, relationships and rules and then to press a magic button that creates the application according to current best practices. We're getting closer and closer to this these days. The .net framework provides a huge level of abstraction for common tasks, the Enterprise Application Blocks make it even easier, and it's common to use either code generators such as code smith, or object/relational mappers such as NHibernate to automate data access. If you know all the tools out there and how to use them you can almost reach that nirvana. You can imagine a kind of 'automation of automation' where you might choose say an 'Enterprise smart client application' package that would know how to use all these frameworks and code generators to give you such a magic button. That in short is what Software Factories are. It's Microsoft's word for a collection of code generators and frameworks as well as a mechanism to capture the developer's intentions, that allows the automatic generation of all non business specific code. So the top level abstraction in this set of concepts is the Software Factory. At the practical, actually available today in beta level are the GAT and DSL Tools. I was quite confused about how these played together when I first started looking at them. They seem to do similar things, but which one to use when? In fact they are products of two separate teams at Microsoft who weren't really aware of what the other was doing until they'd progressed some way down their separate roads. The GAT was developed by the Patterns and Practices group; the people who make the Enterprise Application Blocks and produce advice on application architecture. It basically builds on Visual Studio Templates, adding a code templating system similar to code smith and some nice extension points that allow you to create wizards and actions. They've already released a number of software factories based on the GAT, such as the 'Service Factory'. The GAT is the easy-to-use, every software house should have one, way of building software factories. The DSL tools come as part of the Visual Studio SDK, produced by the Visual Studio team. The VS SDK is aimed at people who want to produce extensions for Visual Studio and is a bit more hard core than the GAT. I haven't looked at DSL tools in anywhere near the same depth, but I understand that it's essentially the diagramming engine used by the class designer and various other Visual Studio tools linked to a code generation engine. Apparently both the Visual Studio team and the Patterns and Practices team were working on their own code templating engines and it was only when someone noticed the duplicated effort that the two teams were brought together. The result was the T4 engine that used in both tools. These two toolsets represent quite an exciting development by Microsoft and as I said above, there's an expectation that we're on the cusp of a significant step change in software development. You only have to read about some of the huge productivity benefits of things like Ruby on Rails to appreciate how easy life can be with the right tools.