- | rssFeed | My book on MSBuild and Team Build | Archives and Categories Saturday, May 01, 2010

Web Deployment Tool (MSDeploy) : Build Package including extra files or excluding specific files

If you are using Visual Studio 2010 then you may already be aware that Web Deployment Tool (aka MSDeploy) is integrated into Visual Studio. I’ve posted a few blog entries already about this tool. Two of the common questions that I get discussing this with people are

  1. How do I exclude files from being placed in the package?
  2. How do I add other files to the created package?

I will address these two questions here, first we look at the easier one, how to exclude files but we will go over a bit of background first.

Web Publishing Pipeline

With Visual Studio 2010 a new concept has been created which is known as the Web Publishing Pipeline. In a nutshell this is a process which will take your web application, build it and eventually create a package that you can use to deploy your application. This process is fully captured in MSBuild. With VS 2010 many targets and many tasks are shipped to support this process. Since its captured in MSBuild format, you can customize and extend to your hearts desire. So what we need to do is hook into this process to perform the customizations that we need. This process is captured in the following files.

%program files%\MSBuild\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets
%program files%\MSBuild\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets

The Microsoft.WebApplication.targets file is imported by the web applications projects file, then that file imports the Microsoft.Web.Publishing.targets file.

Excluding files from being packaged

If you open the project file of a web application created with VS 2010 towards the bottom of it you will find a line with.

<Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" />

BTW you can open the project file inside of VS. Right click on the project pick Unload Project. Then right click on the unloaded project and select Edit Project.

This statement will include all the targets and tasks that we need. Most of our customizations should be after that import, if you are not sure put if after! So if you have files to exclude there is an item name, ExcludeFromPackageFiles, that can be used to do so. For example let’s say that you have file named Sample.Debug.js which included in your web application but you want that file to be excluded from the created packages. You can place the snippet below after that import statement.

<ItemGroup>
  <ExcludeFromPackageFiles Include="Sample.Debug.xml">
    <FromTarget>Project</FromTarget>
  </ExcludeFromPackageFiles>
</ItemGroup>

By declaring populating this item the files will automatically be excluded. Note the usage of the FromTarget metadata here. I will not get into that here, but you should know to always specify that.

Including extra files into the package

Including extra files into the package is a bit harder but still no bigee if you are comfortable with MSBuild, and if you are not then read this.  In order to do this we need to hook into the part of the process that collects the files for packaging. The target we need to extend is called CopyAllFilesToSingleFolder. This target has a dependency property, PipelinePreDeployCopyAllFilesToOneFolderDependsOn, that we can tap into and inject our own target. So we will create a target named CustomCollectFiles and inject that into the process. We achieve this with the following (remember after the import statement).

<PropertyGroup>
  <CopyAllFilesToSingleFolderForPackageDependsOn>
    CustomCollectFiles;
    $(CopyAllFilesToSingleFolderForPackageDependsOn);
  </CopyAllFilesToSingleFolderForPackageDependsOn>
</PropertyGroup>

This will add our target to the process, now we need to define the target itself. Let’s assume that you have a folder named Extra Files that sits 1 level above your web project. You want to include all of those files. Here is the CustomCollectFiles target and we discuss after that.

<Target Name="CustomCollectFiles">
  <ItemGroup>
    <_CustomFiles Include="..\Extra Files\**\*" />

    <FilesForPackagingFromProject  Include="%(_CustomFiles.Identity)">
      <DestinationRelativePath>Extra Files\%(RecursiveDir)%(Filename)%(Extension)</DestinationRelativePath>
    </FilesForPackagingFromProject>
  </ItemGroup>
</Target>

Here what I did was create the item _CustomFiles and in the Include attribute told it to pick up all the files in that folder and any folder underneath it. Then I use this item to populate the FilesForPackagingFromProject item. This is the item that MSDeploy actually uses to add extra files. Also notice that I declared the metadata DestinationRelativePath value. This will determine the relative path that it will be placed in the package. I used the statement Extra Files%(RecursiveDir)%(Filename)%(Extension) here. What that is saying is to place it in the same relative location in the package as it is under the Extra Files folder.

Admittedly this could be easier, but its not too bad, and its pretty flexible.

Sayed Ibrahim Hashimi

msbuild | MSBuild 4.0 | MSDeploy | Visual Studio | Visual Studio 2010 | Web Deployment Tool | Web Publishing Pipeline Saturday, May 01, 2010 4:09:16 AM (GMT Daylight Time, UTC+01:00)  #     | 
Friday, April 30, 2010

MSBuild 4.0: New command line switches

If you are using MSBuild 4.0 then you may be interested in knowing that there are a couple new switches that you can pass to msbuild.exe when you kick off a build. The new switches, /preprocess(/pp) and /detailedsummary(/ds), are more convenient then necessary.

/preprocess (/pp)

Since you can import other MSBuild files using the Import Element sometimes locating where a target, property or item is being defined can lead to a search that takes you through several files. It can be even more confusing if more than 1 file defines the property or target that you are interested in, because you may have thought that you found the right target but you may one that was overridden by another file. Now with MSBuild 4.0 you don’t have to search through all of those files. You can use the switch /preprocess switch. Here is the snippet from msbuild.exe /? describing it.

/preprocess[:file] 
 Creates a single, aggregated project file by inlining all the files that would be imported during a build, with their boundaries marked. This can be useful for figuring out what files are being imported and from where, and what they will contribute to the build. By default the output is written to the console window. If the path to an output file is provided that will be used instead.
 (Short form: /pp)
 Example:
   /pp:out.txt

When you use this the full logical project file is dumped to the console, or optionally to a file, and it includes references to where the elements are defined. For example I created the following very simple project files.

import-01.proj

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
    <Target Name="TargetOne">
        <Message Text="From import-02.proj - TargetOne"/>
    </Target>
</Project>

import-02.proj

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
    <Target Name="TargetTwo">
        <Message Text="From import-02.proj - TargetTwo"/>
    </Target>
</Project>

master.proj

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" DefaultTargets="TargetOne;TargetTwo" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">

    <Target Name="TargetOne">
        <Message Text="From master.proj - TargetOne"/>
    </Target>

    <Target Name="TargetTwo">
        <Message Text="From master.proj - TargetTwo"/>
    </Target>

    <Import Project="import-01.proj"/>
    <Import Project="import-02.proj"/>
</Project>

After executing the command msbuild.exe master.proj /pp:out.xml the following was written to the out.xml file.

<?xml version="1.0" encoding="utf-8"?>
<!--
============================================================================================================================================
C:\temp\MSBuild\import\master.proj
============================================================================================================================================
-->
<Project ToolsVersion="4.0" DefaultTargets="TargetOne;TargetTwo" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <Target Name="TargetOne">
    <Message Text="From master.proj - TargetOne" />
  </Target>
  <Target Name="TargetTwo">
    <Message Text="From master.proj - TargetTwo" />
  </Target>
  <!--
============================================================================================================================================
  <Import Project="import-01.proj">

C:\temp\MSBuild\import\import-01.proj
============================================================================================================================================
-->
  <Target Name="TargetOne">
    <Message Text="From import-02.proj - TargetOne" />
  </Target>
  <!--
============================================================================================================================================
  </Import>

C:\temp\MSBuild\import\master.proj
============================================================================================================================================
-->
  <!--
============================================================================================================================================
  <Import Project="import-02.proj">

C:\temp\MSBuild\import\import-02.proj
============================================================================================================================================
-->
  <Target Name="TargetTwo">
    <Message Text="From import-02.proj - TargetTwo" />
  </Target>
  <!--
============================================================================================================================================
  </Import>

C:\temp\MSBuild\import\master.proj
============================================================================================================================================
-->
</Project>

As you can see with /pp it is very easy to see exactly what is defined where and at what location.

/detailedsummary (/ds)

Another new feature with MSBuild 4.0 is the /detailedsummary (/ds) command line switch. When you use this switch you will be shown a detailed summary (haha) of build execution. This summary includes the amount of time spent build each project file as well as the node utilization. I just preformed a build with the command msbuild RuleStack.Engine.sln /m /ds and the summary is shown below.

============================== Build Hierarchy (IDs represent configurations) =====================================================
 Id                  : Exclusive Time   Total Time   Path (Targets)
 -----------------------------------------------------------------------------------------------------------------------------------
 0                   : 0.020s           1.211s       C:\...\RuleStack.Engine.sln ()
 | 1                 : 0.667s           0.667s       C:\...\RuleStack.Engine.Common\RuleStack.Engine.Common.csproj ()
 | 3                 : 0.255s           0.718s       C:\...\Unittest\RuleStack.Engine.Tests\RuleStack.Engine.Tests.csproj ()
 | | 6               : 0.000s           0.000s       C:\...\ObjectBinder\RuleStack.ObjectBinder\RuleStack.ObjectBinder .csproj ()
 | | 5               : 0.000s           0.000s       C:\...\RuleStack.Data\RuleStack.Data.csproj ()
 | | 1               : 0.000s           0.000s       C:\...\RuleStack.Engine.Common\RuleStack.Engine.Common.csproj ()
 | | 2               : 0.000s           0.000s       C:\...\RuleStack.Engine.Backend\RuleStack.Engine.Backend.csproj ( )
 | | 8               : 0.292s           0.460s       C:\...\RuleStack.Engine.Admin.Web\RuleStack.Engine.Admin.Web.csproj ()
 | | | 24            : 0.000s           0.000s       C:\...\RuleStack.Engine.Backend\RuleStack.Engine.Backend.csproj ( GetNativeManifest)
 | | | 5             : 0.000s           0.000s       C:\...\RuleStack.Data\RuleStack.Data.csproj ()
 | | . 2             : 0.000s           0.000s       C:\...\RuleStack.Engine.Backend\RuleStack.Engine.Backend.csproj ( )
 | . 36              : 0.003s           0.003s       C:\...\RuleStack.Engine.Admin.Web\RuleStack.Engine.Admin.Web.csproj (GetNativeManifest)
 | 2                 : 0.319s           0.390s       C:\...\RuleStack.Engine.Backend\RuleStack.Engine.Backend.csproj ( )
 | | 6               : 0.000s           0.000s       C:\...\ObjectBinder\RuleStack.ObjectBinder\RuleStack.ObjectBinder.csproj ()
 | | 5               : 0.000s           0.000s       C:\...\RuleStack.Data\RuleStack.Data.csproj ()
 | | 17              : 0.002s           0.002s       C:\...\RuleStack.Data\RuleStack.Data.csproj (GetNativeManifest)
 | . 21              : 0.001s           0.001s       C:\...\RuleStack.Data\RuleStack.Data.csproj (GetCopyToOutputDirectoryItems)
 | 4                 : 0.382s           0.567s       C:\...\RuleStack.Services\RuleStack.Services.csproj ()
 | | 5               : 0.000s           0.000s       C:\...\RuleStack.Data\RuleStack.Data.csproj ()
 | | 2               : 0.000s           0.000s       C:\...\RuleStack.Engine.Backend\RuleStack.Engine.Backend.csproj ( )
 | | 24              : 0.002s           0.002s       C:\...\RuleStack.Engine.Backend\RuleStack.Engine.Backend.csproj ( GetNativeManifest)
 | . 29              : 0.001s           0.001s       C:\...\RuleStack.Engine.Backend\RuleStack.Engine.Backend.csproj ( GetCopyToOutputDirectoryItems)
 | 7                 : 0.333s           0.337s       C:\...\ObjectBinder\Test_RuleStack.ObjectBinder\Test_RuleStack.ObjectBinder.csproj ()
 | | 6               : 0.000s           0.000s       C:\...\ObjectBinder\RuleStack.ObjectBinder\RuleStack.ObjectBinder.csproj ()
 | | 13              : 0.001s           0.001s       C:\...\ObjectBinder\RuleStack.ObjectBinder\RuleStack.ObjectBinder.csproj (GetNativeManifest)
 | . 19              : 0.001s           0.001s       C:\...\ObjectBinder\RuleStack.ObjectBinder\RuleStack.ObjectBinder.csproj (GetCopyToOutputDirectoryItems)
 | 6                 : 0.210s           0.210s       C:\...\ObjectBinder\RuleStack.ObjectBinder\RuleStack.ObjectBinder.csproj ()
 | 5                 : 0.277s           0.277s       C:\...\RuleStack.Data\RuleStack.Data.csproj ()
 | . 12              : 0.000s           0.000s       C:\...\RuleStack.Engine.Common\RuleStack.Engine.Common.csproj (GetNativeManifest)
 . 43                : 0.002s           0.002s       C:\...\RuleStack.Engine.Admin.Web\RuleStack.Engine.Admin.Web.csproj.metaproj ()

 ============================== Node Utilization (IDs represent configurations) ====================================================
 Timestamp:            1       2       3       4       5       6       7       8        Duration   Cumulative
 -----------------------------------------------------------------------------------------------------------------------------------
 634081842447519669:   0       x       x       x       x       x       x       x        0.018s     0.018s
 634081842447699679:   1       x       x       x       x       x       x       x        0.461s     0.479s #########
 634081842452309943:   |       6       7       5       3       4       2       x        0.130s     0.609s ##
 634081842453610018:   |       |       |       |       8       |       |       x        0.086s     0.695s #
 634081842454470067:   |       |       |       |       |       |       |       x        0.001s     0.696s
 634081842454480067:   x       |       |       |       |       |       |       x        0.001s     0.697s
 634081842454490068:   x       |       x       |       |       x       |       x        0.001s     0.698s
 634081842454500068:   x       |       x       |       |       x       x       x        0.001s     0.699s
 634081842454510069:   x       x       7       |       |       x       x       x        0.002s     0.701s
 634081842454530070:   12      x       |       |       |       x       x       x        0.002s     0.703s
 634081842454550071:   |       13      x       |       |       x       x       x        0.001s     0.704s
 634081842454560072:   |       x       7       |       |       x       x       x        0.008s     0.712s
 634081842454640076:   |       x       |       |       x       x       x       x        0.054s     0.766s #
 634081842455180107:   |       x       |       x       x       x       2       x        0.003s     0.769s
 634081842455210109:   |       x       |       17      x       x       x       x        0.002s     0.771s
 634081842455230110:   |       x       |       x       x       x       2       x        0.036s     0.807s
 634081842455590131:   |       19      x       x       x       x       |       x        0.001s     0.808s
 634081842455600131:   |       x       7       x       x       x       |       x        0.018s     0.826s
 634081842455780142:   |       x       x       x       x       x       |       x        0.036s     0.862s
 634081842456140162:   |       x       x       21      x       x       x       x        0.001s     0.863s
 634081842456150163:   |       x       x       x       x       x       2       x        0.016s     0.879s
 634081842456310172:   |       x       x       x       8       4       x       x        0.003s     0.882s
 634081842456340174:   |       x       x       x       |       x       24      x        0.001s     0.883s
 634081842456350174:   |       x       x       x       x       x       |       x        0.001s     0.884s
 634081842456360175:   |       x       x       x       8       4       x       x        0.148s     1.032s ##
 634081842457840259:   |       x       x       x       |       x       29      x        0.001s     1.033s
 634081842457850260:   |       x       x       x       |       4       x       x        0.023s     1.056s
 634081842458080273:   |       x       x       x       |       x       x       x        0.013s     1.069s
 634081842458210281:   |       x       x       x       3       x       x       x        0.004s     1.073s
 634081842458250283:   |       x       x       x       36      x       x       x        0.003s     1.076s
 634081842458280285:   |       x       x       x       3       x       x       x        0.131s     1.207s ##
 634081842459590360:   0       x       x       x       x       x       x       x        0.001s     1.208s
 634081842459600360:   43      x       x       x       x       x       x       x        0.002s     1.210s
 634081842459620361:   0       x       x       x       x       x       x       x        0.001s     1.211s
 -----------------------------------------------------------------------------------------------------------------------------------
 Utilization:          57.8    30.3    46.9    39.6    76.5    53.6    45.4    .0       Average Utilization: 43.8

In the snippet above you should know that I replace the path to the files with to reduce the width of the output. Also the machine that I’m currently using has 8 cores so it shows 8 nodes, on your machine you may have a different number of columns for the node utilization table.

Sayed Ibrahim Hashimi

msbuild | MSBuild 4.0 Friday, April 30, 2010 1:38:03 AM (GMT Daylight Time, UTC+01:00)  #     | 
Monday, April 26, 2010

Config transformations outside of web app builds

If you are using Visual Studio 2010 then you may already be familiar with the Web.config transformations that are now available. What you might not know is that you can use that same technology to transform config files outside of the build process. You will need Visual Studio 2010 installed on the machine where you perform these transformations. It is very easy to perform these transformation as well. Let’s say that we start with the app.config file shown below.

<configuration>
    <connectionStrings>
        <clear/>
        <add name="Default" connectionString="Data Source=localhost;Initial Catalog=Sample01;Integrated Security=True;" />
    </connectionStrings>
    
    <appSettings>
        <add key="contactEmail" value="contact@demo.example.com"/>
        <add key="siteUrl" value="http://demo.example.com"/>
    </appSettings>
    
</configuration>

Then we create another file, transform.xml, which contains our transformations. That file is shown below.

<configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
    <connectionStrings>
        <clear/>
        <add name="Default" connectionString="Data Source=NOT-localhost;Initial Catalog=Sample01;Integrated Security=True;" 
             xdt:Locator="Match(name)" xdt:Transform="Replace"/>
    </connectionStrings>

    <appSettings>
        <add key="contactEmail" value="contact@example.com" xdt:Locator="Match(key)" xdt:Transform="Replace"/>
        <add key="siteUrl" value="http://example.com" xdt:Locator="Match(key)" xdt:Transform="Replace"/>
    </appSettings>

</configuration>

Then we can easily execute the transformations by using MSBuild. So I created a file named trans.proj and it is shown below.

<Project ToolsVersion="4.0" DefaultTargets="Demo" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
    <UsingTask TaskName="TransformXml"
             AssemblyFile="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.Tasks.dll"/>

    <Target Name="Demo">
        <TransformXml Source="app.config"
                      Transform="Transform.xml"
                      Destination="app.prod.config"/>
    </Target>
</Project>

This MSBuild file uses the TransformXml task which is shipped with Visual Studio 2010. We specify the source file, transform file and the destination. Pretty straight forward.

In order to execute this I open a Visual Studio 2010 command prompt, browse to the directory containing both files, and enter the following command

msbuild trans.proj /t:Demo

Once you do this then you will find the file app.prod.config with the following contents.

<configuration>
    <connectionStrings>
        <clear/>
        <add name="Default" connectionString="Data Source=NOT-localhost;Initial Catalog=Sample01;Integrated Security=True;"/>
    </connectionStrings>
    
    <appSettings>
        <add key="contactEmail" value="contact@example.com"/>
        <add key="siteUrl" value="http://example.com"/>
    </appSettings>
    
</configuration>

Sayed Ibrahim Hashimi

Config-Transformation | msbuild | MSBuild 4.0 | MSDeploy | Visual Studio | Visual Studio 2010 Monday, April 26, 2010 5:22:06 AM (GMT Daylight Time, UTC+01:00)  #     | 
Tuesday, April 20, 2010

ASP.NET MVC a better TagBuilder

The other day I was working on a site which required a pager, so I searched around a bit and I a pagerby Gunnar Peipman that looked promising. I found a few others but decided against them. Some of them loaded all of the data into memory and then paged from there, and others just flat out didn't work! In any case I had a good experience with Gunnar's. I wanted to take what Gunnar had and create a view helper using my custom view helpers. Along the way I found myself writing some code using the TagBuilderclass that just didn't sit well with me. Take a look at the snippet below which.
TagBuilder startTag = new TagBuilder("a");
startTag.Attributes.Add("href", string.Format("{0}/{1}", this.UrlPrefix, 1));
startTag.SetInnerText("<<");
startTag.Attributes.Add("title", "first page");
httpResponse.Write(startTag.ToString(TagRenderMode.Normal));

TagBuilder previous = new TagBuilder("a");
previous.Attributes.Add("href", string.Format("{0}/{1}", this.UrlPrefix, this.CurrentPage - 1));
previous.SetInnerText("<");
previous.Attributes.Add("title", "previous page");
httpResponse.Write(previous.ToString(TagRenderMode.Normal));
I didn't like the fact that I had to make a bunch of calls to the tag builder to build the HTML for me, it was just uglier than what I wanted. So I decided to create a new tag builder which places an Fluent interface on top of it (ok, maybe its just method chaining). The end result was the FluentTagBuilderclass. I couldn't extend TagBuilderbecause I wanted to change the return types, so instead I created the class to just contain one and to just pass the calls through to it. What I did was to declare all the same properties and methods that the TagBuilder had, but just change the ones who returned void to return the same object itself. So for example I created methods like.
public FluentTagBuilder AddCssClass(string value)
{
    this.TagBuilder.AddCssClass(value);
    return this;
}

public FluentTagBuilder SetInnerHtml(string innerHtml)
{
    this.TagBuilder.InnerHtml = innerHtml;
    return this;
}
With this I can chain different method calls together and create code which looks better. If you've used jQuerythen you are used to this. With this in place I was able to convert the snippet above into the following.
FluentTagBuilder startTag =
    new FluentTagBuilder("a")
    .AddAttribute("href", string.Format("{0}/{1}", this.UrlPrefix, 1))
    .SetInnerText("<<")
    .AddAttribute("title", "first page");

httpResponse.Write(startTag.ToString(TagRenderMode.Normal));

FluentTagBuilder previous =
    new FluentTagBuilder("a")
    .AddAttribute("href", string.Format("{0}/{1}", this.UrlPrefix, this.CurrentPage - 1))
    .SetInnerText("<")
    .AddAttribute("title", "previous page");

httpResponse.Write(previous.ToString(TagRenderMode.Normal));
To me this is a lot easier to read, and to create. If you agree you can grab the class and include it in your projects. Links to full source files are below.
  1. FluentTagBuilder.cs
  2. Pager.cs
Sayed Ibrahim Hashimi
ASP.NET MVC | Fluent interface | View helpers Tuesday, April 20, 2010 5:10:14 AM (GMT Daylight Time, UTC+01:00)  #     | 
Friday, April 09, 2010

Entity Framework: Externalizing filters and dynamic includes

Today a friend of mine asked me to explain (via email) a few things to another developer regarding some things I have done with the Entity Framework (EF). I figured since I already typed it up I might as well share it here as well. So here it is.

I have had 2 issues with using EF4 which I have found creative and effective work around for.

Externalizing filters

The first issue, which also exists when using other frameworks, is that you end up with an explosion of methods (and usually a lot of duplicated code). For instance let’s say you have a table to store users. This table has an ID, and Email field and, and let’s and OpenIdUrl. So you end up with the following methods.

User GetUserById(long id)

User GetUserByEmail(string email)

User GetUserByOpenIdUrl(string openIdUrl)

Usually you’ll find that each of these methods have the exact same implementation except a different where clause. Now there is a bigger problem in the fact that you cannot anticipate everything that the user may want to filter on. Because of this and the dynamic nature of IQueryable I have employed a technique using Expression Trees to allow the user to declare the exact filter to be used via a lambda expression. So for the users it’s very slick. For example I could create the following method

public User GetUser(Expression<Func<User, bool>> filter)
{
    if (filter == null)
    {
        filter = Const<User>.LinqExpression.LinqExpressionFuncAlwaysTrue;
    }

    User foundUser = null;
    using (InlineTasksContext dbContext = new InlineTasksContext())
    {
        dbContext.Connection.Open();
        foundUser = dbContext.Users.Where(filter).SingleIfExists();
    }

    return foundUser;
}

In this case I’m allowing the user to pass in a filter that is applied to the query. Here is how I could implement those three if I chose to as well.

public User GetUserById(long id)
{
    return this.GetUser(user => user.Id == id);
}
public User GetUserByEmail(string email)
{
    if (email == null) { throw new System.ArgumentNullException("email"); }

    return GetUser(user => user.Email == email);
}
public User GetUserByOpenIdUrl(string openIdUrl)
{
    if (string.IsNullOrEmpty(openIdUrl)) { throw new ArgumentNullException("openIdUrl"); }

    return GetUser(user => user.OpenIdUrl == openIdUrl);
}

Note in the first method I am using some other helpers I created. Specifically the Const<User>.LinqExpression.LinqExpressionFuncAlwaysTrue as well as the SingleIfExists

extension method. I use the AlwaysTrue expression so that way I don’t have to be worried about doing an if and having to maintain 2 select statements. And the other (SingleIfExists) is used to just return null instead of blowing up if no elements exist in the sequence.

Since they are not the purpose of this email I’ve just put them at the bottom so that you can read if you are interested.

As Keith mentioned to me a while back Expression trees are not serializable so you still end up with this problem if you are using a service based solution. But you could try and use the sample Expression Tree Serialization to take care of this for you, I’ve never tried it, but it’s on my TODO list.

.Include

My next issue is that you are forced to declare what navigation properties (i.e. table links) should be included in the query. This is done using the .Include method. So back to our GetUser example, how can a framework know what tables values should be extracted for? The end result is that the framework creator just includes a bunch of tables and 90% of the time the result is just wasted resources because the other data is never touched. Better is to provide a default which only includes the most commonly used tables and another method that allows them to specify what they want. Here is a sample for a GetTasksByUserId method

public IList<Task> GetTasksByUserId(long userId, IEnumerable<string> includeList)
{
    using (InlineTasksContext dbContext = new InlineTasksContext())
    {
        var result = (from t in dbContext.Tasks
                        .Include(includeList)
                        .Where(t=>t.User.Id == 1)
                        select t)
                        .ToList();
        return result;
    }
}
 In this example I’m using an extension method for.Include which is defined as follows.
public static ObjectQuery<T> Include<T>(this ObjectQuery<T> query, IEnumerable<string> includes)
{
    if (query == null) { throw new System.ArgumentNullException("query"); }

    if (includes != null)
    {
        foreach (string include in includes)
        {
            query.Include(include);
        }
    }
    return query;
}

So the idea is to create a framework which is both robust and easy to use, but at the same time is not overly assumptive about what the users want. I think a lot more can be done with EF and its dynamic nature.

Does that cover what you were intending Keith? Any questions/comments please send them my way.

Here are those methods I mentioned above.

public static class LinqExtensions
{
    public static T SingleIfExists<T>(this IQueryable<T> query)
    {
        if (query == null) { throw new System.ArgumentNullException("query"); }

        T result = default(T);
        if (query.Count() > 0)
        {
            result = query.Single();
        }

        return result;
    }
}
//----------------------------------------------------------------------------------------------------
public class Const<T>
{
    public static class Predicate
    {
        public static Predicate<T> AlwaysTrue
        {
            get
            {
                return x => true;
            }
        }

        public static Predicate<T> AlwaysFalse
        {
            get
            {
                return x => false;
            }
        }
    }

    public class LinqExpression
    {
        public static Expression<Func<T, bool>> LinqExpressionFuncAlwaysTrue
        {
            get
            {
                return x => true;
            }
        }
    }
}
Sayed Ibrahim Hashimi
Entity | Entity Framework | LINQ Friday, April 09, 2010 2:49:32 AM (GMT Daylight Time, UTC+01:00)  #     | 
Friday, April 02, 2010

ASP.NET MVC Route + Ajax + jQuery

The story around routing in ASP.NET MVC is pretty good. When a resource is requested the action that it is routed to is determined and the parameters to the action are initialized. Counter to that when you build a link you specify the argument values and it creates the url for you. So if your routes happen to change your application shouldn’t be bothered by that. One area which can be problematic for this is if you are making any ajax requests. What I’ve done to minimize the impact of route changes to Ajax requests is to have the script make a call to determine the correct route. In the most trivial case this is extremely easy. You call an action, which has a known route, passing into it the name of the action and controller. It returns to you the url for the route. This works really good, but the problem is when your routes have parameters, how do you handle those?

First let’s take a look at the Javascript and then the definition of the action method which it calls.

// wire up the event for the enter button
$("#searchText").keypress(function (event) {
    if (event.keyCode == 13) {
        // grab the text that is inside the box
        var text = $("#searchText").val();

        var routData = {
            controllerName: 'Search',
            actionName: 'Search',
            paramValues: "{ criteria: '" + text + "' }"
        };

        $.ajax({
            url: "/Home/MapRoute",
            data: routData,
            success: function (data) {
                window.location.href = data;
            },
            error: function (data) {
                alert('there was a failure on the internets');
            }
        });

    }
});

Here I am building a Javascript object named routeData. This is declared in JSON. One thing to pay attention to is the fact that the value for paramValues is a string containing JSON. This is passed to the controller action.

I’m using jQuery to make an Ajax request to /Home/MapRoute. This maps to the method shown below.

public JsonResult MapRoute(string actionName, string controllerName, string paramValues)
{
    JavaScriptSerializer jss = new JavaScriptSerializer();

    Dictionary<string, string> parameters = jss.Deserialize<Dictionary<string, string>>(paramValues);

    RouteValueDictionary rd = new RouteValueDictionary();
    foreach (string key in parameters.Keys)
    {
        rd.Add(key, parameters[key]);
    }

    UrlHelper urlHelper = new UrlHelper(this.Request.RequestContext);
    string url = urlHelper.Action(actionName, controllerName, rd);

    return Json(url, JsonRequestBehavior.AllowGet);
}

Here I’m using the JavaScriptSerializer to convert the JSON string into a dictionary, with string as key and value. I use that dictionary to create a RouteValueDictionary which is passed, along with other parameters, into the UrlHelper to generate the url. When you return the Json result you must specify JsonRequestBehavior.AllowGet, otherwise a 500 internal service error will be returned. I think this is new with ASP.NET 2.

When the action method returns, you can use that url. The drawback to this approach is that you will make an extra request to determine the url, but you will be sure that those urls are correct. Also you could cache the results with Output Caching since the routes won’t change.

Sayed Ibrahim Hashimi

ASP.NET MVC | Javascript | jQuery | routing Friday, April 02, 2010 6:24:30 AM (GMT Daylight Time, UTC+01:00)  #     | 
Wednesday, March 31, 2010

jQuery: Creating a plugin to place hints in text boxes

Have you seen those text boxes on the web that have a hint contained inside of it and wondered how you could implement that? It's pretty easy with jQuery, and there already exist some plugins that you can use, for example here is one. When I set out to do this I didn’t even look to see what was out there because I wanted to write a jQuery plugin, but the solution that I can up with is not that different from that one.

Here is how I wanted the plugin to behave

  1. Add a specified hint to the text box, if the input element was not focused and empty
  2. When the text box was focused, the hint should disappear
  3. When a form is submitted all hints should be removed prior to ensure that they are not incorrectly submitted

First what I did was to create the a file named jquery.sedotech.inputWithHint.js. You should name your plugins using this naming convention

jquery.customString.pluginName.js

Here is the source for the plugin

(function ($) {
    var hintClassName = 'inputWithTextHint';

    $.fn.addHint = function (hint) {
        var filteredSet = this.filter('input:text, textarea');
        filteredSet.each(function () {
            // In here 'this' refers to an individual element
            doAddHint($(this), hint);
        });

        // Find all forms and update the pre post to remove the hint
        $('form input:submit').click(function () {
            $('input:text, textarea').each(function () {
                if ($(this).hasClass(hintClassName) && $(this).attr('hint')
                && $(this).val() == $(this).attr('hint')) {
                    $(this).val('');
                }
            });
        });
    }

    function doAddHint(target, hint) {
        // Only add hint if the target is empty
        if (target.val() == '') {
            addHintToInput(target, hint);

            target.focus(function () {
                // If the target has the hint class on it then a hint must be showing
                //  when hint is showing put cursor at the begining
                if ($(this).hasClass(hintClassName)) {
                    // remove the hint
                    $(this).val('');
                    // remove class
                    $(this).removeClass(hintClassName);
                }
            });

            target.blur(function () {
                // If no text then add hint class back
                if ($(this).val() == '') {
                    addHintToInput(target, hint);
                }
            });
        }
    }

    function addHintToInput(target, hint) {
        target.val(hint);
        target.addClass(hintClassName);
        // add attribute to the target to store hint
        target.attr('hint', hint);
    }
})(jQuery);
Some things to take note of. When you are creating a plugin in you should use the pattern
(function ($) {
    // plugin code here
})(jQuery);

Take note of the ($) as the parameter and (jQuery) at the end. What is happening here is that you are defining an anonymous function declaring a parameter named $ and then invoking that function passing in the jQuery object. You do this because when you are authoring plugins the $ variable is not available, you have to use the other alias jQuery, but that’s just way too difficult. If you use the pattern you are ensured that the $ alias is available and it won’t conflict with other Javascript libraries.

Beyond this all I’m doing is filtering the list of items which are passed in, with the expression var filteredSet = this.filter('input:text, textarea'), to make sure that the plugins doesn’t touch elements which it is not familiar with modifying. After I add the hint, by calling doAddHint on each element in the filtered set, I make sure that the forms on the page are not submitted with those hints.

Resource Links

Sayed Ibrahim Hashimi

Javascript | jQuery | jQuery-plugin Wednesday, March 31, 2010 5:24:07 AM (GMT Daylight Time, UTC+01:00)  #     | 
Friday, March 26, 2010

MOQ: Comparing hard coded mocks to moq mocks

I'm working on an application right now which uses some mock classes which were "hand written" instead of using a mock framework. I haven't had much experience with using mock frameworks, but I've wanted to learn more about them. There are many mock frameworks out there, I chose to use moq because the test cases for ASP.NET MVC use moq. What I wanted to do was replace the mock class that I created with a moq class. If you have ever created mock classes before then you probably know that they are pretty annoying to maintain. What I mean is that if you add a method/property to the interface which the mock class implements then you have to update the mock just to get it to build. This is annoying and kinda makes people not to use the mock which then ends up in ignoring/commenting out test cases. To avoid all of this most mock frameworks, including moq, just require you to specify behavior for methods/properties which you are ready to exercise in your test cases. Now I’d like to show you the mock class that I had. This interface that it is mocking is used to abstract the data store.

internal class MockModelContext :IModelContext
{
    public User CreateUser(User user)
    {
        // if id is not set, set it and just return it
        if (user.Id <= 0)
        {
            user.Id = 1;
        }

        return user;
    }

    public User GetUserByEmail(string email)
    {
        if (string.IsNullOrEmpty(email)) { throw new ArgumentNullException("email"); }

        User user = CreateDummyUser();
        user.Email = email;

        return user;
    }

    public User GetUser(Expression<Func<User, bool>> filter)
    {
        return CreateDummyUser();
    }

    public IList<Task> GetRecentTasks()
    {
        IList<Task> tasks = new List<Task>()
        {
            new Task()
            {
                CreatedDate = new DateTime(2010,1,1,1,1,1),
                CreatorId = 1,
                //Description ="Description 01 here",
                Headline="Headline 01 here",
                Id = 1,
                LastEditedDate= new DateTime(2010,1,1,1,1,1),
                LastEditOwnerId=1,
                Name = "Name here",
                NumViews = 3,
                Script = @"<ScriptHere>script</ScriptHere>",
                // TODO: Tags
                // TODO: TaskComments
                User = new User()
                {
                    Email ="one@hotmail.com",
                    FirstName="First",
                    Id=2,
                    LastName="Last",
                    MiddleName="Middle",
                }
            },
            new Task()
            {
                CreatedDate = new DateTime(2010,1,1,1,1,1),
                CreatorId = 1,
                //Description ="Description 02 here",
                Headline="Headline 02 here",
                Id = 1,
                LastEditedDate= new DateTime(2010,1,1,1,1,1),
                LastEditOwnerId=1,
                Name = "Name here",
                NumViews = 3,
                Script = @"<ScriptHere>script2</ScriptHere>",
                // TODO: Tags
                // TODO: TaskComments
                User = new User()
                {
                    Email ="one@hotmail.com",
                    FirstName="First",
                    Id=2,
                    LastName="Last",
                    MiddleName="Middle",
                }
            }
        };

        return tasks;
    }
    protected internal User CreateDummyUser()
    {
        User user = new User()
        {
            Email = "email",
            FirstName = "First",
            LastName = "Last",
            MiddleName = "Middle"
        };

        return user;
    }
    public User GetUserByOpenIdUrl(string openIdUrl)
    {
        throw new NotImplementedException();
    }
    public IList<Task> GetRecentTasks(IEnumerable<string> includeList)
    {
        throw new NotImplementedException();
    }
    public Task AddTask(Task task)
    {
        throw new NotImplementedException();
    }
    public User GetUserById(long id)
    {
        User user = this.CreateDummyUser();
        user.Id = id;

        return user;
    }
    public User SaveUser(User user)
    {
        throw new NotImplementedException();
    }
    public Task GetTaskById(long id)
    {
        throw new NotImplementedException();
    }
}

Take notice of the methods which just throw a NotImplementedException, these are methods that I just added to the interface and haven’t yet written test cases for. (Yeah I know I’m not following true TDD, but I never claimed to be either). Now you can compare that to these methods which use moq to create the mock.

private Mock<IModelContext> CreateMockModelContext()
{
    var context = new Mock<IModelContext>();

    context.Setup(c => c.CreateUser(It.IsAny<User>()))
        .Returns<User>(user =>
        {
            if (user.Id <= 0)
            {
                user.Id = 1;
            }
            return user;
        });

    context.Setup(c => c.GetUserByEmail(It.IsAny<string>()))
        .Returns<string>(email =>
            {
                if (email == null) { throw new ArgumentNullException("email"); }

                User user = this.CreateDummyUser();
                user.Email = email;

                return user;
            }); ;

    context.Setup(c => c.GetUserById(It.IsAny<long>()))
        .Returns<long>(id => 
        {
            User user = new User();
            user.Id = id;
            return user;
        });

    context.Setup(c => c.GetRecentTasks())
        .Returns(() =>
        {
            IList<Task> tasks = new List<Task>()
            {
                new Task()
                {
                    CreatedDate = new DateTime(2010,1,1,1,1,1),
                    CreatorId = 1,
                    //Description ="Description 01 here",
                    Headline="Headline 01 here",
                    Id = 1,
                    LastEditedDate= new DateTime(2010,1,1,1,1,1),
                    LastEditOwnerId=1,
                    Name = "Name here",
                    NumViews = 3,
                    Script = @"<ScriptHere>script</ScriptHere>",
                    // TODO: Tags
                    // TODO: TaskComments
                    User = new User()
                    {
                        Email ="one@hotmail.com",
                        FirstName="First",
                        Id=2,
                        LastName="Last",
                        MiddleName="Middle",
                    }
                },
                new Task()
                {
                    CreatedDate = new DateTime(2010,1,1,1,1,1),
                    CreatorId = 1,
                    //Description ="Description 02 here",
                    Headline="Headline 02 here",
                    Id = 1,
                    LastEditedDate= new DateTime(2010,1,1,1,1,1),
                    LastEditOwnerId=1,
                    Name = "Name here",
                    NumViews = 3,
                    Script = @"<ScriptHere>script2</ScriptHere>",
                    // TODO: Tags
                    // TODO: TaskComments
                    User = new User()
                    {
                        Email ="one@hotmail.com",
                        FirstName="First",
                        Id=2,
                        LastName="Last",
                        MiddleName="Middle",
                    }
                }

            };
            return tasks;
        });

    return context;
}
protected internal User CreateDummyUser()
{
    User user = new User()
    {
        Email = "email",
        FirstName = "First",
        LastName = "Last",
        MiddleName = "Middle"
    };

    return user;
}

Since I’m just mocking methods all I’m really doing here is using the moq Setup method (formerly known as Expect), and the Returns method to implement the behavior that I needed. The key to note here is that if you need to access the parameter(s) passed into the method, then you will have to used Returns and pass in a lambda expression that contains the behavior. In that lamdba you can name the parameter anything you want, but I would suggest you name it the same name that the method you are mocking names it. This makes it much more understandable what you are actually doing.

Sayed Ibrahim Hashimi

mocks | moq | unit testing Friday, March 26, 2010 5:10:46 AM (GMT Standard Time, UTC+00:00)  #     | 
Friday, March 19, 2010

Replacing solution files with MSBuild Files

I recently answered a question on stackoverflow.com Replace .sln with MSBuild and wrap contained projects into targets and wanted to share that content here as well. I’ll expand a bit more on it here.

A couple common questions that I’m frequently asked include

  1. Should I use solution files for public builds
  2. How can I replace a solution file with an MSBuild file

My answer for 1 is that I never use .sln files for any public build. I always create my own build file and use that. My take on solution files is that they are for developer convince for use in Visual Studio only. If you disagree, you are not alone.

Replacing a solution file with an MSBuild file is pretty easy, you just create an MSBuild file to build your projects. You will do that by using the MSBuild task. Now if you want to create a reusable .targets file (or set of .targets files) then this is a bit more involved, but makes it easier on you in the long haul.

I wrote about 20 pages on creating reusable .targets files in my book, but I'll get you started here with the basics here. I believe that the key to creating reusable build scripts (i.e. .targets files) is three elements:

The idea is that you want to place all of your targets into separate files and then these files will be imported by the files which will be driving the build process. These are the files which contain the data. Since you import the .targets files you get all the targets as if they had been defined inline. There will be a silent contract between the .proj and .targets files. This contract is defined in properties and items which both use. This is what needs to be validated.

The idea here is not new. This pattern is followed by .csproj (and other projects generated by Visual Studio). If you take a look your .csproj file you will not find a single target, just properties and items. Then towards the bottom of the file it imports Microsoft.csharp.targets (may differ depending on project type). This project file (along with others that it imports) contains all the targets which actually perform the build.

So it's layed out like this:

Where MyProdcut.proj might look like:

<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <!-- This uses a .targets file to off load performing the build -->
  <PropertyGroup>
    <Configuration Condition=" '$(Configuration)'=='' ">Release</Configuration>
    <OutputPath Condition=" '$(OutputPath)'=='' ">$(MSBuildProjectDirectory)\BuildArtifacts\bin\</OutputPath>
  </PropertyGroup>

  <ItemGroup>
    <Projects Include="$(MSBuildProjectDirectory)\..\ClassLibrary1\ClassLibrary1.csproj"/>
    <Projects Include="$(MSBuildProjectDirectory)\..\ClassLibrary2\ClassLibrary2.csproj"/>
    <Projects Include="$(MSBuildProjectDirectory)\..\ClassLibrary3\ClassLibrary3.csproj"/>
    <Projects Include="$(MSBuildProjectDirectory)\..\WindowsFormsApplication1\WindowsFormsApplication1.csproj"/>
  </ItemGroup>

  <Import Project="SharedBuild.targets"/>
</Project>

And SharedBuild.targets might look like:

<Project  DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <!-- This represents a re-usable build file -->
  <Target Name="SharedBuild_Validate">
    <!-- See http://sedodream.com/2009/06/30/ElementsOfReusableMSBuildScriptsValidation.aspx for more info
         about this validation pattern
    -->
    <ItemGroup>
      <_RequiredProperties Include ="Configuration">
          <Value>$(Configuration)</Value>
      </_RequiredProperties>    
      <_RequiredProperties Include ="OutputPath">
          <Value>$(OutputPath)</Value>
      </_RequiredProperties>
      
      <_RequiredItems Include="Projects">
        <RequiredValue>%(Projects.Identity)</RequiredValue>
        <RequiredFilePath>%(Projects.Identity)</RequiredFilePath>
      </_RequiredItems>
    </ItemGroup>

    <!-- Raise an error if any value in _RequiredProperties is missing -->
    <Error Condition="'%(_RequiredProperties.Value)'==''"
           Text="Missing required property [%(_RequiredProperties.Identity)]"/>

    <!-- Raise an error if any value in _RequiredItems is empty -->
    <Error Condition="'%(_RequiredItems.RequiredValue)'==''"
           Text="Missing required item value [%(_RequiredItems.Identity)]" />

    <!-- Validate any file/directory that should exist -->
    <Error Condition="'%(_RequiredItems.RequiredFilePath)' != '' and !Exists('%(_RequiredItems.RequiredFilePath)')"
           Text="Unable to find expeceted path [%(_RequiredItems.RequiredFilePath)] on item [%(_RequiredItems.Identity)]" />
  </Target>

  <PropertyGroup>
    <BuildDependsOn>
      SharedBuild_Validate;
      BeforeBuild;
      CoreBuild;
      AfterBuild;
    </BuildDependsOn>
  </PropertyGroup>
  <Target Name="Build" DependsOnTargets="$(BuildDependsOn)"/>
  <Target Name="BeforeBuild"/>
  <Target Name="AfterBuild"/>
  <Target Name="CoreBuild">
    <!-- Make sure output folder exists -->
    <PropertyGroup>
      <_FullOutputPath>$(OutputPath)$(Configuration)\</_FullOutputPath>
    </PropertyGroup>
    <MakeDir Directories="$(_FullOutputPath)"/>
    <MSBuild Projects="@(Projects)"
             BuildInParallel="true"
             Properties="OutputPath=$(_FullOutputPath)"/>
  </Target>
</Project>

Don't look too much at the SharedBuild_Validate target yet. I put that there for completeness but don't focus on it. You can find more info on that at my blog at http://sedodream.com/2009/06/30/ElementsOfReusableMSBuildScriptsValidation.aspx.

The important parts to notice are the extensibility points. Even though this is a very basic file, it has all the components of a reusable .targets file. You can customize it's behavior by passing in different properties and items to build. You can extend it's behavior by overriding a target (BeforeBuild, AfterBuild or even CoreBuild) and you can inject your own targets into the build with:

<Project ...>
   ...
  <Import Project="SharedBuild.targets"/>
  <PropertyGroup>
    <BuildDependsOn>
      $(BuildDependsOn);
      CustomAfterBuild
    </BuildDependsOn>
  </PropertyGroup>
  <Target Name="CustomAfterBuild">
    <!-- Insert stuff here -->
  </Target>
</Project>

So this is the basics on how to create reusable build scripts which can help you easily create .proj files to replace your .sln files. I also recently answered. a related question Make a target run once at the Solution level in MSBuild.

This samples for this are using Visual Studio 2010 RC You can download the samples for this at http://sedotech.com/Content/files/ReplaceSlnFile.zip

Sayed Ibrahim Hashimi

msbuild | Visual | Visual Studio | Visual Studio 2010 Friday, March 19, 2010 4:16:06 AM (GMT Standard Time, UTC+00:00)  #     | 
Thursday, March 11, 2010

Web Deployment Tool (MSDeploy) Custom Provider Take 1

Disclaimer: Take what you read here with a grain of salt, I’m not an expert at providers … yet :)

I’ve known for quite a while that the Web Deployment Tool supports custom providers but I’ve never really looked at what it took to get actually write one. Tonight I wanted to write a simple provider to just sync a file from one place to another, just to see what is involved in creating that provider. In this post I describe how I created the provider. First you have to have the Web Deployment Tool installed, I’ve got the RTM version installed, but recently they delivered version 1.1 either should work. First things first, you need to create a class library project in Visual Studio. For this example I used Visual Studio 2010 RC for the reason that it’s the only version of Visual Studio that I have installed on this machine. If you are using Visual Studio 2010 make sure that you specify to build for .NET 3.5 because MSDeploy won’t pickup any providers written in .NET 4.0. To specify that your project should build for .NET 3.5 go to Project->Properties then on the Application tab pick the Target Framework to be .NET 3.5. See the image below for clarification.

targetframework-.net35

You will need to reference the two assemblies Microsoft.Web.Deployment.dll and Microsoft.Web.Delegation.dll. You can find both in the %Program Files%\IIS\Microsoft Web Deploy folder.

After this you need to create the class which is the provider. I called my CustomFileProvider because it will only sync a single file. The class should extend the DeploymentObjectProvider class. There are a couple abstract items that you must implement those are.

CreateKeyAttributeData

From what I can see this method is used to indicate how the “key attribute” is used. For instance when you use a contentPath provider you would use a statement like msdeploy –verb:sync –source:contentPath=C:\one\pathToSync –dest:… So we can see that the value C:\one\pathToSync is passed to the provider without a name. This is the key attribute value. This method for my provider looks like the following.

public override DeploymentObjectAttributeData CreateKeyAttributeData()
{
    DeploymentObjectAttributeData attributeData = new DeploymentObjectAttributeData(
        CustomFileProvider.KeyAttributeName,
        this.FilePath,
        DeploymentObjectAttributeKind.CaseInsensitiveCompare);

    return attributeData;
}

In this case CustomFileProvider.KeyAttributeName is a const whose value is path and its value is provided from the FilePath property. The other item that you have to override is the Name property.

Name

This property returns the name of the provider. In all the samples that I have seen (which is not very much) this name always agrees with the name of the custom provider factory, more on that in a bit. So in their example I had mine return the value customFile which my factory also returns.

Outside of these two items there are some other methods that you need to know about those are covered below.

GetAttributes

The GetAttributes method is kinda interesting. This method will be called on both the source and destination and you need to understand which context its being called in and act accordingly. You can determine if you are executing on the source or dest by using the BaseContext.IsDestinationObject property. So for this provider if you are in the source you want to ensure that the file specified exists, if not then raise a DeploymentFatalExcepton, this will stop the sync. If you are on the destination you could perform some checks to see if the file is up-to-date or not. For a simple provider you can force a sync to occur. You would do this by raising a DeploymentException. When you raise this exception at this time it causes the Add method to be called, which is exactly what we want. Here is my version of the GetAttributes method.

public override void GetAttributes(DeploymentAddAttributeContext addContext)
{
    if (this.BaseContext.IsDestinationObject)
    {
        // if we are on the destination and the file doesn't exist then we need to throw an exception
        // to ensure that the file gets synced. This happens because the Add command will be called for us.

        // Since I'm throwing an exception here Add will always be called, we could check to see if this file
        // was up-to-date and if so then skip this exception.
        throw new DeploymentException();
    }
    else
    {
        // We are acting on the source object here, make sure that the file exists on disk
        if (!File.Exists(this.FilePath))
        {
            string message = string.Format("File <{0}> does not exist",this.FilePath);
            throw new DeploymentFatalException(message);
        }
    }

    base.GetAttributes(addContext);
}

For the most part the only thing left for this simple provider to implement is to override the Add method. First I will show the method then discuss its content. Here is the method.

public override void Add(DeploymentObject source, bool whatIf)
{
    // This is called on the Destination so this.FilePath is the dest path not source path
    if (!whatIf && File.Exists(source.ProviderContext.Path))
    {
        // We can let MSDeploy do the actual sync for us using existig provider
        DeploymentProviderOptions sourceProviderOptions = new DeploymentProviderOptions(DeploymentWellKnownProvider.FilePath);
        sourceProviderOptions.Path = source.ProviderContext.Path;

        using (DeploymentObject sourceObject = DeploymentManager.CreateObject(sourceProviderOptions, new DeploymentBaseOptions()))
        {
            DeploymentProviderOptions destProviderOptions = new DeploymentProviderOptions(DeploymentWellKnownProvider.FilePath);
            destProviderOptions.Path = this.FilePath;

            // Make the call to perform an actual sync
            sourceObject.SyncTo(destProviderOptions, new DeploymentBaseOptions(), new DeploymentSyncOptions());
        }
    }
}

First I check to make sure that we are not doing a whatif run (i.e. a run where we don’t want to physically perform the action) and that the source file exists. Take note of the fact that I’m explicitly using source.ProviderContext.Path to get the source path. This provider has a property, FilePath, which contains the path but it could be either source path or dest path depending on which end you are executing in. the source.ProviderContent.Path will always point to the source value. After that you can see that I’m actually leveraging an existing provider the FilePath provider to do the actual sync for me. So all the dirty work is his job! If you are writing a provider make sure to re-use any existing providers that you can, because the code for this part looks like it can get nasty. I’ll leave that for another post.

After I prepare the source options I create an instance of the DeploymentObject class, prepare the FilePath provider and call SyncTo on the object., this is where the physical sync occurs. That is basically it for the provider itself now we need to create a provider factory class which is the guy who knows how to create our providers for us.

Fortunately creating custom provider factories is even easier then creating custom providers themselves. I called mine CustomFileProviderFactory and the entire class is shown below.

[DeploymentProviderFactory]
public class CustomFileProviderFactory : DeploymentProviderFactory
{
    protected override DeploymentObjectProvider Create(DeploymentProviderContext providerContext, DeploymentBaseContext baseContext)
    {
        return new CustomFileProvider(providerContext, baseContext);
    }

    public override string Description
    {
        get { return @"Custom provider to copy a file"; }
    }

    public override string ExamplePath
    {
        get { return @"c:\somefile.txt"; }
    }

    public override string FriendlyName
    {
        get { return "customFile"; }
    }
    public override string Name
    {
        get { return "customFile"; }
    }
}

A few things to make note of; your class should extend the DeploymentProviderFactory class and it should have the DeploymentProviderFactory attribute attached to it. Besides that there are two properties FriendlyName and Name, once again in all the samples I have seen they are always the same and always equal to the Name property on the provider itself. I followed suit and copied them. I’m still trying to figure out more about what each of these actually do, but for now I’m OK with leaving them to be the same. So that is basically it.

In order to have MSDeploy use the provider you have to create a folder named Extensibility under the %Program Files%\IIS\Microsoft Web Deploy folder if it doesn’t exist, and then copy the assembly into that folder. And then you are good to go. Here is the snippet showing my custom provider in action!

C:\temp\MSDeploy>msdeploy -verb:sync -source:customFile=C:\temp\MSDeploy\Source\source.txt -dest:customFile=C:\temp
\MSDeploy\Dest\one.txt -verbose
Verbose: Performing synchronization pass #1.
Info: Adding MSDeploy.customFile (MSDeploy.customFile).
Info: Adding customFile (C:\temp\MSDeploy\Dest\one.txt).
Verbose: The dependency check 'DependencyCheckInUse' found no issues.
Verbose: The synchronization completed in 1 pass(es).
Total changes: 2 (2 added, 0 deleted, 0 updated, 0 parameters changed, 0 bytes copied)

This was a pretty basic provider, but you have to start somewhere. I will post more about custom providers as I find out more.

You can download the entire source at http://sedotech.com/Resources#CustomProviders under the Custom Providers heading of the MSDeploy section.

Sayed Ibrahim Hashimi

Custom Provider | MSDeploy | Web Deployment Tool Thursday, March 11, 2010 6:04:47 AM (GMT Standard Time, UTC+00:00)  #     |