«Older Posts - | rssFeed | My book on MSBuild and Team Build | Archives and Categories Friday, February 20, 2015

Visual Studio Web Packages fixing the path in the generated web package

If you’ve ever created an MSDeploy web package using Visual Studio you may have noticed that the generated package has the folder structure of where the application was packaged. Since MSDeploy parameters are used when installing the package in most cases the structure of the web package doesn’t matter. In some cases this causes problems and it would be desired to have a flatter package structure.

Today on twitter @ashic contacted me asking basically “How can I update the package folder structure without modifying the project?” OK it’s possible, but I’ll first explain how you can easily solve this problem in a few ways and then move on to his actual question.

Option 1: Add the PackageWeb NuGet package

The easiest way to fix this problem is to add a the PackageWeb NuGet package into your project (that’s a package that I’ve authored you can see the sources here). This will update the path to a flat structure and add a .ps1 file when creating the package. You can ignore the .ps1 file if you like.

Option 2: Add package.wpp.targets to your project

In web projects when you build it will import any files in the same folder as the .csproj/.vbproj file matching the following file pattern *.wpp.targets. To fix the issue drop the following contents in to that file. Note: you can find the latest version of this in this gist.

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="12.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
    <PackagePath Condition=" '$(PackagePath)'=='' ">website</PackagePath>
    <EnableAddReplaceToUpdatePacakgePath Condition=" '$(EnableAddReplaceToUpdatePacakgePath)'=='' ">true</EnableAddReplaceToUpdatePacakgePath>


  <Target Name="AddReplaceRuleForAppPath" Condition=" '$(EnableAddReplaceToUpdatePacakgePath)'=='true' ">
      <_PkgPathFull Condition=" '$(WPPAllFilesInSingleFolder)'!='' ">$([System.IO.Path]::GetFullPath($(WPPAllFilesInSingleFolder)))</_PkgPathFull>
      <_PkgPathFull Condition=" '$(_PkgPathFull)' == '' ">$([System.IO.Path]::GetFullPath($(_PackageTempDir)))</_PkgPathFull>

    <!-- escape the text into a regex -->
    <EscapeTextForRegularExpressions Text="$(_PkgPathFull)">
      <Output TaskParameter="Result" PropertyName="_PkgPathRegex" />
      <MsDeployReplaceRules Include="replaceFullPath">

Then when you build this project the package path will be consist of Content\website and all content files are under that.

Now on to his question, “How can I update the package path for a project without modifying the project/project files when building from the command line”"?”

Now that we have the MSBuild .targets file to do the work for us the only thing we need to figure out is how to add this .targets file into the build process when calling msbuild.exe myproject.csproj /t:Package. It’s pretty easy actually. You can take the package.wpp.targets file and drop it in a well known location (let’s say c:\msbuild\package.targets for this example). Then when you build your project you can pass a property to get that file imported. The command is below.

msbuild myproject.csproj /t:Package /p:CustomAfterMicrosoftCommonTargets=c:\msbuild\package.targets  

In Microsoft.Common.targets (which is imported by most project types) contains a property CustomAfterMicrosoftCommonTargets which defaults to a folder under Program Files. You can override that value via MSBuild parameters to override it which is what we are doing here. Note: if you have a common targets file in the default shared location then this obviously will not work for you. You'd have to add another conditional import with a new property for that case.


Sayed Ibrahim Hashimi

Friday, February 20, 2015 6:04:28 PM (GMT Standard Time, UTC+00:00)  #     | 
Saturday, January 03, 2015

How to get MSBuild log files in Markdown format

Have you ever wanted to view MSBuild log files in Markdown format? If you are using psbuild then you’ll now get this feature for free if you upgrade your install. psbuild is a PowerShell wrapper for msbuild.exe. You can learn more about it on the github page or my previous blog post on it.


Using psbuild to build a project the basic command will look like the following.

Invoke-MSBuild myprojectorsln.csproj

After that to get the log you’ll invoke


Open-PSBuildLog will open the log file from the last build executed by that instance of psbuild. The Open-PSBuildLog has a single parameter, format, which defines which log file to return. This parameter takes the following values.

Detailed is the default format. So to get your log file in MSBuild format execute.

Open-PSBuild markdown

This will open the .md file in the default editor. Below is a screenshot of a sample log file.



If you have any comments reach out to me on twitter or open an issue in psbuild to discuss further.


Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

msbuild | psbuild Saturday, January 03, 2015 12:52:55 AM (GMT Standard Time, UTC+00:00)  #     | 
Wednesday, December 03, 2014

I closed my BBVACompass account because they believe that “Hi123” is a strong password

OK, so this off topic, but its so important that I had to blog about this. Let me give you some background. Last week I was traveling with my family on a road trip to Canada. Usually I’m super paranoid and never connect to any open wireless network (I pay for an carry my own mi-fi device due to this). Since we were in Canada I didn’t want to get hit with so many charges so I chanced it on a few networks. I made sure to connect to VPN as soon as I could, but there was still some time that I was not completely protected. At one point I thought that my gmail account had been hacked (further investigation proved this to be false thankfully). So I connected to a known network, applied VPN, double checked that my IP was routed through VPN and started changing my critical passwords. One of those was an account with BBVACompass bank. I initially setup this account a while back, and evidently I wasn’t as concerned with online security as I am now.

When I got to the password reset screen here is the tooltip indicating the password requirements.


In developer terms that 4-12 alphanumeric only. You cannot use any special characters or spaces. I’m not a security expert so I reached out to @TroyHunt (founder of https://haveibeenpwned.com/ and recognized security expert) to see what his thoughts were on this. Here is his response.


Later in the conversation BBVACompass chimed in stating


So I looked at the link http://www.bbvacompass.com/customer-service/online-banking/siteid.jsp to see if there was some other way to authenticate which was more secure. From what I understood from that link they have a service called “Site ID” which consists of the following.

  1. You enter three security questions/answers
  2. When logging in on a new machine you are prompted for security questions/answers and if the machine is “trusted”
  3. When logging in using a trusted machine the password is never submitted over the wire

The page that was linked to didn’t include any indication that this was “dual factor authentication” as the @BBVACompass twitter account tried to pass on me. I let them know that this is not two/dual factor auth. Even with “Site ID” if you log in on a compromised machine all security questions/answers and password can be stolen and users can effectively log in without me ever being notified. That defeats the purpose of two factor auth. With two factor auth if I sign into google on a compromised machine you will get my password but when you try and sign in later you’ll have to get access to my phone’s text messages as well. That is true two factor auth, not security questions.

Another security expert @RobHale77 also chimed in later with the comments below.


What’s a strong password?

I decided to do a bit more investigation around how BBVACompass is representing password strength on their change password page. I guessed that the password strength field was being populated with JavaScript so I opened the site in my browser and use the in-browser dev tools to look at the code. Here is the getPasswordStrength function, comments were added by me.

// This function will return an integer in the range of 0-100.
// The max that I've seen here is 90
function getPasswordStrength(H){
    var D=(H.length);
    if (D<4) { D=0; }
    var F=H.replace(/[0-9]/g,"");
    var G=(H.length-F.length);
    var A=H.replace(/\W/g,"");
    var C=(H.length-A.length);
    var B=H.replace(/[A-Z]/g,"");
    var I=(H.length-B.length);
    var E=((D*10)-20)+(G*10)+(C*15)+(I*10);
    return E;

Then this is converted to weak/medium/strong with the following js function, once again comments were added by me.

$.fn.passwordStrength = function( options ){
    return this.each(function(){
        var that = this;that.opts = {};
        that.opts = $.extend({}, $.fn.passwordStrength.defaults, options);
        that.div = $(that.opts.targetDiv);
        that.defaultClass = that.div.attr('class');
        // opts.classes is declared elsewhere as = Array('weak','medium','strong') so length is 3
        that.percents = (that.opts.classes.length) ? 100 / that.opts.classes.length : 100;
         v = $(this)
            if( typeof el == "undefined" )
                this.el = $(this);
            var s = getPasswordStrength (this.value);
            var p = this.percents;
            var t = Math.floor( s / p );
            // from what I can tell 's' will max out at 90 so this if statement will always be skipped
            if( 100 <= s )
                t = this.opts.classes.length - 1;
            // t now determines the index for weak/medium/strong
            // weak:     s <= 30
            // medium:    s 40-60 (inclusive)
            // strong:    s >= 70 (maximum I've seen is 90)
                .addClass( this.defaultClass )
                .addClass( this.opts.classes[ t ] );

Since the code is minified it somewhat difficult to follow. What I found was that a strong password consisted of the following.

So I decided to try “Hi123” to see if I was right. Sure enough BBVACompass told me that the selected password Hi123 is a Strong password!

image This is beyond insane. It contains a word and a sequence of 3 numbers (likely the most common sequence at that as well). BBVACompass, this is misleading at best. This is nowhere near strong, you are lying to your customers about the security of their passwords. Here are some passwords and how BBVACompass represents their strength. If you have an account you can verify this by going to the change password screen under Online Banking Profile.




Why on earth is “hlzzeseiyg” weak and “111Aa” is strong?! Clearly this has been poorly implemented and misleading, fix it now.

What I would like to see as a consumer

My top recommendation for BBVACompass is to get a security expert/team involved to redo your online security, but if you cannot afford that then follow what’s below.

I’m not a security expert but here is what I recommended to BBVACompass as a consumer.

  1. Support for very strong passwords. Those that are >= 20 characters and allowing special characters
  2. Support for true two factor auth like password/text or password/call
  3. (stretch goal) Support to view an audit log of devices that have recently accessed my account

I am getting all of the above features from google currently.

Be more transparent about weak passwords

Now that I’ve seen the guts of their getPasswordStrength function I’d like to see BBVACompass implement a better function for reporting password strength. One that takes into account dictionary words, and common patterns. As stated I’m not a security expert but after a quick search I found http://www.sitepoint.com/5-bootstrap-password-strength-metercomplexity-demos/ which includes a pointer to live demo jquery.pwstrength.bootstrap (http://jsfiddle.net/jquery4u/mmXV5/) and StrongPass.js (http://jsfiddle.net/dimitar/n8Dza/). Below are the results for the same “Hi123” password from both.



As you can see if BBVACompass had used readily available Open Source tools to verify password strength we wouldn’t be having this conversation. Both reported the password as being unacceptable.


As consumers we must hold our online service providers (especially banks) accountable for online security. For the tech savy bunch, it’s your responsibility to educate your non-tech friends/family about online security and strong passwords.

As a bank, BBVACompass, needs to hold their development team accountable for providing customers with secure access to accounts online as well as honest indications for password strength. You’re being dishonest, which means I cannot trust you.


Previous pleas ignored by BBVACompass

I did a search on twitter for @BBVACompass password and discovered that this has been brought up multiple times by customers. The first of which I found occurred in November 2013! Tweets below.





BBVACompass, your customers have spoken and we are demanding better online security. Now is the time to act. I’ve already closed my account and I’ll be advising all friends/family with a BBVACompass account to do the same. With recent security breaches of Sony/Target/etc you need to start taking online security more seriously. This blog post and twitter comments may end up with a few accounts closing, but if your customers experience wide spread hacking then it will be much more severe. Fix this before it is too late, this should be your top development priority IMO.

My Promise to BBVACompass

BBVACompass if you support passwords >= 20 characters with special charcters within 90 days I will re-open my account with the same funds as which I closed it the next time I’m in Florida.


Note: please post comments at http://www.reddit.com/r/technology/comments/2o4uat/i_closed_my_bbvacompass_account_because_they/.

Sayed Ibrahim Hashimi @SayedIHashimi

Wednesday, December 03, 2014 7:03:57 AM (GMT Standard Time, UTC+00:00)  #     | 
Monday, August 11, 2014

SlowCheetah is going into maintenance mode

For the latest info here and for discussion please visit https://github.com/sayedihashimi/slow-cheetah/issues/158.


I first developed SlowCheetah around VS2010 SP2 with the idea that I could at some point transition this to the appropriate team(s) at Microsoft. Unfortunately I haven't been able to do that, and the existence of this extension has actually worked against that goal.

I'm really happy that SlowCheetah has gotten the attention and success that it has, but now it's time for me to move on.

No support for SlowCheetah in VS "14"

I am not planning to update SlowCheetah for Visual Studio "14". If you would like to see support for transforms in VS "14" I suggest you vote, and comment, on the uesrvoice item at http://visualstudio.uservoice.com/forums/121579-visual-studio/suggestions/2043217-support-web-config-style-transforms-on-any-file-in.

No new features

I will not be adding any features to SlowCheetah myself. If anyone want's to add any features I will try and help guide anyone who is interested.

No fixes for regressions

If there are any scenarios that work in VS2013 RTM that do not work in future version of Visual Studio then I will not be fixing them.

I hope you all understand my situation here. I have spent countless hours working on SlowCheetah and there is very little ROI for me so, I need to move on to focus on other OSS projects that I'm involved in.

Thanks for all the love. I still love SlowCheetah too and I'm sad to see there won't be support for transform in VS "14"



Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

msbuild | SlowCheetah | Visual Studio Monday, August 11, 2014 6:44:36 PM (GMT Daylight Time, UTC+01:00)  #     | 
Tuesday, July 22, 2014

Stop checking-in binaries, instead create self-bootstrapping scripts

A few weeks ago Mads Kristensen and I created a few site extensions for Azure Web Sites which the Azure Image Optimizer and Azure Minifier. These extensions can be used to automatically optimize all images on a site, and minify all .js/.css files respectively. These are shipped as nuget packages in nuget.org as well as site extensions in siteextensions.net.

After creating those utilities we also update the image optimizer to support being called in on the command line via a .exe. We have not yet had a chance to update the minifier to be callable directly but we have an open issue on it. If you can help that would be great.

The exe for the image optimizer that can be used from the command line can be found in the nuget package as well. You can also download it from here, but to get the latest version you nuget.org is the way to go.

After releasing that exe I wanted an easy way to use it on a variety of machines, and to make it simple for others to try it out. What I ended up with is what I’m calling a “self-bootstrapping script” which you can find at optimize-images.ps1. Below you’ll see the entire contents of the script.

    $folderToOptimize = ($pwd),

    $toolsDir = ("$env:LOCALAPPDATA\LigerShark\tools\"),

    $nugetDownloadUrl = 'http://nuget.org/nuget.exe'

    If nuget is in the tools
    folder then it will be downloaded there.
function Get-Nuget(){
        $toolsDir = ("$env:LOCALAPPDATA\LigerShark\tools\"),

        $nugetDownloadUrl = 'http://nuget.org/nuget.exe'
        $nugetDestPath = Join-Path -Path $toolsDir -ChildPath nuget.exe
        if(!(Test-Path $nugetDestPath)){
            'Downloading nuget.exe' | Write-Verbose
            (New-Object System.Net.WebClient).DownloadFile($nugetDownloadUrl, $nugetDestPath)

            # double check that is was written to disk
            if(!(Test-Path $nugetDestPath)){
                throw 'unable to download nuget'

        # return the path of the file

    If the image optimizer in the .ools
    folder then it will be downloaded there.
function GetImageOptimizer(){
        $toolsDir = ("$env:LOCALAPPDATA\LigerShark\tools\"),
        $nugetDownloadUrl = 'http://nuget.org/nuget.exe'
        if(!(Test-Path $toolsDir)){
            New-Item $toolsDir -ItemType Directory | Out-Null

        $imgOptimizer = (Get-ChildItem -Path $toolsDir -Include 'ImageCompressor.Job.exe' -Recurse)

            'Downloading image optimizer to the .tools folder' | Write-Verbose
            # nuget install AzureImageOptimizer -Prerelease -OutputDirectory C:\temp\nuget\out\
            $cmdArgs = @('install','AzureImageOptimizer','-Prerelease','-OutputDirectory',(Resolve-Path $toolsDir).ToString())

            'Calling nuget to install image optimzer with the following args. [{0}]' -f ($cmdArgs -join ' ') | Write-Verbose
            &(Get-Nuget -toolsDir $toolsDir -nugetDownloadUrl $nugetDownloadUrl) $cmdArgs | Out-Null

        $imgOptimizer = Get-ChildItem -Path $toolsDir -Include 'ImageCompressor.Job.exe' -Recurse | select -first 1
        if(!$imgOptimizer){ throw 'Image optimizer not found' }       


function OptimizeImages(){
        $toolsDir = ("$env:LOCALAPPDATA\LigerShark\tools\"),
        $nugetDownloadUrl = 'http://nuget.org/nuget.exe'
        [string]$imgOptExe = (GetImageOptimizer -toolsDir $toolsDir -nugetDownloadUrl $nugetDownloadUrl)

        [string]$folderToOptimize = (Resolve-path $folder)

        'Starting image optimizer on folder [{0}]' -f $folder | Write-Host
        # .\.tools\AzureImageOptimizer.0.0.10-beta\tools\ImageCompressor.Job.exe --folder M:\temp\images\opt\to-optimize
        $cmdArgs = @('--folder', $folderToOptimize)

        'Calling img optimizer with the following args [{0} {1}]' -f $imgOptExe, ($cmdArgs -join ' ') | Write-Host
        &$imgOptExe $cmdArgs

        'Images optimized' | Write-Host

OptimizeImages -folder $folderToOptimize -toolsDir $toolsDir -nugetDownloadUrl $nugetDownloadUrl

The script is setup to where you call functions like Get-NuGet and GetImageOptimzer to get the path to the .exe to call. If the .exe is not in the expected location, under %localappdata% by default, it will be downloaded and then the path will be returned. In the case of this script I use nuget.org as my primary distribution mechanism for this so the script will first download nuget.exe and then use that to get the actual binaries. WIth this approach, you can avoid checking in binaries and have scripts which are still pretty concise.

After creating optimize-images.ps1 I thought it would be really useful to have a similar script to execute XDT transforms on xml files. So I created transform-xml.ps1. That script first downloads nuget.exe and then uses that to download the nuget packages which are required to invoke XDT transforms.

A self-bootstrapping script doesn’t need to be a PowerShell script, you can apply the same techniques to any scripting language. I’ve recently created an MSBuild script, inspired by Get-Nuget above, which can be used in a similar way. You can find that script in a gist here. It’s below as well.

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003" DefaultTargets="GetNuget">

  This is an MSBuild snippet that can be used to download nuget to the path
  in the property NuGetExePath property.
   1. Import this file or copy and paste this into your build script
   2. Call the GetNuGet target before you use nuget.exe from $(NuGetExePath)
    <NuGetExePath Condition=" '$(NuGetExePath)'=='' ">$(localappdata)\LigerShark\AzureJobs\tools\nuget.exe</NuGetExePath>
    <NuGetDownloadUrl Condition=" '$(NuGetDownloadUrl)'=='' ">http://nuget.org/nuget.exe</NuGetDownloadUrl>
  <Target Name="GetNuget" Condition="!Exists('$(NuGetExePath)')">
    <Message Text="Downloading nuget from [$(NuGetDownloadUrl)] to [$(NuGetExePath)]" Importance="high"/>
      <_nugetexeitem Include="$(NuGetExePath)" />
    <MakeDir Directories="@(_nugetexeitem->'%(RootDir)%(Directory)')"/>
        FileName="$(NuGetExePath)" />    

  <PropertyGroup Condition=" '$(ls-msbuildtasks-path)'=='' ">
    <ls-msbuildtasks-path Condition=" !Exists('$(ls-msbuildtasks-path)')">$(MSBuildFrameworkToolsPath)\Microsoft.Build.Tasks.v4.0.dll</ls-msbuildtasks-path>
    <ls-msbuildtasks-path Condition=" !Exists('$(ls-msbuildtasks-path)')">$(windir)\Microsoft.NET\Framework\v4.0.30319\Microsoft.Build.Tasks.v4.0.dll</ls-msbuildtasks-path>
  <UsingTask TaskName="DownloadFile" TaskFactory="CodeTaskFactory" AssemblyFile="$(ls-msbuildtasks-path)">
    <!-- http://stackoverflow.com/a/12739168/105999 -->
      <Address ParameterType="System.String" Required="true"/>
      <FileName ParameterType="System.String" Required="true" />
      <Reference Include="System" />
      <Code Type="Fragment" Language="cs">
            new System.Net.WebClient().DownloadFile(Address, FileName);

This script has a single target, GetNuGet, which you can call to download nuget.exe to the expected location. After that you can use the path to nuget.exe from the NuGetExePath property. I’ve already removed nuget.ext from SideWaffle and AzureJobs repository using this technique. It’s a great way to avoid checking in nuget.exe.


Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

msbuild | powershell | scripting Tuesday, July 22, 2014 7:36:11 AM (GMT Daylight Time, UTC+01:00)  #     | 
Saturday, July 19, 2014

Introducing PSBuild – an improved interface for msbuild.exe in PowerShell

For the past few months I’ve been working on a project I’m calling PSBuild. It’s an open source project on GitHub which makes the experience of calling MSBuild from PowerShell better. Getting started with PSBuild is really easy. You can install it in one line.

(new-object Net.WebClient).DownloadString("https://raw.github.com/ligershark/psbuild/master/src/GetPSBuild.ps1") | iex

You can find this info on the project home page as well.


When you install PSBuild one of the functions that you get is Invoke-MSBuild. When you call Invoke-MSBuild it will end up calling msbuild.exe. Some advantages of using Invoke-MSBuild are.

Calling Invoke-MSBuild

A most basic usage of Invoke-MSBuild.

Invoke-MSBuild .\App.csproj

This will build the project using the default targets. The call to msbuild.exe on my computer is below.

C:\Program Files (x86)\MSBuild\12.0\bin\amd64\msbuild.exe 

From the call to msbuild.exe you can see that the /m is passed as well as a couple file loggers in %localappdata%. We will get to the logs later. More on Invoke-MSBuild,

To get a sense for how you can use Invoke-MSBuild take a look at the examples below.

# VisualStudioVersion and Configuration MSBuild properties have easy to use parameters
Invoke-MSBuild .\App.csproj -visualStudioVersion 12.0 -configuration Release

# How to pass properties
Invoke-MSBuild .\App.csproj -visualStudioVersion 12.0 -properties @{'DeployOnBuild'='true';'PublishProfile'='toazure';'Password'='mypwd-really'}

# How to build a single target
Invoke-MSBuild .\App.csproj -visualStudioVersion 12.0 -targets Clean

# How to bulid multiple targets
Invoke-MSBuild .\App.csproj -visualStudioVersion 12.0 -targets @('Clean','Build')


Getting log files

When you call Invoke-MSBuild on a project or solution the output will look something like the following.


Notice that line at the end. You can access your last log using the command.


This will open the detailed log of the previous build in the program that’s associated with the .log extension. You can also use the Get-PSBuildLastLogs function to view the path for both log files written. If you want to view the folder where the log files are written to you can execute start (Get-PSBuildLogDirectory).

Helper functions

There are a couple of things that I’m constantly having to look up when I’m authoring MSBuild files; Reserved property names and escape characters. PSBuild has a helper function for each of these Get-MSBuildEscapeCharacters and Get-MSBuildReservedProperties. In the screenshot below you’ll see the result of executing each of these.



Default Properties

The Invoke-MSBuild cmdlet has a property –defaultProperties. You can pass in a hashtable just like the –properties parameter. These properties are applied as environment variables before the call to msbuild.exe and then reverted afterwards. The effect here is that you can have a property value which will be used if no other value for that property is specified in MSBuild.


There is so much more to PSBuild. This is just the tip of the iceberg. Keep an eye on the project page for more info. I’d love your help on this project. Please consider contributing to the project https://github.com/ligershark/psbuild.


Sayed Ibrahim Hashimi | http://msbuildbook.com/ | @SayedIHashimi

msbuild | psbuild Saturday, July 19, 2014 6:20:43 AM (GMT Daylight Time, UTC+01:00)  #     | 
Friday, May 09, 2014

How to download a site using msdeploy.exe

Recently I encountered a customer asking if it’s possible to download a site using msdeploy.exe. This is pretty easy using msdeploy.exe. I’ll demonstrate this with Microsoft Azure Web Sites but you can use this with any hosting provider that supports Web Deploy (aka MSDeploy).

To perform a sync with msdeploy.exe the structure of the command that we need to execute is as follows.

msdeploy.exe –verb:sync –source:<fill-in-details> –dest:<fill-in-details>

For the source property we will use the remote Azure Web Site, and for the dest property we will write to a folder on the local file system. You can get the Web Deploy publishing settings in the Azure Web Site by clicking on the Download the publish profile link in the Configure page.


This will download an XML file that has all the publish settings you’ll need. For example below is a publish settings file for my demo site.

  <publishProfile profileName="sayeddemo2 - Web Deploy" publishMethod="MSDeploy" 
                  userPWD="*** removed ***" 
                  SQLServerDBConnectionString="" mySQLDBConnectionString="" hostingProviderForumLink="" 
  <publishProfile profileName="sayeddemo2 - FTP" publishMethod="FTP" 
                  publishUrl="ftp://waws-prod-bay-001.ftp.azurewebsites.windows.net/site/wwwroot" ftpPassiveMode="True" 
                  userName="sayeddemo2\$sayeddemo2" userPWD="*** removed ***" 
                  destinationAppUrl="http://sayeddemo2.azurewebsites.net" SQLServerDBConnectionString="" 
                  mySQLDBConnectionString="" hostingProviderForumLink="" 

The publish settings file provided by Azure Web Sites has two profiles; an MSDeploy profile and an FTP profile. We can ignore the FTP profile and just use the MSDeploy one. The relevant settings from the profile that we will use are the following values.

We will use the contentPath MSDeploy provider to download the files. On the source parameter we will need to include the relevant details of the remote machine. The full command to execute is below. I’ll break it down a bit after the snippet.

"C:\Program Files (x86)\IIS\Microsoft Web Deploy V3\msdeploy.exe" 
    -dest:contentPath=c:\temp\pubfromazure -disablerule:BackupRule

The important parts of the command above are how the remote settings are passed to the source provider. On the dest side I’ve provided the location where the files should be downloaded to.


Sayed Ibrahim Hashimi | http://msbuildbook.com/ | @SayedIHashimi

Friday, May 09, 2014 12:02:02 AM (GMT Daylight Time, UTC+01:00)  #     | 
Friday, October 11, 2013

SideWaffle: How to create your own VS template pack

If you haven’t heard I’m working on a project with Mads Kristensen called SideWaffle. SideWaffle is a Visual Studio Extension which contains many different Visual Studio Item and Project Templates. This is a community effort and all open source at https://github.com/ligershark/template-builder. You can create your own Item Templates and Project Templates and send a Pull Request for them to be included in the main repo. Check out the video below for more info on SideWaffle.

SideWaffle intro video

Item Templates are used by VS developers to create files using the Add New Item dialog. SideWaffle already contains a number of Item Templates such as; Angular Controller, robots.txt, SignalR Hub and Client, etc. For more info on how to create Item Templates with SideWaffle watch the 4 minute video below.


Project Templates are the items that show up in the Add New Project dialog. They provide a starting point for developers creating new projects. SideWaffle already has a few project templates as well, such as a Google Chrome Extension. You can learn more about how to create Project Templates in this video.


Now that we’ve gotten the intro out of the way, let’s explore how you can create your own SideWaffle.

How to create your own SideWaffle

The idea behind SideWaffle is that we will have a shared VS extension for popular VS Item and Project Templates. Instead of contributing to the main SideWaffle project you may be interested in creating your own distribution that does not have the standard templates. For example, I’ve heard from both the Orchard and Umbraco that they are interested in creating template packs for their communities. It wouldn’t make much sense to include those templates in the main SideWaffle project. Instead it would be best to create a separate distribution for each; OrchardWaffle and UmbracoWaffle.

So how can you do this? It’s pretty easy actually. SideWaffle is built on top of a NuGet package, TemplateBuilder, which is also open source at https://github.com/ligershark/template-builder. All the core functionality of SideWaffle is contained in that NuGet package. To create your own SideWaffle follow these steps:

After you add the TemplateBuilder NuGet package a few things happen:

  1. The build process of the project is modified to support building Item and Project templates
  2. Your .vsixmanifest file is updated with two new Asset tags
  3. An ItemTemplates folder is created with a sample Item Template

From here on you can build the project and after installing the generated .vsix you can have users easily create instances of your item or project templates.

You can add additional Item Templates, as well as create Project Templates in your project. That’s pretty much all there is to getting started with your own Waffle pack.

Let me know if you have any issues or comments.

Happy Waffleing!


Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

extensibility | SideWaffle | Visual Studio | Visual Studio 2012 | VSIX Friday, October 11, 2013 5:11:26 PM (GMT Daylight Time, UTC+01:00)  #     | 
Saturday, September 21, 2013

How to extend the web publish process without modifying project contents

When automating web publishing for Visual Studio projects in many cases your first step will be to create a publish profile for the project in VS. There are many cases in which you cannot, or would not like to do this. In this post I’ll show you how you can take an existing project and use an MSBuild file to drive the publish process. In that I’ll also show how you can extend the publish process without modifying either the project or any of its contents.

Before we get too far into this, if you are not familiar with how to publish your VS web projects from the command line you can read our docs at ttp://www.asp.net/mvc/tutorials/deployment/visual-studio-web-deployment/command-line-deployment.

When you publish a project from the command line using a publish profile you typically use the syntax below.

msbuild .\MyProject.csproj /p:VisualStudioVersion=11.0 /p:DeployOnBuild=true /p:PublishProfile=<profile-name-or-path> 

In this snippet we are passing in a handful of properties. VisualStudioVersion dictates which version of MSBuild targets are used during the build. See http://sedodream.com/2012/08/19/VisualStudioProjectCompatabilityAndVisualStudioVersion.aspx for more details on that. DeployOnBuild=true injects the publish process at the end of build. PublishProfile can either be the name of a publish profile which the project contains or it can be the full path to a publish profile. We will use PublishProfile with that second option, the full path.

So we need to pass in the full path to a publish profile, which typically is a .pubxml file. A publish profile is just an MSBuild file. When you pass in PublishProfile and DeployOnBuild=true, then the publish profile is Imported into the build/publish process. It will supply the publish properties needed to perform the publish.

Let’s see how that works. I have a sample project, MySite, which does not have any publish profiles created for it. I have created a publish profile, ToFileSys.pubxml, in another folder that will be used though. The contents of that file are below.


<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">

This publish profile will publish to a local folder. I just created this file in VS with a different project and then just copied it to the folder that I needed, and removed properties which are only used for the inside of VS experience. We can publish the MySite project using this profile with the command below.

msbuild MySite.csproj 

When you execute this the file specified in PublishProfile will be included into the build process.

Taking it up a level


Now let’s see how we can take this to the next level by having a single script that will be used to publish more than one project using this technique.

In the sample files (which you can find links for at the end of the post). I have a solution with two web projects, MySite and MyOtherSite. Neither of these projects have any publish profiles created. I have created a script which will build/publish these projects which you can find at build\publish.proj in the samples. The contents of the file are shown below.


<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="12.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003" DefaultTargets="BuildProjects">
  This file is used in two ways.
    1. Drive the build and publish process
    2. It is used by the publish process during the build of MySite to configure/extend publish
        Note: 1. Is kicked off by the use on the cmd line/build server. 2. Is invoked by this script itself.
              This file is injected into the publish process via the PublishProfile property.
    <VisualStudioVersion Condition=" '$(VisualStudioVersion)'=='' ">11.0</VisualStudioVersion>
    <Configuration Condition=" '$(Configuration)'=='' ">Release</Configuration>
    <!-- Location for build output of the project -->
    <OutputRoot Condition=" '$(OutputRoot)'=='' ">$(MSBuildThisFileDirectory)..\BuildOutput\</OutputRoot>
    <!-- Root for the publish output -->
    <PublishFolder Condition=" '$(PublishFolder)'==''">C:\temp\Publish\Output\</PublishFolder>    
    <ProjectsToBuild Include="$(MSBuildThisFileDirectory)..\MySite\MySite.csproj">

    <ProjectsToBuild Include="$(MSBuildThisFileDirectory)..\MyOtherSite\MyOtherSite.csproj">
  <Target Name="BuildProjects">
    <MSBuild Projects="@(ProjectsToBuild)" />

  <!-- ***************************************************************************************
 The targets below will be called during the publish process.
 These targets are injected into the publish process for each web project.
 These targets will not have access to any new values for properties/items from
 the targets above this.
 *************************************************************************************** -->

  <Target Name="AfterWebPublish" AfterTargets="WebPublish">
    <Message Text="Inside AfterWebPublish" Importance="high"/>

This file is pretty simple, it declares some properties which will be used for the build/publish process. Then it declares the projects to be built with an item list named ProjectsToBuild. When declaring ProjectsToBuild I use the AdditionalProperties metadata to specify MSBuild properties to be used during the build process for each project. Let’s take a closer look at those properties.


I’ll explain all the properties now. VisualStudioVersion, Configuration and OutputPath are all used for the build process. The other properties are related to publishing. If you want to publish from the file system those properties (WebPublishMethod, publishUrl, DeployOnBuild, and DeployTarget) must be set. The most important property there is PublishProfile.

PublishProfile is set to $(MSBuildThisFileFullPath) which is the full path to publish.proj. This will instruct the build process of that project to import publish.proj when its build/publish process is started. It’s important to note that a “new instance” of the file will be imported. What that means is that the imported version of publish.proj won’t have access to any dynamic properties/items created in publish.proj.

The reason why PublishProfile is specified there is so that we can extend the publish process from within publish.proj itself. publish.proj has a target, AfterWebPublish, which will be executed after each project is executed. Let’s see how this works.

We can execute the publish process with the command below.

msbuild .\build\publish.proj /p:VisualStudioVersion=11.0

After executing this command the tail end of the result is shown in the image below.


In the image above you can see that the MyOtherSite project is being publish to the specified location in publish.proj and the AfterWebPublish target is executed as well.


In this post we’ve seen how we can use an MSBuild file as a publish profile, and how to extend the publish process using that same file.

You can download the samples at https://dl.dropboxusercontent.com/u/40134810/blog/publish-injection.zip. You can find the latest version in my publish-samples repository at publish-injection.

Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

msbuild | Visual Studio | Visual Studio 2012 | web | Web Publishing Pipeline Saturday, September 21, 2013 7:57:03 PM (GMT Daylight Time, UTC+01:00)  #     | 
Saturday, June 08, 2013

Introducing VsixCompress–a NuGet package to compress your Visual Studio Package

A few weeks ago I blogged about a Visual Studio extension, Farticus, which I’m working on with Mads Kristensen. In that post I described how the default compression of a .vsix (the artifact that is created for a Visual Studio Package) is not as small as it could be. It’s better to get a fully compressed VSIX because when users install the component the download time can be significantly reduced. In that post I described how you could use the Zip task from the MSBuild Extension Pack to have a fully compressed .vsix file. I will now show you how I’ve simplified this.

Icon for package VsixCompressVsixCompress

Since my previous post I’ve created a NuGet package, VsixCompress which simplifies this greatly. If you have an existing Visual Studio package and want to have a fully compressed .vsix file then all you need to do is install the VsixCompress package.



After you install this package the following happens.

  1. NuGet package is downloaded and installed to the packages folder
  2. The project is edited to include an import for a new .targets file
  3. The build process is extended to compress the .vsix file automatically

After installing this package once you build the generated .vsix is much smaller than before. In the default case where you select to create a new C# VS Package the created .vsix is 17kb. After adding VsixCompress the resulting .vsix is only 9kb. That’s almost half the size. That’s all you need to know for local development. If you have a build server setup then there are a couple of additional steps. Let’s go over those now.

Build Server Support

I have blogged before about issues of shipping build updates in NuGet. To briefly summarize, when leveraging NuGet Package Restore you have to be careful if any of those NuGet packages have build updates. When using Package Restore the NuGet packages which contain the imported .targets file(s) are restored after the build starts. What this means is that the .targets files will never be imported (or an old copy is imported in the case the file exists from a previous build). The only way to work around this is to restore the packages before the .sln/.csproj file themselves are built. You can read the full details at http://sedodream.com/2012/12/24/SlowCheetahBuildServerSupportUpdated.aspx. I have a NuGet package, PackageRestore, which can help here. Take a look at my previous post How to simplify shipping build updates in a NuGet package. Now that we’ve discussed all the details that you need let’s discuss what my plans are going forward with this.

Plans going forward

I’m hoping to add the following features over the course of the next few weeks.

FYI VsixCompress is open source so you can take a look at the code, or better yet contribute at https://github.com/sayedihashimi/vsix-compress.

Please let me know what you think of this!

Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

extensibility | Visual Studio | VSIX Saturday, June 08, 2013 1:42:57 AM (GMT Daylight Time, UTC+01:00)  #     |