«Older Posts - | rssFeed | My book on MSBuild and Team Build | Archives and Categories Wednesday, December 03, 2014

I closed my BBVACompass account because they believe that “Hi123” is a strong password

OK, so this off topic, but its so important that I had to blog about this. Let me give you some background. Last week I was traveling with my family on a road trip to Canada. Usually I’m super paranoid and never connect to any open wireless network (I pay for an carry my own mi-fi device due to this). Since we were in Canada I didn’t want to get hit with so many charges so I chanced it on a few networks. I made sure to connect to VPN as soon as I could, but there was still some time that I was not completely protected. At one point I thought that my gmail account had been hacked (further investigation proved this to be false thankfully). So I connected to a known network, applied VPN, double checked that my IP was routed through VPN and started changing my critical passwords. One of those was an account with BBVACompass bank. I initially setup this account a while back, and evidently I wasn’t as concerned with online security as I am now.

When I got to the password reset screen here is the tooltip indicating the password requirements.

image

In developer terms that 4-12 alphanumeric only. You cannot use any special characters or spaces. I’m not a security expert so I reached out to @TroyHunt (founder of https://haveibeenpwned.com/ and recognized security expert) to see what his thoughts were on this. Here is his response.

image

Later in the conversation BBVACompass chimed in stating

image

So I looked at the link http://www.bbvacompass.com/customer-service/online-banking/siteid.jsp to see if there was some other way to authenticate which was more secure. From what I understood from that link they have a service called “Site ID” which consists of the following.

  1. You enter three security questions/answers
  2. When logging in on a new machine you are prompted for security questions/answers and if the machine is “trusted”
  3. When logging in using a trusted machine the password is never submitted over the wire

The page that was linked to didn’t include any indication that this was “dual factor authentication” as the @BBVACompass twitter account tried to pass on me. I let them know that this is not two/dual factor auth. Even with “Site ID” if you log in on a compromised machine all security questions/answers and password can be stolen and users can effectively log in without me ever being notified. That defeats the purpose of two factor auth. With two factor auth if I sign into google on a compromised machine you will get my password but when you try and sign in later you’ll have to get access to my phone’s text messages as well. That is true two factor auth, not security questions.

Another security expert @RobHale77 also chimed in later with the comments below.

image

What’s a strong password?

I decided to do a bit more investigation around how BBVACompass is representing password strength on their change password page. I guessed that the password strength field was being populated with JavaScript so I opened the site in my browser and use the in-browser dev tools to look at the code. Here is the getPasswordStrength function, comments were added by me.

// This function will return an integer in the range of 0-100.
// The max that I've seen here is 90
function getPasswordStrength(H){
    var D=(H.length);
    if (D<4) { D=0; }
    if(D>5){
        D=5;
    }
    var F=H.replace(/[0-9]/g,"");
    var G=(H.length-F.length);
    if(G>3){G=3;}
    var A=H.replace(/\W/g,"");
    var C=(H.length-A.length);
    if(C>3){C=3;}
    var B=H.replace(/[A-Z]/g,"");
    var I=(H.length-B.length);
    if(I>3){I=3;}
    var E=((D*10)-20)+(G*10)+(C*15)+(I*10);
    if(E<0){E=0;}
    if(E>100){E=100;}
    return E;
}

Then this is converted to weak/medium/strong with the following js function, once again comments were added by me.

$.fn.passwordStrength = function( options ){
    return this.each(function(){
        var that = this;that.opts = {};
        that.opts = $.extend({}, $.fn.passwordStrength.defaults, options);
 
        that.div = $(that.opts.targetDiv);
        that.defaultClass = that.div.attr('class');
 
        // opts.classes is declared elsewhere as = Array('weak','medium','strong') so length is 3
        that.percents = (that.opts.classes.length) ? 100 / that.opts.classes.length : 100;
 
         v = $(this)
        .keyup(function(){
            if( typeof el == "undefined" )
                this.el = $(this);
            var s = getPasswordStrength (this.value);
            var p = this.percents;
            var t = Math.floor( s / p );
            
            // from what I can tell 's' will max out at 90 so this if statement will always be skipped
            if( 100 <= s )
                t = this.opts.classes.length - 1;
            
            // t now determines the index for weak/medium/strong
            // weak:     s <= 30
            // medium:    s 40-60 (inclusive)
            // strong:    s >= 70 (maximum I've seen is 90)
            
            this.div
                .removeAttr('class')
                .addClass( this.defaultClass )
                .addClass( this.opts.classes[ t ] );
 
        });
    });

Since the code is minified it somewhat difficult to follow. What I found was that a strong password consisted of the following.

So I decided to try “Hi123” to see if I was right. Sure enough BBVACompass told me that the selected password Hi123 is a Strong password!

image This is beyond insane. It contains a word and a sequence of 3 numbers (likely the most common sequence at that as well). BBVACompass, this is misleading at best. This is nowhere near strong, you are lying to your customers about the security of their passwords. Here are some passwords and how BBVACompass represents their strength. If you have an account you can verify this by going to the change password screen under Online Banking Profile.

weak

medium

strong

Why on earth is “hlzzeseiyg” weak and “111Aa” is strong?! Clearly this has been poorly implemented and misleading, fix it now.

What I would like to see as a consumer

My top recommendation for BBVACompass is to get a security expert/team involved to redo your online security, but if you cannot afford that then follow what’s below.

I’m not a security expert but here is what I recommended to BBVACompass as a consumer.

  1. Support for very strong passwords. Those that are >= 20 characters and allowing special characters
  2. Support for true two factor auth like password/text or password/call
  3. (stretch goal) Support to view an audit log of devices that have recently accessed my account

I am getting all of the above features from google currently.

Be more transparent about weak passwords

Now that I’ve seen the guts of their getPasswordStrength function I’d like to see BBVACompass implement a better function for reporting password strength. One that takes into account dictionary words, and common patterns. As stated I’m not a security expert but after a quick search I found http://www.sitepoint.com/5-bootstrap-password-strength-metercomplexity-demos/ which includes a pointer to live demo jquery.pwstrength.bootstrap (http://jsfiddle.net/jquery4u/mmXV5/) and StrongPass.js (http://jsfiddle.net/dimitar/n8Dza/). Below are the results for the same “Hi123” password from both.

image

image 

As you can see if BBVACompass had used readily available Open Source tools to verify password strength we wouldn’t be having this conversation. Both reported the password as being unacceptable.

Accountability

As consumers we must hold our online service providers (especially banks) accountable for online security. For the tech savy bunch, it’s your responsibility to educate your non-tech friends/family about online security and strong passwords.

As a bank, BBVACompass, needs to hold their development team accountable for providing customers with secure access to accounts online as well as honest indications for password strength. You’re being dishonest, which means I cannot trust you.

 

Previous pleas ignored by BBVACompass

I did a search on twitter for @BBVACompass password and discovered that this has been brought up multiple times by customers. The first of which I found occurred in November 2013! Tweets below.

imagehttps://twitter.com/gumnos/status/406880230670749696

image

https://twitter.com/JasonSamfield/status/429448195119136768 

imagehttps://twitter.com/BenLake5/status/477262736208822273

BBVACompass, your customers have spoken and we are demanding better online security. Now is the time to act. I’ve already closed my account and I’ll be advising all friends/family with a BBVACompass account to do the same. With recent security breaches of Sony/Target/etc you need to start taking online security more seriously. This blog post and twitter comments may end up with a few accounts closing, but if your customers experience wide spread hacking then it will be much more severe. Fix this before it is too late, this should be your top development priority IMO.

My Promise to BBVACompass

BBVACompass if you support passwords >= 20 characters with special charcters within 90 days I will re-open my account with the same funds as which I closed it the next time I’m in Florida.

 

Note: please post comments at http://www.reddit.com/r/technology/comments/2o4uat/i_closed_my_bbvacompass_account_because_they/.

Sayed Ibrahim Hashimi @SayedIHashimi

Wednesday, December 03, 2014 7:03:57 AM (GMT Standard Time, UTC+00:00)  #     | 
Monday, August 11, 2014

SlowCheetah is going into maintenance mode

For the latest info here and for discussion please visit https://github.com/sayedihashimi/slow-cheetah/issues/158.

 

I first developed SlowCheetah around VS2010 SP2 with the idea that I could at some point transition this to the appropriate team(s) at Microsoft. Unfortunately I haven't been able to do that, and the existence of this extension has actually worked against that goal.

I'm really happy that SlowCheetah has gotten the attention and success that it has, but now it's time for me to move on.

No support for SlowCheetah in VS "14"

I am not planning to update SlowCheetah for Visual Studio "14". If you would like to see support for transforms in VS "14" I suggest you vote, and comment, on the uesrvoice item at http://visualstudio.uservoice.com/forums/121579-visual-studio/suggestions/2043217-support-web-config-style-transforms-on-any-file-in.

No new features

I will not be adding any features to SlowCheetah myself. If anyone want's to add any features I will try and help guide anyone who is interested.

No fixes for regressions

If there are any scenarios that work in VS2013 RTM that do not work in future version of Visual Studio then I will not be fixing them.

I hope you all understand my situation here. I have spent countless hours working on SlowCheetah and there is very little ROI for me so, I need to move on to focus on other OSS projects that I'm involved in.

Thanks for all the love. I still love SlowCheetah too and I'm sad to see there won't be support for transform in VS "14"

 

Thanks,

Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

msbuild | SlowCheetah | Visual Studio Monday, August 11, 2014 6:44:36 PM (GMT Daylight Time, UTC+01:00)  #     | 
Tuesday, July 22, 2014

Stop checking-in binaries, instead create self-bootstrapping scripts

A few weeks ago Mads Kristensen and I created a few site extensions for Azure Web Sites which the Azure Image Optimizer and Azure Minifier. These extensions can be used to automatically optimize all images on a site, and minify all .js/.css files respectively. These are shipped as nuget packages in nuget.org as well as site extensions in siteextensions.net.

After creating those utilities we also update the image optimizer to support being called in on the command line via a .exe. We have not yet had a chance to update the minifier to be callable directly but we have an open issue on it. If you can help that would be great.

The exe for the image optimizer that can be used from the command line can be found in the nuget package as well. You can also download it from here, but to get the latest version you nuget.org is the way to go.

After releasing that exe I wanted an easy way to use it on a variety of machines, and to make it simple for others to try it out. What I ended up with is what I’m calling a “self-bootstrapping script” which you can find at optimize-images.ps1. Below you’ll see the entire contents of the script.

[cmdletbinding()]
param(
    $folderToOptimize = ($pwd),

    $toolsDir = ("$env:LOCALAPPDATA\LigerShark\tools\"),

    $nugetDownloadUrl = 'http://nuget.org/nuget.exe'
)

<#
.SYNOPSIS
    If nuget is in the tools
    folder then it will be downloaded there.
#>
function Get-Nuget(){
    [cmdletbinding()]
    param(
        $toolsDir = ("$env:LOCALAPPDATA\LigerShark\tools\"),

        $nugetDownloadUrl = 'http://nuget.org/nuget.exe'
    )
    process{
        $nugetDestPath = Join-Path -Path $toolsDir -ChildPath nuget.exe
        
        if(!(Test-Path $nugetDestPath)){
            'Downloading nuget.exe' | Write-Verbose
            (New-Object System.Net.WebClient).DownloadFile($nugetDownloadUrl, $nugetDestPath)

            # double check that is was written to disk
            if(!(Test-Path $nugetDestPath)){
                throw 'unable to download nuget'
            }
        }

        # return the path of the file
        $nugetDestPath
    }
}

<#
.SYNOPSIS
    If the image optimizer in the .ools
    folder then it will be downloaded there.
#>
function GetImageOptimizer(){
    [cmdletbinding()]
    param(
        $toolsDir = ("$env:LOCALAPPDATA\LigerShark\tools\"),
        $nugetDownloadUrl = 'http://nuget.org/nuget.exe'
    )
    process{
        
        if(!(Test-Path $toolsDir)){
            New-Item $toolsDir -ItemType Directory | Out-Null
        }

        $imgOptimizer = (Get-ChildItem -Path $toolsDir -Include 'ImageCompressor.Job.exe' -Recurse)

        if(!$imgOptimizer){
            'Downloading image optimizer to the .tools folder' | Write-Verbose
            # nuget install AzureImageOptimizer -Prerelease -OutputDirectory C:\temp\nuget\out\
            $cmdArgs = @('install','AzureImageOptimizer','-Prerelease','-OutputDirectory',(Resolve-Path $toolsDir).ToString())

            'Calling nuget to install image optimzer with the following args. [{0}]' -f ($cmdArgs -join ' ') | Write-Verbose
            &(Get-Nuget -toolsDir $toolsDir -nugetDownloadUrl $nugetDownloadUrl) $cmdArgs | Out-Null
        }

        $imgOptimizer = Get-ChildItem -Path $toolsDir -Include 'ImageCompressor.Job.exe' -Recurse | select -first 1
        if(!$imgOptimizer){ throw 'Image optimizer not found' }       

        $imgOptimizer
    }
}

function OptimizeImages(){
    [cmdletbinding()]
    param(
        [Parameter(Mandatory=$true,Position=0)]
        $folder,
        $toolsDir = ("$env:LOCALAPPDATA\LigerShark\tools\"),
        $nugetDownloadUrl = 'http://nuget.org/nuget.exe'
    )
    process{        
        [string]$imgOptExe = (GetImageOptimizer -toolsDir $toolsDir -nugetDownloadUrl $nugetDownloadUrl)

        [string]$folderToOptimize = (Resolve-path $folder)

        'Starting image optimizer on folder [{0}]' -f $folder | Write-Host
        # .\.tools\AzureImageOptimizer.0.0.10-beta\tools\ImageCompressor.Job.exe --folder M:\temp\images\opt\to-optimize
        $cmdArgs = @('--folder', $folderToOptimize)

        'Calling img optimizer with the following args [{0} {1}]' -f $imgOptExe, ($cmdArgs -join ' ') | Write-Host
        &$imgOptExe $cmdArgs

        'Images optimized' | Write-Host
    }
}

OptimizeImages -folder $folderToOptimize -toolsDir $toolsDir -nugetDownloadUrl $nugetDownloadUrl

The script is setup to where you call functions like Get-NuGet and GetImageOptimzer to get the path to the .exe to call. If the .exe is not in the expected location, under %localappdata% by default, it will be downloaded and then the path will be returned. In the case of this script I use nuget.org as my primary distribution mechanism for this so the script will first download nuget.exe and then use that to get the actual binaries. WIth this approach, you can avoid checking in binaries and have scripts which are still pretty concise.

After creating optimize-images.ps1 I thought it would be really useful to have a similar script to execute XDT transforms on xml files. So I created transform-xml.ps1. That script first downloads nuget.exe and then uses that to download the nuget packages which are required to invoke XDT transforms.

A self-bootstrapping script doesn’t need to be a PowerShell script, you can apply the same techniques to any scripting language. I’ve recently created an MSBuild script, inspired by Get-Nuget above, which can be used in a similar way. You can find that script in a gist here. It’s below as well.

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003" DefaultTargets="GetNuget">

  <!--
  This is an MSBuild snippet that can be used to download nuget to the path
  in the property NuGetExePath property.
  
  Usage:
   1. Import this file or copy and paste this into your build script
   2. Call the GetNuGet target before you use nuget.exe from $(NuGetExePath)
  -->
  
  <PropertyGroup>
    <NuGetExePath Condition=" '$(NuGetExePath)'=='' ">$(localappdata)\LigerShark\AzureJobs\tools\nuget.exe</NuGetExePath>
    <NuGetDownloadUrl Condition=" '$(NuGetDownloadUrl)'=='' ">http://nuget.org/nuget.exe</NuGetDownloadUrl>
  </PropertyGroup>
  
  <Target Name="GetNuget" Condition="!Exists('$(NuGetExePath)')">
    <Message Text="Downloading nuget from [$(NuGetDownloadUrl)] to [$(NuGetExePath)]" Importance="high"/>
    <ItemGroup>
      <_nugetexeitem Include="$(NuGetExePath)" />
    </ItemGroup>
    <MakeDir Directories="@(_nugetexeitem->'%(RootDir)%(Directory)')"/>
    <DownloadFile
        Address="$(NuGetDownloadUrl)"
        FileName="$(NuGetExePath)" />    
  </Target>

  <PropertyGroup Condition=" '$(ls-msbuildtasks-path)'=='' ">
    <ls-msbuildtasks-path>$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll</ls-msbuildtasks-path>
    <ls-msbuildtasks-path Condition=" !Exists('$(ls-msbuildtasks-path)')">$(MSBuildFrameworkToolsPath)\Microsoft.Build.Tasks.v4.0.dll</ls-msbuildtasks-path>
    <ls-msbuildtasks-path Condition=" !Exists('$(ls-msbuildtasks-path)')">$(windir)\Microsoft.NET\Framework\v4.0.30319\Microsoft.Build.Tasks.v4.0.dll</ls-msbuildtasks-path>
  </PropertyGroup>
  
  <UsingTask TaskName="DownloadFile" TaskFactory="CodeTaskFactory" AssemblyFile="$(ls-msbuildtasks-path)">
    <!-- http://stackoverflow.com/a/12739168/105999 -->
    <ParameterGroup>
      <Address ParameterType="System.String" Required="true"/>
      <FileName ParameterType="System.String" Required="true" />
    </ParameterGroup>
    <Task>
      <Reference Include="System" />
      <Code Type="Fragment" Language="cs">
        <![CDATA[
            new System.Net.WebClient().DownloadFile(Address, FileName);
        ]]>
      </Code>
    </Task>
  </UsingTask>
  
</Project>

This script has a single target, GetNuGet, which you can call to download nuget.exe to the expected location. After that you can use the path to nuget.exe from the NuGetExePath property. I’ve already removed nuget.ext from SideWaffle and AzureJobs repository using this technique. It’s a great way to avoid checking in nuget.exe.

 

Thanks,
Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

msbuild | powershell | scripting Tuesday, July 22, 2014 7:36:11 AM (GMT Daylight Time, UTC+01:00)  #     | 
Saturday, July 19, 2014

Introducing PSBuild – an improved interface for msbuild.exe in PowerShell

For the past few months I’ve been working on a project I’m calling PSBuild. It’s an open source project on GitHub which makes the experience of calling MSBuild from PowerShell better. Getting started with PSBuild is really easy. You can install it in one line.

(new-object Net.WebClient).DownloadString("https://raw.github.com/ligershark/psbuild/master/src/GetPSBuild.ps1") | iex

You can find this info on the project home page as well.

 

When you install PSBuild one of the functions that you get is Invoke-MSBuild. When you call Invoke-MSBuild it will end up calling msbuild.exe. Some advantages of using Invoke-MSBuild are.

Calling Invoke-MSBuild

A most basic usage of Invoke-MSBuild.

Invoke-MSBuild .\App.csproj

This will build the project using the default targets. The call to msbuild.exe on my computer is below.

C:\Program Files (x86)\MSBuild\12.0\bin\amd64\msbuild.exe 
    .\App.csproj 
    /m 
    /clp:v=m
    /flp1:v=d;logfile=C:\Users\Sayed\AppData\Local\PSBuild\logs\App.csproj-log\msbuild.detailed.log 
    /flp2:v=diag;logfile=C:\Users\Sayed\AppData\Local\PSBuild\logs\App.csproj-log\msbuild.diagnostic.log

From the call to msbuild.exe you can see that the /m is passed as well as a couple file loggers in %localappdata%. We will get to the logs later. More on Invoke-MSBuild,

To get a sense for how you can use Invoke-MSBuild take a look at the examples below.

# VisualStudioVersion and Configuration MSBuild properties have easy to use parameters
Invoke-MSBuild .\App.csproj -visualStudioVersion 12.0 -configuration Release

# How to pass properties
Invoke-MSBuild .\App.csproj -visualStudioVersion 12.0 -properties @{'DeployOnBuild'='true';'PublishProfile'='toazure';'Password'='mypwd-really'}

# How to build a single target
Invoke-MSBuild .\App.csproj -visualStudioVersion 12.0 -targets Clean

# How to bulid multiple targets
Invoke-MSBuild .\App.csproj -visualStudioVersion 12.0 -targets @('Clean','Build')

 

Getting log files

When you call Invoke-MSBuild on a project or solution the output will look something like the following.

image

Notice that line at the end. You can access your last log using the command.

Open-PSBuildLog

This will open the detailed log of the previous build in the program that’s associated with the .log extension. You can also use the Get-PSBuildLastLogs function to view the path for both log files written. If you want to view the folder where the log files are written to you can execute start (Get-PSBuildLogDirectory).

Helper functions

There are a couple of things that I’m constantly having to look up when I’m authoring MSBuild files; Reserved property names and escape characters. PSBuild has a helper function for each of these Get-MSBuildEscapeCharacters and Get-MSBuildReservedProperties. In the screenshot below you’ll see the result of executing each of these.

image

 

Default Properties

The Invoke-MSBuild cmdlet has a property –defaultProperties. You can pass in a hashtable just like the –properties parameter. These properties are applied as environment variables before the call to msbuild.exe and then reverted afterwards. The effect here is that you can have a property value which will be used if no other value for that property is specified in MSBuild.

 

There is so much more to PSBuild. This is just the tip of the iceberg. Keep an eye on the project page for more info. I’d love your help on this project. Please consider contributing to the project https://github.com/ligershark/psbuild.

 

Thanks,
Sayed Ibrahim Hashimi | http://msbuildbook.com/ | @SayedIHashimi

msbuild | psbuild Saturday, July 19, 2014 6:20:43 AM (GMT Daylight Time, UTC+01:00)  #     | 
Friday, May 09, 2014

How to download a site using msdeploy.exe

Recently I encountered a customer asking if it’s possible to download a site using msdeploy.exe. This is pretty easy using msdeploy.exe. I’ll demonstrate this with Microsoft Azure Web Sites but you can use this with any hosting provider that supports Web Deploy (aka MSDeploy).

To perform a sync with msdeploy.exe the structure of the command that we need to execute is as follows.

msdeploy.exe –verb:sync –source:<fill-in-details> –dest:<fill-in-details>

For the source property we will use the remote Azure Web Site, and for the dest property we will write to a folder on the local file system. You can get the Web Deploy publishing settings in the Azure Web Site by clicking on the Download the publish profile link in the Configure page.

image

This will download an XML file that has all the publish settings you’ll need. For example below is a publish settings file for my demo site.

<publishData>
  <publishProfile profileName="sayeddemo2 - Web Deploy" publishMethod="MSDeploy" 
                  publishUrl="waws-prod-bay-001.publish.azurewebsites.windows.net:443" 
                  msdeploySite="sayeddemo2" 
                  userName="$sayeddemo2" 
                  userPWD="*** removed ***" 
                  destinationAppUrl="http://sayeddemo2.azurewebsites.net" 
                  SQLServerDBConnectionString="" mySQLDBConnectionString="" hostingProviderForumLink="" 
                  controlPanelLink="http://windows.azure.com">
    <databases/>
  </publishProfile>
  <publishProfile profileName="sayeddemo2 - FTP" publishMethod="FTP" 
                  publishUrl="ftp://waws-prod-bay-001.ftp.azurewebsites.windows.net/site/wwwroot" ftpPassiveMode="True" 
                  userName="sayeddemo2\$sayeddemo2" userPWD="*** removed ***" 
                  destinationAppUrl="http://sayeddemo2.azurewebsites.net" SQLServerDBConnectionString="" 
                  mySQLDBConnectionString="" hostingProviderForumLink="" 
                  controlPanelLink="http://windows.azure.com">
    <databases/>
  </publishProfile>
</publishData>

The publish settings file provided by Azure Web Sites has two profiles; an MSDeploy profile and an FTP profile. We can ignore the FTP profile and just use the MSDeploy one. The relevant settings from the profile that we will use are the following values.

We will use the contentPath MSDeploy provider to download the files. On the source parameter we will need to include the relevant details of the remote machine. The full command to execute is below. I’ll break it down a bit after the snippet.

"C:\Program Files (x86)\IIS\Microsoft Web Deploy V3\msdeploy.exe" 
    -verb:sync 
    -source:contentPath=sayeddemo2,
        ComputerName="https://waws-prod-bay-001.publish.azurewebsites.windows.net/msdeploy.axd?site=sayeddemo2",
        UserName=$sayeddemo2,Password=***removed***,AuthType='Basic'  
    -dest:contentPath=c:\temp\pubfromazure -disablerule:BackupRule

The important parts of the command above are how the remote settings are passed to the source provider. On the dest side I’ve provided the location where the files should be downloaded to.

 

Sayed Ibrahim Hashimi | http://msbuildbook.com/ | @SayedIHashimi

Friday, May 09, 2014 12:02:02 AM (GMT Daylight Time, UTC+01:00)  #     | 
Friday, October 11, 2013

SideWaffle: How to create your own VS template pack

If you haven’t heard I’m working on a project with Mads Kristensen called SideWaffle. SideWaffle is a Visual Studio Extension which contains many different Visual Studio Item and Project Templates. This is a community effort and all open source at https://github.com/ligershark/template-builder. You can create your own Item Templates and Project Templates and send a Pull Request for them to be included in the main repo. Check out the video below for more info on SideWaffle.

SideWaffle intro video

Item Templates are used by VS developers to create files using the Add New Item dialog. SideWaffle already contains a number of Item Templates such as; Angular Controller, robots.txt, SignalR Hub and Client, etc. For more info on how to create Item Templates with SideWaffle watch the 4 minute video below.

 

Project Templates are the items that show up in the Add New Project dialog. They provide a starting point for developers creating new projects. SideWaffle already has a few project templates as well, such as a Google Chrome Extension. You can learn more about how to create Project Templates in this video.

 

Now that we’ve gotten the intro out of the way, let’s explore how you can create your own SideWaffle.

How to create your own SideWaffle

The idea behind SideWaffle is that we will have a shared VS extension for popular VS Item and Project Templates. Instead of contributing to the main SideWaffle project you may be interested in creating your own distribution that does not have the standard templates. For example, I’ve heard from both the Orchard and Umbraco that they are interested in creating template packs for their communities. It wouldn’t make much sense to include those templates in the main SideWaffle project. Instead it would be best to create a separate distribution for each; OrchardWaffle and UmbracoWaffle.

So how can you do this? It’s pretty easy actually. SideWaffle is built on top of a NuGet package, TemplateBuilder, which is also open source at https://github.com/ligershark/template-builder. All the core functionality of SideWaffle is contained in that NuGet package. To create your own SideWaffle follow these steps:

After you add the TemplateBuilder NuGet package a few things happen:

  1. The build process of the project is modified to support building Item and Project templates
  2. Your .vsixmanifest file is updated with two new Asset tags
  3. An ItemTemplates folder is created with a sample Item Template

From here on you can build the project and after installing the generated .vsix you can have users easily create instances of your item or project templates.

You can add additional Item Templates, as well as create Project Templates in your project. That’s pretty much all there is to getting started with your own Waffle pack.

Let me know if you have any issues or comments.

Happy Waffleing!

 

Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

extensibility | SideWaffle | Visual Studio | Visual Studio 2012 | VSIX Friday, October 11, 2013 5:11:26 PM (GMT Daylight Time, UTC+01:00)  #     | 
Saturday, September 21, 2013

How to extend the web publish process without modifying project contents

When automating web publishing for Visual Studio projects in many cases your first step will be to create a publish profile for the project in VS. There are many cases in which you cannot, or would not like to do this. In this post I’ll show you how you can take an existing project and use an MSBuild file to drive the publish process. In that I’ll also show how you can extend the publish process without modifying either the project or any of its contents.

Before we get too far into this, if you are not familiar with how to publish your VS web projects from the command line you can read our docs at ttp://www.asp.net/mvc/tutorials/deployment/visual-studio-web-deployment/command-line-deployment.

When you publish a project from the command line using a publish profile you typically use the syntax below.

msbuild .\MyProject.csproj /p:VisualStudioVersion=11.0 /p:DeployOnBuild=true /p:PublishProfile=<profile-name-or-path> 

In this snippet we are passing in a handful of properties. VisualStudioVersion dictates which version of MSBuild targets are used during the build. See http://sedodream.com/2012/08/19/VisualStudioProjectCompatabilityAndVisualStudioVersion.aspx for more details on that. DeployOnBuild=true injects the publish process at the end of build. PublishProfile can either be the name of a publish profile which the project contains or it can be the full path to a publish profile. We will use PublishProfile with that second option, the full path.

So we need to pass in the full path to a publish profile, which typically is a .pubxml file. A publish profile is just an MSBuild file. When you pass in PublishProfile and DeployOnBuild=true, then the publish profile is Imported into the build/publish process. It will supply the publish properties needed to perform the publish.

Let’s see how that works. I have a sample project, MySite, which does not have any publish profiles created for it. I have created a publish profile, ToFileSys.pubxml, in another folder that will be used though. The contents of that file are below.

ToFileSys.pubxml

<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <PropertyGroup>
    <WebPublishMethod>FileSystem</WebPublishMethod>
    <ExcludeApp_Data>False</ExcludeApp_Data>
    <publishUrl>C:\temp\Publish\01</publishUrl>
    <DeleteExistingFiles>False</DeleteExistingFiles>
  </PropertyGroup>
</Project>

This publish profile will publish to a local folder. I just created this file in VS with a different project and then just copied it to the folder that I needed, and removed properties which are only used for the inside of VS experience. We can publish the MySite project using this profile with the command below.

msbuild MySite.csproj 
            /p:VisualStudioVersion=11.0 
            /p:DeployOnBuild=true 
            /p:PublishProfile=C:\data\my-code\publish-samples\publish-injection\build\ToFileSys.pubxml 

When you execute this the file specified in PublishProfile will be included into the build process.

Taking it up a level

steps

Now let’s see how we can take this to the next level by having a single script that will be used to publish more than one project using this technique.

In the sample files (which you can find links for at the end of the post). I have a solution with two web projects, MySite and MyOtherSite. Neither of these projects have any publish profiles created. I have created a script which will build/publish these projects which you can find at build\publish.proj in the samples. The contents of the file are shown below.

publish.proj

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="12.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003" DefaultTargets="BuildProjects">
  <!-- 
  This file is used in two ways.
    1. Drive the build and publish process
    2. It is used by the publish process during the build of MySite to configure/extend publish
        Note: 1. Is kicked off by the use on the cmd line/build server. 2. Is invoked by this script itself.
              This file is injected into the publish process via the PublishProfile property.
  -->  
  <PropertyGroup>
    <VisualStudioVersion Condition=" '$(VisualStudioVersion)'=='' ">11.0</VisualStudioVersion>
    <Configuration Condition=" '$(Configuration)'=='' ">Release</Configuration>
    <!-- Location for build output of the project -->
    <OutputRoot Condition=" '$(OutputRoot)'=='' ">$(MSBuildThisFileDirectory)..\BuildOutput\</OutputRoot>
    <!-- Root for the publish output -->
    <PublishFolder Condition=" '$(PublishFolder)'==''">C:\temp\Publish\Output\</PublishFolder>    
  </PropertyGroup>
  
  <ItemGroup>
    <ProjectsToBuild Include="$(MSBuildThisFileDirectory)..\MySite\MySite.csproj">
      <AdditionalProperties>
        VisualStudioVersion=$(VisualStudioVersion);
        Configuration=$(Configuration);
        OutputPath=$(OutputRoot);
        WebPublishMethod=FileSystem;
        publishUrl=$(PublishFolder)MySite\;
        DeployOnBuild=true;
        DeployTarget=WebPublish;
        PublishProfile=$(MSBuildThisFileFullPath)
      </AdditionalProperties>
    </ProjectsToBuild>

    <ProjectsToBuild Include="$(MSBuildThisFileDirectory)..\MyOtherSite\MyOtherSite.csproj">
      <AdditionalProperties>
        VisualStudioVersion=$(VisualStudioVersion);
        Configuration=$(Configuration);
        OutputPath=$(OutputRoot);
        WebPublishMethod=FileSystem;
        publishUrl=$(PublishFolder)MyOtherSite\;
        DeployOnBuild=true;
        DeployTarget=WebPublish;
        PublishProfile=$(MSBuildThisFileFullPath)
      </AdditionalProperties>
    </ProjectsToBuild>
  </ItemGroup>
  
  <Target Name="BuildProjects">
    <MSBuild Projects="@(ProjectsToBuild)" />
  </Target>

  <!-- ***************************************************************************************
 The targets below will be called during the publish process.
 These targets are injected into the publish process for each web project.
 These targets will not have access to any new values for properties/items from
 the targets above this.
 *************************************************************************************** -->

  <Target Name="AfterWebPublish" AfterTargets="WebPublish">
    <Message Text="Inside AfterWebPublish" Importance="high"/>
  </Target>
  
</Project>

This file is pretty simple, it declares some properties which will be used for the build/publish process. Then it declares the projects to be built with an item list named ProjectsToBuild. When declaring ProjectsToBuild I use the AdditionalProperties metadata to specify MSBuild properties to be used during the build process for each project. Let’s take a closer look at those properties.

<AdditionalProperties>
  VisualStudioVersion=$(VisualStudioVersion);
  Configuration=$(Configuration);
  OutputPath=$(OutputRoot);
  WebPublishMethod=FileSystem;
  publishUrl=$(PublishFolder)MySite\;
  DeployOnBuild=true;
  DeployTarget=WebPublish;
  PublishProfile=$(MSBuildThisFileFullPath)
</AdditionalProperties>

I’ll explain all the properties now. VisualStudioVersion, Configuration and OutputPath are all used for the build process. The other properties are related to publishing. If you want to publish from the file system those properties (WebPublishMethod, publishUrl, DeployOnBuild, and DeployTarget) must be set. The most important property there is PublishProfile.

PublishProfile is set to $(MSBuildThisFileFullPath) which is the full path to publish.proj. This will instruct the build process of that project to import publish.proj when its build/publish process is started. It’s important to note that a “new instance” of the file will be imported. What that means is that the imported version of publish.proj won’t have access to any dynamic properties/items created in publish.proj.

The reason why PublishProfile is specified there is so that we can extend the publish process from within publish.proj itself. publish.proj has a target, AfterWebPublish, which will be executed after each project is executed. Let’s see how this works.

We can execute the publish process with the command below.

msbuild .\build\publish.proj /p:VisualStudioVersion=11.0

After executing this command the tail end of the result is shown in the image below.

image

In the image above you can see that the MyOtherSite project is being publish to the specified location in publish.proj and the AfterWebPublish target is executed as well.

 

In this post we’ve seen how we can use an MSBuild file as a publish profile, and how to extend the publish process using that same file.

You can download the samples at https://dl.dropboxusercontent.com/u/40134810/blog/publish-injection.zip. You can find the latest version in my publish-samples repository at publish-injection.

Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

msbuild | Visual Studio | Visual Studio 2012 | web | Web Publishing Pipeline Saturday, September 21, 2013 7:57:03 PM (GMT Daylight Time, UTC+01:00)  #     | 
Saturday, June 08, 2013

Introducing VsixCompress–a NuGet package to compress your Visual Studio Package

A few weeks ago I blogged about a Visual Studio extension, Farticus, which I’m working on with Mads Kristensen. In that post I described how the default compression of a .vsix (the artifact that is created for a Visual Studio Package) is not as small as it could be. It’s better to get a fully compressed VSIX because when users install the component the download time can be significantly reduced. In that post I described how you could use the Zip task from the MSBuild Extension Pack to have a fully compressed .vsix file. I will now show you how I’ve simplified this.

Icon for package VsixCompressVsixCompress

Since my previous post I’ve created a NuGet package, VsixCompress which simplifies this greatly. If you have an existing Visual Studio package and want to have a fully compressed .vsix file then all you need to do is install the VsixCompress package.

image

 

After you install this package the following happens.

  1. NuGet package is downloaded and installed to the packages folder
  2. The project is edited to include an import for a new .targets file
  3. The build process is extended to compress the .vsix file automatically

After installing this package once you build the generated .vsix is much smaller than before. In the default case where you select to create a new C# VS Package the created .vsix is 17kb. After adding VsixCompress the resulting .vsix is only 9kb. That’s almost half the size. That’s all you need to know for local development. If you have a build server setup then there are a couple of additional steps. Let’s go over those now.

Build Server Support

I have blogged before about issues of shipping build updates in NuGet. To briefly summarize, when leveraging NuGet Package Restore you have to be careful if any of those NuGet packages have build updates. When using Package Restore the NuGet packages which contain the imported .targets file(s) are restored after the build starts. What this means is that the .targets files will never be imported (or an old copy is imported in the case the file exists from a previous build). The only way to work around this is to restore the packages before the .sln/.csproj file themselves are built. You can read the full details at http://sedodream.com/2012/12/24/SlowCheetahBuildServerSupportUpdated.aspx. I have a NuGet package, PackageRestore, which can help here. Take a look at my previous post How to simplify shipping build updates in a NuGet package. Now that we’ve discussed all the details that you need let’s discuss what my plans are going forward with this.

Plans going forward

I’m hoping to add the following features over the course of the next few weeks.

FYI VsixCompress is open source so you can take a look at the code, or better yet contribute at https://github.com/sayedihashimi/vsix-compress.

Please let me know what you think of this!

Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

extensibility | Visual Studio | VSIX Saturday, June 08, 2013 1:42:57 AM (GMT Daylight Time, UTC+01:00)  #     | 
Thursday, June 06, 2013

How to simplify shipping build updates in a NuGet package

Have you ever wanted to ship build updates in a NuGet package? If so then you may have noticed that there are issues if you need to import a .targets file which is contained in a NuGet package. To be brief, the issue is that by the time the .targets file is being restored with NuGet it’s already too late because the build has already started. So we need an easy way to invoke NuGet package restore for a given project before the build for the solution/project starts. In this entry you will see how we can use a generated build script to help with this.

A few months back I discovered this issue and blogged about it at SlowCheetah build server support updated. The solution that I implemented was to create a new build script, packageRestore.proj, which is placed in the root of the project when the SlowCheetah NuGet package is installed. If you want to invoke the package restore from a build server then you can just add packageRestore.proj to the list of items to build before the .sln/.csproj file which you are intending to build. I’ve now refactored this into its own NuGet package, PackageRestore.

 

Here is how you can try it out.

  1. Create a new project
  2. Add a few NuGet packages to the project
  3. Enable NuGet package restore
  4. Add the PackageRestore package (the command you can use is Install-Package PackageRestore –pre)
  5. Close solution
  6. Delete packages folder
  7. Execute the command msbuild.exe packageRestore.proj

After executing the command you will see that the packages folder along with all its content will be restored. For CI servers when you define your build script all you do is build packageRestore.proj before you build the target .sln/.csproj file.

If you are shipping build updates through NuGet and want to be able to enable package restore in a similar way a SlowCheetah then when you create your NuGet package add PackageRestore as a dependency.

 

This is an open source project so you can view the source, and help contribute, at https://github.com/sayedihashimi/package-restore.

 

Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

msbuild | nuget | PackageRestore Thursday, June 06, 2013 6:57:37 AM (GMT Daylight Time, UTC+01:00)  #     | 
Wednesday, June 05, 2013

How to publish a VS web project with a .publishSettings file

The easiest way to publish a Visual Studio web project from the command line is to follow the steps below.

  1. Open VS
  2. Right click on the Web project and select Publish
  3. Import your .publishSettings file (or manually configure the settings)
  4. Save the profile (i.e. .pubxml file)
  5. From the command line publish with the command line passing in PublishProfile

For more details on this you can see ASP.NET Command Line Deployment. This is pretty simple and very easy, but it does require that you manually create the .pubxml file. In some cases you’d just like to download the .publishsettings file from your hosting provider and use that from the command line. This post will show you how to do this.

In order to achieve the goal we will need to extend the build/publish process. There are two simple ways to do this; 1. Place a .wpp.targets file in the same directory as the web project or 2. Pass an additional property indicating the location of the .wpp.targets file. I’ll first go over the technique where you place the file directly inside of the directory where the project is. After that I’ll show you how to use this file from a well known location.

One way to do this is to create a .wpp.targets file. This .wpp.targets file will be imported into the build/publish process automatically. This .targets file will enable us to pass in PublishSettingsFile as an MSBuild property. It will then read the .publishsettings file and output the properties needed to execute the publish process.

.wpp.targets in the project directory

Let’s take a look at the .targets file and then we will discuss it’s contents. Below you will find the contents of the full file.

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> 
  
  <!--
  When using this file you must supply /p:PublishSettingsFile as a parameter and /p:DeployOnBuild=true
  -->  
  <PropertyGroup Condition=" Exists('$(PublishSettingsFile)')">
    <!-- These must be declared outside of a Target because they impact the Import Project flow -->
    <WebPublishMethod>MSDeploy</WebPublishMethod>
    <DeployTarget>WebPublish</DeployTarget>
  
    <PipelineDependsOn>
      GetPublishPropertiesFromPublishSettings;
      $(PipelineDependsOn);
    </PipelineDependsOn>
  </PropertyGroup>

  <Target Name="GetPublishPropertiesFromPublishSettings" BeforeTargets="Build" Condition=" Exists('$(PublishSettingsFile)')">
    <PropertyGroup>
      <_BaseQuery>/publishData/publishProfile[@publishMethod='MSDeploy'][1]/</_BaseQuery>
      <!-- This value is not in the .publishSettings file and needs to be specified, it can be overridden with a cmd line parameter -->
      <!-- If you are using the Remote Agent then specify this as RemoteAgent -->
      <MSDeployPublishMethod>WMSVC</MSDeployPublishMethod>
    </PropertyGroup>

    <ItemGroup>
      <_MSDeployXPath Include="WebPublishMethod">
        <Query>$(_BaseQuery)@publishMethod</Query>
      </_MSDeployXPath>

      <_MSDeployXPath Include="MSDeployServiceURL">
        <Query>$(_BaseQuery)@publishUrl</Query>
      </_MSDeployXPath>

      <_MSDeployXPath Include="SiteUrlToLaunchAfterPublish">
        <Query>$(_BaseQuery)@destinationAppUrl</Query>
      </_MSDeployXPath>

      <_MSDeployXPath Include="DeployIisAppPath">
        <Query>$(_BaseQuery)@msdeploySite</Query>
      </_MSDeployXPath>

      <_MSDeployXPath Include="UserName">
        <Query>$(_BaseQuery)@userName</Query>
      </_MSDeployXPath>

      <_MSDeployXPath Include="Password">
        <Query>$(_BaseQuery)@userPWD</Query>
      </_MSDeployXPath>
    </ItemGroup>

    <XmlPeek XmlInputPath="$(PublishSettingsFile)"
             Query="%(_MSDeployXPath.Query)"
             Condition=" Exists('$(PublishSettingsFile)')">
      <Output TaskParameter="Result" PropertyName="%(_MSDeployXPath.Identity)"/>
    </XmlPeek>
  </Target>
</Project>

You can place this file in the root of your project (next to the .csproj/.vbproj file) with the name {ProjectName}.wpp.targets.

This .targets file is pretty simple. It defines a couple properties and a single target, GetPublishPropertiesFromPublishSettings. In order to publish your project from the command line you would execute the command below.

msbuild.exe MyProject /p:VisualStudioVersion=11.0 /p:DeployOnBuild=true /p:PublishSettingsFile=<path-to-.publishsettings>

Here is some info on the properties that are being passed in.

The VisualStudioVersion property indicates that we are using the VS 2012 targets. More info on this at http://sedodream.com/2012/08/19/VisualStudioProjectCompatabilityAndVisualStudioVersion.aspx.

DeployOnBuild, when true it indicates that we want to publish the project. This is the same property that you would normally pass in.

PublishSettingsFile, this is a new property which the .targets file recognizes. It should be set to the path of the .publishSettings file.

 

The properties at the top of the .targets file (WebPublishMethod and DeployTarget) indicate what type of publish operation is happening. The default values for those are MSDeploy and WebPublish respectively. You shouldn’t need to change those, but if you do you can pass them in on the command line.

The GetPublishPropertiesFromPublishSettings target uses the XmlPeek task to read the .publishsettings file. It then emits the properties required for publishing.

OK this is great an all, but it still requires that an additional file (the .wpp.targets file) be placed in the directory of project. It is possible to avoid this as well. Let’s move to that

.wpp.targets from a well known location

If you’d like to avoid having to place the .wpp.targets file in the directory of the project you can easily do this. Place the file in a well known location and then execute the same msbuild.exe call adding an additional property. See the full command below.

msbuild.exe MyProject /p:VisualStudioVersion=11.0 /p:DeployOnBuild=true /p:PublishSettingsFile=<path-to-.publishsettings> /p:WebPublishPipelineCustomizeTargetFile=<path-to.targets-file>

Once you do this you no longer need to create the Publish Profile in VS if you want to publish from the command line with a .publishsettings file.

 

FYI you can find the complete sample at https://github.com/sayedihashimi/publish-samples/tree/master/PubWithPublishSettings.

 

Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

msbuild | web | Web Publishing Pipeline Wednesday, June 05, 2013 6:01:34 PM (GMT Daylight Time, UTC+01:00)  #     |