If you’ve ever created an MSDeploy web package using Visual Studio you may have noticed that the generated package has the folder structure of where the application was packaged. Since MSDeploy parameters are used when installing the package in most cases the structure of the web package doesn’t matter. In some cases this causes problems and it would be desired to have a flatter package structure.

Today on twitter @ashic contacted me asking basically “How can I update the package folder structure without modifying the project?” OK it’s possible, but I’ll first explain how you can easily solve this problem in a few ways and then move on to his actual question.

Option 1: Add the PackageWeb NuGet package

The easiest way to fix this problem is to add a the PackageWeb NuGet package into your project (that’s a package that I’ve authored you can see the sources here). This will update the path to a flat structure and add a .ps1 file when creating the package. You can ignore the .ps1 file if you like.

Option 2: Add package.wpp.targets to your project

In web projects when you build it will import any files in the same folder as the .csproj/.vbproj file matching the following file pattern *.wpp.targets. To fix the issue drop the following contents in to that file. Note: you can find the latest version of this in this gist.



  
    website
    true

    
      $(PackageDependsOn);
      AddReplaceRuleForAppPath;
    
  

  
    
      <_PkgPathFull Condition=" '$(WPPAllFilesInSingleFolder)'!='' ">$([System.IO.Path]::GetFullPath($(WPPAllFilesInSingleFolder)))
      <_PkgPathFull Condition=" '$(_PkgPathFull)' == '' ">$([System.IO.Path]::GetFullPath($(_PackageTempDir)))
    

    
    
      
    
    
    
      
        $(_PkgPathRegex)
        $(PackagePath)
      
    
  

Then when you build this project the package path will be consist of Content\website and all content files are under that.

Now on to his question, “How can I update the package path for a project without modifying the project/project files when building from the command line”"?”

Now that we have the MSBuild .targets file to do the work for us the only thing we need to figure out is how to add this .targets file into the build process when calling msbuild.exe myproject.csproj /t:Package. It’s pretty easy actually. You can take the package.wpp.targets file and drop it in a well known location (let’s say c:\msbuild\package.targets for this example). Then when you build your project you can pass a property to get that file imported. The command is below.

msbuild myproject.csproj /t:Package /p:CustomAfterMicrosoftCommonTargets=c:\msbuild\package.targets  

In Microsoft.Common.targets (which is imported by most project types) contains a property CustomAfterMicrosoftCommonTargets which defaults to a folder under Program Files. You can override that value via MSBuild parameters to override it which is what we are doing here. Note: if you have a common targets file in the default shared location then this obviously will not work for you. You'd have to add another conditional import with a new property for that case.

Thanks,

Sayed Ibrahim Hashimi


Comment Section

Comments are closed.


Have you ever wanted to view MSBuild log files in Markdown format? If you are using psbuild then you’ll now get this feature for free if you upgrade your install. psbuild is a PowerShell wrapper for msbuild.exe. You can learn more about it on the github page or my previous blog post on it.

 

Using psbuild to build a project the basic command will look like the following.

Invoke-MSBuild myprojectorsln.csproj

After that to get the log you’ll invoke

Open-PSBuildLog

Open-PSBuildLog will open the log file from the last build executed by that instance of psbuild. The Open-PSBuildLog has a single parameter, format, which defines which log file to return. This parameter takes the following values.

  • detailed
  • diagnostic
  • markdown

Detailed is the default format. So to get your log file in MSBuild format execute.

Open-PSBuild markdown

This will open the .md file in the default editor. Below is a screenshot of a sample log file.

image

 

If you have any comments reach out to me on twitter or open an issue in psbuild to discuss further.

 

Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi


Comment Section

Comments are closed.


OK, so this off topic, but its so important that I had to blog about this. Let me give you some background. Last week I was traveling with my family on a road trip to Canada. Usually I’m super paranoid and never connect to any open wireless network (I pay for an carry my own mi-fi device due to this). Since we were in Canada I didn’t want to get hit with so many charges so I chanced it on a few networks. I made sure to connect to VPN as soon as I could, but there was still some time that I was not completely protected. At one point I thought that my gmail account had been hacked (further investigation proved this to be false thankfully). So I connected to a known network, applied VPN, double checked that my IP was routed through VPN and started changing my critical passwords. One of those was an account with BBVACompass bank. I initially setup this account a while back, and evidently I wasn’t as concerned with online security as I am now.

When I got to the password reset screen here is the tooltip indicating the password requirements.

image

In developer terms that 4-12 alphanumeric only. You cannot use any special characters or spaces. I’m not a security expert so I reached out to @TroyHunt (founder of https://haveibeenpwned.com/ and recognized security expert) to see what his thoughts were on this. Here is his response.

image

Later in the conversation BBVACompass chimed in stating

image

So I looked at the link http://www.bbvacompass.com/customer-service/online-banking/siteid.jsp to see if there was some other way to authenticate which was more secure. From what I understood from that link they have a service called “Site ID” which consists of the following.

  1. You enter three security questions/answers
  2. When logging in on a new machine you are prompted for security questions/answers and if the machine is “trusted”
  3. When logging in using a trusted machine the password is never submitted over the wire

The page that was linked to didn’t include any indication that this was “dual factor authentication” as the @BBVACompass twitter account tried to pass on me. I let them know that this is not two/dual factor auth. Even with “Site ID” if you log in on a compromised machine all security questions/answers and password can be stolen and users can effectively log in without me ever being notified. That defeats the purpose of two factor auth. With two factor auth if I sign into google on a compromised machine you will get my password but when you try and sign in later you’ll have to get access to my phone’s text messages as well. That is true two factor auth, not security questions.

Another security expert @RobHale77 also chimed in later with the comments below.

image

What’s a strong password?

I decided to do a bit more investigation around how BBVACompass is representing password strength on their change password page. I guessed that the password strength field was being populated with JavaScript so I opened the site in my browser and use the in-browser dev tools to look at the code. Here is the getPasswordStrength function, comments were added by me.

// This function will return an integer in the range of 0-100.
// The max that I've seen here is 90
function getPasswordStrength(H){
    var D=(H.length);
    if (D<4) { D=0; }
    if(D>5){
        D=5;
    }
    var F=H.replace(/[0-9]/g,"");
    var G=(H.length-F.length);
    if(G>3){G=3;}
    var A=H.replace(/\W/g,"");
    var C=(H.length-A.length);
    if(C>3){C=3;}
    var B=H.replace(/[A-Z]/g,"");
    var I=(H.length-B.length);
    if(I>3){I=3;}
    var E=((D*10)-20)+(G*10)+(C*15)+(I*10);
    if(E<0){E=0;}
    if(E>100){E=100;}
    return E;
}

Then this is converted to weak/medium/strong with the following js function, once again comments were added by me.

$.fn.passwordStrength = function( options ){
    return this.each(function(){
        var that = this;that.opts = {};
        that.opts = $.extend({}, $.fn.passwordStrength.defaults, options);
 
        that.div = $(that.opts.targetDiv);
        that.defaultClass = that.div.attr('class');
 
        // opts.classes is declared elsewhere as = Array('weak','medium','strong') so length is 3
        that.percents = (that.opts.classes.length) ? 100 / that.opts.classes.length : 100;
 
         v = $(this)
        .keyup(function(){
            if( typeof el == "undefined" )
                this.el = $(this);
            var s = getPasswordStrength (this.value);
            var p = this.percents;
            var t = Math.floor( s / p );
            
            // from what I can tell 's' will max out at 90 so this if statement will always be skipped
            if( 100 <= s )
                t = this.opts.classes.length - 1;
            
            // t now determines the index for weak/medium/strong
            // weak:     s <= 30
            // medium:    s 40-60 (inclusive)
            // strong:    s >= 70 (maximum I've seen is 90)
            
            this.div
                .removeAttr('class')
                .addClass( this.defaultClass )
                .addClass( this.opts.classes[ t ] );
 
        });
    });

Since the code is minified it somewhat difficult to follow. What I found was that a strong password consisted of the following.

  • 3 numbers
  • 1 upper case letter
  • 1 lower case letter

So I decided to try “Hi123” to see if I was right. Sure enough BBVACompass told me that the selected password Hi123 is a Strong password!

image This is beyond insane. It contains a word and a sequence of 3 numbers (likely the most common sequence at that as well). BBVACompass, this is misleading at best. This is nowhere near strong, you are lying to your customers about the security of their passwords. Here are some passwords and how BBVACompass represents their strength. If you have an account you can verify this by going to the change password screen under Online Banking Profile.

weak

  • aaaa
  • bbbb
  • swrxwuppzx
  • hlzzeseiyg

medium

  • 1234
  • a123
  • HLZzESeiYG
  • sWrXwUppZX

strong

  • Hi123
  • 123Ab
  • 123Food
  • 111Hi
  • 111Aa

Why on earth is “hlzzeseiyg” weak and “111Aa” is strong?! Clearly this has been poorly implemented and misleading, fix it now.

What I would like to see as a consumer

My top recommendation for BBVACompass is to get a security expert/team involved to redo your online security, but if you cannot afford that then follow what’s below.

I’m not a security expert but here is what I recommended to BBVACompass as a consumer.

  1. Support for very strong passwords. Those that are >= 20 characters and allowing special characters
  2. Support for true two factor auth like password/text or password/call
  3. (stretch goal) Support to view an audit log of devices that have recently accessed my account

I am getting all of the above features from google currently.

Be more transparent about weak passwords

Now that I’ve seen the guts of their getPasswordStrength function I’d like to see BBVACompass implement a better function for reporting password strength. One that takes into account dictionary words, and common patterns. As stated I’m not a security expert but after a quick search I found http://www.sitepoint.com/5-bootstrap-password-strength-metercomplexity-demos/ which includes a pointer to live demo jquery.pwstrength.bootstrap (http://jsfiddle.net/jquery4u/mmXV5/) and StrongPass.js (http://jsfiddle.net/dimitar/n8Dza/). Below are the results for the same “Hi123” password from both.

image

image 

As you can see if BBVACompass had used readily available Open Source tools to verify password strength we wouldn’t be having this conversation. Both reported the password as being unacceptable.

Accountability

As consumers we must hold our online service providers (especially banks) accountable for online security. For the tech savy bunch, it’s your responsibility to educate your non-tech friends/family about online security and strong passwords.

As a bank, BBVACompass, needs to hold their development team accountable for providing customers with secure access to accounts online as well as honest indications for password strength. You’re being dishonest, which means I cannot trust you.

 

Previous pleas ignored by BBVACompass

I did a search on twitter for @BBVACompass password and discovered that this has been brought up multiple times by customers. The first of which I found occurred in November 2013! Tweets below.

imagehttps://twitter.com/gumnos/status/406880230670749696

image

https://twitter.com/JasonSamfield/status/429448195119136768 

imagehttps://twitter.com/BenLake5/status/477262736208822273

BBVACompass, your customers have spoken and we are demanding better online security. Now is the time to act. I’ve already closed my account and I’ll be advising all friends/family with a BBVACompass account to do the same. With recent security breaches of Sony/Target/etc you need to start taking online security more seriously. This blog post and twitter comments may end up with a few accounts closing, but if your customers experience wide spread hacking then it will be much more severe. Fix this before it is too late, this should be your top development priority IMO.

My Promise to BBVACompass

BBVACompass if you support passwords >= 20 characters with special charcters within 90 days I will re-open my account with the same funds as which I closed it the next time I’m in Florida.

 

Note: please post comments at http://www.reddit.com/r/technology/comments/2o4uat/i_closed_my_bbvacompass_account_because_they/.

Sayed Ibrahim Hashimi @SayedIHashimi


Comment Section

Comments are closed.


For the latest info here and for discussion please visit https://github.com/sayedihashimi/slow-cheetah/issues/158.

 

I first developed SlowCheetah around VS2010 SP2 with the idea that I could at some point transition this to the appropriate team(s) at Microsoft. Unfortunately I haven't been able to do that, and the existence of this extension has actually worked against that goal.

I'm really happy that SlowCheetah has gotten the attention and success that it has, but now it's time for me to move on.

No support for SlowCheetah in VS "14"

I am not planning to update SlowCheetah for Visual Studio "14". If you would like to see support for transforms in VS "14" I suggest you vote, and comment, on the uesrvoice item at http://visualstudio.uservoice.com/forums/121579-visual-studio/suggestions/2043217-support-web-config-style-transforms-on-any-file-in.

No new features

I will not be adding any features to SlowCheetah myself. If anyone want's to add any features I will try and help guide anyone who is interested.

No fixes for regressions

If there are any scenarios that work in VS2013 RTM that do not work in future version of Visual Studio then I will not be fixing them.

I hope you all understand my situation here. I have spent countless hours working on SlowCheetah and there is very little ROI for me so, I need to move on to focus on other OSS projects that I'm involved in.

Thanks for all the love. I still love SlowCheetah too and I'm sad to see there won't be support for transform in VS "14"

 

Thanks,

Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi


Comment Section

Comments are closed.


A few weeks ago Mads Kristensen and I created a few site extensions for Azure Web Sites which the Azure Image Optimizer and Azure Minifier. These extensions can be used to automatically optimize all images on a site, and minify all .js/.css files respectively. These are shipped as nuget packages in nuget.org as well as site extensions in siteextensions.net.

After creating those utilities we also update the image optimizer to support being called in on the command line via a .exe. We have not yet had a chance to update the minifier to be callable directly but we have an open issue on it. If you can help that would be great.

The exe for the image optimizer that can be used from the command line can be found in the nuget package as well. You can also download it from here, but to get the latest version you nuget.org is the way to go.

After releasing that exe I wanted an easy way to use it on a variety of machines, and to make it simple for others to try it out. What I ended up with is what I’m calling a “self-bootstrapping script” which you can find at optimize-images.ps1. Below you’ll see the entire contents of the script.

[cmdletbinding()]
param(
    $folderToOptimize = ($pwd),

    $toolsDir = ("$env:LOCALAPPDATA\LigerShark\tools\"),

    $nugetDownloadUrl = 'http://nuget.org/nuget.exe'
)

<#
.SYNOPSIS
    If nuget is in the tools
    folder then it will be downloaded there.
#>
function Get-Nuget(){
    [cmdletbinding()]
    param(
        $toolsDir = ("$env:LOCALAPPDATA\LigerShark\tools\"),

        $nugetDownloadUrl = 'http://nuget.org/nuget.exe'
    )
    process{
        $nugetDestPath = Join-Path -Path $toolsDir -ChildPath nuget.exe
        
        if(!(Test-Path $nugetDestPath)){
            'Downloading nuget.exe' | Write-Verbose
            (New-Object System.Net.WebClient).DownloadFile($nugetDownloadUrl, $nugetDestPath)

            # double check that is was written to disk
            if(!(Test-Path $nugetDestPath)){
                throw 'unable to download nuget'
            }
        }

        # return the path of the file
        $nugetDestPath
    }
}

<#
.SYNOPSIS
    If the image optimizer in the .ools
    folder then it will be downloaded there.
#>
function GetImageOptimizer(){
    [cmdletbinding()]
    param(
        $toolsDir = ("$env:LOCALAPPDATA\LigerShark\tools\"),
        $nugetDownloadUrl = 'http://nuget.org/nuget.exe'
    )
    process{
        
        if(!(Test-Path $toolsDir)){
            New-Item $toolsDir -ItemType Directory | Out-Null
        }

        $imgOptimizer = (Get-ChildItem -Path $toolsDir -Include 'ImageCompressor.Job.exe' -Recurse)

        if(!$imgOptimizer){
            'Downloading image optimizer to the .tools folder' | Write-Verbose
            # nuget install AzureImageOptimizer -Prerelease -OutputDirectory C:\temp\nuget\out\
            $cmdArgs = @('install','AzureImageOptimizer','-Prerelease','-OutputDirectory',(Resolve-Path $toolsDir).ToString())

            'Calling nuget to install image optimzer with the following args. [{0}]' -f ($cmdArgs -join ' ') | Write-Verbose
            &(Get-Nuget -toolsDir $toolsDir -nugetDownloadUrl $nugetDownloadUrl) $cmdArgs | Out-Null
        }

        $imgOptimizer = Get-ChildItem -Path $toolsDir -Include 'ImageCompressor.Job.exe' -Recurse | select -first 1
        if(!$imgOptimizer){ throw 'Image optimizer not found' }       

        $imgOptimizer
    }
}

function OptimizeImages(){
    [cmdletbinding()]
    param(
        [Parameter(Mandatory=$true,Position=0)]
        $folder,
        $toolsDir = ("$env:LOCALAPPDATA\LigerShark\tools\"),
        $nugetDownloadUrl = 'http://nuget.org/nuget.exe'
    )
    process{        
        [string]$imgOptExe = (GetImageOptimizer -toolsDir $toolsDir -nugetDownloadUrl $nugetDownloadUrl)

        [string]$folderToOptimize = (Resolve-path $folder)

        'Starting image optimizer on folder [{0}]' -f $folder | Write-Host
        # .\.tools\AzureImageOptimizer.0.0.10-beta\tools\ImageCompressor.Job.exe --folder M:\temp\images\opt\to-optimize
        $cmdArgs = @('--folder', $folderToOptimize)

        'Calling img optimizer with the following args [{0} {1}]' -f $imgOptExe, ($cmdArgs -join ' ') | Write-Host
        &$imgOptExe $cmdArgs

        'Images optimized' | Write-Host
    }
}

OptimizeImages -folder $folderToOptimize -toolsDir $toolsDir -nugetDownloadUrl $nugetDownloadUrl

The script is setup to where you call functions like Get-NuGet and GetImageOptimzer to get the path to the .exe to call. If the .exe is not in the expected location, under %localappdata% by default, it will be downloaded and then the path will be returned. In the case of this script I use nuget.org as my primary distribution mechanism for this so the script will first download nuget.exe and then use that to get the actual binaries. WIth this approach, you can avoid checking in binaries and have scripts which are still pretty concise.

After creating optimize-images.ps1 I thought it would be really useful to have a similar script to execute XDT transforms on xml files. So I created transform-xml.ps1. That script first downloads nuget.exe and then uses that to download the nuget packages which are required to invoke XDT transforms.

A self-bootstrapping script doesn’t need to be a PowerShell script, you can apply the same techniques to any scripting language. I’ve recently created an MSBuild script, inspired by Get-Nuget above, which can be used in a similar way. You can find that script in a gist here. It’s below as well.




  
  
  
    $(localappdata)\LigerShark\AzureJobs\tools\nuget.exe
    http://nuget.org/nuget.exe
  
  
  
    
    
      <_nugetexeitem Include="$(NuGetExePath)" />
    
    
        
  

  
    $(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll
    $(MSBuildFrameworkToolsPath)\Microsoft.Build.Tasks.v4.0.dll
    $(windir)\Microsoft.NET\Framework\v4.0.30319\Microsoft.Build.Tasks.v4.0.dll
  
  
  
    
    
      

This script has a single target, GetNuGet, which you can call to download nuget.exe to the expected location. After that you can use the path to nuget.exe from the NuGetExePath property. I’ve already removed nuget.ext from SideWaffle and AzureJobs repository using this technique. It’s a great way to avoid checking in nuget.exe.

 

Thanks,
Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi


Comment Section

Comments are closed.


<< Older Posts