Showing posts with label Visual Studio. Show all posts
Showing posts with label Visual Studio. Show all posts

Thursday, October 19, 2017

Xamarin.Forms with Caliburn.Micro walk-through

As of this morning, I posted a new version of my Xamarin.Forms with Caliburn.Micro Starter Kit on the Visual Studio Marketplace.

This video provides a quick walk-through of using the template.

submit to reddit

Monday, October 16, 2017

Jump start your next project with Xamarin.Forms Caliburn.Micro Starter Kit

Hey all! I’ve bundled my walkthrough of setting up a Xamarin.Forms to use Caliburn.Micro for Android, iOS and UWP into a Visual Studio Project Template and made it available in the Visual Studio Extensions Gallery

You can download it directly here, or from within Visual Studio: Tools –> Extensions and Updates –> Online.

Update 10/18/2017:

  • I had to republish the package as a “Tool” because it includes a few code snippets. The VS Gallery doesn’t allow you to change the classification of the VSIX, so I had to republish under a new identifier. You’ll need to uninstall and reinstall the new template.

image

As a multi-project, it’s very straight forward to use, simply choose File –> New and select “Xamarin.Forms with Caliburn.Micro”

image

Will create a project with the following structure:

image

Which, when run (on your platform of choosing) looks like this screenshot below. This is right where we left off from my walk-through earlier this year and is a great starting point for prototyping or building your next app.

image

Known Issues

  • Windows does not automatically create a signing key and identity for your app. Be sure to edit the UWP manifest and associate with your signing identity.

The source code for the starter kit can be found here. Let me know what you think!

submit to reddit

Tuesday, October 10, 2017

Bundle your Visual Studio Solution as a Multi-Project Template

Earlier this year I provided a walkthrough of setting up a Xamarin.Forms project that leveraged Caliburn.Micro for Android, iOS and UWP. I had big plans for extracting the contents of that walkthrough and providing it as a NuGet package. Plans changed however, and I’ve decided to package the entire solution as a Multi-Project Template and provide it as an add-on to Visual Studio (VSIX). This post introduces provides a walk-through on how to create multi-project templates.

Wait, why not NuGet?

First off, as an aside, let’s go back and look what I wanted to do. I wanted to provide a starter-kit of files that would jump start your efforts and allow you to modify my provided files as you see fit. As a NuGet package, I can deliver these files to any project simply by adding these loose code files in the content folder of the NuGet package. Two things that are really awesome about this: the code files can be treated as source code transforms by changing their extension to *.pp, and through platform targeting I could deliver different content files per platform (Xamarin.iOS10, Xamarin.Android10, uap10.0, etc). With this approach, you would simply create a new Xamarin.Forms project then add the NuGet package to all projects. Bam. Easy.

But there are a few problems with this approach:

  • Existing files. My NuGet package would certainly be replacing existing files in your solution. I’d want to overwrite key parts of the initial template (App.xaml, AppDelegate, Activity, etc) and in some cases delete files (MainPage.xaml). Technically, I can overcome these side-effects by modifying the project through a NuGet install script (install.ps1). However, you would be prompted during the install about the replacements and if you clicked ‘No’ when prompted to replace these files… my template wouldn’t work.
  • Delivering Updates. This is the funny thing about this approach -- it is really intended as a one time deal. You would add the starter files to your project and then begin to modify and extend to your hearts’ content. However, as the package author, no doubt I would find an issue or improvement for the package and publish it. If you were to update the package, it would repeat its initialization process and nuke your customizations. I would prefer not to see you when you’re angry.
  • Not guaranteed. Lastly, you could try and add the NuGet package to only one of your projects, or to a library that isn’t intended as a Xamarin.Forms project.

Above all else, the NuGet documentation clearly states that these files should be treated immutable and not intended to be modified by the consuming project. And since the best place to add the package is immediately after you create the project using a Visual Studio Template, why not just make a Template?

Creating a Multi-Project Template

While Multi-Project Templates have been around for a while, their tooling has improved considerably over the last few releases of Visual Studio. Although there isn’t a feature to export an entire solution as a multi-project, they conceptually work the same way as creating a single project template and then tweaking it slightly.

There are two ways to create a Project Template. The first and easiest is simply to select Project –> Export Template. The wizard that appears will prompt you for a Project and places your template in the My Exported Templates folder.

The second approach requires you to install the Visual Studio SDK, which can be found as an option in the initial installer. When you have the SDK installed, you can create a Project Template as an item in your solution. This project includes the necessary vstemplate files and produces the packaged template every time you build.

image

Effectively, a Project Template is just a zip file with a .vstemplate file in it. A Multi-Project Template has a single .vstemplate that points to templates in subfolders. Here’s how I created mine:

1. Create a Project Template project

Using the Visual Studio SDK, I created a Project Template project to my solution and modified the VSTemplate file with the appropriate details:

<VSTemplate Version="2.0.0" Type="ProjectGroup"
    xmlns="http://schemas.microsoft.com/developer/vstemplate/2005">
  <TemplateData>
    <Name>Xamarin.Forms with Caliburn.Micro</Name>
    <Description>Xamarin.Forms project with PCL library.</Description>
    <ProjectType>CSharp</ProjectType>
    <Icon>_icon.ico</Icon>
    <DefaultName>App</DefaultName>
    <ProvideDefaultName>true</ProvideDefaultName>
    <CreateNewFolder>true</CreateNewFolder>
    <RequiredFrameworkVersion>2.0</RequiredFrameworkVersion>
    <SortOrder>1000</SortOrder>
    <TemplateID>Your ID HERE</TemplateID>
  </TemplateData>
  <TemplateContent/>
</VSTemplate>


2. Export Projects and Add to the Project Template project

Next, simply export all the projects in your solution that you want to include in your template. The Project –> Export Template dialog looks like this:

image

Once you’ve exported the projects as templates take each of the zip files and extract them into a subfolder of your Template Project. Then, in Visual Studio, include these extracted subfolders as part of the project. Note that Visual Studio will assign a default Action for each file, so code files will be set to Compile, images will be set as EmbeddedResource, etc. You’ll have to go through each of these files and change the default action to Content, copy if newer. It’s a pain, and I found it easier to unload the project and manually edit the csproj file directly.

3. Configure the Template to include the embedded Projects

Now that we have the embedded projects included in the output, we need to modify the template to point to these embedded templates. Visual Studio has a set of reserved keywords that can be used in the vstemplate and code transforms; $safeprojectname$ is a reserved keyword that represents the name of the current project. My vstemplate names the referenced templates after the name that was provided by the user:

<VSTemplate Version="2.0.0" Type="ProjectGroup"
    xmlns="http://schemas.microsoft.com/developer/vstemplate/2005">
  <TemplateData>
    ...
  </TemplateData>
  <TemplateContent>
    <ProjectCollection>
      <ProjectTemplateLink ProjectName="$safeprojectname$" CopyParameters="true">XF\MyTemplate.vstemplate</ProjectTemplateLink>
      <ProjectTemplateLink ProjectName="$safeprojectname$.Android" CopyParameters="true">XF.Android\MyTemplate.vstemplate</ProjectTemplateLink>
      <ProjectTemplateLink ProjectName="$safeprojectname$.UWP" CopyParameters="true">XF.UWP\MyTemplate.vstemplate</ProjectTemplateLink>
      <ProjectTemplateLink ProjectName="$safeprojectname$.iOS" CopyParameters="true">XF.iOS\MyTemplate.vstemplate</ProjectTemplateLink>
    </ProjectCollection>
  </TemplateContent>  
</VSTemplate>

If the ProjectName is omitted, it will use the name within the embedded template.

4. Fix Project References

To ensure the project compiles, we must fix the project references to the PCL library in the iOS, Android and UWP projects. Here we leverage an interesting feature of Multi-Project templates – Visual Studio provides special reserved keywords for accessing properties of the root template project. In this case, we can reference the safeprojectname of the root project using the $ext_safeprojectname$ reserved keyword. And because project references use a GUID to refer to the referenced project, we can provide the PCL project with a GUID that will be known to all the child projects – in this case, we can use $ext_guid1$.

The <ProjectGuid> element in the PCL Project must be configured to use the shared GUID:

<PropertyGroup>
  <MinimumVisualStudioVersion>11.0</MinimumVisualStudioVersion>
  <Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
  <Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform>
  <ProjectGuid>{$ext_guid1$}</ProjectGuid>
  <OutputType>Library</OutputType>
  <AppDesignerFolder>Properties</AppDesignerFolder>
  <RootNamespace>$safeprojectname$</RootNamespace>
  <AssemblyName>$safeprojectname$</AssemblyName>
  <FileAlignment>512</FileAlignment>
  <TargetFrameworkVersion>v4.5</TargetFrameworkVersion>
  <TargetFrameworkProfile>Profile259</TargetFrameworkProfile>
  <ProjectTypeGuids>{786C830F-07A1-408B-BD7F-6EE04809D6DB};{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}</ProjectTypeGuids>
  <NuGetPackageImportStamp>
  </NuGetPackageImportStamp>
</PropertyGroup>

In the projects that reference the PCL, the path to the project, project GUID and Name must be also be modified:

<ItemGroup>
  <ProjectReference Include="..\$ext_safeprojectname$\$ext_safeprojectname$.csproj">
    <Project>{$ext_guid1$}</Project>
    <Name>$ext_projectname$</Name>
  </ProjectReference>
</ItemGroup>

5. Fix-ups

Lastly, there will be some other fix-ups you will need to apply. These are things like original project names that appear in manifest files, etc. The templating engine can make changes to any type of file, but you may need to verify that these files have the ReplaceParameters attribute set to True in the .vstemplate file.

Build and Deploy!

With this in place, you can simply compile the Project Template and copy the zip to ProjectTemplates folder. Optionally, you can add a VSIX project to the solution that you can use to bundle our Project Template as an installer that you can distribute to users via the Visual Studio Extensions Gallery.

Happy coding!

submit to reddit

Monday, July 08, 2013

DeploymentItems in Visual Studio 2012

A frequent concern with writing unit tests with MSTest is how to include additional files and test data for a test run. This process has changed between Visual Studio 2010 and 2012 and it’s become a source of confusion.

Background

With Visual Studio 2010 and earlier, every time you ran your tests Visual Studio would copy all files related to the test to a test run folder and execute them from this location. For local development this feature allows you to compare results between test runs, but the feature is also intended to support deploying the tests to remote machines for execution.

If your tests depend on additional files such as external configuration files or 3rd party dependencies that aren’t directly referenced by the tests, you would need to enable Deployment in your testsettings and then either specifying the deployment items in the testsettings file or marking each test with a DeploymentItemAttribute.

What’s changed in Visual Studio 2012?

Visual Studio 2012 has a number of changes related to the test engine that impact deployment. The most visible change is that Visual Studio 2012 no longer automatically adds the testsettings file to your solution when you add a Test project. The testsettings file can be added to your project manually, but it’s generally recommended that you don’t use it as it’s for backward compatibility and not all features within Visual Studio 2012 are backward compatible. Microsoft Fakes for example are not backward compatible.

The biggest change related to deployment is that Visual Studio 2012 tests run directly out of the output folder by default. This adds a significant speed boost for the tests but it also means that if your tests are dependent on files that are already part of the build output, you won’t need to enable deployment at all.

Another interesting change is that if you include a DeploymentItemAttribute in your tests, Deployment will be automatically enabled and your tests will run out of the deployment folder.

More information can be found here.

Monday, March 05, 2012

How I organize my Local TFS Workspaces

It happens several times on most projects. Developer one, let’s call him Andy, adds a third-party library into the source control repository that isn’t referenced anywhere in the Visual Studio solution file that the team uses. Andy also modifies a few files that depend on this new library and checks his changes in. Developer two, let’s call him Eric, gets the latest from source control by right-clicking the top of the Solution in the Solution Explorer and selecting “Get Latest (recursive)” from the context menu. Although Andy’s local workspace and the build server work fine, Eric believes he has the latest but his code won’t compile.

It’s a frustrating problem with an easy fix: just get the latest copy of the source and rebuild the solution. You can get the latest from the Source Control Explorer in Visual Studio, or open a Visual Studio Command-prompt and issue this command at the root of your solution:

tf.exe get /recursive

I’ve worked to remedy this problem with my teams in several ways, including special buttons you can add to your IDE to make getting to the Source Control Explorer window faster. But when pair-programming on someone else’s machine, my buttons aren’t always available so I drop down to command-line as preferred choice. However, this sometimes has mixed results. If the command-line can’t figure out which workspace you’re in, sometimes it will get the latest from all local workspaces.

I don’t have this problem because I structure my workspaces differently than you. Here’s how I do it.

Multiple Workspaces per Client

This step is optional, but I think it’s worth mentioning. Rather than use a single workspace for all clients, I create one or more workspaces that reflect the client that I’m writing code for. To keep this information visible, I name the workspace after the client instead of the computer name.

TFS-AddWorkspace

I separate my TFS-Workspaces by client for a few reasons:

  • Some of my clients have their own repository which requires me to create a workspace for their server.
  • When I finish work with a client, I can safely delete an entire workspace without concern of breaking server-mappings for other clients.

Having multiple workspaces for the same client allows me to check out the same branch more than once. This allows me to:

  • Use an older copy of the source to reproduce a defect, validate unit tests or to run code analysis
  • Work on multiple defects in isolation from one another
  • Try out a refactoring in isolation from current development
  • Code review of a co-worker’s shelve-set

The practice of having multiple workspaces may not be required for all projects, but it’s a good habit to form.

Client Workspaces separated using Folders

As stated above, I create multiple workspaces for each client. In order to keep those workspaces organized, I keep them separated in their own folder using a simple naming convention (A,B,C, etc). This makes it simple to remove an entire workspace when no longer needed.

Building upon the folder structure that I outlined in my last post (Using Windows 7 Libraries to Organize your code), my folder structure for a client looks like this:

Client Workspace Name Folder Location
Client1 Client1-A C:\Projects\Infusion\Code\Client1\A
Client1 Client1-B C:\Projects\Infusion\Code\Client1\B
Client2 Client2-A C:\Projects\Infusion\Code\Client2\A

A,B,C is a simple naming convention, and it doesn’t need to get too fancy. I’ve worked with some projects with some long folder names, but I haven’t yet exceeded the 260 character limit with MSBuild.

Putting it all together

With the above in place, I can check out separate copies of the same branch into different folders: Client1\A\trunk, Client1\B\trunk, etc. Opening a command-prompt at the root of my solution and executing:

tf.exe get /recursive

…gets me just the updated code for that branch. I especially love this approach because I can get latest before I open the solution file, which is immensely helpful because I don’t have to wait for Visual Studio to reload projects if they’ve changed.

Code happy.

Monday, January 23, 2012

Execute Batch from Visual Studio

Since as long as I can remember, I’ve kept a command-line window open while I worked. It’s a warm fuzzy feeling of how computers used to work. I tend to structure commonly used tasks as msbuild or nant scripts, and then add handy batch files that pass the appropriate parameters to the script.

Unfortunately, most of my team-mates don’t live in the command-line, so running a batch file breaks their traditional flow.

Here’s a short tip on how to execute batch files without having to leave the comfort of the IDE. You’ve probably seen this tip before, but as always, I often use my blog as a digital memory. If it helps you, great.

Visual Studio supports the ability to associate tools and alternate editors for different files. Adding support for batch files is simply a matter of opening the context-menu for a file and choosing “Open With..”. Unfortunately, there’s no mechanism to supply parameters to your program, so adding support for a Command Prompt requires a small subtle hack that passes our parameters to the program we want.

To Associate Batch files to a Custom Command

First, we need to create a simple batch file that passes the arguments that Visual Studio provides onto our batch file.

  1. If you haven't already, add your batch file to your solution. These are best treated as Solution items that aren’t part of your compilation process.
  2. Open notepad and save the following script as C:\ExecuteBatch.cmd
@cmd /c %1

Once this is in place,

  1. Associate the batch file in Visual Studio to the command line by right-clicking on the batch file and choose "Open With...".
  2. In the dialog that appears, provide a name and associate it to the ExecuteBatch.cmd.

image

Optional: You can associate the Command as the default program for this extension, by selecting your custom command and clicking on the “Set as Default” button. Note that if you edit the file frequently you might want to skip this step, but you’ll have to right-click the file and choose “Open With…” anytime you want to run your custom command.

Gotchas & Caveats

Just a few closing points:

  • If you set your custom command as the default, note that there is no confirmation if you accidentally double-click the batch file. If your script is potentially destructive or long-running, you might want to add a prompt at the beginning of the batch file before running.
  • The ExecuteBatch.cmd provided above will close the window immediately after the batch terminates. If you want to review the output of the script before the window closes, you might want to add a pause to the end of the script.
  • Lastly, when adding new scripts to Visual Studio it will perform the default action when the file is added to the solution. If you don’t want to run the script when the file is added, you might want to temporarily assign a different editor (Source Code Editor) before you precede.

Happy Coding.

submit to reddit

Monday, November 28, 2011

Fixing Parallel Test Execution in Visual Studio 2010

As the number of tests in my project grow, so does the length of my continuous integration build. Fortunately, the new parallel test execution of Visual Studio 2010 allow us to trim down the amount of time consumed by our unit tests. If your unit tests meet the criteria for thread-safety you can configure your unit tests to run in parallel simply by adding the following to your test run configuration:

<?xml version="1.0" encoding="UTF-8"?>
<TestSettings name="Local" id="5082845d-c149-4ade-a9f5-5ff568d7ae62" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
  <Description>These are default test settings for a local test run.</Description>
  <Deployment enabled="false" />
  <Execution parallelTestCount="0">
    <TestTypeSpecific />
    <AgentRule name="Execution Agents">
    </AgentRule>
  </Execution>
</TestSettings>

The ideal setting of “0” implies that the test runner will automatically figure out the number of concurrent tests to execute based on the number of processors on the local machine. Based on this, a single-core CPU will run 1 test simultaneously, a dual-core CPU can run 2 and a quad-core CPU can 4. Technically, a quad-core hyper-threaded machine has 8 processors but when parallelTestCount is set to zero the test run on that machine fails instantly:

Test run is aborting on '<machine-name>', number of hung tests exceeds maximum allowable '5'.

So what gives?

Well, routing through the disassembled source code for the test runner we learn that the number of tests that can be executed simultaneously interferes with the maximum number of tests that can hang before the entire test run is considered to be in a failed state. Unfortunately the maximum number of tests that can hang has been hardcoded to 5. Effectively, when the 6th test begins to execute the test runner believes that the other 5 executing tests are in a failed state so it aborts everything. Maybe the team writing this feature picked “5” as an arbitrary number, or legitimately believed there wouldn’t be more than 4 CPUs before the product shipped, or simply didn’t make the connection between the setting and the possible hardware. I do sympathize for the mistake: the developers wanted the number to be low because a higher number could add several minutes to a build if the tests were actually in an non-responsive state.

The Connect issue lists this feature as being fixed, although their are no posted workarounds and a there’s a lot of feedback that feature doesn’t work on high-end machines even with the latest service pack. But it is fixed, no-one knows about it.

Simply add the following to your registry (you will likely have to create the key) and configure the maximum amount based on your CPU. I’m showing the default value of 5, but I figure number of CPUs + 1 is probably right.

Windows 32 bit:
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\VisualStudio\10.0\EnterpriseTools\QualityTools\Agent]
"MaximumAllowedHangs"="5"
Windows 64 bit:
[HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\VisualStudio\10.0\EnterpriseTools\QualityTools\Agent]
"MaximumAllowedHangs"="5" 

Note: although you should be able to set the parallelTestCount setting to anything you want, overall performance is constrained by the raw computing power of the CPU, so anything more than 1 test per CPU creates contention between threads which degrades performance. Sometimes I set the parallelTestCount to 4 on my dual-core CPU to check for possible concurrency issues with the code or tests.

Epilogue

So what’s with the Connect issue? Having worked on enterprise software my guess is this: the defect was logged and subsequently fixed, the instructions were given to the tester and verified, but these instructions never tracked forward with the release notes or correlated back to the Connect issue. Ultimately there’s probably a small handful of people at Microsoft that actually know this registry setting exists, fewer that understand why, and those that do either work on a different team or no longer work for the company. Software is hard: one small fissure and the whole thing seems to fall apart.

Something within the process is clearly missing. However, as a software craftsman and TDD advocate I’m less concerned that the process didn’t capture the workaround as I am that the code randomly pulls settings from the registry – this is a magic string hack that’s destined to get lost in the weeds. Why isn’t this number calculated based on the number of processors? Or better, why not make MaximumAllowedHangs configurable from the test settings file so that it can be unit tested without tampering with the environment? How much more effort would it really take, assuming both solutions would need proper documentation and tests?

Hope this helps.

Tuesday, July 12, 2011

Visual Studio Regular Expressions for Find & Replace

Visual Studio has had support for regular expressions for Find & Replace for several versions, but I've only really used it for simple searches. I recently had a problem where I needed to introduce a set of changes to a very large object model. It occurred to me that this could be greatly simplified with some pattern matching, but I was genuinely surprised to learn that Visual Studio had their own brand of Regular Expressions.

After spending some time learning the new syntax I had a really simple expression to modify all of my property setters:

Original:

public string PropertyName
{
    get { return _propertyName; }
    set
    {
        _propertyName = value;
        RaisePropertyChanged("PropertyName");
    }
}

Goal:

public string PropertyName
{
    get { return _propertyName; }
    set
    {
        if ( value == _propertyName )
             return;            
        _propertyName = value;
        RaisePropertyChanged("PropertyName");
    }
}

Here’s a quick capture and breakdown of the pattern I used.

image

Find:

^{:Wh*}<{_:a+} = value;
  • ^ = beginning of line
  • { = start of capture group #1
  • :Wh = Any whitespace character
  • * = zero or more occurrences
  • } = end of capture group #1
  • < = beginning of word
  • { = start of capture group #2
  • _ = I want to the text to start with an underscore
  • :a = any alpha numerical character
  • + = 1 or more alpha numerical characters
  • } end of capture group #2
  • “ = value;” = exact text match

Replace:

\1(if (\2 == value)\n\1\t\return;\n\1\2 = value;

The Replace algorithm is fairly straight forward, where “\1” and “\2” represent capture groups 1 and 2.  Since capture group #1 represents the leading whitespace, I’m using it in the replace pattern to keep the original padding and to base new lines from that point.  For example, “\n\1\t” introduces a newline, the original whitespace and then a new tab.

It’s seems insane that Microsoft implemented their own regular expression engine, but there’s some interesting things in there, such as being able to match on quoted text, etc.

I know this ain’t much, but hopefully it will inspire you to write some nifty expressions.  Cheers.

submit to reddit

Monday, June 27, 2011

Visual Studio Code Analysis Settings

I've just spent the last week or so working with Code Analysis for my Visual Studio 2010 projects.  I still haven't figured out if I'm using Visual Studio's built in Code Analysis tool or FxCop 10.0, because the two tools share a lot of the same engine and rulesets.  At any rate, one of the pain points I had to work through was ensuring that the configuration settings between my Visual Studio configurations were the same.  Strangely enough, I had more confidence in my changes when manually editing the csproj files in Notepad++ than from Visual Studio -- it seemed like each configuration had slightly different settings, and being able to see the msbuild settings seemed to make more sense.

During this process, I found it quite frustrating that none of the CodeAnalysis settings that are configurable from the csproj schema were documented online.  However, the intellisense for the schema is quite good.  The schema is located at: <Program Files>\Microsoft Visual Studio 10.0\Xml\Schemas\1033\MSBuild\Microsoft.Build.CommonTypes.xsd

Most of the settings generated by Visual Studio are often the default values and can be discarded.

For my reference, here's the documentation, lifted from the schema:

Project Element Description
CodeAnalysisAdditionalOptions Additional options to pass to the Code Analysis command line tool.
CodeAnalysisApplyLogFileXsl Indicates whether to apply the XSL style sheet specified in $(CodeAnalysisLogFileXsl) to the Code Analysis report. This report is specified in $(CodeAnalysisLogFile). The default is false.
CodeAnalysisConsoleXsl Path to the XSL style sheet that will be applied to the Code Analysis console output. The default is an empty string (''), which causes Code Analysis to use its default console output.
CodeAnalysisCulture Culture to use for Code Analysis spelling rules, for example, 'en-US' or 'en-AU'. The default is the current user interface language for Windows.
CodeAnalysisDependentAssemblyPaths Additional reference assembly paths to pass to the Code Analysis command line tool. A fully qualified path to a directory containing reference assemblies to be used by Code Analysis.
CodeAnalysisDictionary Code Analysis custom dictionaries.  Semicolon-separated list of Code Analysis custom dictionaries. Wildcards are allowed.
CodeAnalysisFailOnMissingRules Indicates whether Code Analysis should fail if a rule or rule set is missing. The default is false.
CodeAnalysisForceOutput Indicates whether Code Analysis generates a report file, even when there are no active warnings or errors. The default is true.
CodeAnalysisGenerateSuccessFile Indicates whether Code Analysis generates a '$(CodeAnalysisInputAssembly).lastcodeanalysissucceeded' file in the output folder when no build-breaking errors occur. The default is true.
CodeAnalysisIgnoreBuiltInRules Indicates whether Code Analysis will ignore the default rule directories when searching for rules. The default is false.
CodeAnalysisIgnoreBuiltInRuleSets Indicates whether Code Analysis will ignore the default rule set directories when searching for rule sets. The default is false.
CodeAnalysisIgnoreInvalidTargets Indicates whether Code Analysis should silently fail when analyzing invalid assemblies, such as those without managed code. The default is true.
CodeAnalysisIgnoreGeneratedCode Indicates whether Code Analysis should fail silently when it analyzes invalid assemblies, such as those without managed code. The default is true.
CodeAnalysisImport Code Analysis projects (*.fxcop) or reports to import. Semicolon-separated list of Code Analysis projects (*.fxcop) or reports to import. Wildcards are allowed.
CodeAnalysisInputAssembly Path to the assembly to be analyzed by Code Analysis. The default is '$(OutDir)$(TargetName)$(TargetExt)'.
CodeAnalysisLogFile Path to the output file for the Code Analysis report. The default is '$(CodeAnalysisInputAssembly).CodeAnalysisLog.xml'.
CodeAnalysisLogFileXsl Path to the XSL style sheet to reference in the Code Analysis output report. This report is specified in $(CodeAnalysisLogFile). The default is an empty string ('').
CodeAnalysisModuleSuppressionsFile Name of the file, without the path, where Code Analysis project-level suppressions are stored. The default is 'GlobalSuppressions$(DefaultLanguageSourceExtension)'.
CodeAnalysisOverrideRuleVisibilities Indicates whether to run all overridable Code Analysis rules against all targets. This will cause specific rules, such as those within the Design and Naming categories, to run against both public and internal APIs, instead of only public APIs. The default is false.
CodeAnalysisOutputToConsole Indicates whether to output Code Analysis warnings and errors to the console. The default is false.
CodeAnalysisVerbose Indicates whether to output verbose Code Analysis diagnostic info to the console. The default is false.
CodeAnalysisPath Path to the Code Analysis installation folder. The default is '$(VSINSTALLDIR)\Team Tools\Static Analysis Tools\FxCop'.
CodeAnalysisPlatformPath Path to the .NET Framework folder that contains platform assemblies, such as mscorlib.dll and System.dll. The default is an empty string ('').
CodeAnalysisProject Path to the Code Analysis project (*.fxcop) to load. The default is an empty string ('').
CodeAnalysisQuiet Indicates whether to suppress all Code Analysis console output other than errors and warnings. This applies when $(CodeAnalysisOutputToConsole) is true. The default is false.
CodeAnalysisRuleAssemblies Semicolon-separated list of paths either to Code Analysis rule assemblies or to folders that contain Code Analysis rule assemblies. The paths are in the form '[+|-][!][file|folder]', where '+' enables all rules in rule assembly, '-' disables all rules in rule assembly, and '!' causes all rules in rule assembly to be treated as errors. For example '+D:\Projects\Rules\NamingRules.dll;+!D:\Projects\Rules\SecurityRules.dll'. The default is '$(CodeAnalysisPath)\Rules'.
CodeAnalysisRuleDirectories Semicolon-separated list of directories in which to search for rules when resolving a rule set. The default is '$(CodeAnalysisPath)\Rules' unless the CodeAnalysisIgnoreBuiltInRules property is set to true.
CodeAnalysisRules Semicolon-separated list of Code Analysis rules. The rules are in the form '[+|-][!]Category#CheckId', where '+' enables the rule, '-' disables the rule, and '!' causes the rule to be treated as an error. For example, '-Microsoft.Naming#CA1700;+!Microsoft.Naming#CA1701'. The default is an empty string ('') which enables all rules.
CodeAnalysisRuleSet A .ruleset file which contains a list of rules to run during analysis. The string can be a full path, a path relative to the project file, or a file name. If a file name is specified, the CodeAnalysisRuleSetDirectories property will be searched to find the file. The default is an empty string ('').
CodeAnalysisRuleSetDirectories Semicolon-separated list of directories in which to search for rule sets. The default is '$(VSINSTALLDIR)\Team Tools\Static Analysis Tools\Rule Sets' unless the CodeAnalysisIgnoreBuiltInRuleSets property is set to true.
CodeAnalysisSaveMessagesToReport Comma-separated list of the type ('Active', 'Excluded', or 'Absent') of warnings and errors to save to the output report file. The default is 'Active'.
CodeAnalysisSearchGlobalAssemblyCache Indicates whether Code Analysis should search the Global Assembly Cache (GAC) for missing references that are encountered during analysis. The default is true.
CodeAnalysisSummary Indicates whether to output a Code Analysis summary to the console after analysis. The default is false.
CodeAnalysisTimeout The time, in seconds, that Code Analysis should wait for analysis of a single item to complete before it aborts analysis. Specify 0 to cause Code Analysis to wait indefinitely. The default is 120.
CodeAnalysisTreatWarningsAsErrors Indicates whether to treat all Code Analysis warnings as errors. The default is false.
CodeAnalysisUpdateProject Indicates whether to update the Code Analysis project (*.fxcop) specified in $(CodeAnalysisProject). This applies when there are changes during analysis. The default is false.
CodeAnalysisUseTypeNameInSuppression Indicates whether to include the name of the rule when Code Analysis emits a suppression. The default is true.
RunCodeAnalysis Indicates whether to run Code Analysis during the build.

As this has been a fun week dealing with FxCop, expect to see a few more posts related to the above settings.  Stay right here and click refresh endlessly, or subscribe to the rss feed.

Happy coding.

submit to reddit

Tuesday, May 31, 2011

The Tests are Broken, Now What?

Despite our best intentions to write durable tests, it seems inevitable that tests will break at some point. In some cases, breaking a test can be seen as a very positive thing: we've introduced a side-effect that has broken existing functionality, and our tests have helped prevent a bug. However, in some cases, tests break because of external factors -- maybe the tests weren't isolated enough, maybe a developer went rogue and forgot to run the tests, or some other poorly justified or lousy excuse. While identifying what went wrong in your development process is important, the immediate concern is that these broken tests represent a failed build and developer productivity is at risk -- what should you do with the broken tests?

If the developer looking at the tests knows a bit about the code, they should have enough information in the tests to help sort things out.  The developer should drop what they're doing, slow down and research the code and the tests and fix the broken tests. If caught early, it's very manageable. Unfortunately, this isn't the common case -- the developer introducing the breaking changes doesn't understand the tests and is ultimately left with one of two approaches:

  1. Comment out or remove the test
  2. Mark the test as Ignored

Neither of these approaches "fix" anything.  The first approach, to comment the test out, is equivalent to pissing on the requirements. The test supposedly represents a piece of functional behaviour of the application, the fact that it's broken suggests a feature is also broken. The only reason to comment out a test is if that functionality has ceased to exist or the test was invalid in the first place. (Take a look at my checklist for unit test code reviews to help qualify invalid tests)

The second alternative to mark the tests as Ignored can be dangerous, depending on whether you're able to track why the test has been disabled and able to monitor when it enters this state.  This is one part developer process, one part testing framework, one part reporting.

Within NUnit, the Ignore attribute fits the bill nicely.  The attribute accepts a string message which is used to report why the test has been disabled. It should be standard developer process to only allow use of Ignore if a message is provided. (Come to think of it, I should put in a feature request so that it is required.)  When NUnit runs, the Ignored tests are not executed but the number of ignored tests are included in the xml output, meaning that your build server can track ignored tests over time. You can also bypass the Ignored attribute within the NUnit user-interface by simply selecting the Ignored test and running it manually.

[Ignore("The xml for this test isn't serializing correctly. Please fix!")]
[Test]
public void SuperImportantTest()
{
    // snip
}

Within MSTest however, Ignore is essentially the same as commenting out the tests.  There's no ability to record why the test has been excluded (developers must leave comments), and when MSTest runs the test suite the Ignored tests are simply excluded from the test run.  Looking at the MSTest output, the TRX file does not define a status for ignored, meaning that it can't be tracked in your build reports.

<ResultSummary outcome="Completed">
  <Counters 
    total="1" 
    executed="1" 
    passed="1" 
    error="0" 
    failed="0" 
    timeout="0" 
    aborted="0" 
    inconclusive="0" 
    passedButRunAborted="0" 
    notRunnable="0" 
    notExecuted="0" 
    disconnected="0" 
    warning="0" 
    completed="0" 
    inProgress="0" 
    pending="0" />
</ResultSummary>

Though I'm not a big fan of this, I'll spare you the complaining and move on to working solutions.

The best approach within MSTest isn't to disable the test but to indicate that it isn't reliable and needs attention. In addition, we need these tests to stand out without breaking the build. This is exactly what Inconclusive is for.

[TestMethod] 
public void SuperImportantTest() 
{ 
    Assert.Inconclusive("The xml for this test isn't serializing correctly. Please fix!"); 
    
    // snip 
}

The key to using Assert.Inconclusive is to put the statement as the first code-expression in your test. When MSTest encounters the Assert.Inconclusive statement, execution of the test is immediately halted, and the test is recorded as "Inconclusive". The TRX reports this status separately, and the message appears in the xml output.

Next Steps

With the tests marked as problematic, it’s possible to check-in the tests and unblock developers from a failed build while you and the original test author figure out how to fix the problem. It’s a temporary patch and is not intended for all breaking tests, just small fires that happen during critical developer crunches.  Once the fire has been put out and the tests corrected, you really should go back and investigate why the test broke in the first place:

  • Should the developer’s changes have broken this test?  If not, could the code/test have been written differently to be more durable?
  • What could the developer have done differently? Is this a one time occurrence or a disconnect in process?

That’s all for now.

submit to reddit

Friday, March 04, 2011

Add a Custom Toolbar for Source Control

My current project uses TFS and I spent a lot of time in and out of source control, switching between workspaces to manage defects. I found myself needing access to my workspaces and getting really frustrated with how clunky this operation is within Visual Studio.  There are two ways you can open source control.

  1. The Source Control item in the Team System tool window.  I don’t always have the Team Explorer tool window open and when I open it, it takes a few seconds to get details from the server.
  2. View –> Other Windows –> Source Control Explorer.  Useful, but there’s too much mouse movement and clicking to be accessible.

So, rather than creating a custom keyboard shortcut that I would forget I added a toolbar that is always in plain-sight. It’s so convenient that I take it for granted, and when I pair with others they comment on it. So for their convenience (and yours), here’s how it’s done.

Add a new Toolbar

From the Menubar, select “Tools –> Customize”.  It’ll pop up this dialog. 

Click on “New” and give your toolbar a name.

Toolbar_Customize_AddToolbar

Add the Commands

Switch to the Commands tab and select the name of your toolbar in the Toolbar dropdown.

Toolbar_Customize_Empty

Click on the “Add Command” button and select the following commands:

  • View : TfsSourceControlExplorer
  • View : TfsPendingChanges

Toolbar_Customize_AddCommand

Now style the button’s accordingly using the “Modify Selection” button.  I’ve set mine to use the Default styling, which is just the button icon.

Enjoy

CustomToolbar_result

Friday, December 11, 2009

Visual Studio Keyboard Katas - II

Hopefully, if you read the last kata and have been trying it out, you may have found yourself needing to use your mouse less for common activities such as opening files and reviewing build status.  This kata builds upon those previous techniques, adding seven more handy shortcuts and a pattern to practice them.

Granted, the last Kata was a bit of white belt pattern: obvious and almost comical, but essential.  In Tae Kwon Do, the yellow belt patterns introduce forward and backward motion, so it seems logical that the next kata introduces rapidly navigating forward and backward through code.

Today’s Shortcut Lesson

Our mnemonic for this set of shortcuts is centered around two keys in the upper-right area of the keyboard: F12 and Minus (-).  The basic combinations for these keys can be modified by using the SHIFT key.

Also note, I’ve introduced another Tool window (CTRL + Window).  The Find Symbols Results is also displayed when you do a Quick Symbol search, which may help explain the “Q”.

F12 Go to Definition
SHIFT + F12 Find all References
CTRL + MINUS Navigate Backward
SHIFT + CTRL + MINUS Navigate Forward
CTRL + W, Q Find Symbols Results Window
CTRL + LEFT ARROW Move to previous word
CTRL + RIGHT ARROW Move to next word

And as an extra Keeno bonus, an 8th shortcut:

CTRL + ALT + DOWN ARROW Show MDI File List

Keyboard Kata

Practice this kata any time you need to identify how a class is used.

  1. Open the Solution Explorer. (CTRL+W, S)
  2. Navigate to a file. (Arrow Keys / Enter)
  3. Select a property or variable (Arrow keys)
  4. Navigate to the Definition for this item (F12)
  5. Find all References of this Type (CTRL+LEFT to move the cursor from the definition to the type, then SHIFT+F12 for references)
  6. Open one of the references (Arrow Keys / Enter)
  7. Open the next reference (CTRL+W,Q / Arrow Keys / Enter)
  8. Open the nth reference (CTRL+W,Q / Arrow Keys / Enter)
  9. Navigate to the original starting point (CTRL + MINUS)
  10. Navigate to the 2nd reference (SHIFT + CTRL + MINUS)
  11. Navigate to any window (CTRL + ALT + DOWN / Arrow Keys / Enter)

submit to reddit

Monday, December 07, 2009

Visual Studio Keyboard Katas

I’ve never spent much time learning keyboard shortcuts for Visual Studio – they’ve always seemed hard to remember with multiple key combinations, some commands have multiple shortcut bindings, and some keystrokes simply aren’t intuitive.  Recently, however, I’ve met a few IDE Ninjas who have opened my eyes on the productivity gains to be had.

The problem with learning keyboard shortcuts is that they can be a negative self-enforcing loop.  If the secret to learning keyboard shortcuts is using them during your day-to-day activities, the act of stopping work to look up an awkward keystroke interrupts your flow, lowers your productivity, and ultimately results in lost time.  Lost time and distractions puts pressure on us to stay focused and complete our work, which further discourages us from stopping to learn new techniques, including those that would ultimately speed us up.  Oh, the irony.

To break out that loop, we need to:

  • learn a few shortcuts by associating them with some mnemonics; and then
  • learn a few exercises that we can inject into daily coding flow

As an homage to the Code Katas cropping up on the internets, this is my first attempt at a Keyboard Kata. 

The concept of the “kata” is taken from martial arts, where a series of movements are combined into a pattern.  Patterns are ancient, handed down from master to student over generations, and are a big part of martial art exams.  They often represent a visualization of defending yourself from multiple attackers, with a focus on technique, form, and strength.  The point is that you repeat them over and over until you master them and they become instinctive muscle memory.  Having done many years of Tae Kwon Do, many years ago, I still remember most of my patterns to this date.  Repetition is a powerful thing.

A note about my Visual Studio environment:  I’m using the default Visual Studio C# keyboard scheme in Visual Studio 2008.  I’ve unpinned all of my commonly docked windows so that they auto-hide when not in use.  Unpinning your tool windows not only gives you more screen real estate, but it encourages you to use keyboard sequences to open them.

Today’s Shortcut Lesson

In order to help your retention for each lesson, I’m going to limit what you need to remember to seven simple shortcuts.  Read through the shortcuts, try out the kata, and include it in your daily routine -- memorize them and let them become muscle memory.  I hope to post a bunch of Katas over the next few weeks.

Tip: You’ll get even better retention if you say the shortcuts out loud as you do them.  You’ll feel (and sound) like a complete dork, but it works.

Tool Windows (CTRL+W, …)

Visual Studio’s keyboard scheme does have some reason behind its madness, where related functionality are grouped with similar shortcuts.  The majority of the toolbar windows are grouped under CTRL+W.  If it helps, think CTRL+WINDOW.

Here are a few of the shortcuts for Tool Windows:

CTRL+W, S Solution Explorer
CTRL+W, P Properties
CTRL+W, O Output Window
CTRL+W, E Errors
CTRL+W, C Class View

 

Note that the ESC key will put focus in the currently opened document and auto-hide the current tool window.

Build Shortcuts

F6
-or -
CTRL+SHIFT+B
Build Solution
SHIFT+F6 Build Project

 

Opening a Solution Kata

So here is the kata.  Try this pattern every morning after you open a solution file.

  1. Open the Solution Explorer.
  2. Navigate to a file
  3. View it’s properties
  4. Build the current Project
  5. Build the Solution
  6. Review the Output
  7. Check for build Errors

Extra credit:

  1. Open a file by navigating to it in the solution explorer
  2. Open a file to a specific method in the Class View
  3. View properties of a currently opened file.

submit to reddit

Wednesday, December 02, 2009

NUnit for Visual Studio Addin

I recently stumbled upon this great addin for Visual Studio that uses the Visual Studio Test Adapter pattern to include NUnit tests within Visual Studio as MS Tests.  They appear in the Test List Editor and execute equivalent to MS Test, including those handy Run and Debug keyboard shortcuts I described in my last post.

Since they operate as MS Tests, the project requires some additional meta-data in the csproj file in order to have Visual Studio recognize this project as a Test library.  My last post has the details.

Curious to see how far the addin could act as a stand-in for NUnit, I fired up Visual Studio, a few beers, and the NUnit attribute documentation to put it through the works.  I’ve compiled my findings here in the table below.

In all fairness, there are a lot of attributes in NUnit, some of these you probably didn’t know existed.

NUnit Attribute Supported Comments
Category No Sadly, the addin does not register a new column definition for Category.  Though this feature is not tied to any functional behavior, it would be greatly welcomed to improve upon Visual Studio’s Test Lists.
Combinatorial Yes  
Culture / SetCulture No Tests that would normally be excluded by NUnit fail.
Datapoint / Theory No Test names do not match NUnit runtime.  All Datapoints produce result Not Runnable in the Test Results
Description No Value does not appear in the Test List Editor
Explicit No Explicit Tests are executed and appear as Enabled in the Test List Editor
ExpectedException Yes  
Ignore Partial Ignored tests are excluded from the Test List Editor, so they are ignored, but they do not appear as Enabled = False.
MaxTime / Timeout Partial Functions properly though supplied setting does not appear in the Timeout column in the Test List Editor
Platform No Tests are executed regardless of the specified platform.
Property - Custom properties do not appear in the output of the TRX file, which is where I’m assuming they would appear.  Not entirely sure if the schema would support custom properties however.
Random No Tests are generated, though the names contain only numbers.  Executing these tests produce the result Not Runnable in the Test Results.
Range No Tests are generated, though the names contain only numbers.  Executing these tests produce the result Not Runnable in the Test Results.
Repeat Yes  
RequiredAddin - Not tested.
RequiresMTA / RequiresSTA / RequiresThread Yes  
Sequential Yes  
Setup / Teardown Yes  
SetupFixture No Setup methods for a given namespace do not execute.
Suite - Not tested (requires command-line switch)
TestFixtureSetup / TestFixtureTeardown Yes  
Test Yes Of course!
TestCase Yes Tested different argument types (int, string), TestName, ExpectedException.
TestCaseSource Yes  

There’s quite a few No’s in this list, but the major players (Test, Setup/Teardown, TestFixtureSetup/Teardown) are functional.  I’m actually pleased that NUnit 2.5.2 features such as parameterized tests (TestCase, TestCaseSource) and Combinatorial / Sequential / Values are in place, as well as former addin features that were bundled into the framework (MaxTime / Repeat).

In respect to the malformed test names and non-runnable tests for the Theory / Range / Random attributes, hopefully this is a small issue that can be resolved.  The cosmetic issues with Ignore / Description / Category don’t pose any major concerns though they would be large wins in terms of full compatibility with MS Test user interface and features.

I’ve never used the SetupFixture nor the culture attributes, so I’m not losing much sleep over these.

However, for me, the main issue for me is that Explicit tests are always executed.  I’ve worked on many projects where a handful of tests either brought down the build server or couldn’t be run with other tests.  Rather than solve the problem, developers tagged the tests as Explicit – they work, but you better have a good reason to be running them.

Hats off to the NUnitForVS team.

submit to reddit

Friday, July 11, 2008

Automate Visual Studio from external tools

While cleaning up a code monster, a colleague and I were looking for ways to dynamically rebuild all of our web-services as part of build script or utility as we have dozens of them and they change somewhat frequently.  In the end, we decided that we didn't necessarily need support for modifying them within the IDE and we could just generate them using the WSDL tool.

However, while I was researching the problem I stumbled upon an easy method to drive Visual Studio without having to write an addin or macro; useful for one-off utilities and hair-brain schemes.

Here's some ugly code, just to give you a sense for it.

You'll need references to:

  • EnvDTE - 8.0.0.0
  • VSLangProj - 7.0.3300.0
  • VSLangProj80 - 8.0.0.0
namespace AutomateVisualStudio
{
  using System;
  using EnvDTE;
  using VSLangProj80;

  public class Utility
  {
      public static void Main()
      {
          string projectPath = @"C:\Demo\Empty.csproj";
          Type type = Type.GetTypeFromProgID("VisualStudio.DTE.8.0");
          DTE dte = (DTE) Activator.CreateInstance(type);
          dte.MainWindow.Visible = false;

          dte.Solution.Create(@"C:\Temp\","tmp.sln");
          Project project = dte.Solution.AddFromFile(projectPath, true);

          VSProject2 projectV8 = (VSProject2) project.Object;
          if (projectV8.WebReferencesFolder == null)
          {
              projectV8.CreateWebReferencesFolder();
          }

          ProjectItem item = projectV8.AddWebReference("http://localhost/services/DemoWS?WSDL");
          item.Name = "DemoWS";
            
          project.Save(projectPath);
          dte.Quit();
      }
  }
}

Note that Visual Studio doesn't allow you to manipulate projects directly; you must load your project into a solution.  If you don't want to mess with your existing solution file, you can create a temporary solution and add your existing project to it.  And if you don't want to clutter up your disk with temporary solution files, just don't call the the Save method on the Solution object.

If you had to build a Visual Studio utility, what would you build?

submit to reddit

Friday, December 21, 2007

Debugging WizardExtensions for Visual Studio Templates

As per my previous post, this exercise would probably be much easier if I used the Guidance Automation Toolkit, but in the spirit of Twelve Days of Code, I promised to boldly venture into areas I normally don't go. I decided that I wanted to try out a WizardExtension so that I could compare the experience with the Guidance Automation Toolkit. So I created a new project and added the following references:

  • EnvDTE
  • ENvDTE80
  • Microsoft.VisualStudio.TemplateWizardInterface
  • System.Windows.Forms

The Visual Studio Template documentation says you need to sign your assembly and install it into the GAC, but that's crazy. Rather than jumping through hoops, I found a handy forum post that describes that the Visual Studio follows the standard assembly probing sequence, so the assembly needs to be in a place that the devenv.exe process can find it. Signing and installing into the GAC is simply a security measure. I didn't want to dump my custom assembly into Visual Studio's assemblies (where I would forget about it) so I created a custom folder in %program files%\Microsoft Visual Studio 8\Common7\IDE and added this to the probingPath of the devenv.exe.config. To enable debugging for my custom wizard-extension, I use two Visual Studio instances. One for my wizard-extension, the other for testing the template. Here are the steps involved:

  • Add the assembly and class name to your ProjectGroup.vstemplate file:
<wizardextension>
 <assembly>Experiments.TemplateWizard</assembly>
 <fullclassname>Experiments.TemplateWizard.CustomizeProjectNameWizard</fullclassname>
</wizardextension>
  • Zip up the updated template and copy it into the appropriate Visual Studio Templates folder.
  • Compile the wizard-extension assembly and copy it and its pdb to a path where visual studio can find it
  • Launch a new instance of Visual Studio
  • Switch back to the other visual studio instance, attach to the "devenv" process (the one that says it's at the start page) and set your break-points
  • Switch back to the new instance of Visual Studio and start the template that contains your wizard extension
  • debugging goodness!!

Well, at least I saved myself the effort of signing, etc. This exercise showed that very little is actually done at the ProjectGroup level of a Multi-Project template: the RunStarted is called, followed by ProjectFinishedGenerating method. The biggest disappointment is that the project parameter in the ProjectFinishedGenerating is null. This is probably because the item being created is a Solution, not a project.

The last ditch (seriously, ditch!) is to cast the automationObject passed into RunStarted to _DTE, and the work through COM interop to manage the Solution. That sounds romantic.

submit to reddit

Thursday, December 20, 2007

Bundling Visual Studio templates for distribution

Microsoft's done a fairly good job in packaging for Visual Studio templates. Simply:

  1. Create an xml file that adheres to the Visual Studio Content Installer Schema Reference
  2. Rename the xml with a "vscontent" extension
  3. Place the vscontent file and your template zip into another zip file
  4. Rename that zip file with a "VSI" extension.

Now, when you double click the VSI file it runs a wizard that installs your template into the appropriate Visual Studio Template folder.

submit to reddit

Wednesday, December 19, 2007

Visual Studio 2005 Multi-Project Templates - a waste of time?

As part of my twelve-days-of-code, I'm tackling a set of small projects geared towards simple project automation. I've discovered in recent projects that although the road is always paved with good intentions, other tasks, emergencies and distractions always prevent you from accomplishing what seem to be the most minor tasks. So when starting out on a new task, we always cut corners with the intention of returning to these trivial tasks whenever we find the time, or when they become justified in our client's eyes. However, if we started out with these things done for us, no one would question their existence or worry about a catch-up penalty; we would just accept these things as best-practice.

Visual Studio Project templates are interesting, though my first encounters with them suggest they miss the mark. For my projects, I find the effort isn't about creating the project, it's about creating the overall solution: project libraries, web sites, test harnesses, references to third-party libaries and tools, build-scripts, etc. Visual Studio supports the concept of "Multi-Projects Templates", which are limited (see below), but I suspect that the Guidance Automation Extensions might fill in the gaps.

Visual Studio supports two types of templates within the IDE, and a third type which must be stitched together using XML. The first type refers to "Item Templates" which refer to single files which can be included in any project. I'm focusing more on Project templates and Multi-Project templates.

Within Visual Studio, the concept of a project template is extremely easy: you simply create the project the way you like and then choose the "Export Templates..." option from the File menu. The project and its contents are published as a ZIP file in "My Documents\Visual Studio 2005\My Exported Templates". A big plus on the template structure is that all the files support parameterization, which means you can decorate the exported files with keywords that are dynamically replaced when the template is created by the user. The export wizard takes care of most of the keyword substitution for you, such that root namespaces in all files will match the name of the user's solution. With this in mind, a Project Template is "sanitized" and waiting for your client to adopt your structure with their name.

Multi-Project Templates stitch multiple Project Templates together by using a master xml-based template file. These templates can't be created using the IDE, but you can create a solution and then export each project out as a Project Template, then following this handy MSDN article and the Template Schema reference, you can quickly parcel together a master template.

However, there's a few really nasty limitations with multi-item projects templates. The biggest issue is that the Project Name cannot be parameterized, so the template adopts the names that are defined in your configuration file. As a result, the only thing you can really customize is the name of the solution. I was completely baffled by this: I thought I must be doing something wrong. However, after a few minutes of googling, others had come to the exact same conclusion.

Fortunately, the template system supports a Wizard framework, which would allow you to write some code to dynamically modify the solution. Unfortunately, the code for this would have to be strong-named and installed in the GAC. I'm tempted to wade into this, but I fear that I might be better off looking at the Guidance Automation Toolkit.

submit to reddit

Monday, December 17, 2007

Visual Studio Templates - Export Template not available

So I was started into my foray of the Twelve Days of Code, focusing on a "Software Automation" theme. First stop, "Visual Studio Project templates". I've played with these templates before, and because Visual Studio makes them considerably easy to do, this first stop should be an easy one. I had an interesting problem with Visual Studio that slowed me down.

The steps to create a template are straight forward: you create the item and then use the "Export Template" wizard in the "File" menu. However, the "Export Template..." option did not appear in the File menu.

I recently got a new laptop, and had to reinstall everything from scratch. At first I thought it was because IT only installed Visual Studio Professional instead of the Enterprise version. But there have been some other peculiarities, for example, the "Debug" tool bar was missing crucial items like "Step In", and "Step Out".

The culprit was that I had installed SQL Server 2005 after Visual Studio. Because they share the same shell (sounds a lot like the Component UI Application Block) SQL Server had changed the default settings for Visual Studio.

To fix:

  1. Select "Import and Export Settings..." from the Tools menu.
  2. Choose the option to "Reset all Settings."
  3. Choose the option to save your current settings, just in case.
  4. Pick an option that doesn't include "SQL Server". I chose the "Visual C# Development Settings"
  5. Finish

Perhaps the software-automation tip here is to configure your settings with defaults for your team, export them out and share the vssettings file (xml) the team.