Bamboo is my build server of choice because I find it simple to setup and has great integration with the rest of the Atlassian stack, such as our JIRA and Bitbucket Server instances.

Bamboo has had native support for MSBuild-based for ages, but with dotnet build the new sexiness, I wanted to get up and running with my CI workflow for my .NET Core applications.

Now there are quite a few challenges to face when setting up a CI build:

1. Versioning
2. Building
3. Testing
4. Deployment

I decided to tackle these things in stages.

### Versioning

In pre-project.json world, if you wanted to version a set of projects within the same solution together, you could achieve that either by generating something like an AssemblyInfo.cs file at build time, or perhaps using a SharedAssemblyInfo.cs link approach whereby you manually set your numbers for the entire solution.

Currently, in project.json world this isn't possible because the [Assembly*Version] attributes are generated by dotnet-build. You might be able to manually add these yourself, but I haven't experimented with that.

So let's look at an example, here is one library MyLibrary.Abstractions.

{
"version": "1.0.0-*",
"dependencies": { },
"frameworks": {
"netstandard1.6": { }
}
}


And here's my implementation library, MyLibrary:

{
"version": "1.0.0-*",
"dependencies": {
"MyLibrary.Abstractions": "1.0.0-*"
},
"frameworks": {
"netstandard1.6": { }
}
}


Right off the bat, I can see a couple of issues. The version number is fixed into the version key, and the dependency version of the abstractions library is also fixed.

The version string 1.0.0-* is significant in that it pins an exact Major.minor.patch number, but allows matching on any pre-release string. What this means is that when a build occurs, dotnet-build generates a pre-release string (because of the presence of -*) which is a timestamp. This aligns the version numbers of both the MyLibrary and MyLibrary.Abstractions packages. Previously, prior to RC2, you could simply do this:

"dependences": {
"MyLibrary.Abstractions": ""
}


This is no longer possible, so if I need to version both components together, I need to do something different. Firstly, I need to tackle that version key and have that carry the same value in both project.json files.

#### Setting the version before project.json

I don't really believe in the term "calculating a version" because that implies some sort of formulaic-approach to determining the version components.

Unless you had some awesome code-analysis tool that could compare your codebase before and after a commit to determine the type of changes and how this affects your public API, the choice of Major.minor.patch solely has to rely on the developer, because only they will know the intent of their change. To this end, I decided to take the approach similar to GitVersion's GitVersion.yaml file where I can express part of the version number myself (the part I care about - Major.minor.patch) and generate the pre-release string from the branch/commit information. I also needed to be able to surface this information in Bamboo itself so I can attribute it to future deployments.

For this, I define a simple text file, build.props:

versionMajor=1
versionMinor=0
versionRev=0


This file would be committed to source control so it can be shared with other devs (to align versions) and the CI server.

Next, I use my branch information to determine the pre-release string (if any), so for instance:

• If the branch is not release, I will generate a pre-release string.
• If we are building locally, the pre-release string is simply -<branch-name>, e.g. -master, or -feature-A
• If we are building on the CI server which drops packages into a NuGet feed, we include the total commit count as -<branch-name>-<commit-count>. I can't take advantage of the +build metadata yet because our deployment server (Octopus Deploy) targets an older version of NuGet. I use commit count and not build number because if I do multiple builds of the same commit, they are the same output, so should carry the same version number.
• If the branch is release, I will not generate a pre-release string.

This means I can generate version numbers such as 1.0.0 (release branch), 1.0.0-master-1 (master branch on CI server), 1.0.0-feature-A (feature/A branch on a local machine).

I wrap up the logic for this version number generation into a Powershell script named version.ps1. This script generates the version number and writes it out to a local file named version.props. This version information is then stamped into each project.json file.

version=1.0.0
semanticVersion=1.0.0-master
prerelease=master


#### Handling dependencies versioning

We still haven't solved how we update the dependency versions in project.json for projects in the same solution. The truth is, we don't. Right at the start, we just change the version number in the dependency version string, to an object (a great tip from Andrew Lock):

"dependences": {
"MyLibrary.Abstractions": { "target": "project" }
}


This allows the version resolution to match on any version. It's not a perfect approach, in fact, the compiler explicitly warns about a version mismatch, but as these are projects in the same solution being versioned together, that is a warning I am happy to put up with. You wouldn't apply * as the version to dependencies outside of the current solution, really these are project-project references only.

### Building

Now building my solution could be as easy as dotnet build **/project.json, but the build process is a bit more involved because we have to stamp our version information in (detailed above), as well as run the test and pack commands to prepare our outputs. Enter Cake.

I've been following Cake for a while because I've honestly struggled with other build systems, such as Fake, PSake, etc. I'm a C# developer and Cake for me is a breeze because it presents a DSL that you write in C#, my language of choice! Cake is also extensible, so that was my point of entry for handling my version stamping. I first define a task named Version:

Task("Version")
.Does(() =>
{
if (Bamboo.IsRunningOnBamboo)
{
// MA - We are running a CI build - so need make sure we execute the script with -local = $false StartPowershellFile("./version.ps1", args => { args.Append("local", "$false");
args.Append("branch", EnvironmentVariable("bamboo_planRepository_branchName"));
});
}
else
{
StartPowershellFile("./version.ps1", args => args.Append("local", "\$true"));
}

foreach (string line in lines)
{
if (line.StartsWith("version"))
{
version = line.Substring("version=".Length).Trim();
}
else if (line.StartsWith("semanticVersion"))
{
semanticVersion = line.Substring("semanticVersion=".Length).Trim();
}
else if (line.StartsWith("prerelease"))
{
prerelease = line.Substring("prerelease=".Length).Trim();
}
}

Console.WriteLine("Version: {0}", version);
Console.WriteLine("SemanticVersion: {0}", semanticVersion);
Console.WriteLine("PreRelease: {0}", prerelease);

DotNetCoreVersion(new DotNetCoreVersionSettings
{
Files = GetFiles("**/project.json"),
Version = semanticVersion
});
});


The last method call is the key part, once I've executed my versioning script, I read the version number and use a custom Cake extension I've built DotNetCoreVersion to load each target project.json as a JObject, set the version key and write them back out again.

Now I can perform my build using another task Build:

Task("Build")
.Does(() =>
{
// MA - Build the libraries
DotNetCoreBuild("./src/**/project.json", new DotNetCoreBuildSettings
{
Configuration = configuration
});

// MA - Build the test libraries
DotNetCoreBuild("./tests/**/project.json", new DotNetCoreBuildSettings
{
Configuration = configuration
});
});


Cake has built in methods for building .NET Core applications, so that made it a lot easier! On the Bamboo side of things, Cake is bootstrapped by another Powershell script build.ps1, so thanks to Bamboo's native Powershell script integration, we simply execute our build script:

### Testing

Currently, although there is now support for both NUnit and MSTest, the best test library for .NET Core apps is currently Xunit and that's purely a side-effect of the Microsoft team favouring Xunit itself during development. We have a problem here - Bamboo doesn't understand Xunit test result XML. Luckily, there exists an XSLT for transforming from Xunit to NUnit, which Bamboo does understand.

We wrap this up in our Cake build script:

Task("Test")
.WithCriteria(() => HasArgument("test"))
.Does(() =>
{
var tests = GetFiles("./tests/**/project.json");
foreach (var test in tests)
{
string projectFolder = System.IO.Path.GetDirectoryName(test.FullPath);
string projectName = projectFolder.Substring(projectFolder.LastIndexOf('\\') + 1);
string resultsFile = "./test-results/" + projectName + ".xml";

DotNetCoreTest(test.FullPath, new DotNetCoreTestSettings
{
ArgumentCustomization = args => args.Append("-xml " + resultsFile)
});

// MA - Transform the result XML into NUnit-compatible XML for the build server.
XmlTransform("./tools/NUnitXml.xslt", "./test-results/" + projectName + ".xml", "./test-results/NUnit." + projectName + ".xml");
}
});


With us now outputting NUnit test results XML, we can read that information in during a Bamboo build plan and surface the test results in the interface. This also means that builds can now fail because of test result failure, which is what we want.

### Deployments

Bamboo does have a built-in deployment mechanism, and for our internal libraries we utilise this to push our packages into one of two NuGet feeds:

• If it is stable build from our release branch, these go into the stable NuGet feed. These do not automatically deploy, but they can easily be done with the push of a button (continuous delivery).
• If it is build from our master branch, these automatically pushed to our volatile NuGet feed (continuous deployment).

We use ProGet by Inedo, as it is a superbly stable, multi-feed package host which is easy to setup and quick. By deploying our packages to these feeds, it is internal to our development environment and we can quickly start using our updated packages in our other projects. If we need to, we can quickly spin up a project-specific feed, or perhaps a branch-specific feed and deploy different versions of our code for different clients/scenarios.

One of the last steps of the build script, is to pack everything together:

Task("Pack")
.WithCriteria(() => HasArgument("pack"))
.Does(() =>
{
var projects = GetFiles("./src/**/project.json");
foreach (var project in projects)
{
// MA - Pack the libraries
DotNetCorePack(project.FullPath, new DotNetCorePackSettings
{
Configuration = configuration,
OutputDirectory = "./artifacts/"
});
}
});


The dotnet-pack tool generates our NuGet packages for us, both the binaries and the symbols. ProGet can host both of these, so we just ship them all to the ProGet API and it handles the rest for us. This deployment step is handled as a Bamboo deployment project. For each module in our framework, we have two deployment plans, the first is the Volatile plan that uses continuous deployment to drop new packages into our volatile feed. The second plan is our stable plan which (when manually triggered) deploys to our stable feed.

We need to make sure the version information is carried through to the deployment plan, so to tackle that, in the source Bamboo build plan we read in the contents of our generated version.props file:

The "Inject Bamboo variables" task allows us to read files in <key>=<value> format and append them as Bamboo variables. In this instance, we read in the version number and add it to the bamboo.props.semanticVersion variable. The variables need to be available to the result otherwise we can't use them later.

Configuring the release version:

And that's pretty much it! Obviously, this is an approach that works well for me, it may not suit your needs, but luckily there are so many ways of achieving the same thing. This will likely all need to change anyway, as the Microsoft team are busily migrating back to MSBuild which means we may be able to use more familiar methods of generating AssemblyInfo.cs files again.

The source files for the different components are available as a Gist: https://gist.github.com/Antaris/8ad52a96e0f2d9f682d1cd6342c44936

Let me know what you think.

The new ASP.NET vNext platform (ASP.NET 5) takes advantage of the new Roslyn compiler infrastructure. Roslyn is a managed-code implementation of the .NET compiler, but in reality, it is so much more than that. Roslyn is the compiler-as-a-service, realising a set of services that have long been the locked away. With this new technology, we have a new set of APIs, which allows us to understand a lot more of the code we are writing.

In ASP.NET vNext, the whole approach to the project system has changed, creating a leaner project system, built on the new Roslyn compiler services. The team have enabled some new scenarios with this approach, and one quite exciting scenario is meta-programming. That being developing programs that understand other programs, or in our instance, writing code to understand our own code, and update/modify our projects at compile time. Meta-programming with your projects can be achieved using the new ICompileModule (github) interface, which the Roslyn compiler can discover at compile time, and utilise both before and after your compilation:

public interface ICompileModule
{
BeforeCompile(BeforeCompileContext context);

AfterCompile(AfterCompileContext context);
}


The interesting thing about how the ICompileModule itself is used, is it is included as part of your own target assembly, and can act on code within that assembly itself.

##### Example project

Let's look at a project structure:

/project.json
/compiler/preprocess/ImplementGreeterCompileModule.cs
/Greeter.cs


This is a very much simplified project, and what we are going to get it to do, is implement the body of a method on our Greeter class:

public class Greeter
{
public string GetMessage()
{
// Implement this method.
}
}


So first things first, we need to make sure we have a reference to the required assemblies, so edit the project.json file and add the following:

{
"version": "1.0.0-*",
"dependencies": {
"Microsoft.CodeAnalysis.CSharp": "1.0.0-*",
"Microsoft.Framework.Runtime.Roslyn.Abstractions": "1.0.0-*"
},

"frameworks": {
"dnx451": {
"frameworkAssemblies": {
"System.Runtime": "4.0.10.0",
"System.Text.Encoding": "4.0.0.0"
}
},
"dnxcore50": {
"dependencies": {
"System.ComponentModel": "4.0.0-*",
"System.IO": "4.0.10-*",
"System.Reflection": "4.0.10-*",
"System.Runtime": "4.0.10-*",
"System.Runtime.Extensions": "4.0.10-*",
}
}
}
}


I'm not going to talk about the frameworks section and how this works as there is already a great deal already written about this concept.

The packages Microsoft.CodeAnalysis.CSharp brings in the Roslyn APIs for working with C# code, and the Microsoft.Framework.Runtime.Roslyn.Abstractions is a contract assembly for bridging between your code and the Roslyn compiler when using the DNX. The DNX implementation with Roslyn is what gives us the ability to use these techniques, you can check out the implementation here.

### How does ICompileModule work?

One of the interest bits about how this all works, is when DNX is building your projects, it actually goes through a couple of stages, and these are very broad descriptions:

1. Discover all applicable source files
2. Convert to SyntaxTree instances
3. Discover all references
4. Create a compilation

Now at this stage, the RoslynCompiler will go ahead and discover any ICompileModules and if they exist, will create a !preprocess assembly, with the same references as your project. It will perform the same as above (steps 1-4), but with just the code in the compiler/preprocess/..., compile it, and load the assembly. The next step is to create an instance of the BeforeCompileContext class which gives us information on the current main project compilation, its references and syntax trees. When the preprocess assembly types are found, they are instantiated, and the BeforeCompile method is executed. At this stage, it sort of feels like Inception.

### Implementing our Compile Module

So, now we have some of an understanding about how compile modules work, lets use one to implement some code. We start off with our basic implementation:

using System.Diagnostics;
using System.Linq;

using Microsoft.CodeAnalysis;
using Microsoft.CodeAnalysis.CSharp.Syntax;
using Microsoft.Framework.Runtime.Roslyn;

using T = Microsoft.CodeAnalysis.CSharp.CSharpSyntaxTree;
using F = Microsoft.CodeAnalysis.CSharp.SyntaxFactory;
using K = Microsoft.CodeAnalysis.CSharp.SyntaxKind;

public class ImplementGreeterCompileModule : ICompileModule
{
public void AfterCompile(AfterCompileContext context)
{
// NoOp
}

public void BeforeCompile(BeforeCompileContext context)
{

}
}


When working with syntax trees, I prefer to use shorthand namespace import aliases, so F is SyntaxFactory, T is the CSharpSyntaxTree type, and K is the SyntaxKind type. This short hand lets me write more terse code which is still quite readable.

So first things first, what do we need to do? Well, firstly, we need to find our Greeter class within our compilation. So we can use the context.Compilation.SyntaxTrees collection for this. So we're first going to do a little digging to find it:

// Get our Greeter class.
var syntaxMatch = context.Compilation.SyntaxTrees
.Select(s => new
{
Tree = s,
Root = s.GetRoot(),
Class = s.GetRoot().DescendantNodes()
.OfType<ClassDeclarationSyntax>()
.Where(cs => cs.Identifier.ValueText == "Greeter")
.SingleOrDefault()
})
.Where(a => a.Class != null)
.Single();


We keep a reference to the tree itself, its root and the matching class declaration, as we'll need these later on. Next up, let's find our GetMessage method within our class:

var tree = syntaxMatch.Tree;
var root = syntaxMatch.Root;
var classSyntax = syntaxMatch.Class;

// Get the method declaration.
var methodSyntax = classSyntax.Members
.OfType<MethodDeclarationSyntax>()
.Where(ms => ms.Identifier.ValueText == "GetMessage")
.Single();


Of course, we're ignoring things like overloads, etc. here, these are things you would need to consider in production code, but as an example, this is very naive. Now we have our method, we need to implement our body. What I want to create, is a simple return "Hello World!" statement. Now you could shortcut this by using SyntaxFactory.ParseStatement("return \"Hello World!\"");, but let's try building it from scratch:

// Let's implement the body.
var returnStatement = F.ReturnStatement(
F.LiteralExpression(
K.StringLiteralExpression,
F.Literal(@"""Hello World!""")));


So here we are creating a return statement using the SyntaxFactory type. The body of the statement is implemented as the return keyword + a string literal for "Hello World!".

Next up, we need to start updating the syntax tree. The thing to remember at this point, is that compilations, syntax tree, and its nodes are immutable. That being they are read-only structures, so we can't simply add things to an existing tree, we need to create new trees, and replace nodes. So with the next couple of lines, let's do that.

// Get the body block
var bodyBlock = methodSyntax.Body;

// Create a new body block, with our new statement.
var newBodyBlock = F.Block(new StatementSyntax[] { returnStatement });

// Get the revised root
var newRoot = (CompilationUnitSyntax)root.ReplaceNode(bodyBlock, newBodyBlock);

// Create a new syntax tree.
var newTree = T.Create(newRoot);


We've doing a couple of things here, we've first obtained the body block of the GetMessage method declaration. Next, we create a new block with our returnStatement. We then need to go back to the root node, and tell it to replace the bodyBlock node with the newBodyBlock node It does this, but returns us a new root node. The original root node is left unchanged, so to finish that off, we have to create the new syntax tree from this revised root node.

Lastly, we'll replace the current syntax tree with our new one:

// Replace the compilation.
context.Compilation = context.Compilation.ReplaceSyntaxTree(tree, newTree);


If you build now, even though the Greeter.GetMessage method does not currently have an implementation, it will build fine, because we've now dynamically implemented it using our ImplementGreeterCompileModule.

So our complete implementation looks like this:

using System.Diagnostics;
using System.Linq;

using Microsoft.CodeAnalysis;
using Microsoft.CodeAnalysis.CSharp.Syntax;
using Microsoft.Framework.Runtime.Roslyn;

using T = Microsoft.CodeAnalysis.CSharp.CSharpSyntaxTree;
using F = Microsoft.CodeAnalysis.CSharp.SyntaxFactory;
using K = Microsoft.CodeAnalysis.CSharp.SyntaxKind;

public class ImplementGreeterCompileModule : ICompileModule
{
public void AfterCompile(AfterCompileContext context)
{
// NoOp
}

public void BeforeCompile(BeforeCompileContext context)
{
// Uncomment this to step through the module at compile time:
//Debugger.Launch();

// Get our Greeter class.
var syntaxMatch = context.Compilation.SyntaxTrees
.Select(s => new
{
Tree = s,
Root = s.GetRoot(),
Class = s.GetRoot().DescendantNodes()
.OfType<ClassDeclarationSyntax>()
.Where(cs => cs.Identifier.ValueText == "Greeter")
.SingleOrDefault()
})
.Where(a => a.Class != null)
.Single();

var tree = syntaxMatch.Tree;
var root = syntaxMatch.Root;
var classSyntax = syntaxMatch.Class;

// Get the method declaration.
var methodSyntax = classSyntax.Members
.OfType<MethodDeclarationSyntax>()
.Where(ms => ms.Identifier.ValueText == "GetMessage")
.Single();

// Let's implement the body.
var returnStatement = F.ReturnStatement(
F.LiteralExpression(
K.StringLiteralExpression,
F.Literal(@"""Hello World!""")));

// Get the body block
var bodyBlock = methodSyntax.Body;

// Create a new body block, with our new statement.
var newBodyBlock = F.Block(new StatementSyntax[] { returnStatement });

// Get the revised root
var newRoot = (CompilationUnitSyntax)root.ReplaceNode(bodyBlock, newBodyBlock);

// Create a new syntax tree.
var newTree = T.Create(newRoot);

// Replace the compilation.
context.Compilation = context.Compilation.ReplaceSyntaxTree(tree, newTree);
}
}


I've added a Debugger.Launch() step in which can be useful if you want to step through the compilation process. Simply hit a CTRL+SHIFT+B build and attach to the IDE instance.

### Where do we go from here?

There are a myriad of possibilities with this new technology, and the areas I am very much interested in, is component modularity. In future posts, I'll show you how you can use a compile module to discover modules in your code and/or nuget packages and generate dependency registrations at compile time.

I've added the code as both a Gist and pushed it to a public repo on GitHub