Uncategorized

Quote of the day #4 (for Saturday)

“You can’t turn back the clock, but you can wind it up again”

Uncategorized

Quote of the day #3

“Our greatest glory is not in never falling, but in rising every time we fall”

Uncategorized

Changing Jobs

So I am currently serving out my notice period at a job, and will shorty be starting another. A colleague where I am has a neat book full of excellent quotes, and I can think of no better way to count down my days to share a quote a day until I leave. So look out for them

Uncategorized

Cake Build Tool

I don’t know exactly when or where I first came across the Cake build tool, and at the time I made a mental note to look at it in more detail (as I am not a massive fan of MSBuild). That time came and went, and I did nothing about it. Then Cake came across my radar again so this time I decided to dig into it a bit more.

 

So what is this cake build tool?

The Cake build tool is a build tool that utilizes the Roslyn (compiler as a service) from .NET. What this means is that you can write very precise build scripts using very familiar C# language syntax that you know and love.

 

Getting started

The best way to get started is to clone the example repo : https://github.com/cake-build/example

The repo is a simple C# class library and a test project all within a single solution.

 

image

As you can see this project is very simple. What we would like to do with this project is the following things :

  • Clean solution
  • Restore Nugets
  • Build solution
  • Run tests
  • And also have ability to push out Nuget package (nupkg file)

Most of this is already available within the example repo : https://github.com/cake-build/example, with the exception of pushing a nuget package at the end.

 

What bits do you need to run a cake build?

So what do you need to provide to run a cake build

You just need these 2 files

  • build.ps1 (bootstrapper that doesn’t change, grab it from repo example above)
  • build. cake (this is your specific build and should contain the targets/tasks you need for your build)

 

The .cake file

As the build.ps1 is a standard thing I won’t worry about that, but lets now turn our attention to the build.cake file which for this post looks like this

 

#tool nuget:?package=NUnit.ConsoleRunner&version=3.4.0


//////////////////////////////////////////////////////////////////////
// ARGUMENTS
//////////////////////////////////////////////////////////////////////

var target = Argument("target", "Default");
var configuration = Argument("configuration", "Release");

//////////////////////////////////////////////////////////////////////
// PREPARATION
//////////////////////////////////////////////////////////////////////

// Define directories.
var buildDir = Directory("./src/Example/bin") + Directory(configuration);

//////////////////////////////////////////////////////////////////////
// TASKS
//////////////////////////////////////////////////////////////////////

Task("Clean")
    .Does(() =>
{
    CleanDirectory(buildDir);
});

Task("Restore-NuGet-Packages")
    .IsDependentOn("Clean")
    .Does(() =>
{
    NuGetRestore("./src/Example.sln");
});

Task("Build")
    .IsDependentOn("Restore-NuGet-Packages")
    .Does(() =>
{
    if(IsRunningOnWindows())
    {
      // Use MSBuild
      MSBuild("./src/Example.sln", settings =>
        settings.SetConfiguration(configuration));
    }
    else
    {
      // Use XBuild
      XBuild("./src/Example.sln", settings =>
        settings.SetConfiguration(configuration));
    }
});

Task("Run-Unit-Tests")
    .IsDependentOn("Build")
    .Does(() =>
{
    NUnit3("./src/**/bin/" + configuration + "/*.Tests.dll", new NUnit3Settings {
        NoResults = true
        });
});


var nugetPackageDir = Directory("./artifacts");
var nuGetPackSettings = new NuGetPackSettings
{   
  OutputDirectory = nugetPackageDir  
};

Task("Package")
  .Does(() => NuGetPack("./src/Example/Example.nuspec", nuGetPackSettings));


//////////////////////////////////////////////////////////////////////
// TASK TARGETS
//////////////////////////////////////////////////////////////////////

Task("Default")
    .IsDependentOn("Run-Unit-Tests");

//////////////////////////////////////////////////////////////////////
// EXECUTION
//////////////////////////////////////////////////////////////////////

RunTarget(target);

 

 

There are a couple of concepts to call out there

 

  • We have some top level arguments/ variables
  • Nice C# features that we have used before
  • We have Tasks just like other build systems. We can make one task depend on another using .IsDependantOn(“”)
  • There seems to be wide range of inbuilt things we can use for example these guys below. These are all prebuilt items in the cake DSL that we can make use of. There are loads of these, the full list is available here : https://cakebuild.net/dsl/
    • CleanDirectory
    • NUnit3
    • NuGetPack

 

Have a look at the DSL web site there are quite a few cool things you can use

 

image

 

Running the build

So with this build.cake and build.ps1 (bootstrapper file) in place we would like to run the build. Here is how we do that

 

1. Open PowerShell window as Administrator
2. Issue this command in PowerShell : Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass
3. Change to the correct directory with the .cake file in it, and issue this command : .\build.ps1
4. You should see some output, where it eventually completes
5. You should also see a tools folder

 

This is the tail end of the build I just ran above

 

image

 

And this is the sort of thing that we should see in the tools folder that the cake build created

 

image

 

Deploying a Nuget

So I stated that I also wanted to be able to deploy a Nuget Package as a Nupkg. To do this I need to create the following .nuspec file for the Example project

<?xml version="1.0"?>
<package >
  <metadata>
    <id>Example</id>
    <version>1.0.0</version>
    <title>Cake Example</title>
    <authors>Sacha Barber</authors>
    <owners>Sacha Barber</owners>
    <licenseUrl>http://github.com/sachabarber</licenseUrl>
    <projectUrl>http://github.com/sachabarber</projectUrl>
    <requireLicenseAcceptance>false</requireLicenseAcceptance>
    <description>Simple Cake Build Tool Example</description>
    <releaseNotes>1st and only release</releaseNotes>
    <copyright>Copyright 2018</copyright>
    <tags>C# Cake</tags>
  </metadata>
  <files>  
   <file src="bin\Release\Example.dll" target="lib\net45"></file>  
</files> 
</package>

 

So with that in place we can also try the Nuget publish Task that our build.cake file has in it like this:

 

1. Open PowerShell window as Administrator
2. Issue this command in PowerShell : Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass
3. Issue this command in PowerShell : .\build.ps1 -Target Package

 

After running that we should see artifacts folder with the following artifact in it

 

image

 

Conclusion

I was pretty happy with this, I went from not using Cake at all to carrying out ALL my requirements in 1 hour on a train ride with limited WiFi. It just seems to work, and I imagine it would be a good fit for working with something like https://about.gitlab.com/

 

I think I will be looking to use this little build tool a lot more.

Uncategorized

MADCAP IDEA PART 4 : PROTOTYPING THE SCREENS

 

Last Time

 

Last time we looked at bringing in the Play Framework (scala based MVC web framework) and making the front end work with Play. This time we will  be look at the initial prototypes of the screens.

This is my best guess of what they may look like right now, based on my initial requirements, but as with all things once you get into the guts of it, changes will occur.

 

 

PreAmble

Just as a reminder this is part of my ongoing set of posts which I talk about here :

https://sachabarbs.wordpress.com/2017/05/01/madcap-idea/, where we will be building up to a point where we have a full app using lots of different stuff, such as these

  • WebPack
  • React.js
  • React Router
  • TypeScript
  • Babel.js
  • Akka
  • Scala
  • Play (Scala Http Stack)
  • MySql
  • SBT
  • Kafka
  • Kafka Streams

 

Mockup tool of choice

I am a big fan of the balsamiq mockup tools https://balsamiq.com/. This comes as a stand alone installed version or as a plugin for JIRA.

balsamiq provides the following (I am just listing the features I used, there are many more)

 

  • Drag and drop from a wide range of forms, containers, controls
  • Set content for controls (usually using some fancy design time behavior)
  • Set navigation links
  • Set properties like IsSelected, IsEnabled etc etc

 

This is what the windows installed balsamiq desktop version looks like, see how you have many categories of items to choose from

 

image

 

And here is what I mean by the clever design time support. This is a data grid that I have double clicked on, where the text in design mode described the rendered results of the control

 

image

 

It really is a very nice tool. Anyway on with the initial screen designs

 

Navbar

image

 

This will be a simple react-router / react-bootstrap based navigation bar. There is nothing much more to say about that.

 

 

Login

image

 

This will be a login form which will be validated, and submitted to a Play framework controller, for further validation. The Play controller would look up the user details from a MySQL database, and if an entry is found the user is considered logged on. Keeping in simple here no oAuth no JWT, just simple lookup

 

Passenger Register

image

 

If the user is a passenger the sort of information that they will need to enter to register will be different from a driver who may need to register, as such there is a specific passenger registration form, which will be validated and sent to a Play controller endpoint for storage in MySQL.

 

Driver Register

image

 

If the user is a driver, we need more information about the vehicle, as such there is a specific driver registration form, which will be validated and sent to a Play controller endpoint for storage in MySQL.

 

 

Create Job

image

 

Only a passenger will be able to create new jobs. Since I am doing all this work on a single laptop which is ALWAYS in a single location, I am having to SIMULATE the geo-coordinates of a job by accepting the current users input for their current position. The passenger/driver users will provide this geo information by clicking on a google map. The geo co-ordinate update will either travel through a Kafka stream, –> Akka –> Comet, or may just use Akka –> Comet. I have not fully decided on this part yet.

 

There may only EVER be 1 active job, so if a logged in passenger tries to create a 2nd job this should cause an error

 

image

 

View Job

Both passengers/drivers may view an active job. Drivers may “bid for a job” by clicking on the map providing the job is not already paired with a driver.  A driver symbol will be a car, as before the driver will update their geo co-ordinates by clicking on the map. A before the geo co-ordinate update will either travel through a Kafka stream, –> Akka –> Comet, or may just use Akka –> Comet.

 

image

 

A passenger may inspect a drivers details, and chose to accept the driver, at which point the passengers job become assigned the chosen driver.

 

image

 

Drivers that are not allocated to the job will be removed from the map, and only geo updates from the paired passenger/driver will be reflected on the map. 

 

Passenger Completion

 

image

Once a job has been completed (by clicking the”Complete” button) the passenger will be able to rank the driver. This will store the ranking for the driver. This could be stored directly in MySQL, but I want to play with Kafka Streams a bit more, so we use a Kafka Publisher –> Kafka Streams –> KTable arrangement to store the state. And then use Kafka active queries to get the data out again.

 

 

Driver Completion

image

 

The driver is also able to complete the job from their end (using the “complete” button), and is able to rank the passenger. This will work as described above.

 

View Ranking

image

 

Depending on which way I go with the ranking storage this will either be a direct MySQL query or a Kafka Streams active query over a KTable.

 

Conclusion

This is perhaps the simplest of all the posts in this series, but it is an important one. Next time we will try and statically implement these screens, and the associated routing that goes with them.