C#

Bogus : Simple fake data tool

This is a VERY short post about a new tool that just popped up on my radar. I am quite into good tests, and with testing usually comes data, and it can be quite a bind to create some reasonable fake data, whether you use some factories, you own JabberWocky data creator or maybe use a tool such as AutoFixture the ask is the same, create me some fake data

Anyway a tool I had not used before popped up which does the same thing as AutoFixture but looks a bit easier to use. The tool is Bogus

From the README.md this is what you would do



public enum Gender
{
    Male,
    Female
}

//Set the randomzier seed if you wish to generate repeatable data sets.
Randomizer.Seed = new Random(8675309);

var fruit = new[] { "apple", "banana", "orange", "strawberry", "kiwi" };

var orderIds = 0;
var testOrders = new Faker<Order>()
    //Ensure all properties have rules. By default, StrictMode is false
    //Set a global policy by using Faker.DefaultStrictMode
    .StrictMode(true)
    //OrderId is deterministic
    .RuleFor(o => o.OrderId, f => orderIds++)
    //Pick some fruit from a basket
    .RuleFor(o => o.Item, f => f.PickRandom(fruit))
    //A random quantity from 1 to 10
    .RuleFor(o => o.Quantity, f => f.Random.Number(1, 10));


var userIds = 0;
var testUsers = new Faker<User>()
    //Optional: Call for objects that have complex initialization
    .CustomInstantiator(f => new User(userIds++, f.Random.Replace("###-##-####")))

    //Use an enum outside scope.
    .RuleFor(u => u.Gender, f => f.PickRandom<Gender>())

    //Basic rules using built-in generators
    .RuleFor(u => u.FirstName, (f, u) => f.Name.FirstName(u.Gender))
    .RuleFor(u => u.LastName, (f, u) => f.Name.LastName(u.Gender))
    .RuleFor(u => u.Avatar, f => f.Internet.Avatar())
    .RuleFor(u => u.UserName, (f, u) => f.Internet.UserName(u.FirstName, u.LastName))
    .RuleFor(u => u.Email, (f, u) => f.Internet.Email(u.FirstName, u.LastName))
    .RuleFor(u => u.SomethingUnique, f => $"Value {f.UniqueIndex}")
    .RuleFor(u => u.SomeGuid, f => Guid.NewGuid)

    //Use a method outside scope.
    .RuleFor(u => u.CartId, f => Guid.NewGuid())
    //Compound property with context, use the first/last name properties
    .RuleFor(u => u.FullName, (f, u) => u.FirstName + " " + u.LastName)
    //And composability of a complex collection.
    .RuleFor(u => u.Orders, f => testOrders.Generate(3).ToList())
    //Optional: After all rules are applied finish with the following action
    .FinishWith((f, u) =>
        {
            Console.WriteLine("User Created! Id={0}", u.Id);
        });

var user = testUsers.Generate();
Console.WriteLine(user.DumpAsJson());



Which would give you this JSON output

/* OUTPUT:
User Created! Id=0
 *
{
  "Id": 0,
  "FirstName": "Audrey",
  "LastName": "Spencer",
  "FullName": "Audrey Spencer",
  "UserName": "Audrey_Spencer72",
  "Email": "Audrey82@gmail.com",
  "Avatar": "https://s3.amazonaws.com/uifaces/faces/twitter/itstotallyamy/128.jpg",
  "CartId": "863f9462-5b88-471f-b833-991d68db8c93",
  "SSN": "923-88-4231",
  "Gender": 0,
  "Orders": [
    {
      "OrderId": 0,
      "Item": "orange",
      "Quantity": 8
    },
    {
      "OrderId": 1,
      "Item": "banana",
      "Quantity": 2
    },
    {
      "OrderId": 2,
      "Item": "kiwi",
      "Quantity": 9
    }
  ]
} */

Nice tool, I will be giving this a go

Even more tools

So after posting this an old friend from back in my WPF days got hold of me and pointed me at his own tool, which is also definitely worth a look (hes a smart man and I very much like his previous work, so defo try have a look if you can)

His library is called Testify which you grab from here : http://www.digitaltapestry.net/testify/

Here is what he said to me out of band (he didn’t want to add it to this post himself, didn’t want appear biased (which you would come across as but hey ho), anyway he contacted me off line, and I’ve had a look and IMHO the more information the better)

Anyway this is what he says of his own library

….does what Bogus does, and more, at http://www.digitaltapestry.net/testify/. My AnonymousData is similar to Bogus and AutoFixture. The focus is on Unit testing, intending to reduce coupling between the tests and the production code, as well as to call out what’s important to the test and what is just necessary to make it work.

Testify also includes other concepts, like Contract Verifiers, Fluent and Extendible Assertions, Compound Assertions and Classification

Advertisements
C#

Update scheduled Quartz.Net job by monitoring App.Config

 

Introduction

So I was back in .NET land the other day at work, where I had to schedule some code to run periodically on some schedule.

The business also needed this schedule to be adjustable, so that they could adjust it when things were busier and wind it down when they are not

This adjusting of the schedule time would be done via a setting in the App.Config, where the App.Config is monitored for changes. If there is a change then we would look to use the new schedule value from the App.Config to run the job. Ideally the app must not go down to afford this change of job schedule time.

There are some good job / scheduling libraries out there, but for this I just wanted to use something light weight so I went with Quartz.net

Its easy to setup and use, and has a fairly nice API, supports IOC and CRON schedules. In short it fits the bill

In a netshell this post will simple talk about how you can adjust the schedule of a ALREADY scheduled job, there will also be some special caveats that I personally had to deal with in my requirements, which may or may not be an issue for you

 

Some Weird Issues That I Needed To Cope With

So let me just talk about some of the issues that I had to deal with

The guts of the job code that I run on my schedule is actually writing to Azure Blob Storage and then to Azure SQL DW tables. And as such has several writes to several components one after another.

So this run of the current job run MUST be allowed to complete in FULL (or fail using Exception handling that’s ok to). It would not be acceptable to just stop the Quartz job while there is work in flight.

I guess some folk may be thinking of some sort of transaction here, that must either commit or rollback. Unfortunately that doesn’t work with Azure Blob Storage uploads.

So I had to think of another plan.

So here is what I came up with. I would use threading primitives namely an AutoResetEvent that would control when the Quartz.net job could be changed to use a new schedule.

if a change in the App.Config was seen, then we know that we “should” be using a new schedule time, however the scheduled job MAY have work in flight. So we need to wait for that work to complete (or fail) before we could think about swapping the Quartz.net scheduler time.

So that is what I went for, there are a few other things to be aware of such as I needed threading primitives that worked with Async/Await code. Luckily Stephen Toub from the TPL team has done that for us : asyncautoresetevent

There is also the well known fact that the FileSystemWatcher class fires events twice : http://lmgtfy.com/?q=filesystemwatcher+firing+twice

So as we go through the code you will see how I dealt with those

The Code

Ok so now that we have talked about the problem, lets go through the code.

There are several NuGet packages I am using to make my life easier

So lets start with the entry point, which for me is the simple Program class shown below

using System;
using System.Collections.Generic;
using System.Linq;
using System.Reflection;
using System.Security.Principal;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using Autofac;
using SachaBarber.QuartzJobUpdate.Services;
using Topshelf;


namespace SachaBarber.QuartzJobUpdate
{
    static class Program
    {
        private static ILogger _log = null;


        [STAThread]
        public static void Main()
        {
            try
            {
                var container = ContainerOperations.Container;
                _log = container.Resolve<ILogger>();
                _log.Log("Starting");

                AppDomain.CurrentDomain.UnhandledException += AppDomainUnhandledException;
                TaskScheduler.UnobservedTaskException += TaskSchedulerUnobservedTaskException;
                Thread.CurrentPrincipal = new WindowsPrincipal(WindowsIdentity.GetCurrent());
                
                HostFactory.Run(c =>                                 
                {
                    c.Service<SomeWindowsService>(s =>                        
                    {
                        s.ConstructUsing(() => container.Resolve<SomeWindowsService>());
                        s.WhenStarted(tc => tc.Start());             
                        s.WhenStopped(tc => tc.Stop());               
                    });
                    c.RunAsLocalSystem();                            

                    c.SetDescription("Uploads Calc Payouts/Summary data into Azure blob storage for RiskStore DW ingestion");       
                    c.SetDisplayName("SachaBarber.QuartzJobUpdate");                       
                    c.SetServiceName("SachaBarber.QuartzJobUpdate");                      
                });
            }
            catch (Exception ex)
            {
                _log.Log(ex.Message);
            }
            finally
            {
                _log.Log("Closing");
            }
        }
      

        private static void AppDomainUnhandledException(object sender, UnhandledExceptionEventArgs e)
        {
            ProcessUnhandledException((Exception)e.ExceptionObject);
        }

        private static void TaskSchedulerUnobservedTaskException(object sender, UnobservedTaskExceptionEventArgs e)
        {
            ProcessUnhandledException(e.Exception);
            e.SetObserved();
        }

        private static void ProcessUnhandledException(Exception ex)
        {
            if (ex is TargetInvocationException)
            {
                ProcessUnhandledException(ex.InnerException);
                return;
            }
            _log.Log("Error");
        }
    }
}

All this does it host the actual windows service class for me using TopShelf. Where the actual service class looks like this

using System;
using System.Configuration;
using System.IO;
using System.Reactive.Disposables;
using System.Reactive.Linq;
using System.Xml.Linq;
using Autofac;
using SachaBarber.QuartzJobUpdate.Async;
using SachaBarber.QuartzJobUpdate.Configuration;
using SachaBarber.QuartzJobUpdate.Jobs;
using SachaBarber.QuartzJobUpdate.Services;
//logging
using Quartz;

namespace SachaBarber.QuartzJobUpdate
{
    public class SomeWindowsService
    {
        private readonly ILogger _log;
        private readonly ISchedulingAssistanceService _schedulingAssistanceService;
        private readonly IRxSchedulerService _rxSchedulerService;
        private readonly IObservableFileSystemWatcher _observableFileSystemWatcher;
        private IScheduler _quartzScheduler;
        private readonly AsyncLock _lock = new AsyncLock();
        private readonly SerialDisposable _configWatcherDisposable = new SerialDisposable();
        private static readonly JobKey _someScheduledJobKey = new JobKey("SomeScheduledJobKey");
        private static readonly TriggerKey _someScheduledJobTriggerKey = new TriggerKey("SomeScheduledJobTriggerKey");

        public SomeWindowsService(
            ILogger log,
            ISchedulingAssistanceService schedulingAssistanceService, 
            IRxSchedulerService rxSchedulerService,
            IObservableFileSystemWatcher observableFileSystemWatcher)
        {
            _log = log;
            _schedulingAssistanceService = schedulingAssistanceService;
            _rxSchedulerService = rxSchedulerService;
            _observableFileSystemWatcher = observableFileSystemWatcher;
        }

        public void Start()
        {
            try
            {
                var ass = typeof (SomeWindowsService).Assembly;
                var configFile = $"{ass.Location}.config"; 
                CreateConfigWatcher(new FileInfo(configFile));


                _log.Log("Starting SomeWindowsService");

                _quartzScheduler = ContainerOperations.Container.Resolve<IScheduler>();
                _quartzScheduler.JobFactory = new AutofacJobFactory(ContainerOperations.Container);
                _quartzScheduler.Start();

                //create the Job
                CreateScheduledJob();
            }
            catch (JobExecutionException jeex)
            {
                _log.Log(jeex.Message);
            }
            catch (SchedulerConfigException scex)
            {
                _log.Log(scex.Message);
            }
            catch (SchedulerException sex)
            {
                _log.Log(sex.Message);
            }

        }

        public void Stop()
        {
            _log.Log("Stopping SomeWindowsService");
            _quartzScheduler?.Shutdown();
            _configWatcherDisposable.Dispose();
            _observableFileSystemWatcher.Dispose();
        }


        private void CreateConfigWatcher(FileInfo configFileInfo)
        {
            FileSystemWatcher watcher = new FileSystemWatcher();
            watcher.Path = configFileInfo.DirectoryName;
            watcher.NotifyFilter = 
                NotifyFilters.LastAccess | 
                NotifyFilters.LastWrite | 
                NotifyFilters.FileName | 
                NotifyFilters.DirectoryName;
            watcher.Filter = configFileInfo.Name;
            _observableFileSystemWatcher.SetFile(watcher);
            //FSW is notorious for firing twice see here : 
            //http://stackoverflow.com/questions/1764809/filesystemwatcher-changed-event-is-raised-twice
            //so lets use Rx to Throttle it a bit
            _configWatcherDisposable.Disposable = _observableFileSystemWatcher.Changed.SubscribeOn(
                _rxSchedulerService.TaskPool).Throttle(TimeSpan.FromMilliseconds(500)).Subscribe(
                    async x =>
                    {
                        //at this point the config has changed, start a critical section
                        using (var releaser = await _lock.LockAsync())
                        {
                            //tell current scheduled job that we need to read new config, and wait for it
                            //to signal us that we may continue
                            _log.Log($"Config file {configFileInfo.Name} has changed, attempting to read new config data");
                            _schedulingAssistanceService.RequiresNewSchedulerSetup = true;
                            _schedulingAssistanceService.SchedulerRestartGate.WaitAsync().GetAwaiter().GetResult();
                            //recreate the AzureBlobConfiguration, and recreate the scheduler using new settings
                            ConfigurationManager.RefreshSection("schedulingConfiguration");
                            var newSchedulingConfiguration = SimpleConfig.Configuration.Load<SchedulingConfiguration>();
                            _log.Log($"SchedulingConfiguration section is now : {newSchedulingConfiguration}");
                            ContainerOperations.ReInitialiseSchedulingConfiguration(newSchedulingConfiguration);
                            ReScheduleJob();
                        }
                    },
                    ex =>
                    {
                        _log.Log($"Error encountered attempting to read new config data from config file {configFileInfo.Name}");
                    });
        }

        private void CreateScheduledJob(IJobDetail existingJobDetail = null)
        {
            var azureBlobConfiguration = ContainerOperations.Container.Resolve<SchedulingConfiguration>();
            IJobDetail job = JobBuilder.Create<SomeQuartzJob>()
                    .WithIdentity(_someScheduledJobKey)
                    .Build();

            ITrigger trigger = TriggerBuilder.Create()
                .WithIdentity(_someScheduledJobTriggerKey)
                .WithSimpleSchedule(x => x
                    .RepeatForever()
                    .WithIntervalInSeconds(azureBlobConfiguration.ScheduleTimerInMins)
                )
                .StartAt(DateTimeOffset.Now.AddSeconds(azureBlobConfiguration.ScheduleTimerInMins))
                .Build();

            _quartzScheduler.ScheduleJob(job, trigger);
        }

        private void ReScheduleJob()
        {
            if (_quartzScheduler != null)
            {
                _quartzScheduler.DeleteJob(_someScheduledJobKey);
                CreateScheduledJob();
            }
        }
    }


}

There is a fair bit going on here. So lets list some of the work this code does

  • It creates the initial Quartz.Net job and scheduled it using the values from a custom config section which are read into an object
  • It watches the config file for changes (we will go through that in a moment) and will wait on the AsyncAutoResetEvent to be signalled, at which point it will recreate the Quartz.net job

So lets have a look at some of the small helper parts

This is a simple Rx based file system watcher. The reason Rx is good here is that you can Throttle the events (see this post FileSystemWatcher raises 2 events)

using System;
using System.IO;
using System.Reactive.Linq;

namespace SachaBarber.QuartzJobUpdate.Services
{
    public class ObservableFileSystemWatcher : IObservableFileSystemWatcher
    {
        private FileSystemWatcher _watcher;

        public void SetFile(FileSystemWatcher watcher)
        {
            _watcher = watcher;

            Changed = Observable
                .FromEventPattern<FileSystemEventHandler, FileSystemEventArgs>
                (h => _watcher.Changed += h, h => _watcher.Changed -= h)
                .Select(x => x.EventArgs);

            Renamed = Observable
                .FromEventPattern<RenamedEventHandler, RenamedEventArgs>
                (h => _watcher.Renamed += h, h => _watcher.Renamed -= h)
                .Select(x => x.EventArgs);

            Deleted = Observable
                .FromEventPattern<FileSystemEventHandler, FileSystemEventArgs>
                (h => _watcher.Deleted += h, h => _watcher.Deleted -= h)
                .Select(x => x.EventArgs);

            Errors = Observable
                .FromEventPattern<ErrorEventHandler, ErrorEventArgs>
                (h => _watcher.Error += h, h => _watcher.Error -= h)
                .Select(x => x.EventArgs);

            Created = Observable
                .FromEventPattern<FileSystemEventHandler, FileSystemEventArgs>
                (h => _watcher.Created += h, h => _watcher.Created -= h)
                .Select(x => x.EventArgs);

            All = Changed.Merge(Renamed).Merge(Deleted).Merge(Created);
            _watcher.EnableRaisingEvents = true;
        }

        public void Dispose()
        {
            _watcher.EnableRaisingEvents = false;
            _watcher.Dispose();
        }

        public IObservable<FileSystemEventArgs> Changed { get; private set; }
        public IObservable<RenamedEventArgs> Renamed { get; private set; }
        public IObservable<FileSystemEventArgs> Deleted { get; private set; }
        public IObservable<ErrorEventArgs> Errors { get; private set; }
        public IObservable<FileSystemEventArgs> Created { get; private set; }
        public IObservable<FileSystemEventArgs> All { get; private set; }
    }
}

And this is a small utility class that will contain the results of the custom config section that may be read using SimpleConfig

namespace SachaBarber.QuartzJobUpdate.Configuration
{
    public class SchedulingConfiguration
    {
        public int ScheduleTimerInMins { get; set; }

        public override string ToString()
        {
            return $"ScheduleTimerInMins: {ScheduleTimerInMins}";
        }
    }
}

Which you read from the App.Config like this

 var newSchedulingConfiguration = SimpleConfig.Configuration.Load<SchedulingConfiguration>();

And this is the Async/Await compatible AutoResetEvent that I took from Stephen Toubs blog

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace SachaBarber.QuartzJobUpdate.Async
{
    public class AsyncAutoResetEvent
    {
        private static readonly Task Completed = Task.FromResult(true);
        private readonly Queue<TaskCompletionSource<bool>> _waits = new Queue<TaskCompletionSource<bool>>();
        private bool _signaled;

        public Task WaitAsync()
        {
            lock (_waits)
            {
                if (_signaled)
                {
                    _signaled = false;
                    return Completed;
                }
                else
                {
                    var tcs = new TaskCompletionSource<bool>();
                    _waits.Enqueue(tcs);
                    return tcs.Task;
                }
            }
        }

        public void Set()
        {
            TaskCompletionSource<bool> toRelease = null;
            lock (_waits)
            {
                if (_waits.Count > 0)
                    toRelease = _waits.Dequeue();
                else if (!_signaled)
                    _signaled = true;
            }
            toRelease?.SetResult(true);
        }
    }
}

So the last part of the puzzle is how does the AsynAutoReset event get signalled?

Well as we said above we need to wait for any in progress work to complete first. So the way I tackled that was that within the job code that gets run every Quartz.Net scheduler tick time, we just check whether we have be requested to swap out the current schedule time, and if so we should signal the waiting code of the (shared) AsyncAutoResetEvent, otherwise we just carry on and do the regular job work.

The way that we get the AsyncAutoResetEvent that is used by the waiting code and also the job code (to signal it) is via using a singleton registration in an IOC container. I am using AutoFac which I set up like this, but you could have your own singleton, or IOC container of choice that you could use.

The trick is to make sure that both classes that need to access the AsyncAutoResetEvent use a single instance.

using System;
using System.Reflection;
using Autofac;
using SachaBarber.QuartzJobUpdate.Configuration;
using SachaBarber.QuartzJobUpdate.Services;
using Quartz;
using Quartz.Impl;

namespace SachaBarber.QuartzJobUpdate
{
    public class ContainerOperations
    {
        private static Lazy<IContainer> _containerSingleton = 
            new Lazy<IContainer>(CreateContainer);

        public static IContainer Container => _containerSingleton.Value;

        public static void ReInitialiseSchedulingConfiguration(
            SchedulingConfiguration newSchedulingConfiguration)
        {
            var currentSchedulingConfiguration = 
                Container.Resolve<SchedulingConfiguration>();
            currentSchedulingConfiguration.ScheduleTimerInMins = 
                newSchedulingConfiguration.ScheduleTimerInMins;
        }
        

        private static IContainer CreateContainer()
        {
            var builder = new ContainerBuilder();
            builder.RegisterType<ObservableFileSystemWatcher>()
                .As<IObservableFileSystemWatcher>().ExternallyOwned();
            builder.RegisterType<RxSchedulerService>()
                .As<IRxSchedulerService>().ExternallyOwned();
            builder.RegisterType<Logger>().As<ILogger>().ExternallyOwned();
            builder.RegisterType<SomeWindowsService>();
            builder.RegisterInstance(new SchedulingAssistanceService())
                .As<ISchedulingAssistanceService>();
            builder.RegisterInstance(
                SimpleConfig.Configuration.Load<SchedulingConfiguration>());

            // Quartz/jobs
            builder.Register(c => new StdSchedulerFactory().GetScheduler())
                .As<Quartz.IScheduler>();
            builder.RegisterAssemblyTypes(Assembly.GetExecutingAssembly())
                .Where(x => typeof(IJob).IsAssignableFrom(x));
            return builder.Build();
        }

        
    }
}

Where the shared instance in my case is this class

using SachaBarber.QuartzJobUpdate.Async;

namespace SachaBarber.QuartzJobUpdate.Services
{
    public class SchedulingAssistanceService : ISchedulingAssistanceService
    {
        public SchedulingAssistanceService()
        {
            SchedulerRestartGate = new AsyncAutoResetEvent();
            RequiresNewSchedulerSetup = false;
        }    

        public AsyncAutoResetEvent SchedulerRestartGate { get; }
        public bool RequiresNewSchedulerSetup { get; set; }
    }
}

Here is the actual job code that will check to see if a change in the App.Config has been detected. Which would require this code to signal the waiting code that it may continue.

using System;
using System.IO;
using System.Net;
using System.Threading.Tasks;
using Quartz;

namespace SachaBarber.QuartzJobUpdate.Services
{
    public class SomeQuartzJob : IJob
    {
        private readonly ILogger _log;
        private readonly ISchedulingAssistanceService _schedulingAssistanceService;

        public SomeQuartzJob(
            ILogger log, 
            ISchedulingAssistanceService schedulingAssistanceService)
        {
            _log = log;
            _schedulingAssistanceService = schedulingAssistanceService;
        }


        public void Execute(IJobExecutionContext context)
        {
            try
            {
                ExecuteAsync(context).GetAwaiter().GetResult();
            }
            catch (JobExecutionException jeex)
            {
                _log.Log(jeex.Message);
                throw;
            }
            catch (SchedulerConfigException scex)
            {
                _log.Log(scex.Message);
                throw;
            }
            catch (SchedulerException sex)
            {
                _log.Log(sex.Message);
                throw;
            }
            catch (ArgumentNullException anex)
            {
                _log.Log(anex.Message);
                throw;
            }
            catch (OperationCanceledException ocex)
            {
                _log.Log(ocex.Message);
                throw;
            }
            catch (IOException ioex)
            {
                _log.Log(ioex.Message);
                throw;
            }
        }


        /// <summary>
        /// This is called every time the Quartz.net scheduler CRON time ticks
        /// </summary>
        public async Task ExecuteAsync(IJobExecutionContext context)
        {
            await Task.Run(async () =>
            {
                if (_schedulingAssistanceService.RequiresNewSchedulerSetup)
                {
                    //signal the waiting scheduler restart code that it can now restart the scheduler
                    _schedulingAssistanceService.RequiresNewSchedulerSetup = false;
                    _log.Log("Job has been asked to stop, to allow job reschedule due to change in config");
                    _schedulingAssistanceService.SchedulerRestartGate.Set();
                }
                else
                {
                    await Task.Delay(1000);
                    _log.Log("Doing the uninterruptible work now");
                }
            });
        }
    }
}

So when the AsyncAutoResetEvent is signalled the waiting code (inside the subscribe code of the Rx file system watcher inside the SomeWindowsService.cs code) will proceed to swap out the Quartz.Net scheduler time.

It can do this safely as we know there is NO work in flight as the job has told this waiting to code to proceed, which it can only do if there is no work in flight.

This swapping over of the scheduler time to use the newly read App.Config values is also protected in an AsyncLock class (again taken from Stephen Toub)

using System;
using System.Threading;
using System.Threading.Tasks;

namespace SachaBarber.QuartzJobUpdate.Async
{
    /// <summary>
    /// See http://blogs.msdn.com/b/pfxteam/archive/2012/02/12/10266988.aspx
    /// from the fabulous Stephen Toub
    /// </summary>    
    public class AsyncLock
    {
        private readonly AsyncSemaphore m_semaphore;
        private readonly Task<Releaser> m_releaser;

        public AsyncLock()
        {
            m_semaphore = new AsyncSemaphore(1);
            m_releaser = Task.FromResult(new Releaser(this));
        }

        public Task<Releaser> LockAsync()
        {
            var wait = m_semaphore.WaitAsync();
            return wait.IsCompleted ?
                m_releaser :
                wait.ContinueWith((_, state) => new Releaser((AsyncLock)state),
                    this, CancellationToken.None,
                    TaskContinuationOptions.ExecuteSynchronously, TaskScheduler.Default);
        }

        public struct Releaser : IDisposable
        {
            private readonly AsyncLock m_toRelease;

            internal Releaser(AsyncLock toRelease) { m_toRelease = toRelease; }

            public void Dispose()
            {
                if (m_toRelease != null)
                    m_toRelease.m_semaphore.Release();
            }
        }
    }
}

Where this relies on AsyncSemaphore

using System;
using System.Collections.Generic;
using System.Threading.Tasks;

namespace SachaBarber.QuartzJobUpdate.Async
{
    /// <summary>
    /// See http://blogs.msdn.com/b/pfxteam/archive/2012/02/12/10266983.aspx
    /// from the fabulous Stephen Toub
    /// </summary>
    public class AsyncSemaphore
    {
        private static readonly Task s_completed = Task.FromResult(true);
        private readonly Queue<TaskCompletionSource<bool>> _mWaiters = new Queue<TaskCompletionSource<bool>>();
        private int _mCurrentCount;

        public AsyncSemaphore(int initialCount)
        {
            if (initialCount < 0) throw new ArgumentOutOfRangeException("initialCount");
            _mCurrentCount = initialCount;
        }

        public Task WaitAsync()
        {
            lock (_mWaiters)
            {
                if (_mCurrentCount > 0)
                {
                    --_mCurrentCount;
                    return s_completed;
                }
                else
                {
                    var waiter = new TaskCompletionSource<bool>();
                    _mWaiters.Enqueue(waiter);
                    return waiter.Task;
                }
            }
        }


        public void Release()
        {
            TaskCompletionSource<bool> toRelease = null;
            lock (_mWaiters)
            {
                if (_mWaiters.Count > 0)
                    toRelease = _mWaiters.Dequeue();
                else
                    ++_mCurrentCount;
            }
            if (toRelease != null)
                toRelease.SetResult(true);
        }


    }
}

Just for completeness this is how you get an App.Config section to refresh at runtime

 ConfigurationManager.RefreshSection("schedulingConfiguration");

Anyway this works fine for me, I now have a reactive app that changes to changes in the App.Config without the need to restart the app, and it does so by allowing inflight work to be completed.

Hope it helps someone out there

 

Where Is The Code?

The code can be found here : : https://github.com/sachabarber/SachaBarber.QuartzJobUpdate

C#

OuT WITH RAVEN embedded In with litedb

I recently starting working (I have now finished it) writing a small internal web site using the following things

  • WebApi2
  • OAuth
  • JWT support
  • OWIN
  • AutoFac
  • Raven DB Embedded
  • Aurelia.io for front end

I have to say it worked out great, It was a pleasuree to work on, all the way through.

I quite like Raven embedded, for this type of app. Its completely stand alone, and does just what I need from it.

So I got the end of the project, and I was pretty sure I checked that we had licenses for everything I was using. Turns out we didn’t have one for RavenDB.

Mmm. This app was a tool really to help us internally so we did not want to spend that much on it.

Shame as I like Raven. I started to look around for another tool that could fit the bill.

This was my shopping list

  • Had to be .NET
  • Had to support document storage
  • Had to have LINQ support
  • Had to support same set of features that I was using as Raven Embedded (CRUD + indexes essentially)
  • Had to be free
  • Had to be embedded as single Dll

It did not take me long to stumble upon LiteDB.

This ticked all my boxes and more. I decided to try it out in a little Console app to test it, and was extremely happy. I did not do any performance testing, as that is not such a concern for the app that I was building, but from an API point of view, it would prove to be very easy to replace the Raven Embedded code I had written so far.

I was happy.

Just thought I would show you all a little bit of its usage right here

 

Installation

This is done via NuGet. The package is called “LiteDB”

CRUD

Assuming we have this entity

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace LiteDBDemo
{
    public class Customer
    {
        public int Id { get; set; }
        public string Name { get; set; }
        public string[] Phones { get; set; }
        public bool IsActive { get; set; }

        public override string ToString()
        {
            return string.Format("Id : {0}, Name : {1}, Phones : {2}, IsActive : {3}",
                Id,
                Name,
                Phones.Aggregate((x, y) => string.Format("{0}{1}", x, y)),
                IsActive);
        }
    }
}

Here is how you would might use LiteDB to perform CRUD operation. See how it has the concept of collections. This is kind of like MongoDB if you have used that.

using LiteDB;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace LiteDBDemo
{
    class Program
    {
        static void Main(string[] args)
        {
            // Open database (or create if not exits)
            using (var db = new LiteDatabase(@"MyData.db"))
            {
                //clean out entire collection, will drop documents, indexes everything
                db.GetCollection<Customer>("customers").Drop();


                Create(db);
                Read(db);
                Update(db);
                Delete(db);
            }
            Console.ReadLine();
        }

        private static void Create(LiteDatabase db)
        {
            Console.WriteLine("\r\nCREATE\r\n");

            // Get customer collection
            var customers = db.GetCollection<Customer>("customers");

            // Create your new customer instance
            var customer = new Customer
            {
                Name = "John Doe",
                Phones = new string[] { "8000-0000", "9000-0000" },
                IsActive = true
            };

            // Insert new customer document (Id will be auto-incremented)
            customers.Insert(customer);
            Console.WriteLine("Inserted customer");
        }

        private static void Read(LiteDatabase db)
        {
            Console.WriteLine("\r\nREAD\r\n");

            // Get customer collection
            var customers = db.GetCollection<Customer>("customers");

            // Index document using a document property
            customers.EnsureIndex(x => x.Name);

            // Use Linq to query documents
            var firstCustomer = customers.Find(x => x.Name.StartsWith("Jo")).FirstOrDefault();
            Console.WriteLine(firstCustomer);

        }

        private static void Update(LiteDatabase db)
        {
            Console.WriteLine("\r\nUPDATE\r\n");

            // Get customer collection
            var customers = db.GetCollection<Customer>("customers");
            // Use Linq to query documents
            var johnDoe = customers.Find(x => x.Name == "John Doe").First();
            Console.WriteLine("Before update");
            Console.WriteLine(johnDoe);

            johnDoe.Name = "John Doe MODIFIED";
            customers.Update(johnDoe);

            var johnDoe2 = customers.Find(x => x.Name == "John Doe MODIFIED").First();
            Console.WriteLine("Read updated");
            customers.Update(johnDoe2);
            Console.WriteLine(johnDoe2);

        }

        private static void Delete(LiteDatabase db)
        {
            Console.WriteLine("\r\nDELETE\r\n");

            // Get customer collection
            var customers = db.GetCollection<Customer>("customers");
            // Use Linq to query documents
            customers.Delete(x => x.Name == "John Doe MODIFIED");
            Console.WriteLine("Deleting Name = 'John Doe MODIFIED'");

            var johnDoe = customers.Find(x => x.Name == "John Doe MODIFIED").FirstOrDefault();
            Console.WriteLine("Looking for Name = 'John Doe MODIFIED'");
            Console.WriteLine(johnDoe == null ? "It's GONE" : johnDoe.ToString());


        }

    }
}

You can learn more about this over at the LiteDB website

http://www.litedb.org/

 

Overall I was very very happy with LiteDB and I particularly like the fact that is was free, and it did pretty much exactly the same as RavenDB Emebedded (sometimes it was easier to do as well).

I would use this library again for sure, I found it spot on to be honest.

Like a nice Gin and Tonic on a summers day.

 

C#

Entity framework 7 in memory provider test

A while ago I wrote an article that http://www.codeproject.com/Articles/875165/To-Repository-Or-NOT which talked about how to test repository classes using Entity Framework and not using Entity Framework.

It has been a while and I am just about to start a small project for one of the Admin staff at work, to aid her in her day to day activities.

As always there will be a database involved.

I will likely be using Owin and OR MVC5 with Aurelia.IO for Client side.

Not sure about DB, so I decided to try out the In Memory support in the yet to be released Entity Framework 7.

Grabbing The Nuget Package

So lets have a look. The first thing you will need to do is grab the Nuget package which for me was as easy as using the Nuget package window in Visual Studio 2015.

image

The package name is “EntityFramework.InMemory” this will bring in the other bits and pieces you need.

NOTE : This is a pre-release NuGet package so you will need to include prelease packages.

 

The Model

So now that I have the correct packages in place its just a question of crafting some model classes. I am using the following 2

Person

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace EF7_InMemoryProviderTest
{
    public class Person
    {
        public Person()
        {
            Qualifications = new List<Qualification>();
        }

        public int Id { get; set; }

        public string FirstName { get; set; }

        public string LastName { get; set; }

        public ICollection<Qualification> Qualifications { get; set; }

        public override string ToString()
        {
            string qualifications = Qualifications.Any() ?
                    Qualifications.Select(x => x.Description)
                        .Aggregate((x, y) => string.Format("{0} {1}", x, y)) :
                    string.Empty;

            return string.Format("Id : {0}, FirstName : {1}, LastName : {2}, \r\nQualifications : {3}\r\n",
                        Id, FirstName, LastName, qualifications);
        }
    }
}

Qualification

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace EF7_InMemoryProviderTest
{
    public class Qualification
    {
       
        public int Id { get; set; }

        public string Description { get; set; }

        public override string ToString()
        {
            return string.Format("Id : {0}, Description : {1}",
                        Id, Description);
        }

    }
}

 

Custom DbContext

Nothing more to it than that. So now lets look at creating a DbContext which has our stuff in it. For me this again is very simple, I just do this:

using Microsoft.Data.Entity;
using Microsoft.Data.Entity.Infrastructure;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace EF7_InMemoryProviderTest
{
    public class ClassDbContext : DbContext
    {
        public ClassDbContext(DbContextOptions options)
            : base(options)
        {
        }

        public DbSet<Person> Members { get; set; }
        public DbSet<Qualification> Qualifications { get; set; }
    }
}

Writing Some Test Code Using The InMemory Provider

So now that we have all the pieces in place, lets run some code to do a few things

  1. Seed some data
  2. Obtain a Person
  3. Add a Qualification to the Person obtained

Here is all the code to do this

using Microsoft.Data.Entity;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace EF7_InMemoryProviderTest
{
    class Program
    {
        static void Main(string[] args)
        {
            var optionsBuilder = new DbContextOptionsBuilder<ClassDbContext>();
            optionsBuilder.UseInMemoryDatabase();

            using (var classDbContext = new ClassDbContext(optionsBuilder.Options))
            {
                SeedData(classDbContext);

                var personId1 = GetMember(classDbContext, 1);
                Console.WriteLine("> Adding a qualication\r\n");
                personId1.Qualifications.Add(classDbContext.Qualifications.First());
                classDbContext.SaveChanges();

                personId1 = GetMember(classDbContext, 1);


                Console.ReadLine();

            }
        }

        private static Person GetMember(ClassDbContext classDbContext, int id)
        {
            var person = classDbContext.Members.FirstOrDefault(x => x.Id == id);
            Console.WriteLine(person);
            return person;
        }


        private static void SeedData(ClassDbContext classDbContext)
        {
            classDbContext.Members.Add(new Person()
                {
                    Id = 1,
                    FirstName = "Sacha",
                    LastName = "Barber"
                });
            classDbContext.Members.Add(new Person()
                {
                    Id = 2,
                    FirstName = "Sarah",
                    LastName = "Barber"
                });

            classDbContext.Qualifications.Add(new Qualification()
                {
                    Id = 1,
                    Description = "Bsc Hons : Computer Science"
                });
            classDbContext.Qualifications.Add(new Qualification()
                {
                    Id = 2,
                    Description = "Msc : Computer Science"
                });
            classDbContext.Qualifications.Add(new Qualification()
                {
                    Id = 3,
                    Description = "Bsc Hons : Naturapathic medicine"
                });

            classDbContext.SaveChanges();
        }
    }
}

And this is the results

image

Closing Note

Quite happy with how easy this was, and I think I would definitely try this out for real.

If you want to play along, I have a demo project (for Visual Studio 2015) here:

https://github.com/sachabarber/EF7_InMemoryTest

C#, Scala

.NET -> Scala Interop Using RabbitMQ

I have taken on a new job and we are using a lot of .NET, but we are also using lots of Scala. My work colleague and I were talking about how to get different Actor frameworks (well Akka in Java and Akka.NET in .NET) to talk to each other. This does not seem possible right now, as the exact wire protocol would need to be decided and implemented.

So we thought ok how about you stick to using your actor framework of choice and we would send messages from .NET to Scala via some sort of messaging solution, where we would send messages as JSON.

So I decided to take a stab at that. I have used a few messaging solutions in my time, I decided to go for RabbitMQ as that is something I know and like (still need to learn Kafka, one day Kafka one day).

If this sounds like it could be of interest to you, I wrote this up here:

http://www.codeproject.com/Articles/1037532/NET-Scala-Interop-Using-RabbitMQ

 

C#

App.Config Transforms Outside Of Web Project

This is a weird post in some ways as it is new for me, but certainly VERY old for others. I imagine for years web developer have known about how to use the Web.Config XSLT transforms MSBUILD task. If you have not heard of this, quite simply it allows you to have a single Web.Config file and a number of other config files where ONLY the transformations are declared. Such that when the XSLT transforms MSBUILD task runs, it will take the source Web.config file along with a transformation .config file and produce a new .config file which you would use as part of your deployment process.

 

I have myself known about this for years to, I have even known about the Microsoft MSBUILD teams Slow Cheetah project which allows you to use this same technique outside of web projects. Ting is what I have always done is had a bunch of .config files (so one for XXX.LIVE.config one for XXXX.QA.config) that I would rename and deploy by some clever scripts.

 

I recently had to do a bit of work on a project that made use of the Web.Config XSLT transforms MSBUILD task, and I could clearly see in the MSBUILD file that this just used a MSBUILD task. So I thought this must be easy enough to use stand alone. Turns out it is, you DO NOT really need to use Slow Cheetah at all.  You just need to know where the Web.Config XSLT transforms MSBUILD task is and how to use it.

 

The rest of this post will talk you through how to do this.

 

Suppose you have this App.Config you wish to transform

 

We will concentrate on just a few areas here, those area are the ones that are going to change between environments:

 

<?xml version="1.0" encoding="utf-8"?>
<configuration>
 
  <configSections>
    <section name="nlog" type="NLog.Config.ConfigSectionHandler, NLog" />
    <section name="shareSettings" type="SimpleConfig.Section, SimpleConfig" />
  </configSections>
 
  <shareSettings
      productName="Shipping"
      ftpPath="D:\ShippingRoutes">
  </shareSettings>
 
  <nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" 
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <extensions>
      <add assembly="Gelf4NLog.Target"/>
    </extensions>
    <targets async="true">
      <target name="graylog"
          xsi:type="graylog"
          hostip="dev-logging"
          hostport="12200"
          Facility="CoolService">
        <parameter name="exception" layout="${exception:format=tostring}" optional="true" />
        <parameter name="processname" layout="${processname}" />
        <parameter name="logger" layout="${logger}" />
        <parameter name="treadid" layout="${threadid}" />
      </target>
      <target name="file" xsi:type="File"
              layout="${longdate} | ${level} | ${message}${onexception:${newline}EXCEPTION\:${exception:format=tostring,StackTrace}}"
              fileName="c:/temp/CoolService-${shortdate}.log" />
    </targets>
    <rules>
      <logger name="NHibernate.*" minlevel="Off" writeTo="graylog" final="true" />
      <logger name="NHibernate.*" minlevel="Error" writeTo="file" final="true" />
      <logger name="*" minlevel="Off" writeTo="graylog" />
      <logger name="*" minlevel="trace" writeTo="file" />
    </rules>
  </nlog>
 
  <startup>
    <supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.5" />
  </startup>
 
  <system.serviceModel>
    <diagnostics performanceCounters="All" />
 
    <bindings>
      <netTcpBinding>
        <binding name="tcpBinding" 
		maxReceivedMessageSize="2147483647" 
		closeTimeout="00:59:00" 
		openTimeout="00:59:00" 
		receiveTimeout="00:59:00" 
		sendTimeout="00:59:00">
          <security mode="None" />
          <readerQuotas maxStringContentLength="8192" 
			maxArrayLength="20971520" />
        </binding>
      </netTcpBinding>
    </bindings>
 
 
    <client>
      <!-- CoolService -->
      <endpoint name="coolServiceEndpoint" 
	        address="net.tcp://localhost:63006/CoolService" 
		binding="netTcpBinding"
                bindingConfiguration="tcpBinding" 
                contract="Services.ICoolService" />
    </client>
  </system.serviceModel>
 
  <system.diagnostics>
    <sources>
      <source 	name="System.ServiceModel" 
		switchValue="All" 
		propagateActivity="true">
        <listeners>
          <add name="traceListener" 
		type="System.Diagnostics.XmlWriterTraceListener" 
		initializeData="c:\temp\CoolService.svclog"/>
        </listeners>
      </source>
    </sources>
  </system.diagnostics>
 
 
</configuration>

 

  • Custom config section (NOTE I am using SimpleConfig to do that, which is awesome)
  • NLog logging settings
  • WCF client section
  • Diagnostics WCF section

 

So Now Show Me The Transformations

Now this post will not (as is not meant to) teach you all about the Web.Config XSLT transforms MSBUILD task, but rather shall show you an example. So on with the example, suppose we want to create a LIVE config file where we change the following:

 

  • Custom config section (NOTE I am using SimpleConfig to do that, which is awesome) (CHANGE ATTRIBUTES)
  • NLog logging settings (CHANGE Logger/Target)
  • WCF client section (CHANGE ADDRESS)
  • Diagnostics WCF section (REMOVE IT)

 

Here is how we could do that (say its called “CoolService.LIVE.config”) :

 

<?xml version="1.0"?>
<configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
  <shareSettings xdt:Transform="SetAttributes" 
		xdt:Locator="Match(productName)"  
		productName="Shipping"
      		ftpPath="\\shipping\ShippingRoutes" />
                 
  <nlog xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 		
        xmlns="http://www.nlog-project.org/schemas/NLog.xsd">
      <targets>
      	<target xdt:Transform="SetAttributes" 
		xdt:Locator="Match(name)" 
		name="graylog" 
		hostip="app-logging" />
                                               
      	<target xdt:Transform="SetAttributes" 
		xdt:Locator="Match(name)" 
		name="file" 
		fileName="D:/logs/CoolService-${shortdate}.log" />
     </targets>
     <rules>
     	<logger xdt:Transform="SetAttributes" 
		xdt:Locator="Match(writeTo)" 
		minlevel="trace" 
		writeTo="graylog"/>
     </rules>
  </nlog>
 
  <system.serviceModel>
    <client>
      <endpoint xdt:Transform="SetAttributes" 
		xdt:Locator="Match(name)"
		name="coolServiceEndpoint" 		
	        address="net.tcp://appCoolService:63006/CoolService"  />
    </client>
  </system.serviceModel>
 
  <system.diagnostics xdt:Transform="Remove" />

</configuration>

 

 

So How Do We Apply These Transforms

To actually apply these transforms, we can easily craft a simple MSBUILD project file, such as (say its called “Transforms.proj”):

 

<Project ToolsVersion="4.0" 
	DefaultTargets="Release" 
	xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
    <UsingTask 
	TaskName="TransformXml" 
	AssemblyFile="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v12.0\Web\Microsoft.Web.Publishing.Tasks.dll"/>
 
    <ItemGroup>
        <Config Include="LIVE"><Environment>LIVE</Environment></Config>
        <Config Include="QA"><Environment>QA</Environment></Config>       
    </ItemGroup>
 
    <Target Name="Release">
        <MakeDir Directories="CoolService\Configuration\%(Config.Environment)"/>
 
        <TransformXml Source="App.config"
                     Transform="CoolService.%(Config.Identity).config"
                     Destination="CoolService\Configuration\%(Config.Environment)\CoolService.exe.config"/>
    </Target>
</Project>

 

Where the $(MsBuildExtensionsPath) will likely be something like “C:\Program Files (x86)\MSBuild\”. So once we have  a MSBUILD file like this in place it is just a simple matter of running MSBUILD something like

 

MSBUILD Transforms.proj

 

Which will result in the following being produced:

 

<?xml version="1.0" encoding="utf-8"?>
<configuration>
 
  <configSections>
    <section name="nlog" type="NLog.Config.ConfigSectionHandler, NLog" />
    <section name="shareSettings" type="SimpleConfig.Section, SimpleConfig" />
  </configSections>
 
  <shareSettings
      productName="Shipping"
      ftpPath="\\shipping\ShippingRoutes">
  </shareSettings>
 
  <nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" 
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <extensions>
      <add assembly="Gelf4NLog.Target"/>
    </extensions>
    <targets async="true">
      <target name="graylog"
          xsi:type="graylog"
          hostip="app-logging"
          hostport="12200"
          Facility="CoolService">
        <parameter name="exception" layout="${exception:format=tostring}" optional="true" />
        <parameter name="processname" layout="${processname}" />
        <parameter name="logger" layout="${logger}" />
        <parameter name="treadid" layout="${threadid}" />
      </target>
      <target name="file" xsi:type="File"
              layout="${longdate} | ${level} | ${message}${onexception:${newline}EXCEPTION\:${exception:format=tostring,StackTrace}}"
              fileName="D:/logs/CoolService-${shortdate}.log" />
    </targets>
    <rules>
      <logger name="NHibernate.*" minlevel="trace" writeTo="graylog" final="true" />
      <logger name="NHibernate.*" minlevel="Error" writeTo="file" final="true" />
      <logger name="*" minlevel="trace" writeTo="graylog" />
      <logger name="*" minlevel="trace" writeTo="file" />
    </rules>
  </nlog>
 
  <startup>
    <supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.5" />
  </startup>
 
  <system.serviceModel>
    <diagnostics performanceCounters="All" />
 
    <bindings>
      <netTcpBinding>
        <binding name="tcpBinding" 
		maxReceivedMessageSize="2147483647" 
		closeTimeout="00:59:00" 
		openTimeout="00:59:00" 
		receiveTimeout="00:59:00" 
		sendTimeout="00:59:00">
          <security mode="None" />
          <readerQuotas maxStringContentLength="8192" 
			maxArrayLength="20971520" />
        </binding>
      </netTcpBinding>
    </bindings>
 
 
    <client>
      <!-- CoolService -->
      <endpoint name="coolServiceEndpoint" 
	        address="net.tcp://appCoolService:63006/CoolService" 
		binding="netTcpBinding"
                bindingConfiguration="tcpBinding" 
                contract="Services.ICoolService" />
    </client>
  </system.serviceModel>
 
 
</configuration>
C#, Distributed Systems

A Look At Akka .NET

A while back I wrote an Actor model for NetMQ (the .NET port of ZeroMQ), which is now part of the live codebase, I was happy with this.

 

I do like the idea of Actor Models, where you spin up and talk to an actor, rather than worry about locks/semaphores etc etc.

 

It just gels with me rather well. To this end I have been experimenting with Akka.NET which is a pretty complete port of the original Akka, it is a lot of fun, and a really nice way to write distributed multithreaded code if you ask me.

 

To this end I have written a small article, which should be viewed as an introductory article on Akka.NET. If you like the sound of this you can read the full article over here at Codeproject :

 

http://www.codeproject.com/Articles/1007161/A-Look-saAt-Akka-NET

 

Enjoy