Category Archives: C#

Update scheduled Quartz.Net job by monitoring App.Config

 

Introduction

So I was back in .NET land the other day at work, where I had to schedule some code to run periodically on some schedule.

The business also needed this schedule to be adjustable, so that they could adjust it when things were busier and wind it down when they are not

This adjusting of the schedule time would be done via a setting in the App.Config, where the App.Config is monitored for changes. If there is a change then we would look to use the new schedule value from the App.Config to run the job. Ideally the app must not go down to afford this change of job schedule time.

There are some good job / scheduling libraries out there, but for this I just wanted to use something light weight so I went with Quartz.net

Its easy to setup and use, and has a fairly nice API, supports IOC and CRON schedules. In short it fits the bill

In a netshell this post will simple talk about how you can adjust the schedule of a ALREADY scheduled job, there will also be some special caveats that I personally had to deal with in my requirements, which may or may not be an issue for you

 

Some Weird Issues That I Needed To Cope With

So let me just talk about some of the issues that I had to deal with

The guts of the job code that I run on my schedule is actually writing to Azure Blob Storage and then to Azure SQL DW tables. And as such has several writes to several components one after another.

So this run of the current job run MUST be allowed to complete in FULL (or fail using Exception handling that’s ok to). It would not be acceptable to just stop the Quartz job while there is work in flight.

I guess some folk may be thinking of some sort of transaction here, that must either commit or rollback. Unfortunately that doesn’t work with Azure Blob Storage uploads.

So I had to think of another plan.

So here is what I came up with. I would use threading primitives namely an AutoResetEvent that would control when the Quartz.net job could be changed to use a new schedule.

if a change in the App.Config was seen, then we know that we “should” be using a new schedule time, however the scheduled job MAY have work in flight. So we need to wait for that work to complete (or fail) before we could think about swapping the Quartz.net scheduler time.

So that is what I went for, there are a few other things to be aware of such as I needed threading primitives that worked with Async/Await code. Luckily Stephen Toub from the TPL team has done that for us : asyncautoresetevent

There is also the well known fact that the FileSystemWatcher class fires events twice : http://lmgtfy.com/?q=filesystemwatcher+firing+twice

So as we go through the code you will see how I dealt with those

The Code

Ok so now that we have talked about the problem, lets go through the code.

There are several NuGet packages I am using to make my life easier

So lets start with the entry point, which for me is the simple Program class shown below

using System;
using System.Collections.Generic;
using System.Linq;
using System.Reflection;
using System.Security.Principal;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using Autofac;
using SachaBarber.QuartzJobUpdate.Services;
using Topshelf;


namespace SachaBarber.QuartzJobUpdate
{
    static class Program
    {
        private static ILogger _log = null;


        [STAThread]
        public static void Main()
        {
            try
            {
                var container = ContainerOperations.Container;
                _log = container.Resolve<ILogger>();
                _log.Log("Starting");

                AppDomain.CurrentDomain.UnhandledException += AppDomainUnhandledException;
                TaskScheduler.UnobservedTaskException += TaskSchedulerUnobservedTaskException;
                Thread.CurrentPrincipal = new WindowsPrincipal(WindowsIdentity.GetCurrent());
                
                HostFactory.Run(c =>                                 
                {
                    c.Service<SomeWindowsService>(s =>                        
                    {
                        s.ConstructUsing(() => container.Resolve<SomeWindowsService>());
                        s.WhenStarted(tc => tc.Start());             
                        s.WhenStopped(tc => tc.Stop());               
                    });
                    c.RunAsLocalSystem();                            

                    c.SetDescription("Uploads Calc Payouts/Summary data into Azure blob storage for RiskStore DW ingestion");       
                    c.SetDisplayName("SachaBarber.QuartzJobUpdate");                       
                    c.SetServiceName("SachaBarber.QuartzJobUpdate");                      
                });
            }
            catch (Exception ex)
            {
                _log.Log(ex.Message);
            }
            finally
            {
                _log.Log("Closing");
            }
        }
      

        private static void AppDomainUnhandledException(object sender, UnhandledExceptionEventArgs e)
        {
            ProcessUnhandledException((Exception)e.ExceptionObject);
        }

        private static void TaskSchedulerUnobservedTaskException(object sender, UnobservedTaskExceptionEventArgs e)
        {
            ProcessUnhandledException(e.Exception);
            e.SetObserved();
        }

        private static void ProcessUnhandledException(Exception ex)
        {
            if (ex is TargetInvocationException)
            {
                ProcessUnhandledException(ex.InnerException);
                return;
            }
            _log.Log("Error");
        }
    }
}

All this does it host the actual windows service class for me using TopShelf. Where the actual service class looks like this

using System;
using System.Configuration;
using System.IO;
using System.Reactive.Disposables;
using System.Reactive.Linq;
using System.Xml.Linq;
using Autofac;
using SachaBarber.QuartzJobUpdate.Async;
using SachaBarber.QuartzJobUpdate.Configuration;
using SachaBarber.QuartzJobUpdate.Jobs;
using SachaBarber.QuartzJobUpdate.Services;
//logging
using Quartz;

namespace SachaBarber.QuartzJobUpdate
{
    public class SomeWindowsService
    {
        private readonly ILogger _log;
        private readonly ISchedulingAssistanceService _schedulingAssistanceService;
        private readonly IRxSchedulerService _rxSchedulerService;
        private readonly IObservableFileSystemWatcher _observableFileSystemWatcher;
        private IScheduler _quartzScheduler;
        private readonly AsyncLock _lock = new AsyncLock();
        private readonly SerialDisposable _configWatcherDisposable = new SerialDisposable();
        private static readonly JobKey _someScheduledJobKey = new JobKey("SomeScheduledJobKey");
        private static readonly TriggerKey _someScheduledJobTriggerKey = new TriggerKey("SomeScheduledJobTriggerKey");

        public SomeWindowsService(
            ILogger log,
            ISchedulingAssistanceService schedulingAssistanceService, 
            IRxSchedulerService rxSchedulerService,
            IObservableFileSystemWatcher observableFileSystemWatcher)
        {
            _log = log;
            _schedulingAssistanceService = schedulingAssistanceService;
            _rxSchedulerService = rxSchedulerService;
            _observableFileSystemWatcher = observableFileSystemWatcher;
        }

        public void Start()
        {
            try
            {
                var ass = typeof (SomeWindowsService).Assembly;
                var configFile = $"{ass.Location}.config"; 
                CreateConfigWatcher(new FileInfo(configFile));


                _log.Log("Starting SomeWindowsService");

                _quartzScheduler = ContainerOperations.Container.Resolve<IScheduler>();
                _quartzScheduler.JobFactory = new AutofacJobFactory(ContainerOperations.Container);
                _quartzScheduler.Start();

                //create the Job
                CreateScheduledJob();
            }
            catch (JobExecutionException jeex)
            {
                _log.Log(jeex.Message);
            }
            catch (SchedulerConfigException scex)
            {
                _log.Log(scex.Message);
            }
            catch (SchedulerException sex)
            {
                _log.Log(sex.Message);
            }

        }

        public void Stop()
        {
            _log.Log("Stopping SomeWindowsService");
            _quartzScheduler?.Shutdown();
            _configWatcherDisposable.Dispose();
            _observableFileSystemWatcher.Dispose();
        }


        private void CreateConfigWatcher(FileInfo configFileInfo)
        {
            FileSystemWatcher watcher = new FileSystemWatcher();
            watcher.Path = configFileInfo.DirectoryName;
            watcher.NotifyFilter = 
                NotifyFilters.LastAccess | 
                NotifyFilters.LastWrite | 
                NotifyFilters.FileName | 
                NotifyFilters.DirectoryName;
            watcher.Filter = configFileInfo.Name;
            _observableFileSystemWatcher.SetFile(watcher);
            //FSW is notorious for firing twice see here : 
            //http://stackoverflow.com/questions/1764809/filesystemwatcher-changed-event-is-raised-twice
            //so lets use Rx to Throttle it a bit
            _configWatcherDisposable.Disposable = _observableFileSystemWatcher.Changed.SubscribeOn(
                _rxSchedulerService.TaskPool).Throttle(TimeSpan.FromMilliseconds(500)).Subscribe(
                    async x =>
                    {
                        //at this point the config has changed, start a critical section
                        using (var releaser = await _lock.LockAsync())
                        {
                            //tell current scheduled job that we need to read new config, and wait for it
                            //to signal us that we may continue
                            _log.Log($"Config file {configFileInfo.Name} has changed, attempting to read new config data");
                            _schedulingAssistanceService.RequiresNewSchedulerSetup = true;
                            _schedulingAssistanceService.SchedulerRestartGate.WaitAsync().GetAwaiter().GetResult();
                            //recreate the AzureBlobConfiguration, and recreate the scheduler using new settings
                            ConfigurationManager.RefreshSection("schedulingConfiguration");
                            var newSchedulingConfiguration = SimpleConfig.Configuration.Load<SchedulingConfiguration>();
                            _log.Log($"SchedulingConfiguration section is now : {newSchedulingConfiguration}");
                            ContainerOperations.ReInitialiseSchedulingConfiguration(newSchedulingConfiguration);
                            ReScheduleJob();
                        }
                    },
                    ex =>
                    {
                        _log.Log($"Error encountered attempting to read new config data from config file {configFileInfo.Name}");
                    });
        }

        private void CreateScheduledJob(IJobDetail existingJobDetail = null)
        {
            var azureBlobConfiguration = ContainerOperations.Container.Resolve<SchedulingConfiguration>();
            IJobDetail job = JobBuilder.Create<SomeQuartzJob>()
                    .WithIdentity(_someScheduledJobKey)
                    .Build();

            ITrigger trigger = TriggerBuilder.Create()
                .WithIdentity(_someScheduledJobTriggerKey)
                .WithSimpleSchedule(x => x
                    .RepeatForever()
                    .WithIntervalInSeconds(azureBlobConfiguration.ScheduleTimerInMins)
                )
                .StartAt(DateTimeOffset.Now.AddSeconds(azureBlobConfiguration.ScheduleTimerInMins))
                .Build();

            _quartzScheduler.ScheduleJob(job, trigger);
        }

        private void ReScheduleJob()
        {
            if (_quartzScheduler != null)
            {
                _quartzScheduler.DeleteJob(_someScheduledJobKey);
                CreateScheduledJob();
            }
        }
    }


}

There is a fair bit going on here. So lets list some of the work this code does

  • It creates the initial Quartz.Net job and scheduled it using the values from a custom config section which are read into an object
  • It watches the config file for changes (we will go through that in a moment) and will wait on the AsyncAutoResetEvent to be signalled, at which point it will recreate the Quartz.net job

So lets have a look at some of the small helper parts

This is a simple Rx based file system watcher. The reason Rx is good here is that you can Throttle the events (see this post FileSystemWatcher raises 2 events)

using System;
using System.IO;
using System.Reactive.Linq;

namespace SachaBarber.QuartzJobUpdate.Services
{
    public class ObservableFileSystemWatcher : IObservableFileSystemWatcher
    {
        private FileSystemWatcher _watcher;

        public void SetFile(FileSystemWatcher watcher)
        {
            _watcher = watcher;

            Changed = Observable
                .FromEventPattern<FileSystemEventHandler, FileSystemEventArgs>
                (h => _watcher.Changed += h, h => _watcher.Changed -= h)
                .Select(x => x.EventArgs);

            Renamed = Observable
                .FromEventPattern<RenamedEventHandler, RenamedEventArgs>
                (h => _watcher.Renamed += h, h => _watcher.Renamed -= h)
                .Select(x => x.EventArgs);

            Deleted = Observable
                .FromEventPattern<FileSystemEventHandler, FileSystemEventArgs>
                (h => _watcher.Deleted += h, h => _watcher.Deleted -= h)
                .Select(x => x.EventArgs);

            Errors = Observable
                .FromEventPattern<ErrorEventHandler, ErrorEventArgs>
                (h => _watcher.Error += h, h => _watcher.Error -= h)
                .Select(x => x.EventArgs);

            Created = Observable
                .FromEventPattern<FileSystemEventHandler, FileSystemEventArgs>
                (h => _watcher.Created += h, h => _watcher.Created -= h)
                .Select(x => x.EventArgs);

            All = Changed.Merge(Renamed).Merge(Deleted).Merge(Created);
            _watcher.EnableRaisingEvents = true;
        }

        public void Dispose()
        {
            _watcher.EnableRaisingEvents = false;
            _watcher.Dispose();
        }

        public IObservable<FileSystemEventArgs> Changed { get; private set; }
        public IObservable<RenamedEventArgs> Renamed { get; private set; }
        public IObservable<FileSystemEventArgs> Deleted { get; private set; }
        public IObservable<ErrorEventArgs> Errors { get; private set; }
        public IObservable<FileSystemEventArgs> Created { get; private set; }
        public IObservable<FileSystemEventArgs> All { get; private set; }
    }
}

And this is a small utility class that will contain the results of the custom config section that may be read using SimpleConfig

namespace SachaBarber.QuartzJobUpdate.Configuration
{
    public class SchedulingConfiguration
    {
        public int ScheduleTimerInMins { get; set; }

        public override string ToString()
        {
            return $"ScheduleTimerInMins: {ScheduleTimerInMins}";
        }
    }
}

Which you read from the App.Config like this

 var newSchedulingConfiguration = SimpleConfig.Configuration.Load<SchedulingConfiguration>();

And this is the Async/Await compatible AutoResetEvent that I took from Stephen Toubs blog

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace SachaBarber.QuartzJobUpdate.Async
{
    public class AsyncAutoResetEvent
    {
        private static readonly Task Completed = Task.FromResult(true);
        private readonly Queue<TaskCompletionSource<bool>> _waits = new Queue<TaskCompletionSource<bool>>();
        private bool _signaled;

        public Task WaitAsync()
        {
            lock (_waits)
            {
                if (_signaled)
                {
                    _signaled = false;
                    return Completed;
                }
                else
                {
                    var tcs = new TaskCompletionSource<bool>();
                    _waits.Enqueue(tcs);
                    return tcs.Task;
                }
            }
        }

        public void Set()
        {
            TaskCompletionSource<bool> toRelease = null;
            lock (_waits)
            {
                if (_waits.Count > 0)
                    toRelease = _waits.Dequeue();
                else if (!_signaled)
                    _signaled = true;
            }
            toRelease?.SetResult(true);
        }
    }
}

So the last part of the puzzle is how does the AsynAutoReset event get signalled?

Well as we said above we need to wait for any in progress work to complete first. So the way I tackled that was that within the job code that gets run every Quartz.Net scheduler tick time, we just check whether we have be requested to swap out the current schedule time, and if so we should signal the waiting code of the (shared) AsyncAutoResetEvent, otherwise we just carry on and do the regular job work.

The way that we get the AsyncAutoResetEvent that is used by the waiting code and also the job code (to signal it) is via using a singleton registration in an IOC container. I am using AutoFac which I set up like this, but you could have your own singleton, or IOC container of choice that you could use.

The trick is to make sure that both classes that need to access the AsyncAutoResetEvent use a single instance.

using System;
using System.Reflection;
using Autofac;
using SachaBarber.QuartzJobUpdate.Configuration;
using SachaBarber.QuartzJobUpdate.Services;
using Quartz;
using Quartz.Impl;

namespace SachaBarber.QuartzJobUpdate
{
    public class ContainerOperations
    {
        private static Lazy<IContainer> _containerSingleton = 
            new Lazy<IContainer>(CreateContainer);

        public static IContainer Container => _containerSingleton.Value;

        public static void ReInitialiseSchedulingConfiguration(
            SchedulingConfiguration newSchedulingConfiguration)
        {
            var currentSchedulingConfiguration = 
                Container.Resolve<SchedulingConfiguration>();
            currentSchedulingConfiguration.ScheduleTimerInMins = 
                newSchedulingConfiguration.ScheduleTimerInMins;
        }
        

        private static IContainer CreateContainer()
        {
            var builder = new ContainerBuilder();
            builder.RegisterType<ObservableFileSystemWatcher>()
                .As<IObservableFileSystemWatcher>().ExternallyOwned();
            builder.RegisterType<RxSchedulerService>()
                .As<IRxSchedulerService>().ExternallyOwned();
            builder.RegisterType<Logger>().As<ILogger>().ExternallyOwned();
            builder.RegisterType<SomeWindowsService>();
            builder.RegisterInstance(new SchedulingAssistanceService())
                .As<ISchedulingAssistanceService>();
            builder.RegisterInstance(
                SimpleConfig.Configuration.Load<SchedulingConfiguration>());

            // Quartz/jobs
            builder.Register(c => new StdSchedulerFactory().GetScheduler())
                .As<Quartz.IScheduler>();
            builder.RegisterAssemblyTypes(Assembly.GetExecutingAssembly())
                .Where(x => typeof(IJob).IsAssignableFrom(x));
            return builder.Build();
        }

        
    }
}

Where the shared instance in my case is this class

using SachaBarber.QuartzJobUpdate.Async;

namespace SachaBarber.QuartzJobUpdate.Services
{
    public class SchedulingAssistanceService : ISchedulingAssistanceService
    {
        public SchedulingAssistanceService()
        {
            SchedulerRestartGate = new AsyncAutoResetEvent();
            RequiresNewSchedulerSetup = false;
        }    

        public AsyncAutoResetEvent SchedulerRestartGate { get; }
        public bool RequiresNewSchedulerSetup { get; set; }
    }
}

Here is the actual job code that will check to see if a change in the App.Config has been detected. Which would require this code to signal the waiting code that it may continue.

using System;
using System.IO;
using System.Net;
using System.Threading.Tasks;
using Quartz;

namespace SachaBarber.QuartzJobUpdate.Services
{
    public class SomeQuartzJob : IJob
    {
        private readonly ILogger _log;
        private readonly ISchedulingAssistanceService _schedulingAssistanceService;

        public SomeQuartzJob(
            ILogger log, 
            ISchedulingAssistanceService schedulingAssistanceService)
        {
            _log = log;
            _schedulingAssistanceService = schedulingAssistanceService;
        }


        public void Execute(IJobExecutionContext context)
        {
            try
            {
                ExecuteAsync(context).GetAwaiter().GetResult();
            }
            catch (JobExecutionException jeex)
            {
                _log.Log(jeex.Message);
                throw;
            }
            catch (SchedulerConfigException scex)
            {
                _log.Log(scex.Message);
                throw;
            }
            catch (SchedulerException sex)
            {
                _log.Log(sex.Message);
                throw;
            }
            catch (ArgumentNullException anex)
            {
                _log.Log(anex.Message);
                throw;
            }
            catch (OperationCanceledException ocex)
            {
                _log.Log(ocex.Message);
                throw;
            }
            catch (IOException ioex)
            {
                _log.Log(ioex.Message);
                throw;
            }
        }


        /// <summary>
        /// This is called every time the Quartz.net scheduler CRON time ticks
        /// </summary>
        public async Task ExecuteAsync(IJobExecutionContext context)
        {
            await Task.Run(async () =>
            {
                if (_schedulingAssistanceService.RequiresNewSchedulerSetup)
                {
                    //signal the waiting scheduler restart code that it can now restart the scheduler
                    _schedulingAssistanceService.RequiresNewSchedulerSetup = false;
                    _log.Log("Job has been asked to stop, to allow job reschedule due to change in config");
                    _schedulingAssistanceService.SchedulerRestartGate.Set();
                }
                else
                {
                    await Task.Delay(1000);
                    _log.Log("Doing the uninterruptible work now");
                }
            });
        }
    }
}

So when the AsyncAutoResetEvent is signalled the waiting code (inside the subscribe code of the Rx file system watcher inside the SomeWindowsService.cs code) will proceed to swap out the Quartz.Net scheduler time.

It can do this safely as we know there is NO work in flight as the job has told this waiting to code to proceed, which it can only do if there is no work in flight.

This swapping over of the scheduler time to use the newly read App.Config values is also protected in an AsyncLock class (again taken from Stephen Toub)

using System;
using System.Threading;
using System.Threading.Tasks;

namespace SachaBarber.QuartzJobUpdate.Async
{
    /// <summary>
    /// See http://blogs.msdn.com/b/pfxteam/archive/2012/02/12/10266988.aspx
    /// from the fabulous Stephen Toub
    /// </summary>    
    public class AsyncLock
    {
        private readonly AsyncSemaphore m_semaphore;
        private readonly Task<Releaser> m_releaser;

        public AsyncLock()
        {
            m_semaphore = new AsyncSemaphore(1);
            m_releaser = Task.FromResult(new Releaser(this));
        }

        public Task<Releaser> LockAsync()
        {
            var wait = m_semaphore.WaitAsync();
            return wait.IsCompleted ?
                m_releaser :
                wait.ContinueWith((_, state) => new Releaser((AsyncLock)state),
                    this, CancellationToken.None,
                    TaskContinuationOptions.ExecuteSynchronously, TaskScheduler.Default);
        }

        public struct Releaser : IDisposable
        {
            private readonly AsyncLock m_toRelease;

            internal Releaser(AsyncLock toRelease) { m_toRelease = toRelease; }

            public void Dispose()
            {
                if (m_toRelease != null)
                    m_toRelease.m_semaphore.Release();
            }
        }
    }
}

Where this relies on AsyncSemaphore

using System;
using System.Collections.Generic;
using System.Threading.Tasks;

namespace SachaBarber.QuartzJobUpdate.Async
{
    /// <summary>
    /// See http://blogs.msdn.com/b/pfxteam/archive/2012/02/12/10266983.aspx
    /// from the fabulous Stephen Toub
    /// </summary>
    public class AsyncSemaphore
    {
        private static readonly Task s_completed = Task.FromResult(true);
        private readonly Queue<TaskCompletionSource<bool>> _mWaiters = new Queue<TaskCompletionSource<bool>>();
        private int _mCurrentCount;

        public AsyncSemaphore(int initialCount)
        {
            if (initialCount < 0) throw new ArgumentOutOfRangeException("initialCount");
            _mCurrentCount = initialCount;
        }

        public Task WaitAsync()
        {
            lock (_mWaiters)
            {
                if (_mCurrentCount > 0)
                {
                    --_mCurrentCount;
                    return s_completed;
                }
                else
                {
                    var waiter = new TaskCompletionSource<bool>();
                    _mWaiters.Enqueue(waiter);
                    return waiter.Task;
                }
            }
        }


        public void Release()
        {
            TaskCompletionSource<bool> toRelease = null;
            lock (_mWaiters)
            {
                if (_mWaiters.Count > 0)
                    toRelease = _mWaiters.Dequeue();
                else
                    ++_mCurrentCount;
            }
            if (toRelease != null)
                toRelease.SetResult(true);
        }


    }
}

Just for completeness this is how you get an App.Config section to refresh at runtime

 ConfigurationManager.RefreshSection("schedulingConfiguration");

Anyway this works fine for me, I now have a reactive app that changes to changes in the App.Config without the need to restart the app, and it does so by allowing inflight work to be completed.

Hope it helps someone out there

 

Where Is The Code?

The code can be found here : : https://github.com/sachabarber/SachaBarber.QuartzJobUpdate

Advertisements

OuT WITH RAVEN embedded In with litedb

I recently starting working (I have now finished it) writing a small internal web site using the following things

  • WebApi2
  • OAuth
  • JWT support
  • OWIN
  • AutoFac
  • Raven DB Embedded
  • Aurelia.io for front end

I have to say it worked out great, It was a pleasuree to work on, all the way through.

I quite like Raven embedded, for this type of app. Its completely stand alone, and does just what I need from it.

So I got the end of the project, and I was pretty sure I checked that we had licenses for everything I was using. Turns out we didn’t have one for RavenDB.

Mmm. This app was a tool really to help us internally so we did not want to spend that much on it.

Shame as I like Raven. I started to look around for another tool that could fit the bill.

This was my shopping list

  • Had to be .NET
  • Had to support document storage
  • Had to have LINQ support
  • Had to support same set of features that I was using as Raven Embedded (CRUD + indexes essentially)
  • Had to be free
  • Had to be embedded as single Dll

It did not take me long to stumble upon LiteDB.

This ticked all my boxes and more. I decided to try it out in a little Console app to test it, and was extremely happy. I did not do any performance testing, as that is not such a concern for the app that I was building, but from an API point of view, it would prove to be very easy to replace the Raven Embedded code I had written so far.

I was happy.

Just thought I would show you all a little bit of its usage right here

 

Installation

This is done via NuGet. The package is called “LiteDB”

CRUD

Assuming we have this entity

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace LiteDBDemo
{
    public class Customer
    {
        public int Id { get; set; }
        public string Name { get; set; }
        public string[] Phones { get; set; }
        public bool IsActive { get; set; }

        public override string ToString()
        {
            return string.Format("Id : {0}, Name : {1}, Phones : {2}, IsActive : {3}",
                Id,
                Name,
                Phones.Aggregate((x, y) => string.Format("{0}{1}", x, y)),
                IsActive);
        }
    }
}

Here is how you would might use LiteDB to perform CRUD operation. See how it has the concept of collections. This is kind of like MongoDB if you have used that.

using LiteDB;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace LiteDBDemo
{
    class Program
    {
        static void Main(string[] args)
        {
            // Open database (or create if not exits)
            using (var db = new LiteDatabase(@"MyData.db"))
            {
                //clean out entire collection, will drop documents, indexes everything
                db.GetCollection<Customer>("customers").Drop();


                Create(db);
                Read(db);
                Update(db);
                Delete(db);
            }
            Console.ReadLine();
        }

        private static void Create(LiteDatabase db)
        {
            Console.WriteLine("\r\nCREATE\r\n");

            // Get customer collection
            var customers = db.GetCollection<Customer>("customers");

            // Create your new customer instance
            var customer = new Customer
            {
                Name = "John Doe",
                Phones = new string[] { "8000-0000", "9000-0000" },
                IsActive = true
            };

            // Insert new customer document (Id will be auto-incremented)
            customers.Insert(customer);
            Console.WriteLine("Inserted customer");
        }

        private static void Read(LiteDatabase db)
        {
            Console.WriteLine("\r\nREAD\r\n");

            // Get customer collection
            var customers = db.GetCollection<Customer>("customers");

            // Index document using a document property
            customers.EnsureIndex(x => x.Name);

            // Use Linq to query documents
            var firstCustomer = customers.Find(x => x.Name.StartsWith("Jo")).FirstOrDefault();
            Console.WriteLine(firstCustomer);

        }

        private static void Update(LiteDatabase db)
        {
            Console.WriteLine("\r\nUPDATE\r\n");

            // Get customer collection
            var customers = db.GetCollection<Customer>("customers");
            // Use Linq to query documents
            var johnDoe = customers.Find(x => x.Name == "John Doe").First();
            Console.WriteLine("Before update");
            Console.WriteLine(johnDoe);

            johnDoe.Name = "John Doe MODIFIED";
            customers.Update(johnDoe);

            var johnDoe2 = customers.Find(x => x.Name == "John Doe MODIFIED").First();
            Console.WriteLine("Read updated");
            customers.Update(johnDoe2);
            Console.WriteLine(johnDoe2);

        }

        private static void Delete(LiteDatabase db)
        {
            Console.WriteLine("\r\nDELETE\r\n");

            // Get customer collection
            var customers = db.GetCollection<Customer>("customers");
            // Use Linq to query documents
            customers.Delete(x => x.Name == "John Doe MODIFIED");
            Console.WriteLine("Deleting Name = 'John Doe MODIFIED'");

            var johnDoe = customers.Find(x => x.Name == "John Doe MODIFIED").FirstOrDefault();
            Console.WriteLine("Looking for Name = 'John Doe MODIFIED'");
            Console.WriteLine(johnDoe == null ? "It's GONE" : johnDoe.ToString());


        }

    }
}

You can learn more about this over at the LiteDB website

http://www.litedb.org/

 

Overall I was very very happy with LiteDB and I particularly like the fact that is was free, and it did pretty much exactly the same as RavenDB Emebedded (sometimes it was easier to do as well).

I would use this library again for sure, I found it spot on to be honest.

Like a nice Gin and Tonic on a summers day.

 

Entity framework 7 in memory provider test

A while ago I wrote an article that http://www.codeproject.com/Articles/875165/To-Repository-Or-NOT which talked about how to test repository classes using Entity Framework and not using Entity Framework.

It has been a while and I am just about to start a small project for one of the Admin staff at work, to aid her in her day to day activities.

As always there will be a database involved.

I will likely be using Owin and OR MVC5 with Aurelia.IO for Client side.

Not sure about DB, so I decided to try out the In Memory support in the yet to be released Entity Framework 7.

Grabbing The Nuget Package

So lets have a look. The first thing you will need to do is grab the Nuget package which for me was as easy as using the Nuget package window in Visual Studio 2015.

image

The package name is “EntityFramework.InMemory” this will bring in the other bits and pieces you need.

NOTE : This is a pre-release NuGet package so you will need to include prelease packages.

 

The Model

So now that I have the correct packages in place its just a question of crafting some model classes. I am using the following 2

Person

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace EF7_InMemoryProviderTest
{
    public class Person
    {
        public Person()
        {
            Qualifications = new List<Qualification>();
        }

        public int Id { get; set; }

        public string FirstName { get; set; }

        public string LastName { get; set; }

        public ICollection<Qualification> Qualifications { get; set; }

        public override string ToString()
        {
            string qualifications = Qualifications.Any() ?
                    Qualifications.Select(x => x.Description)
                        .Aggregate((x, y) => string.Format("{0} {1}", x, y)) :
                    string.Empty;

            return string.Format("Id : {0}, FirstName : {1}, LastName : {2}, \r\nQualifications : {3}\r\n",
                        Id, FirstName, LastName, qualifications);
        }
    }
}

Qualification

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace EF7_InMemoryProviderTest
{
    public class Qualification
    {
       
        public int Id { get; set; }

        public string Description { get; set; }

        public override string ToString()
        {
            return string.Format("Id : {0}, Description : {1}",
                        Id, Description);
        }

    }
}

 

Custom DbContext

Nothing more to it than that. So now lets look at creating a DbContext which has our stuff in it. For me this again is very simple, I just do this:

using Microsoft.Data.Entity;
using Microsoft.Data.Entity.Infrastructure;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace EF7_InMemoryProviderTest
{
    public class ClassDbContext : DbContext
    {
        public ClassDbContext(DbContextOptions options)
            : base(options)
        {
        }

        public DbSet<Person> Members { get; set; }
        public DbSet<Qualification> Qualifications { get; set; }
    }
}

Writing Some Test Code Using The InMemory Provider

So now that we have all the pieces in place, lets run some code to do a few things

  1. Seed some data
  2. Obtain a Person
  3. Add a Qualification to the Person obtained

Here is all the code to do this

using Microsoft.Data.Entity;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace EF7_InMemoryProviderTest
{
    class Program
    {
        static void Main(string[] args)
        {
            var optionsBuilder = new DbContextOptionsBuilder<ClassDbContext>();
            optionsBuilder.UseInMemoryDatabase();

            using (var classDbContext = new ClassDbContext(optionsBuilder.Options))
            {
                SeedData(classDbContext);

                var personId1 = GetMember(classDbContext, 1);
                Console.WriteLine("> Adding a qualication\r\n");
                personId1.Qualifications.Add(classDbContext.Qualifications.First());
                classDbContext.SaveChanges();

                personId1 = GetMember(classDbContext, 1);


                Console.ReadLine();

            }
        }

        private static Person GetMember(ClassDbContext classDbContext, int id)
        {
            var person = classDbContext.Members.FirstOrDefault(x => x.Id == id);
            Console.WriteLine(person);
            return person;
        }


        private static void SeedData(ClassDbContext classDbContext)
        {
            classDbContext.Members.Add(new Person()
                {
                    Id = 1,
                    FirstName = "Sacha",
                    LastName = "Barber"
                });
            classDbContext.Members.Add(new Person()
                {
                    Id = 2,
                    FirstName = "Sarah",
                    LastName = "Barber"
                });

            classDbContext.Qualifications.Add(new Qualification()
                {
                    Id = 1,
                    Description = "Bsc Hons : Computer Science"
                });
            classDbContext.Qualifications.Add(new Qualification()
                {
                    Id = 2,
                    Description = "Msc : Computer Science"
                });
            classDbContext.Qualifications.Add(new Qualification()
                {
                    Id = 3,
                    Description = "Bsc Hons : Naturapathic medicine"
                });

            classDbContext.SaveChanges();
        }
    }
}

And this is the results

image

Closing Note

Quite happy with how easy this was, and I think I would definitely try this out for real.

If you want to play along, I have a demo project (for Visual Studio 2015) here:

https://github.com/sachabarber/EF7_InMemoryTest

.NET -> Scala Interop Using RabbitMQ

I have taken on a new job and we are using a lot of .NET, but we are also using lots of Scala. My work colleague and I were talking about how to get different Actor frameworks (well Akka in Java and Akka.NET in .NET) to talk to each other. This does not seem possible right now, as the exact wire protocol would need to be decided and implemented.

So we thought ok how about you stick to using your actor framework of choice and we would send messages from .NET to Scala via some sort of messaging solution, where we would send messages as JSON.

So I decided to take a stab at that. I have used a few messaging solutions in my time, I decided to go for RabbitMQ as that is something I know and like (still need to learn Kafka, one day Kafka one day).

If this sounds like it could be of interest to you, I wrote this up here:

http://www.codeproject.com/Articles/1037532/NET-Scala-Interop-Using-RabbitMQ

 

App.Config Transforms Outside Of Web Project

This is a weird post in some ways as it is new for me, but certainly VERY old for others. I imagine for years web developer have known about how to use the Web.Config XSLT transforms MSBUILD task. If you have not heard of this, quite simply it allows you to have a single Web.Config file and a number of other config files where ONLY the transformations are declared. Such that when the XSLT transforms MSBUILD task runs, it will take the source Web.config file along with a transformation .config file and produce a new .config file which you would use as part of your deployment process.

 

I have myself known about this for years to, I have even known about the Microsoft MSBUILD teams Slow Cheetah project which allows you to use this same technique outside of web projects. Ting is what I have always done is had a bunch of .config files (so one for XXX.LIVE.config one for XXXX.QA.config) that I would rename and deploy by some clever scripts.

 

I recently had to do a bit of work on a project that made use of the Web.Config XSLT transforms MSBUILD task, and I could clearly see in the MSBUILD file that this just used a MSBUILD task. So I thought this must be easy enough to use stand alone. Turns out it is, you DO NOT really need to use Slow Cheetah at all.  You just need to know where the Web.Config XSLT transforms MSBUILD task is and how to use it.

 

The rest of this post will talk you through how to do this.

 

Suppose you have this App.Config you wish to transform

 

We will concentrate on just a few areas here, those area are the ones that are going to change between environments:

 

<?xml version="1.0" encoding="utf-8"?>
<configuration>
 
  <configSections>
    <section name="nlog" type="NLog.Config.ConfigSectionHandler, NLog" />
    <section name="shareSettings" type="SimpleConfig.Section, SimpleConfig" />
  </configSections>
 
  <shareSettings
      productName="Shipping"
      ftpPath="D:\ShippingRoutes">
  </shareSettings>
 
  <nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" 
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <extensions>
      <add assembly="Gelf4NLog.Target"/>
    </extensions>
    <targets async="true">
      <target name="graylog"
          xsi:type="graylog"
          hostip="dev-logging"
          hostport="12200"
          Facility="CoolService">
        <parameter name="exception" layout="${exception:format=tostring}" optional="true" />
        <parameter name="processname" layout="${processname}" />
        <parameter name="logger" layout="${logger}" />
        <parameter name="treadid" layout="${threadid}" />
      </target>
      <target name="file" xsi:type="File"
              layout="${longdate} | ${level} | ${message}${onexception:${newline}EXCEPTION\:${exception:format=tostring,StackTrace}}"
              fileName="c:/temp/CoolService-${shortdate}.log" />
    </targets>
    <rules>
      <logger name="NHibernate.*" minlevel="Off" writeTo="graylog" final="true" />
      <logger name="NHibernate.*" minlevel="Error" writeTo="file" final="true" />
      <logger name="*" minlevel="Off" writeTo="graylog" />
      <logger name="*" minlevel="trace" writeTo="file" />
    </rules>
  </nlog>
 
  <startup>
    <supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.5" />
  </startup>
 
  <system.serviceModel>
    <diagnostics performanceCounters="All" />
 
    <bindings>
      <netTcpBinding>
        <binding name="tcpBinding" 
		maxReceivedMessageSize="2147483647" 
		closeTimeout="00:59:00" 
		openTimeout="00:59:00" 
		receiveTimeout="00:59:00" 
		sendTimeout="00:59:00">
          <security mode="None" />
          <readerQuotas maxStringContentLength="8192" 
			maxArrayLength="20971520" />
        </binding>
      </netTcpBinding>
    </bindings>
 
 
    <client>
      <!-- CoolService -->
      <endpoint name="coolServiceEndpoint" 
	        address="net.tcp://localhost:63006/CoolService" 
		binding="netTcpBinding"
                bindingConfiguration="tcpBinding" 
                contract="Services.ICoolService" />
    </client>
  </system.serviceModel>
 
  <system.diagnostics>
    <sources>
      <source 	name="System.ServiceModel" 
		switchValue="All" 
		propagateActivity="true">
        <listeners>
          <add name="traceListener" 
		type="System.Diagnostics.XmlWriterTraceListener" 
		initializeData="c:\temp\CoolService.svclog"/>
        </listeners>
      </source>
    </sources>
  </system.diagnostics>
 
 
</configuration>

 

  • Custom config section (NOTE I am using SimpleConfig to do that, which is awesome)
  • NLog logging settings
  • WCF client section
  • Diagnostics WCF section

 

So Now Show Me The Transformations

Now this post will not (as is not meant to) teach you all about the Web.Config XSLT transforms MSBUILD task, but rather shall show you an example. So on with the example, suppose we want to create a LIVE config file where we change the following:

 

  • Custom config section (NOTE I am using SimpleConfig to do that, which is awesome) (CHANGE ATTRIBUTES)
  • NLog logging settings (CHANGE Logger/Target)
  • WCF client section (CHANGE ADDRESS)
  • Diagnostics WCF section (REMOVE IT)

 

Here is how we could do that (say its called “CoolService.LIVE.config”) :

 

<?xml version="1.0"?>
<configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
  <shareSettings xdt:Transform="SetAttributes" 
		xdt:Locator="Match(productName)"  
		productName="Shipping"
      		ftpPath="\\shipping\ShippingRoutes" />
                 
  <nlog xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 		
        xmlns="http://www.nlog-project.org/schemas/NLog.xsd">
      <targets>
      	<target xdt:Transform="SetAttributes" 
		xdt:Locator="Match(name)" 
		name="graylog" 
		hostip="app-logging" />
                                               
      	<target xdt:Transform="SetAttributes" 
		xdt:Locator="Match(name)" 
		name="file" 
		fileName="D:/logs/CoolService-${shortdate}.log" />
     </targets>
     <rules>
     	<logger xdt:Transform="SetAttributes" 
		xdt:Locator="Match(writeTo)" 
		minlevel="trace" 
		writeTo="graylog"/>
     </rules>
  </nlog>
 
  <system.serviceModel>
    <client>
      <endpoint xdt:Transform="SetAttributes" 
		xdt:Locator="Match(name)"
		name="coolServiceEndpoint" 		
	        address="net.tcp://appCoolService:63006/CoolService"  />
    </client>
  </system.serviceModel>
 
  <system.diagnostics xdt:Transform="Remove" />

</configuration>

 

 

So How Do We Apply These Transforms

To actually apply these transforms, we can easily craft a simple MSBUILD project file, such as (say its called “Transforms.proj”):

 

<Project ToolsVersion="4.0" 
	DefaultTargets="Release" 
	xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
    <UsingTask 
	TaskName="TransformXml" 
	AssemblyFile="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v12.0\Web\Microsoft.Web.Publishing.Tasks.dll"/>
 
    <ItemGroup>
        <Config Include="LIVE"><Environment>LIVE</Environment></Config>
        <Config Include="QA"><Environment>QA</Environment></Config>       
    </ItemGroup>
 
    <Target Name="Release">
        <MakeDir Directories="CoolService\Configuration\%(Config.Environment)"/>
 
        <TransformXml Source="App.config"
                     Transform="CoolService.%(Config.Identity).config"
                     Destination="CoolService\Configuration\%(Config.Environment)\CoolService.exe.config"/>
    </Target>
</Project>

 

Where the $(MsBuildExtensionsPath) will likely be something like “C:\Program Files (x86)\MSBuild\”. So once we have  a MSBUILD file like this in place it is just a simple matter of running MSBUILD something like

 

MSBUILD Transforms.proj

 

Which will result in the following being produced:

 

<?xml version="1.0" encoding="utf-8"?>
<configuration>
 
  <configSections>
    <section name="nlog" type="NLog.Config.ConfigSectionHandler, NLog" />
    <section name="shareSettings" type="SimpleConfig.Section, SimpleConfig" />
  </configSections>
 
  <shareSettings
      productName="Shipping"
      ftpPath="\\shipping\ShippingRoutes">
  </shareSettings>
 
  <nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" 
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <extensions>
      <add assembly="Gelf4NLog.Target"/>
    </extensions>
    <targets async="true">
      <target name="graylog"
          xsi:type="graylog"
          hostip="app-logging"
          hostport="12200"
          Facility="CoolService">
        <parameter name="exception" layout="${exception:format=tostring}" optional="true" />
        <parameter name="processname" layout="${processname}" />
        <parameter name="logger" layout="${logger}" />
        <parameter name="treadid" layout="${threadid}" />
      </target>
      <target name="file" xsi:type="File"
              layout="${longdate} | ${level} | ${message}${onexception:${newline}EXCEPTION\:${exception:format=tostring,StackTrace}}"
              fileName="D:/logs/CoolService-${shortdate}.log" />
    </targets>
    <rules>
      <logger name="NHibernate.*" minlevel="trace" writeTo="graylog" final="true" />
      <logger name="NHibernate.*" minlevel="Error" writeTo="file" final="true" />
      <logger name="*" minlevel="trace" writeTo="graylog" />
      <logger name="*" minlevel="trace" writeTo="file" />
    </rules>
  </nlog>
 
  <startup>
    <supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.5" />
  </startup>
 
  <system.serviceModel>
    <diagnostics performanceCounters="All" />
 
    <bindings>
      <netTcpBinding>
        <binding name="tcpBinding" 
		maxReceivedMessageSize="2147483647" 
		closeTimeout="00:59:00" 
		openTimeout="00:59:00" 
		receiveTimeout="00:59:00" 
		sendTimeout="00:59:00">
          <security mode="None" />
          <readerQuotas maxStringContentLength="8192" 
			maxArrayLength="20971520" />
        </binding>
      </netTcpBinding>
    </bindings>
 
 
    <client>
      <!-- CoolService -->
      <endpoint name="coolServiceEndpoint" 
	        address="net.tcp://appCoolService:63006/CoolService" 
		binding="netTcpBinding"
                bindingConfiguration="tcpBinding" 
                contract="Services.ICoolService" />
    </client>
  </system.serviceModel>
 
 
</configuration>

A Look At Akka .NET

A while back I wrote an Actor model for NetMQ (the .NET port of ZeroMQ), which is now part of the live codebase, I was happy with this.

 

I do like the idea of Actor Models, where you spin up and talk to an actor, rather than worry about locks/semaphores etc etc.

 

It just gels with me rather well. To this end I have been experimenting with Akka.NET which is a pretty complete port of the original Akka, it is a lot of fun, and a really nice way to write distributed multithreaded code if you ask me.

 

To this end I have written a small article, which should be viewed as an introductory article on Akka.NET. If you like the sound of this you can read the full article over here at Codeproject :

 

http://www.codeproject.com/Articles/1007161/A-Look-saAt-Akka-NET

 

Enjoy

Getting LineNumber(s) in your XLINQ

At work the other day I had to do some work with some Xml fragments, which I decided to do using XLinq.

 

Where I wanted to validate a certain fragment, and also get line numbers out of the fragment when it was deemed invalid. Say I had this XML

<?xmlversion="1.0" encoding="utf-8"?>
<Clients>
  <Client>
    <FirstName>Travis</FirstName>
    <LastName>Bickle</LastName>
  </Client>
  <Client>
    <FirstName>Franics</FirstName>
    <LastName>Bacon</LastName>
  </Client>
</Clients>

XLine actually supoprts line numbers by the way of the IXmlLineInfo Interface

So say you had some code like this which grabbed a XNode and wanted to use it’s line number

XText travis = (from x in xml.DescendantNodes().OfType<XText>()
                where x.Value == "Travis"
                select x).Single();

var lineInfo = (IXmlLineInfo)travis;
Console.WriteLine("{0} appears on line {1}", travis, lineInfo.LineNumber);

What I was finding though was that my line numbers were always coming out with 0 reported as the lineNumber. Turns out there is a easy win for this, it is to do with how I was initially loading the
XDocument. I was doing this

var xml = XDocument.Load(file);

Which is bad, and will not load the line numbers. You need to do this instead

var xml = XDocument.Load(file, LoadOptions.SetLineInfo);

For a much better write up on all of this, check out this old post by Charlie Calvert, its much better than my post, wish I had of found that one first

http://blogs.msdn.com/b/charlie/archive/2008/09/26/linq-farm-linq-to-xml-and-line-numbers.aspx

Xml Schemas From Code / XML validation against Schema File And More

I don’t know about you lot but I work with XML files a bit, but I don’t have to mess around with XSD (xml schema) files that often. And it seems like every time I do I forget what I did last time. To this end I thought I would write this up somewhere, so I can refer back to it.

 

So what we will we cover here?

I will be covering these things:

  1. Create a XML file from C# objects
  2. Create a XSD from a XML file using C#
  3. Validate a XML file against a XSD schema file using C#

 

1. Create a XML file from C# objects

So lets say we have created the following objects in C# that we wish to serialize to XML.

public class OrderList
{
    public OrderList()
    {
        Orders = new List();
    }

    public List Orders { get; set; }
}

public class Order
{

    public OrderSummary OrderSummary { get; set; }
    public Customer Customer { get; set; }
}


public class Address
{
    public string AddressLine1 { get; set; }
    public string AddressLine2 { get; set; }
    public string AddressLine3 { get; set; }
    public string City { get; set; }
    public string County { get; set; }
    public string PostCode { get; set; }
}

public class Customer
{
    public string Title { get; set; }
    public string FirstName { get; set; }
    public string LastName { get; set; }
    public string Email { get; set; }
    public string Phone { get; set; }

}


public class OrderSummary
{
    public OrderSummary()
    {
        OrderLines = new List();
    }

    public List OrderLines { get; set; }
    public Address DeliveryAddress { get; set; }
    public DateTime DeliveryDate { get; set; }

}

public class OrderLine
{
    public decimal ItemQuanity { get; set; }
    public string ItemName { get; set; }
}

 

How do we then take that and save it to XML? Turns out this is very easy we can just use some code like this:

 

public static void CreateXmlFile(string filename)
{
    Address add = new Address()
    {
        AddressLine1 = "AddressLine1",
        AddressLine2 = "AddressLine2",
        AddressLine3 = "AddressLine3",
        City = "City",
        County = "County",
        PostCode = "PostCode"
    };

    Customer cust = new Customer()
    {
        Email = "Email",
        FirstName = "John",
        LastName = "Barnes",
        Phone = "13311",
        Title = "Mr"
    };


    OrderList orders = new OrderList();


    var orderSummary = new OrderSummary()
    {
        DeliveryAddress = add,
        DeliveryDate = DateTime.Now,
        OrderLines = new List()
        {
            new OrderLine() {ItemQuanity = 150, ItemName = "TestItem1" },
            new OrderLine() {ItemQuanity = 250, ItemName = "TestItem2" },
            new OrderLine() {ItemQuanity = 4, ItemName = "TestItem3" },
        },
    };


    //order1
    Order order1 = new Order();
    order1.Customer = cust;
    order1.OrderSummary = orderSummary;
    orders.Orders.Add(order1);


    //order2
    Order order2 = new Order();
    order2.Customer = cust;
    order2.OrderSummary = orderSummary;
    orders.Orders.Add(order1);

    XmlSerializer xmlSerializer = new XmlSerializer(typeof(OrderList));

    using (FileStream stream = File.OpenWrite(filename))
    {
        xmlSerializer.Serialize(stream, orders);
    }
}

 

That is enough to write a XML file to disk that matches your C# objects. Cool so far. Let’s continue

 

2. Create a XSD from a XML file using C#

Now there are many ways to do this. Here are some choices

  • Double click a valid XML in Visual Studio and use the Xml menu to create a schema
  • Use the xsd.exe command line tool which you would use something like xsd.exe SomeXmlFile.xml
  • Use C# to programmatically write out a XSD file that matches some object definition

 

In the past I would have use the XSD.exe command line to do this. But a strange this happens when I take the output of that and try and include it in Visual Studio. Visual Studio tries to create a strongly typed DataSet out of the XSD file. I don’t want this, I want it to remain as a XSD schema file. I think there is a way to stop this happening by altering the file produced by xsd.exe, but there are 2 other ways. For this I have chosen to use a programmatical approach, which is as follows:

 

public static void CreateSchemaFromXml(string fileName)
{

    //CREATE SCHEMA FROM XML

    XmlSerializer xmlSerializer = new XmlSerializer(typeof(OrderList));

    XmlSchemas schemas = new XmlSchemas();
    XmlSchemaExporter exporter = new XmlSchemaExporter(schemas);

    XmlTypeMapping mapping = new XmlReflectionImporter()
        .ImportTypeMapping(typeof(OrderList));
    exporter.ExportTypeMapping(mapping);
    var schemasData = TrimSchema(schemas);

    using (FileStream stream = File.OpenWrite(fileName))
    {
        schemasData.First().Write(stream);
    }
}

private static List TrimSchema(XmlSchemas schemas)
{
    List schemasData = new List(
        schemas.Where(s => s.TargetNamespace != "http://www.w3.org/2001/XMLSchema" &&
        s.TargetNamespace != "http://microsoft.com/wsdl/types/"));

    return schemasData;
}

 

This will produce a valid XSD file on disk that you may then fiddle with by adding more restrictions say.  So now all we need to do is carry out some validation of XML file(s) against this XSD file.

 

3. Validate a XML file against a XSD schema file using C#

 

This is easily achieved using some test code. But before we look at that, this is what the project looks like in Visual Studio

 

image

 

So you can see that there is a folder with good and bad files that I wish to test against the XSD file. Here is the complete test case code:

 

using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Reflection;
using System.Text;
using System.Threading.Tasks;
using System.Xml.Linq;
using System.Xml.Schema;
using NUnit.Framework;

namespace XmlTests
{
    [TestFixture]
    public class StaticXmlFileTests
    {

        
        //Baddies
        [TestCase(@"\Xml\BadAgainstSchema\OrdersBADExampleFileNoOrderSummary.xml", false)]
        [TestCase(@"\Xml\BadAgainstSchema\OrdersBADExampleFile_AddressLineTooLong.xml", false)]


        //Goodies
        [TestCase(@"\Xml\Good\OrdersGOODExampleFile_FullFeatureSet.xml", true)]
        [TestCase(@"\Xml\Good\OrdersGOODExampleFile_MultipleOrderLines.xml", true)]
        [TestCase(@"\Xml\Good\OrdersGOODExampleFile_MultipleOrders.xml", true)]
        [TestCase(@"\Xml\Good\OrdersGOODExampleFile_SingleOrder.xml", true)]
        [TestCase(@"\Xml\Good\OrdersGOODExampleFile_SingleOrderLine.xml", true)]
        public void TestFileProducesExpectedSchemaValidationResult(string filename, bool exepectedValidationResult)
        {


            var xmlFile = ObtainFullFilePath(filename);
            var xsdFile = ObtainFullFilePath(@"\Xml\OrdersExampleFile.xsd");

            //VALIDATE XML AGAINST SCHEMA C#
            var xdoc = XDocument.Load(xmlFile);
            var schemas = new XmlSchemaSet();
            using (FileStream stream = File.OpenRead(xsdFile))
            {
                schemas.Add(XmlSchema.Read(stream, (s, e) =>
                {
                    var x = e.Message;
                }));
            }

            bool isvalid = true;
            StringBuilder sb = new StringBuilder();
            try
            {
                xdoc.Validate(schemas, (s, e) => 
                    {
                        isvalid = false;
                        sb.AppendLine(string.Format("Line : {0}, Message : {1} ", 
                            e.Exception.LineNumber, e.Exception.Message));
                    });
            }
            catch (XmlSchemaValidationException)
            {
                isvalid = false;
            }

            var validationErrors = sb.ToString();
            Assert.AreEqual(exepectedValidationResult, isvalid);
            if (exepectedValidationResult)
            {
                Assert.AreEqual(string.Empty, validationErrors);
            }
            else
            {
                Assert.AreNotEqual(string.Empty, validationErrors);
            }

        }



        private string ObtainFullFilePath(string fileName)
        {
            var path = TestContext.CurrentContext.TestDirectory;
            return string.Format("{0}{1}", path, fileName);
        }
    }
}

 

And here is a screen shot of all the tests working as expected:

 

image

 

 

You can find a small project for this on github:  https://github.com/sachabarber/XmlTests

Attached VM Behaviours

I have been using XAML based tech a long time now, and over the years I have honed my skills with it. Over the past year or so I have also been working on a very large XAML project and have had time to try out some ideas that I picked up working with some very clever hombres indeed. The idea is to use RX and kind of micro controllers along with ViewModels and child IOC containers to allow you to build truly big quite well maintained XAML apps.

 

I have taken some time to document this in an article which you can read about over at Codeproject:

 

http://www.codeproject.com/Articles/885009/Attached-VM-Behaviours

 

Enjoy

Bulk Insert Into SQL From C#

The other day at work I had a task that required me to do a bulk insert of data into a SQL server database table. I have obliviously come across (and used in the past) the bcp.exe command line utility. Which is all well and good when you are wanting to run scripts etc etc

This time however I wanted to do the bulk insert, programmatically using some standard .NET code. As I say this is not something I have had to do in code before. So I set out to find out how to do this, and after a few minutes of Googling found the answer I was looking for, which is the

This class has been available in .NET since v2.0, I guess if you don’t need these things they sometimes slip you by, which is the case here, for me anyway!

The main method that you would use in this class are the WriteToServer(..) where there are a few overloads that make use of DataTable/DataRow[] and IDataReader.

  • WriteToServer(DataRow[])
  • WriteToServer(DataTable)
  • WriteToServer(IDataReader)
  • WriteToServer(DataTable, DataRowState)
  • WriteToServerAsync(DataRow[])
  • WriteToServerAsync(DataTable)
  • WriteToServerAsync(IDataReader)
  • WriteToServerAsync(DataRow[], CancellationToken)
  • WriteToServerAsync(DataTable, DataRowState)
  • WriteToServerAsync(DataTable, CancellationToken)
  • WriteToServerAsync(IDataReader, CancellationToken)
  • WriteToServerAsync(IDataReader, CancellationToken)

 

You generally want to make use of the methods above that make use of IDataReader, this is because DataReader is a forward-only, read-only stream. It does not hold the data and thus is much faster then DataTable and DataRows[]

The scenario I was trying to deal with was how to do bulk inserts, and I can across this very good post by Mike Goatly, which goes into a lot of detail

And there was also this one over at Codeproject by AzamSharp

Azam demonstrates how to do use the SqlBulkCopy to do a bulk copy, so if that is what you are after check out his article. My scenario was that I wanted to do a bulk insert,  luckily this exactly what Mike Goatly writes about in his post which I listed above.

 

Bulk Insert

The trick to this is to using the SqlBulkCopy to do a bulk insert we need to create a custom IDataReader. This would be a cinch if we could do something like ObjectDataReader<SomeObject> and use that to feed WriteToServer() with a set of objects.

Unfortunately this doesn’t exist, so you’re going to have to implement your own.

public interface IDataReader : IDisposable, IDataRecord
{
   int Depth { get; }
   bool IsClosed { get; }
   int RecordsAffected { get; }
   void Close();
   DataTable GetSchemaTable();
   bool NextResult();
   bool Read();
}

Mike Goatley gives us a working implementation of this, which is as follows:

namespace SqlBulkCopyExample
{
    using System;
    using System.Collections.Generic;
    using System.Data;
    using System.Linq;
    using System.Linq.Expressions;
    using System.Reflection;

    public class ObjectDataReader<TData> : IDataReader
    {
        /// <summary>
        /// The enumerator for the IEnumerable{TData} passed to the constructor for 
        /// this instance.
        /// </summary>
        private IEnumerator<TData> dataEnumerator;

        /// <summary>
        /// The lookup of accessor functions for the properties on the TData type.
        /// </summary>
        private Func<TData, object>[] accessors;

        /// <summary>
        /// The lookup of property names against their ordinal positions.
        /// </summary>
        private Dictionary<string, int> ordinalLookup;

        /// <summary>
        /// Initializes a new instance of the <see cref="ObjectDataReader&lt;TData&gt;"/> class.
        /// </summary>
        /// <param name="data">The data this instance should enumerate through.</param>
        public ObjectDataReader(IEnumerable<TData> data)
        {
            this.dataEnumerator = data.GetEnumerator();

            // Get all the readable properties for the class and
            // compile an expression capable of reading it
            var propertyAccessors = typeof(TData)
                .GetProperties(BindingFlags.Instance | BindingFlags.Public)
                .Where(p => p.CanRead)
                .Select((p, i) => new
                    {
                        Index = i,
                        Property = p,
                        Accessor = CreatePropertyAccessor(p)
                    })
                .ToArray();

            this.accessors = propertyAccessors.Select(p => p.Accessor).ToArray();
            this.ordinalLookup = propertyAccessors.ToDictionary(
                p => p.Property.Name,
                p => p.Index,
                StringComparer.OrdinalIgnoreCase);
        }

        /// <summary>
        /// Creates a property accessor for the given property information.
        /// </summary>
        /// <param name="p">The property information to generate the accessor for.</param>
        /// <returns>The generated accessor function.</returns>
        private Func<TData, object> CreatePropertyAccessor(PropertyInfo p)
        {
            // Define the parameter that will be passed - will be the current object
            var parameter = Expression.Parameter(typeof(TData), "input");

            // Define an expression to get the value from the property
            var propertyAccess = Expression.Property(parameter, p.GetGetMethod());

            // Make sure the result of the get method is cast as an object
            var castAsObject = Expression.TypeAs(propertyAccess, typeof(object));

            // Create a lambda expression for the property access and compile it
            var lamda = Expression.Lambda<Func<TData, object>>(castAsObject, parameter);
            return lamda.Compile();
        }

        #region IDataReader Members

        public void Close()
        {
            this.Dispose();
        }

        public int Depth
        {
            get { return 1; }
        }

        public DataTable GetSchemaTable()
        {
            return null;
        }

        public bool IsClosed
        {
            get { return this.dataEnumerator == null; }
        }

        public bool NextResult()
        {
            return false;
        }

        public bool Read()
        {
            if (this.dataEnumerator == null)
            {
                throw new ObjectDisposedException("ObjectDataReader");
            }

            return this.dataEnumerator.MoveNext();
        }

        public int RecordsAffected
        {
            get { return -1; }
        }

        #endregion

        #region IDisposable Members

        public void Dispose()
        {
            this.Dispose(true);
            GC.SuppressFinalize(this);
        }

        protected void Dispose(bool disposing)
        {
            if (disposing)
            {
                if (this.dataEnumerator != null)
                {
                    this.dataEnumerator.Dispose();
                    this.dataEnumerator = null;
                }
            }
        }

        #endregion

        #region IDataRecord Members

        public int FieldCount
        {
            get { return this.accessors.Length; }
        }

        public bool GetBoolean(int i)
        {
            throw new NotImplementedException();
        }

        public byte GetByte(int i)
        {
            throw new NotImplementedException();
        }

        public long GetBytes(int i, long fieldOffset, byte[] buffer, int bufferoffset, int length)
        {
            throw new NotImplementedException();
        }

        public char GetChar(int i)
        {
            throw new NotImplementedException();
        }

        public long GetChars(int i, long fieldoffset, char[] buffer, int bufferoffset, int length)
        {
            throw new NotImplementedException();
        }

        public IDataReader GetData(int i)
        {
            throw new NotImplementedException();
        }

        public string GetDataTypeName(int i)
        {
            throw new NotImplementedException();
        }

        public DateTime GetDateTime(int i)
        {
            throw new NotImplementedException();
        }

        public decimal GetDecimal(int i)
        {
            throw new NotImplementedException();
        }

        public double GetDouble(int i)
        {
            throw new NotImplementedException();
        }

        public Type GetFieldType(int i)
        {
            throw new NotImplementedException();
        }

        public float GetFloat(int i)
        {
            throw new NotImplementedException();
        }

        public Guid GetGuid(int i)
        {
            throw new NotImplementedException();
        }

        public short GetInt16(int i)
        {
            throw new NotImplementedException();
        }

        public int GetInt32(int i)
        {
            throw new NotImplementedException();
        }

        public long GetInt64(int i)
        {
            throw new NotImplementedException();
        }

        public string GetName(int i)
        {
            throw new NotImplementedException();
        }

        public int GetOrdinal(string name)
        {
            int ordinal;
            if (!this.ordinalLookup.TryGetValue(name, out ordinal))
            {
                throw new InvalidOperationException("Unknown parameter name " + name);
            }

            return ordinal;
        }

        public string GetString(int i)
        {
            throw new NotImplementedException();
        }

        public object GetValue(int i)
        {
            if (this.dataEnumerator == null)
            {
                throw new ObjectDisposedException("ObjectDataReader");
            }

            return this.accessors[i](this.dataEnumerator.Current);
        }

        public int GetValues(object[] values)
        {
            throw new NotImplementedException();
        }

        public bool IsDBNull(int i)
        {
            throw new NotImplementedException();
        }

        public object this[string name]
        {
            get { throw new NotImplementedException(); }
        }

        public object this[int i]
        {
            get { throw new NotImplementedException(); }
        }

        #endregion
    }
}

With this very useful code that Mike provides all we need to do is something like this to bulk insert using a IDataReader using the SqlBulkCopy class:








namespace SqlBulkCopyExample
{
    using System;
    using System.Collections.Generic;
    using System.Data;
    using System.Data.SqlClient;
    using System.Diagnostics;
    using System.Linq;
    using SqlBulkCopyExample.Properties;

    class Program
    {
        static void Main(string[] args)
        {
            var people = CreateSamplePeople(10000);

            using (var connection = new SqlConnection(
		"Server=.;Database=MostWanted;Integrated Security=SSPI"))
            {
                connection.Open();
                InsertDataUsingSqlBulkCopy(people, connection);
            }
        }

       

        private static void InsertDataUsingSqlBulkCopy(
		IEnumerable<Person> people, SqlConnection connection)
        {
            var bulkCopy = new SqlBulkCopy(connection);
            bulkCopy.DestinationTableName = "Person";
            bulkCopy.ColumnMappings.Add("Name", "Name");
            bulkCopy.ColumnMappings.Add("DateOfBirth", "DateOfBirth");

            using (var dataReader = new ObjectDataReader<Person>(people))
            {
                bulkCopy.WriteToServer(dataReader);
            }
        }

       
        private static IEnumerable<Person> CreateSamplePeople(int count)
        {
            return Enumerable.Range(0, count)
                .Select(i => new Person
                    {
                        Name = "Person" + i,
                        DateOfBirth = new DateTime(
				1950 + (i % 50), 
				((i * 3) % 12) + 1, 
				((i * 7) % 29) + 1)
                    });
        }
    }
}

I grabbed the bulk of this code from Mikes original post, where he does a much more thorough job of explaining things and has a nice little demo project that you can compare the difference between using standard 1 by 1 inserts and using this approach, the difference is huge.

Happy days, thanks Mike certainly made my day a lot easier