CQRS Demo

For a while now I have found myself becoming interested in CQRS, and I am fortunate enough to work with a practitioner of CQRS. As such it seemed like a good time to try and learn a bit more about this pattern.

I have created a small demo app that is a fully asynchronous CQRS example.

If this sounds like it may of interest to you, you can read more about it over at codeproject : CQRS : A Cross Examination Of How It Works

Git protocol errors when using Bower package manager

I have just got back from a month long holiday (which was great). Anyway back to work now…..sigh

So the other day I was trying to get Yeoman to scaffold a new angular.js app for me, which worked fine. I then wanted to use the Bower package manager to download a package, and whoever created the package hosted it on Git. Bower can deal with this just fine. But if like me your network is locked down, where there are all sorts of firewall/proxy rules, you may not be able to use the git protocol.

Luckily this is an easy fix, and all you need to do is issue this command line to have git add a configuration rule to re-write git urls to https

git config --global url."https://".insteadOf git://

What Changes Did This Command Make?

Take a look at your global configuration using:

git config --list

You’ll see the following line in the output:

url.https://.insteadof=git://

You can see how this looks on file, by taking a peek at ~/.gitconfig where you should now see that the following two lines have been added:

[url "https://"]
    insteadOf = git://

And that is all there is to it, everything just worked after that.

Azure Cloud Service : Inter role communications

I have been doing a bit more with Azure of late, and one of the things I wanted to try out was inter role communications between cloud service roles. You can obviously use the Azure ServiceBus where topic based subscriptions may make some sense. I felt there may be a lighter approach, and set out to explore this.

I was not alone in this thinking, and found some good stuff out there, which I have reworked into the article below:

 

http://www.codeproject.com/Articles/888469/Azure-Cloud-Service-Inter-role-communications

 

I hope some of you find it useful

Attached VM Behaviours

I have been using XAML based tech a long time now, and over the years I have honed my skills with it. Over the past year or so I have also been working on a very large XAML project and have had time to try out some ideas that I picked up working with some very clever hombres indeed. The idea is to use RX and kind of micro controllers along with ViewModels and child IOC containers to allow you to build truly big quite well maintained XAML apps.

 

I have taken some time to document this in an article which you can read about over at Codeproject:

 

http://www.codeproject.com/Articles/885009/Attached-VM-Behaviours

 

Enjoy

Triggers/Rowcount And NHibernate

 

The Short Story

If you have seen this error, whist using NHibernate with Triggers, this may be the right post for you

“Batch update returned unexpected row count from update; actual row count: 2; expected: 1″.

 

Long Story

So today I had to create some audit tables, ok some of you will surely go hey why didn’t you just use CQRS for that? Yes yes I know that by using CQRS and event sourcing I would indeed get full audit by way of the stored events, in fact I will be writing about that soon, but for now lets forget about that and just stick to the current situation which is :

 

  1. I have a table that I want to provide auditing for
  2. I am using SQL Server
  3. I am using Fluent NHibernate

 

Now there are numerous ways you may perform auditing in SQL server, in fact later versions of SQL Server come with inbuilt Audit functionality, or you could use a general all purpose audit table which records very generic information such as

 

  • The table name
  • The action performed (insert, update, delete)
  • The old value
  • The new value
  • Some narrative
  • Some date information

 

This is fine, but for my requirements, what I wanted was a full row copy from the original table, plus some extra columns such as

 

  • AuditId (Primary key auto generated by DB (Identity))
  • OperationType : I for Insert, D for delete, U for update

 

So lets have a look at some of this using some code example

 

SQL Server Table(s)

Let say I have the following table called “Deposit” in SQL Server

 

Deposit Table (source for Audit)

 

CREATE TABLE [dbo].[Deposit](
      [Id] [int] IDENTITY(1,1) NOT NULL,
      ….
      ….
      ….
      [Version] [timestamp] NOT NULL
CONSTRAINT [PK_Deposit] PRIMARY KEY CLUSTERED
(
      [Id] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

 

And I also have the following Audit table

Deposit_Audit table

 

CREATE TABLE [dbo].[Deposit_Audit](
      [AuditId] [int] IDENTITY(1,1) NOT NULL,
      [OperationType] [nvarchar](1) NOT NULL,
      [Id] [int] NOT NULL,
      ….
      ….
      ….
      ….
CONSTRAINT [PK_Deposit_Audit] PRIMARY KEY CLUSTERED
(
      [AuditId] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

 

The Fluent NHibernate Entities

So lets say I have a simple class called “Deposit” which looks like this. Note that for Fluent NHibernate/NHiberate to do their magic properties MUST be virtual

 

using System;
namespace Entities
{
    public class Deposit : Entity<int>
    {
        //NOTE : Do not remove this, as NHibernate needs default ctor to allow proxying
        public Deposit() { }
        public virtual int Id { get; set; }
        ….
        ….
        ….
        ….
        public virtual byte[] Version { get; set; }
    }
}

 

The Fluent NHibernate Mapping Files

Where I then have a Fluent NHibernate mapping file that looks like this

 

namespace Entities
{
    public class DepositMap : ClassMap<Deposit>
    {
        public DepositMap()
        {
            this.Schema("[dbo]");
            OptimisticLock.Version();
            this.Table("[Deposit]");
            this.Id(x => x.Id).Column("Id");
            ….
            ….
            ….
            Version(x => x.Version).Generated.Always();
        }
    }
}

 

 

The Trigger To Do The Audit

Then I have this trigger in place in SQL server to actually do the inserting of the data into the audit table

 

CREATE TRIGGER [dbo].[triDeposit_Audit]
   ON [dbo].[Deposit]
   AFTER INSERT, DELETE, UPDATE
AS

BEGIN

      DECLARE @rc AS INT;
      DECLARE @Id INT
      DECLARE @OperationType NVARCHAR(1)
      DECLARE @rcDel AS INT;
      DECLARE @rcInserted AS INT;

      SET @rc = (SELECT COUNT(*) FROM inserted);
      IF @rc = 0 RETURN;

      DECLARE @keycol AS INT;

      SELECT

            @keycol = ISNULL(inserted.Id, deleted.Id),
            @OperationType  =
            CASE

                              WHEN inserted.Id IS NULL THEN 'D' - we don't use this for now
                              WHEN deleted.Id IS NULL THEN 'I'
                              ELSE 'U'
            END
            FROM inserted
            FULL OUTER JOIN deleted
            ON inserted.Id = deleted.Id

     

      - single row
      IF @rc = 1

            BEGIN

                  INSERT INTO [Options].[dbo].[Deposit_Audit]

                  (    [OperationType]
                        ,[Id]
            		…..
            		…..
            		…..
            		…..
                  )             

                  SELECT     
                        @OperationType         
                	….
                	….
                	….
                	….
                  FROM [Options].[dbo].[Deposit] WHERE Id = @keycol
            END 
     

      - multiple rows
      ELSE
            BEGIN
                  SELECT DISTINCT * INTO #I FROM inserted;

                  BEGIN

                        INSERT INTO [Options].[dbo].[Deposit_Audit]
                        (
                           [OperationType]
                          ,[Id]
                            ……
                            ……
                            ……
                        )

                        SELECT     

                           'U'
                           ,[Id]
                            ……
                            ……
                            ……
                        FROM #I
                  END
            END  
END
GO

 

 

All good so far, so I then used my handy NHibernate ISession / Repository to do an INSERT, and then I got something that I was not expecting, I got this:

 

In NHibernate I got this error: “Batch update returned unexpected row count from update; actual row count: 2; expected: 1″. 

 

Yikes.

 

Turns out there is a simple fix for this, which to my mind was not obvious (even though I seem to recall seeing this before, and mindfully forgetting about it), you just need to include the following line at the start of your SQL Server trigger

 

SET NOCOUNT ON

 

By doing that you are saying that the code that runs inside the trigger will not effect the overall row count of the transaction. After adding this one line to my trigger everything worked as expected.

VS2013 Setup Project

I am fortunate enough to have been working on a big app for a while now, which means we have our deployment pretty sorted out, using a manner of different tools/scripts.

Then the other day I was asked to help out another team with a small bit of help, where they needed a small app written to allow them to analyse some data. The other team in question are not DEVS, so they need an installer really.

As I say I have not had to create an installer for quite a while now, I was obviously aware of the lack of “Setup Project” in VS2012/VS2013, but since I have not had a need for it lately I was like “meph”.

Until today.

Luckily help is at hand, there is a Visual Studio 2013 extension to get the ability to create Installer Projects in VS2013 again. God knows why Microsoft took that out in the first place, all that would have done is moved people to use other things such as

Anyway for those that want to use the old (probably familiar) VS Installer style project you can grab it from here:

https://visualstudiogallery.msdn.microsoft.com/9abe329c-9bba-44a1-be59-0fbf6151054d

Hopefully some of you will find that useful

 

Bulk Insert Into SQL From C#

The other day at work I had a task that required me to do a bulk insert of data into a SQL server database table. I have obliviously come across (and used in the past) the bcp.exe command line utility. Which is all well and good when you are wanting to run scripts etc etc

This time however I wanted to do the bulk insert, programmatically using some standard .NET code. As I say this is not something I have had to do in code before. So I set out to find out how to do this, and after a few minutes of Googling found the answer I was looking for, which is the

This class has been available in .NET since v2.0, I guess if you don’t need these things they sometimes slip you by, which is the case here, for me anyway!

The main method that you would use in this class are the WriteToServer(..) where there are a few overloads that make use of DataTable/DataRow[] and IDataReader.

  • WriteToServer(DataRow[])
  • WriteToServer(DataTable)
  • WriteToServer(IDataReader)
  • WriteToServer(DataTable, DataRowState)
  • WriteToServerAsync(DataRow[])
  • WriteToServerAsync(DataTable)
  • WriteToServerAsync(IDataReader)
  • WriteToServerAsync(DataRow[], CancellationToken)
  • WriteToServerAsync(DataTable, DataRowState)
  • WriteToServerAsync(DataTable, CancellationToken)
  • WriteToServerAsync(IDataReader, CancellationToken)
  • WriteToServerAsync(IDataReader, CancellationToken)

 

You generally want to make use of the methods above that make use of IDataReader, this is because DataReader is a forward-only, read-only stream. It does not hold the data and thus is much faster then DataTable and DataRows[]

The scenario I was trying to deal with was how to do bulk inserts, and I can across this very good post by Mike Goatly, which goes into a lot of detail

And there was also this one over at Codeproject by AzamSharp

Azam demonstrates how to do use the SqlBulkCopy to do a bulk copy, so if that is what you are after check out his article. My scenario was that I wanted to do a bulk insert,  luckily this exactly what Mike Goatly writes about in his post which I listed above.

 

Bulk Insert

The trick to this is to using the SqlBulkCopy to do a bulk insert we need to create a custom IDataReader. This would be a cinch if we could do something like ObjectDataReader<SomeObject> and use that to feed WriteToServer() with a set of objects.

Unfortunately this doesn’t exist, so you’re going to have to implement your own.

public interface IDataReader : IDisposable, IDataRecord
{
   int Depth { get; }
   bool IsClosed { get; }
   int RecordsAffected { get; }
   void Close();
   DataTable GetSchemaTable();
   bool NextResult();
   bool Read();
}

Mike Goatley gives us a working implementation of this, which is as follows:

namespace SqlBulkCopyExample
{
    using System;
    using System.Collections.Generic;
    using System.Data;
    using System.Linq;
    using System.Linq.Expressions;
    using System.Reflection;

    public class ObjectDataReader<TData> : IDataReader
    {
        /// <summary>
        /// The enumerator for the IEnumerable{TData} passed to the constructor for 
        /// this instance.
        /// </summary>
        private IEnumerator<TData> dataEnumerator;

        /// <summary>
        /// The lookup of accessor functions for the properties on the TData type.
        /// </summary>
        private Func<TData, object>[] accessors;

        /// <summary>
        /// The lookup of property names against their ordinal positions.
        /// </summary>
        private Dictionary<string, int> ordinalLookup;

        /// <summary>
        /// Initializes a new instance of the <see cref="ObjectDataReader&lt;TData&gt;"/> class.
        /// </summary>
        /// <param name="data">The data this instance should enumerate through.</param>
        public ObjectDataReader(IEnumerable<TData> data)
        {
            this.dataEnumerator = data.GetEnumerator();

            // Get all the readable properties for the class and
            // compile an expression capable of reading it
            var propertyAccessors = typeof(TData)
                .GetProperties(BindingFlags.Instance | BindingFlags.Public)
                .Where(p => p.CanRead)
                .Select((p, i) => new
                    {
                        Index = i,
                        Property = p,
                        Accessor = CreatePropertyAccessor(p)
                    })
                .ToArray();

            this.accessors = propertyAccessors.Select(p => p.Accessor).ToArray();
            this.ordinalLookup = propertyAccessors.ToDictionary(
                p => p.Property.Name,
                p => p.Index,
                StringComparer.OrdinalIgnoreCase);
        }

        /// <summary>
        /// Creates a property accessor for the given property information.
        /// </summary>
        /// <param name="p">The property information to generate the accessor for.</param>
        /// <returns>The generated accessor function.</returns>
        private Func<TData, object> CreatePropertyAccessor(PropertyInfo p)
        {
            // Define the parameter that will be passed - will be the current object
            var parameter = Expression.Parameter(typeof(TData), "input");

            // Define an expression to get the value from the property
            var propertyAccess = Expression.Property(parameter, p.GetGetMethod());

            // Make sure the result of the get method is cast as an object
            var castAsObject = Expression.TypeAs(propertyAccess, typeof(object));

            // Create a lambda expression for the property access and compile it
            var lamda = Expression.Lambda<Func<TData, object>>(castAsObject, parameter);
            return lamda.Compile();
        }

        #region IDataReader Members

        public void Close()
        {
            this.Dispose();
        }

        public int Depth
        {
            get { return 1; }
        }

        public DataTable GetSchemaTable()
        {
            return null;
        }

        public bool IsClosed
        {
            get { return this.dataEnumerator == null; }
        }

        public bool NextResult()
        {
            return false;
        }

        public bool Read()
        {
            if (this.dataEnumerator == null)
            {
                throw new ObjectDisposedException("ObjectDataReader");
            }

            return this.dataEnumerator.MoveNext();
        }

        public int RecordsAffected
        {
            get { return -1; }
        }

        #endregion

        #region IDisposable Members

        public void Dispose()
        {
            this.Dispose(true);
            GC.SuppressFinalize(this);
        }

        protected void Dispose(bool disposing)
        {
            if (disposing)
            {
                if (this.dataEnumerator != null)
                {
                    this.dataEnumerator.Dispose();
                    this.dataEnumerator = null;
                }
            }
        }

        #endregion

        #region IDataRecord Members

        public int FieldCount
        {
            get { return this.accessors.Length; }
        }

        public bool GetBoolean(int i)
        {
            throw new NotImplementedException();
        }

        public byte GetByte(int i)
        {
            throw new NotImplementedException();
        }

        public long GetBytes(int i, long fieldOffset, byte[] buffer, int bufferoffset, int length)
        {
            throw new NotImplementedException();
        }

        public char GetChar(int i)
        {
            throw new NotImplementedException();
        }

        public long GetChars(int i, long fieldoffset, char[] buffer, int bufferoffset, int length)
        {
            throw new NotImplementedException();
        }

        public IDataReader GetData(int i)
        {
            throw new NotImplementedException();
        }

        public string GetDataTypeName(int i)
        {
            throw new NotImplementedException();
        }

        public DateTime GetDateTime(int i)
        {
            throw new NotImplementedException();
        }

        public decimal GetDecimal(int i)
        {
            throw new NotImplementedException();
        }

        public double GetDouble(int i)
        {
            throw new NotImplementedException();
        }

        public Type GetFieldType(int i)
        {
            throw new NotImplementedException();
        }

        public float GetFloat(int i)
        {
            throw new NotImplementedException();
        }

        public Guid GetGuid(int i)
        {
            throw new NotImplementedException();
        }

        public short GetInt16(int i)
        {
            throw new NotImplementedException();
        }

        public int GetInt32(int i)
        {
            throw new NotImplementedException();
        }

        public long GetInt64(int i)
        {
            throw new NotImplementedException();
        }

        public string GetName(int i)
        {
            throw new NotImplementedException();
        }

        public int GetOrdinal(string name)
        {
            int ordinal;
            if (!this.ordinalLookup.TryGetValue(name, out ordinal))
            {
                throw new InvalidOperationException("Unknown parameter name " + name);
            }

            return ordinal;
        }

        public string GetString(int i)
        {
            throw new NotImplementedException();
        }

        public object GetValue(int i)
        {
            if (this.dataEnumerator == null)
            {
                throw new ObjectDisposedException("ObjectDataReader");
            }

            return this.accessors[i](this.dataEnumerator.Current);
        }

        public int GetValues(object[] values)
        {
            throw new NotImplementedException();
        }

        public bool IsDBNull(int i)
        {
            throw new NotImplementedException();
        }

        public object this[string name]
        {
            get { throw new NotImplementedException(); }
        }

        public object this[int i]
        {
            get { throw new NotImplementedException(); }
        }

        #endregion
    }
}

With this very useful code that Mike provides all we need to do is something like this to bulk insert using a IDataReader using the SqlBulkCopy class:








namespace SqlBulkCopyExample
{
    using System;
    using System.Collections.Generic;
    using System.Data;
    using System.Data.SqlClient;
    using System.Diagnostics;
    using System.Linq;
    using SqlBulkCopyExample.Properties;

    class Program
    {
        static void Main(string[] args)
        {
            var people = CreateSamplePeople(10000);

            using (var connection = new SqlConnection(
		"Server=.;Database=MostWanted;Integrated Security=SSPI"))
            {
                connection.Open();
                InsertDataUsingSqlBulkCopy(people, connection);
            }
        }

       

        private static void InsertDataUsingSqlBulkCopy(
		IEnumerable<Person> people, SqlConnection connection)
        {
            var bulkCopy = new SqlBulkCopy(connection);
            bulkCopy.DestinationTableName = "Person";
            bulkCopy.ColumnMappings.Add("Name", "Name");
            bulkCopy.ColumnMappings.Add("DateOfBirth", "DateOfBirth");

            using (var dataReader = new ObjectDataReader<Person>(people))
            {
                bulkCopy.WriteToServer(dataReader);
            }
        }

       
        private static IEnumerable<Person> CreateSamplePeople(int count)
        {
            return Enumerable.Range(0, count)
                .Select(i => new Person
                    {
                        Name = "Person" + i,
                        DateOfBirth = new DateTime(
				1950 + (i % 50), 
				((i * 3) % 12) + 1, 
				((i * 7) % 29) + 1)
                    });
        }
    }
}

I grabbed the bulk of this code from Mikes original post, where he does a much more thorough job of explaining things and has a nice little demo project that you can compare the difference between using standard 1 by 1 inserts and using this approach, the difference is huge.

Happy days, thanks Mike certainly made my day a lot easier