Introduction
At work at the moment we have a number of Microservices which we are slowly trying to transition to containers, where we will likely use Kubernetes to run the containers. Right now our logging framework of choice is NLog, where we typically just log to a file.
Now this works fine when you are on premise or have a dedicates share setup, but you will need to do more work when you move to containers.
For example if you just log to some relative path to the Microservice you will find that when you run that in a container this logging will be wiped out if your container dies, as its effectively just logging into the container process storage. There are of course answers to this, such as
- Volumes/Mounts in Docker/K8S, where we could mount volumes and use these to log to from inside the container
Or you could use a more radical solution such as
- Mounting a share where all your files will be written to, but this is also really just a volume based solution as far as Docker/K8S is concerned
But there is a whole other category of solution that we could consider, which are dedicated logging solutions, where the main players are really
- ELK (elastic, logstash, kibana)
- EFK (elastic, fluentD, kibana)
- Graylog
These logging solutions typically all use elastic, and come with certain ingestors, or input adaptors that allow the log data to flow into elastic where it is indexed, and made available to either Kibana (ExK stack) or Graylog if you use the Graylog stack.
Both Kibana and GrayLog offer a very nice UI which allows you to build up nice dashboards and conduct searches over the indexed data, and also include structural tag searches which are made available by the logger itself. NLog for example suports structred tag logging
So that is the overview, in this post I will show you how to quick setup a .NET Core 3.1 application using NLog talking to GrayLog, In a subsequent post I will also demonstrate the EFK stack. You might ask why not the ELK stack, well quite simply K8S has very good support for just piping the console output through FluentD, so it’s a good choice when working with K8S
NLog
Ok so lets start with creating a dead simple .NET Core3.1 Console app, and add the following Nuget package : NLog.Gelf (I used 1.1.4)
Then you will need to add a Nlog.Config file, this is what mine looks like
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <extensions> <add assembly="NLog.Gelf" /> </extensions> <targets> <target name="console" xsi:type="Console" layout="${date:format=dd/MM/yyyy HH\:mm\:ss.fff} | ${level:uppercase=true} | ${message}${exception:format=ToString}" /> <target name="Gelf" type="GelfHttp" serverUrl="http://localhost:12201/gelf" facility="sachas app"/> </targets> <rules> <logger name="*" minLevel="Trace" appendTo="Gelf, console"/> </rules> </nlog>
The key thing to note there is that we use the special Gelf target and assign it to endpoint http://localhost:12201/gelf, this is how it will be sent to GrayLog.
Then finally you will need to write some code that actually produces some logging output
using System; using System.Threading; using NLog; using NLog.Config; namespace NLogGrayLogDemo { class Program { static void Main(string[] args) { var logger = LogManager.GetCurrentClassLogger(); int counter = 0; while (true) { logger.Debug($"YO its nice, This one is from NLog Gelf logging, index = {counter++}"); Thread.Sleep(5000); } Console.ReadLine(); } } }
Ok so now that we have some simple .NET Core app, we need an instance of GrayLog to test things with.
GrayLog
GrayLog also uses Elastic as its search/indexer, and it also stores some data in Mongo. As such to have a working GrayLog instance you will need Mongo/Elastic and GrayLog
Luckily our friends at GrayLog have made this very simple with their support for Docker, we can pretty much follow this quickstart : https://docs.graylog.org/en/3.3/pages/installation/docker.html
One thing to note though is if you want to actual persist the data you will need to use the Docker Compose file as shown here : https://docs.graylog.org/en/3.3/pages/installation/docker.html#persisting-data. For this demo however I will just be using simple docker run commands and as such the data in GrayLog/Mongo and Elastic will be ephemeral and will be lost should the containers be restarted.
So lets get started, first thing you will need to do is ensure you are using Linux Containers, so make sure your Docker Desktop is using Linux containers, then issue these commands
docker run --name mongo -d mongo:3 docker run --name elasticsearch -e "http.host=0.0.0.0" -e "ES_JAVA_OPTS=-Xms512m -Xmx512m" -d docker.elastic.co/elasticsearch/elasticsearch-oss:6.8.10
This will give you a running Mongo and Elastic, but for GrayLog we need a little bit more work.
Firstly we need to create an Admin password, which we can do use the WSL system for Windows, so from a WSL command line we can issue this command
echo -n "Enter Password: " && head -1 </dev/stdin | tr -d '\n' | sha256sum | cut -d" " -f1
Where you will enter “admin” (or any password you want to use), which will then get encrypted ready to use to create GrayLog. Grab the value and use it in next command line, where we setup the following ports
- 9000 TCP : Graylog Web UI
- 12201 HTTP : Gelf HTTP Input that will need to be steup inside your GrayLog instance running in docker after you issue command line below
- 5555 TCP : Raw/Plaintext TCP Input that will need to be steup inside your GrayLog instance running in docker after you issue command line below
docker run --name graylog --link mongo --link elasticsearch -p 9000:9000 -p 12201:12201 -p 5555:5555 -e GRAYLOG_HTTP_EXTERNAL_URI="http://127.0.0.1:9000/" -e GRAYLOG_ROOT_PASSWORD_SHA2=8c6976e5b5410415bde908bd4dee15dfb167a9c873fc4bb8a81f6f2ab448a918 -d graylog/graylog:3.3
Where we are using the ports we specified above and the admin root password we generated above
Ok so you should now have a running GrayLog, so we can
- open the Graylog Web UI at http://127.0.0.1:9000/ and enter admin and whatever password you generated above
- Go to System/Inputs menu and add 2 new inputs
- Gelf Http : 12201
- Raw/PlainText : 5555
So you now have a running GrayLog, time to test it out
Testing using Netcat
As before we can use the WSL system for Windows, so from a WSL command line we can issue these test commands
Plain text input
echo 'Plain Message1' | nc localhost 5555
Gelf Http input
curl -X POST -H 'Content-Type: application/json' -d '{ "version": "1.1", "host": "example.org", "short_message": "Gelf Message1", "level": 5, "_some_info": "foo" }' 'http://localhost:12201/gelf'
Now if you go back into GrayLog and conduct a search for say “Gelf” you should see the test messages
Now you should also be able to run the .NET Core app and it should also be able to produce messages straight into GrayLog
That is all for this post, as I say in subsequent post I will show you how to use EFK stack too