Case study: custom json converter for DateTime

Kiwi.Json uses ISO-8601 for representing encoded DateTime values. This means that encoded data is human readable as in the following json:

{
sortableDate: "2012-03-25T16:01:26",
universalySortableDate: "2012-03-25 16:01:26Z"
}

Microsoft chose another path in their Json implementations. In short, they use an encoding based on some javascript text encoding quirks + a milliseconds offset from a the fixed date.
So, the date above would occur in json as

{
 msDate: "\/Date(1332691286000)\/"
}

Not very readable, but, there are certains aspects making this and similar solutions quite sound.

So, how do we face json encoded not in ISO-8601 but in the Microsoft format? Out of the box, Kiwi.Json fails (actually throws a parsing exception), but there is a solution.
Specify a custom DateTime formatter, AspNetDateTimeConverter, either globally as in

JsonConvert.RegisterCustomConverters(new AspNetDateTimeConverter());
..
var dataWithDateTime = JsonConvert.Parse<MyTypeWithDateTime>(jsonEncoding)

or, per call to Parse() as in

var dataWithDateTime = JsonConvert.Parse<MyTypeWithDateTime>(jsonEncoding, new AspNetDateTimeConverter())

AspNetDateTimeConverter was quite easy to implement. The full code is

public class AspNetDateTimeConverter : AbstractJsonConverter
{
  private static readonly DateTime BaseJavaScriptDate = new DateTime(1970, 1, 1);

  public override ITypeBuilder CreateTypeBuilder(Type type)
  {
    return TryCreateTypeBuilder<DateTime, string>(type, ParseDate)
           ?? TryCreateTypeBuilder<DateTime?, string>(type, s => ParseDate(s));
  }

  public override ITypeWriter CreateTypeWriter(Type type)
  {
    return TryCreateWriter<DateTime>(type,
        dt => new JsonLiteralContent(string.Concat(@"""\/Date(",(dt - BaseJavaScriptDate).TotalMilliseconds,@")\/""")));
  }

  private DateTime ParseDate(string s)
  {
    return BaseJavaScriptDate.Add(TimeSpan.FromMilliseconds(long.Parse(s.Substring(6, s.IndexOf(')') - 6))));
  }
}

If we dissect this little animal, the vital organs are

private static readonly DateTime BaseJavaScriptDate = new DateTime(1970, 1, 1);

which denotes the era start used in javascript.

Writing of dates are controlled by

public override ITypeWriter CreateTypeWriter(Type type)
{
  return TryCreateWriter<DateTime>(type,
    dt => new JsonLiteralContent(string.Concat(@"""\/Date(",(dt - BaseJavaScriptDate).TotalMilliseconds,@")\/""")));
}

This code simply says, “if a DateTime value comes in for serialization, I can take it and I return a literal string instead formatted as “\/Date(<milliseonds since era>)\/”.

The utility method ParseDate() is

private DateTime ParseDate(string s)
{
  return BaseJavaScriptDate.Add(
    TimeSpan.FromMilliseconds(long.Parse(s.Substring(6, s.IndexOf(')') - 6))));
}

No error checking and stuff, since we know what we are doing. Just peek past “\/Date(” and fetch the milliseconds and add them to the era start.

The method CreatetypeBuilder() is more enigmatic.

public override ITypeBuilder CreateTypeBuilder(Type type)
{
  return TryCreateTypeBuilder<DateTime, string>(type, ParseDate)
         ?? TryCreateTypeBuilder<DateTime?, string>(type, s => ParseDate(s));
}

What does it do? Well, it says it can handle both DateTime and Nullable<DataTime> aka DateTime? at the same time. In fact, the above pattern of
TryCreateTypeBuilder<T,> ?? TryCreateTypeBuilder<T?,> makes great sense for valuetypes, since then a custom converter can handle their nullable counterparts as well.

This is a post in my series about a practical Json implementation in .NET. The source actual code is hosted on github (https://github.com/jlarsson/Kiwi.Json). A compiled version is avaiable from nuget (http://nuget.org/packages/Kiwi.Json).

Advertisements

Logging with Json

Kiwi.Json can be used for great logging in sitations where log4net or NLog is opted out.

Personally, such cases arise when I need the logs to both machine and human readable.

Such cases do indeed exist. I often use separate transaction logging to keep a reliable backup of data I put into Sqlite. Logging should be fast, but more important, I want to be able to more or less create a copy of the database by just replaying the log.

Lets reason about a logging API.

public class DatabaseOperationLog{
  public string Operation{get; set;}
  public Dictionary<string,object> Data{ get; set; }
}

public interface IDatabaseLog{
  void Add(DatabaseOperationLog operation);
  IEnumerable<DatabaseOperationLog> GetOperations();
}

Given this interface, we can log as follows:

IDatabaseLog log = ...
log.Add(new DatabaseOperationLog{
  Operation = "update",
  Data = new Dictionary<string,object>{ {"User","John"},{"Age",35} }
});

And, we can print all the entries with

IDatabaseLog log = ...
var allOperations = log.Operations;
foreach(var operation in log.GetOperations())
{
  System.Console.WriteLine("{0}: {1}", operation.Operation, JsonConvert.Write(operation.Data));
}

This far, its easy. The tricky part in practice is the actual layout of the log file. For the log file to be pure json the content should be something like

[
  {Operation:"update", Data:{...}},
  {Operation:"insert", Data:{...}},
  {Operation:"delete", Data:{...}}
]

Notice how logfile content is a wellformed json array of objects. This is easy to parse and understand but a pain to maintain, since appending a new entry must handle the following cases:

  • adding to an empty or missing logfile should write out a full array with the added messages
  • adding to a log with existing entries means parsing from the end to find the closing ‘]’ of the array, append an array delimiter ‘,’, the log entry and then finally the array terminator ‘]’.

A simpler approach is to just blindly append, relaxing the requirement that the whole log file must be wellformed json. The content of the log would then be something like

{Operation:"update", Data:{...}}
{Operation:"insert", Data:{...}}
{Operation:"delete", Data:{...}}

A piece of cake to understand and implement, but this makes reading the log quite difficult with most Json parsers. In their current builds, both Newtonsoft.Net and Servicestack.Text cant handle this problem, since the log in its entirety isn’t wellformed.

Most json implementations have a separate lexical analysis step before parsing, making parsing gready. Kiwi.Json on the other hand will never read additional characters from input once it parsed a valid json fragment.

The code for reading json fragmens with Kiwi.json is as follows:

public IEnumerable<DatabaseOperationLog> GetOperations(string logFile)
{
  var reader = new JsonStringParser(File.ReadAllText(LogFilePath));
  while (!reader.EndOfInput())
  {
    yield return JsonConvert.Parse<DatabaseOperationLog>(reader);
  }
}

This is a post in my series about a practical Json implementation in .NET. The source actual code is hosted on github (https://github.com/jlarsson/Kiwi.Json). A compiled version is avaiable from nuget (http://nuget.org/packages/Kiwi.Json).

Case study: custom Json converter for DataTable

Kiwi.Json is a fairly complex Json implementation for .NET. As such it can handle serialization and deserialization to and from most common .NET data constructs. But, what happens when a new exotic type turns up? Lets take a case study, System.Data.DataTable, which we initialize as below.

var dt = new DataTable();
dt.Columns.AddRange(new[] {new DataColumn("A"), new DataColumn("B"), new DataColumn("C")});
dt.Rows.Add(1, 2, 3);
dt.Rows.Add("four", "five", "six");
dt.Rows.Add(7, 8, 9);

If we try to serialize it with

var jsonText = JsonConvert.Write(dt);

it will fail miserably (in the current build since a DataTable apparently holds A LOT of internal state including some System.IntPtr, which isn’t supported out of the box by Kiwi.Json).

The class DatatableConverter discussed below fixes this problem. Either explicit in every call as in

var jsonText = JsonConvert.Write(dt, new DatatableConverter());

or, registered globally, as in

JsonConvert.RegisterCustomConverters(new DataTableConverter());
...
var jsonText = JsonConvert.Write(dt);

DataTable has more information than what we normally would like to serialize (lots and lots of quite uninteresting fields and properties for our purpose). A feasible (and perhaps common) Json encoding of the table above is

{
 Columns: ["A","B","C"],
 Rows: [[1,2,3],["four", "five", "six"],[7,8,9]]
}

Ok, so it’s decided. We want DataTables to have the above nice look. Kiwi.Json has the concept of converters and we can quite easily design on for data tables. The main idea is

  • instead of serializing a DataTable, we serialize a simpler class, DataTableProxy
  • we hook up Kiwi.Json and converts to and from DataTable and DataTableProxy in relevant stages of serialization/deserialization.
The class DataTableProxy is modelled to hold only the interesting parts of a DataTable and is defined as
public class DataTableProxy
{
    public IEnumerable<string> Columns { get; set; }
    public IEnumerable<object[]> Rows { get; set; }
}

Our custom converter, DataTableConverter, is defined as

public class DataTableConverter : AbstractJsonConverter
{
    public override ITypeBuilder CreateTypeBuilder(Type type)
    {
        // CreateTypeBuilder is called from JsonConvert each time a new, unknown type is deserialized.
        // The result should be a valid type builder for the argument type or null.
        return TryCreateBuilder<DataTable, DataTableProxy>(type, proxy => {
            // Create empty DataTable ...
            var dt = new DataTable();
            // .. and set it's columns from the names in the DataTableProxy instance ...
            dt.Columns.AddRange(proxy.Columns.Select(n => new DataColumn(n)).ToArray());
            // ...and copy the rows from the DataTableProxy instance
            foreach (var row in proxy.Rows)
            {
                dt.Rows.Add(row);
            }
            return dt;
        });
    }

    public override ITypeWriter CreateTypeWriter(Type type)
    {
        // CreateTypeWriter is called from JsonConvert each time a new, unknown type is serialized.
        // The result should be a valid type builder for the argument type or null.
        return TryCreateWriter<DataTable>(type, dt => new DataTableProxy {
            Columns = dt.Columns.OfType<DataColumn>().Select(c => c.ColumnName),
            Rows = dt.Rows.OfType<DataRow>().Select(r => r.ItemArray)
        });
    }
}

DataTableConverter is actually implemented in Kiwi.Json, just refer to Kiwi.Json.Converters.DataTableConverter. There is also another converter for DataTables, Kiwi.Json.Converters.DataTableAsObjectArrayConverter converting to an from regular json arrays of objects.

Notice the inheritance from AbstractJsonConverter which is specifically designed to handle this kind of proxy serialization.
The method TryCreateBuilder returns an ITypeBuilder if the actual type to be serialized is of type DataTable and null otherwise. The argument is a lambda, specifying how a DataTableProxy is transformed to a DataTable. Likewise, TryCreateWriter above returns a writer that ensures that any DataTable is transformed to a DataTableProxy before serialization.

The convention of returning null for unhandled types is a convention in Kiwi.Json, which will continue to search until a matching handler is found.

There is no requirement on converters to be symmetric. It’s perfectly legal to just implement one of the methods CreateTypeWriter and CreateTypeBuilder. In particular, a common case with DataTable is to feed a client side grids on web pages, which only requires us to implement serialization via CreateTypeWriter.

This is a post in my series about a practical Json implementation in .NET.
The source actual code is hosted on github (https://github.com/jlarsson/Kiwi.Json).
A compiled version is avaiable from nuget (http://nuget.org/packages/Kiwi.Json).