.NET Serialization Performance

3 min read
Simon Coope
Simon Coope

Table of Contents


Recently I was asked to investigate .NET serialization performance to try to enhance the response times of some of our web APIs. Personally I really enjoy this type of task, as there's the investigative part, building a test harness, assembling the collection of results and then diving into those results before drawing conclusions.

To that end I'm going to briefly outline the steps I went through.


First of all I investigated the different JSON serializers available on the .NET platform. Then I listed the ones that fulfilled the following criteria:

  • Easy to install and maintain.
  • Currently being maintained (important if the serializer is open source).
  • Easy to integrate with a .NET project (i.e. does it have a Nuget package).
  • Minimal code required to perform serialization and deserialization.

As a result of my investigation I included the following JSON serializers in my test:

For benchmarking purposers I also included the serializer we're currently using (JSON.NET) and one of the fastest serializers available in .NET (Protobuf serializer), which is a binary serializer used by Google.

Building the Test Harness

The intention of the test harness was to be something I could run many times to collate the results of each test, so it had to be quick. I also had the following requirements:

  1. The test harness should be quick to develop.
  2. It should be very quick to add additional tests for other serializers.
  3. The test should cover differing sizes of sample data (e.g. I initially started with samples of 10 records, 100 records and 1000 records).
  4. Multiple tests should be run and the average times of tests calculated for each serializer against each sample size.

To that end I developed a basic Order/Products system, as follows:

public class Order
    public int Id { get; set; }

    public List<Product> Products { get; set; }

public class Product
    public int Id { get; set; }

    public string Name { get; set; }

    public string Description { get; set; }

    public int Price { get; set; }

As you can see from the above code examples I had to adorn the class members with the appropriate attributes for the protobuf serializer.

Furthermore, I created an abstract Tester class that all concrete tester classes were derived from (e.g. JSONNetTester, JILTester, etc.). The Tester class is responsible for performing the test and storing and exposing the results for each method of serialization.

This approach meant it was easy for me to add other serializers to the test. It also meant that I could consistently test each serializer (see the link to the GitHub project for more information).

Generating Test Data

My main test data requirements concerned how I could generate different numbers of test objects for the differing sample sizes I wanted to test against. I also didn't want to have to manually create thousands of test objects! Therefore, I needed a way to dynamically generate test data.

I did a bit of investigation and found Hydrator. Hydrator allows us to generate simple test data with support for common types (e.g. First names, last names, city names, etc.). This was very easy to set-up and can be seen in the Data/DataGenerator class in the sample project.

Collecting Results

As I now had a highly configurable test harness with as much test data as I wanted, I ran multiple tests to ensure consistency of results. Finally I settled on sample sizes of 100, 1000 and 100000 objects (the application I'm trying to improve the performance of has sample sizes between 10 and 1000 records per request). I also recorded the size of the serialized object (in bytes) and the time taken to serialize and deserialize. The results are as follows:

Serializer Sample Size Ser. Size (bytes) Ser. Time Deser. Time
Protobuf 100 120797 29 6
Protobuf 1000 1247827 22 32
Protobuf 10000 12136094 235 611
Data Contract 100 164432 24 23
Data Contract 1000 1697995 81 216
Data Contract 10000 16519532 841 2491
JSON.NET 100 164432 46 24
JSON.NET 1000 1697995 75 96
JSON.NET 10000 16519532 743 1173
JSON Service Stack 100 152092 23 24
JSON Service Stack 1000 1570611 59 154
JSON Service Stack 10000 15279652 610 1765
JIL 100 164432 81 40
JIL 1000 1697995 45 62
JIL 10000 16519532 452 734


From the above results, these are the main conclusions I drew:

  • As expected the Protobuf serializer was the fastest overall. Though, with small sample sizes the serialization time was surprisingly slower than the Data Contract and Service Stack serializers. This is something I intend to investigate further...

  • The Data Contract serializer performed well with small sample sizes, but as the sample size increased it's serialization and deserialization times increased significantly.

  • The JSON.NET serializer performed well overall, but was narrowly beaten by the Service Stack serializer.

  • The JIL serializer was slow on smaller sample sizes but significantly faster on larger samples of data.

The average sample sizes of the system I've been working to optmize is between 10 and 1000 records per request. Therefore, we opted to perform additional tests on the JSON Service Stack serializer and compared that to the JSON.Net serializer we're currently using. We're currently investigating this further, and performing load and stress tests on our web API.

The sample code can be downloaded from here.

Any comments please feel free to let me know below...