This is the accompanying blog post for my packages slStreamUtilsProtobuf and slStreamUtilsMessagePack, which provides a set of open source .net standard 2.0 and above tools for improving the speed of (de)serialization of .net objects offered by both popular .net serialization tools Protobuf-net and MessagePack-CSharp. It’s available under GPL-3 on github.
The improvements gains will depend on the nature and size of the objects being serialized. The greatest benefits will be obtained while working with very large, undetermined-length I/O streams, such as logging the activity of a server process, or a data science project, which can grow to several hundred gigabytes. Regardless, there may be significant performance benefits when dealing with (much) smaller files.
Introduction
While native .net binary serialization has always been notoriously slow, several faster 3rd party protocols and .net libraries have become available over the years.
In this article I’ll focus on two of the most common tools, Protobuf-net and MessagePack-CSharp, and show how we can greatly accelerate object serialization using several different techniques, until we’re no longer I/O-bound on the serialization speed, allowing us to take advantage of recent, very fast I/O devices.
This post assumes the user is already familiar with the specifics and pros-cons of his chosen serialization protocol, either Protobuf or MessagePack. Both of the respective tool’s repositories offer good introductory elements on the subject. If a user is interested in only 1 of these tools, he may skip the paragraphs that explicitly deal with the other.
Note: find all examples used in this blog at the project repository’s folder Examples
Benchmarks
Benchark-net projects are available under Benchmarks/slStreamUtilsBenchmark
Please visit the project's page for benchmark charts and descriptions.