encoding/json: performance slower than expected

by reid.write:

STR:
1. clone the git repository here: git@github.com:tarasglek/jsonbench.git
2. Generate some sample JSON data using the instructions in the README
3. Run the go json benchmark which is in gojson/src/jsonbench/json.go

What is the expected output?
I expected to see performance roughly in line with Java (using Jackson for json
parsing). On my test machine, the Java benchmark results in the following:
Processing 436525928 bytes took 5491 ms (75.82 MB/s)

What do you see instead?
Significantly slower performance, using the same input file:
Duration: 27.497043818s, 15.14 MB/s

Which compiler are you using (5g, 6g, 8g, gccgo)?
Not sure

Which operating system are you using?
Linux dell-ubuntu 3.2.0-45-generic #70-Ubuntu SMP Wed May 29 20:12:06 UTC 2013 x86_64
x86_64 x86_64 GNU/Linux

Which version are you using?  (run 'go version')
go version go1.1 linux/amd64

3 thoughts on “encoding/json: performance slower than expected

  1. The primary benefit I see from having a “faster” library is that it will reduce the number of third-party JSON parsers.

    These parsers prioritize speed over simplicity, safety, compatibility, and security. It would be nice not to encounter the same 10-25 packages with the same micro-benchmark leaflets every so often. Going through the task of deleting them from a codebase requires a lot of patience and is often time consuming even with special-purpose tooling for the task.

  2. Of course. I don’t have 1.11 installed yet (I know), but barely any performance work happened in the package during the 1.11 cycle, so these 1.10 vs tip numbers should still be useful.

    name           old time/op    new time/op    delta
    CodeEncoder-4    7.43ms ± 0%%    5.35ms ± 1%%  -28.01%%  (p=0.002 n=6+6)
    CodeDecoder-4    30.8ms ± 1%%    27.3ms ± 0%%  -11.42%%  (p=0.004 n=6+5)
    
    name           old speed      new speed      delta
    CodeEncoder-4   261MB/s ± 0%%   363MB/s ± 1%%  +38.91%%  (p=0.002 n=6+6)
    CodeDecoder-4  62.9MB/s ± 1%%  70.9MB/s ± 1%%  +12.71%%  (p=0.002 n=6+6)
    
    name           old alloc/op   new alloc/op   delta
    CodeEncoder-4    91.9kB ± 0%%    62.1kB ± 0%%  -32.38%%  (p=0.002 n=6+6)
    CodeDecoder-4    2.74MB ± 0%%    2.74MB ± 0%%   -0.04%%  (p=0.010 n=6+4)
    
    name           old allocs/op  new allocs/op  delta
    CodeEncoder-4      0.00           0.00          ~     (all equal)
    CodeDecoder-4     90.3k ± 0%%     77.5k ± 0%%  -14.09%%  (p=0.002 n=6+6)
    

    Also note that lots of other changes happened to improve performance in earlier releases, such as @kevinburke‘s table lookups for the encoder. So the speedup should be much higher if we compared tip with an even older Go release.