Processing model: Streaming API

aka Incremental Processing, or Token Streams

Of 3 major processing modes that Jackson supports, Streaming Processing (also known as Incremental Processing) is the most efficient way to process JSON content. It has the lowest memory and processing overhead, and can often match performance of many binary data formats available on Java platform (see "Performance Comparison" link below)

This performance comes at a cost: this is not the most convenient way to process JSON content, because:

As a result, Streaming API is most commonly used by middleware and by frameworks (where performance benefits are available to wider range of using applications, and competition between implementation drives performance as one of measured features), and less often by applications.

Creating Parsers

Parsers are objects used to tokenize JSON content into tokens and associated data. It is the lowest level of read access to JSON content.

Most common way to create parsers is from external sources (Files, HTTP request streams) or buffered data (Strings, byte arrays / buffers). For this purpose org.codehaus.jackson.JsonFactory has extensive set of methods to construct parsers, such as:

  JsonFactory jsonFactory = new JsonFactory(); // or, for data binding, org.codehaus.jackson.mapper.MappingJsonFactory 
  JsonParser jp = jsonFactory.createJsonParser(file); // or URL, Stream, Reader, String, byte[]

Also, if you happen to have an ObjectMapper, there is also ObjectMapper.getJsonFactory() that you can use to reuse factory it has (since (re)using a JsonFactory instances is one Performance Best Practices).

But you can also create parsers from alternate sources:

Reading JSON tokens from these sources is significantly more efficient than re-parsing JSON content from textual representation.

Creating Generators

Generators are objects used to construct JSON content based on sequence of calls to output JSON tokens. It is the lowest level of write access to JSON content.

Most common way to create generators is to pass an external destination (File, OutputStream or Writer) into which write resulting JSON content. For this purpose org.codehaus.jackson.JsonFactory has extensive set of methods to construct parsers, such as:

  JsonFactory jsonFactory = new JsonFactory(); // or, for data binding, org.codehaus.jackson.mapper.MappingJsonFactory 
  JsonGenerator jg = jsonFactory.createJsonGenerator(file, JsonEncoding.UTF8); // or Stream, Reader

An alternative method is available by using org.codehaus.jackson.util.TokenBuffer (added in Jackson 1.5): since it extends JsonGenerator, you can efficiently buffer any JSON output in case it needs to be re-processed:

  TokenBuffer buffer = new TokenBuffer();
  // serialize object as JSON tokens (but don't serialize as JSON text!)
  objectMapper.writeValue(buffer, myBean);
  // read back as tree
  JsonNode root = objectMapper.readTree(buffer.asParser());
  // modify some more, write out
  // ...
  String jsonText = objectMapper.writeValueAsString(root);

(in fact, use of TokenBuffer for re-processing can be considered a performance Best Practice)

Configuring

Additional Reading


CategoryJackson

JacksonStreamingApi (last edited 2010-12-22 00:32:47 by TatuSaloranta)

Copyright ©2009 FasterXML, LLC