Line Delimited JSON

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Line Delimited JSON is a standard for delimiting JSON in stream protocols (such as TCP).

Introduction[edit]

This is a minimal specification for sending and receiving JSON over a stream protocol, such as TCP.

The Line Delimited JSON framing is so simple that no specification had previously been written for this ‘obvious’ way to do it.

Example Output[edit]

(with \r\n line separators)

 {"some":"thing"}
 {"foo":17,"bar":false,"quux":true}
 {"may":{"include":"nested","objects":["and","arrays"]}}

Motivation[edit]

There is currently no standard for transporting JSON within a stream protocol (primarily plain TCP), apart from Websockets, which is unnecessarily complex for non-browser applications.

There were numerous possibilities for JSON framing, including counted strings and non-ASCII delimiters (DLE STX ETX or Websocket’s 0xFFs).

Scope[edit]

The primary use case for LDJSON is an unending stream of JSON objects, delivered at variable times, over TCP, where each object needs to be processed as it arrives. e.g. a stream of stock quotes or chat messages.

Philosophy / Requirements[edit]

The specification must be:

  • trivial to implement in multiple popular programming languages
  • flexible enough to handle arbitrary whitespace (pretty-printed JSON)
  • not contain non-printable characters
  • netcat/telnet friendly

Functional Specification[edit]

Software that supports Line Delimited JSON[edit]

postgresql[edit]

As of version 9.2 Postgresql has a function called row_to_json. In addition postgresql supports JSON as a field type, so this may output nested components in much the same way as MongoDB and other nosql databases.

   vine@ubuntu:~$ echo 'select row_to_json(article) from article;' | sudo -u postgres psql—tuples-only
    {"article_id":1,"article_name":"ding","article_desc":"bellsound","date_added":null}
    {"article_id":2,"article_name":"dong","article_desc":"bellcountersound","date_added":null}
   vine@ubuntu:~$ 

jline[edit]

An example [1] of command line tools for manipulating JSON lines in much the same way that grep, Sort (Unix) and other *NIX tools manipulate CSV.

jq[edit]

sed for JSON, implemented in C and compiled to a standalone binary. [2]

pigshell[edit]

This is a shell-in-a-browser that has pipelines made up from objects [3].

Sending[edit]

Each JSON object must be written to the stream followed by the carriage return and newline characters 0x0D0A. The JSON objects may contain newlines, carriage returns and any other permitted whitespace. See www.json.org for the full spec.

All serialized data must use the UTF8 encoding.

Receiving[edit]

The receiver should handle pretty-printed (multi-line) JSON.

The receiver must accept all common line endings: ‘0x0A’ (Unix), ‘0x0D’ (Mac), ‘0x0D0A’ (Windows).

Trivial Implementation[edit]

A simple implementation is to accumulate received lines. Every time a line ending is encountered, an attempt must be made to parse the accumulated lines into a JSON object.

If the parsing of the accumulated lines is successful, the accumulated lines must be discarded and the parsed object given to the application code.

If the amount of unparsed, accumulated characters exceeds 16MiB the receiver may close the stream. Resource constrained devices may close the stream at a lower threshold, though they must accept at least 1KiB.

Implementations[edit]

MIME Type and File Extensions[edit]

When using HTTP/email the MIME type for Line Delimited JSON should be application/x-ldjson (which will hopefully later change to application/ldjson).

When saved in a file, the file extension should be .ldjson or .ldj

Many parsers handle Line Delimited JSON,[1] and standard content-type for "streaming JSON" suggests application/json; boundary=NL for the MIME type

Notes and References[edit]

  1. ^ trephine.org. "Newline Delimited JSON". trephine.org. Retrieved 2 July 2013. 

chrisdew. "Choice of Transports for JSON over TCP". stackoverflow.com. Retrieved 2 July 2013. 

Ryan, Film Grain. "How We Built Filmgrain, Part 2 of 2". filmgrainapp.com. Retrieved 4 July 2013. 

"row_to_json". Retrieved 6 October 2014.