An aerospace analytics platform ingests live telemetry from thousands of edge devices. Each device currently buffers 20-30 small readings into one large top-level JSON array before posting it to an HTTP/2 endpoint. During load-testing, the consumer frequently stalls because it must parse the entire array in memory before it can emit any single reading. The team needs a new wire format that ensures every reading is a valid JSON value, allows for incremental processing to keep memory usage nearly constant, and allows captured traffic to be inspected with standard line-oriented Unix tools such as grep or tail. Which payload redesign best meets these goals while remaining inside the JSON family?
Prefix every JSON object with the ASCII Record Separator (0x1E) character and follow it with a line-feed (JSON Text Sequences).
Embed the readings in base64-encoded Protocol Buffer blobs inside a single JSON field, separated by commas.
Replace the JSON payload with newline-delimited YAML documents separated by "---" markers.
Transmit each reading as an individual JSON object terminated by a single line-feed character (newline-delimited JSON/NDJSON).
Newline-delimited JSON (often called NDJSON or JSON Lines) writes each JSON object on its own line, separated only by a line-feed character. Because each line is itself a complete JSON object, the consumer can parse one record at a time, avoiding the need to load the whole stream into memory. The simple LF delimiter also makes the stream compatible with line-oriented tools (grep, tail, awk), which treat each object as a separate line.
JSON Text Sequences (application/json-seq) prefix each object with the ASCII Record Separator (0x1E); while still streaming-friendly, that control character is invisible to typical text tools, so it does not satisfy the line-oriented debugging requirement.
Sending one base64-encoded Protobuf blob inside a JSON field produces only a single JSON value, so the memory problem remains and the blob is opaque to standard tools.
YAML is a different serialization format altogether, so switching to it would break the "remain within the JSON family" constraint.
Therefore, newline-delimited JSON is the only option that meets all three stated requirements.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is NDJSON and how is it different from regular JSON?
Open an interactive chat with Bash
Why is memory usage reduced when processing NDJSON compared to a large JSON array?
Open an interactive chat with Bash
How do tools like 'grep' and 'tail' work effectively with NDJSON?