Convert NDJSON to JSON Online
Parse newline-delimited JSON (log files, streaming API exports, Kafka dumps) into a standard JSON array. One line at a time, zero configuration.
NDJSON → JSON
Free NDJSON to JSON converter. Convert newline-delimited JSON log files and streaming data exports into a standard JSON array. Handles large files, blank lines, and malformed records. Works in your browser.
Output will appear here
How to use the NDJSON → JSON
Paste or upload your NDJSON data
Paste text directly into the input box, drag and drop a file onto it, or click "Upload file" to browse. Conversion starts instantly on paste — no button click required.
Configure options (optional)
Open the Options panel to customise delimiter, headers, nested-object flattening, and more. Use the Field Selector to pick exactly which columns appear in the output.
Copy or download your JSON
Click Copy to grab the result, or Download to save the file. Everything runs locally in your browser — no data ever leaves your device.
Frequently Asked Questions
- Is my data safe?
- Yes. Every conversion runs entirely inside your browser. No data is ever transmitted to a server. The tool works offline once loaded.
- What is the maximum file size?
- There is no hard limit. Files under 1 MB convert instantly. Files 1–10 MB show a progress indicator. Files over 10 MB prompt a warning and run in a background thread to keep the browser responsive.
- Why does my NDJSON fail to parse?
- Common causes are trailing commas, single-quoted strings, unquoted keys, or missing closing brackets. The converter auto-repairs many of these and tells you exactly what it changed.
- Can I convert multiple files at once?
- The tool handles one file at a time. For bulk conversion, consider the csvjson CLI or API.
Related Converters
How it works
Each line is an independent JSON object
NDJSON (also called JSONL or JSON Lines) stores one complete JSON object per line. Unlike standard JSON, there's no outer array — the newline is the separator. The converter parses each line individually and collects them into a standard JSON array.
Blank lines and BOM characters are ignored
Log pipelines often produce trailing newlines or UTF-8 BOM characters at the file start. These are stripped silently so the parse succeeds even on raw log output.
Output is a standard JSON array
The result is a properly structured JSON array that any JSON parser, API, or tool can consume. Use it with jq, load it into pandas, or feed it into any downstream converter on this site.
Example
CloudWatch log export (NDJSON format) parsed for analysis
{"timestamp":"2024-01-15T10:22:01Z","level":"INFO","msg":"Request received","path":"/api/users","ms":12}
{"timestamp":"2024-01-15T10:22:02Z","level":"ERROR","msg":"DB timeout","path":"/api/orders","ms":5003}
{"timestamp":"2024-01-15T10:22:03Z","level":"INFO","msg":"Request received","path":"/api/users","ms":9}[
{ "timestamp": "2024-01-15T10:22:01Z", "level": "INFO", "msg": "Request received", "path": "/api/users", "ms": 12 },
{ "timestamp": "2024-01-15T10:22:02Z", "level": "ERROR", "msg": "DB timeout", "path": "/api/orders", "ms": 5003 },
{ "timestamp": "2024-01-15T10:22:03Z", "level": "INFO", "msg": "Request received", "path": "/api/users", "ms": 9 }
]3 log lines become a JSON array. Pipe into jq: cat output.json | jq '[.[] | select(.level=="ERROR")]'
Edge cases, handled
JSONL / JSON Lines
NDJSON, JSONL, and JSON Lines are the same format with different names. All use one JSON object per line with newline separators. This converter handles all three.
Malformed lines
If a line can't be parsed as JSON, the converter stops at that line and shows the error with the line number. Common cause: a log entry was split across lines by a buffered write.
Large log files
Each line is parsed independently, so NDJSON files are more memory-efficient to process than a single large JSON array. Files up to ~100 MB typically process in seconds in the browser.
Frequently asked questions
What is NDJSON and why do log systems use it?
NDJSON (Newline Delimited JSON) stores one complete JSON object per line. Log systems use it because it's streamable — you can append a new log line without rewriting the entire file, and you can read a partially-written file line by line. Standard JSON arrays can't be appended to without breaking the structure.
What's the difference between NDJSON, JSONL, and JSON Lines?
Nothing — they're the same format. NDJSON is the formal name from the ndjson.github.io spec. JSONL is used by BigQuery and some Python libraries. JSON Lines is the name used by the jsonlines.org spec. All three: one JSON object per line, newline as separator.
My log file has some non-JSON lines mixed in. Will this fail?
Yes — any line that isn't valid JSON causes a parse error at that line number. Common in log files that mix structured JSON lines with plaintext error messages. Strip the non-JSON lines before converting, or use grep to filter: grep '^{' logfile.ndjson.
Can I convert the NDJSON output to CSV?
Yes. Convert to JSON array first using this tool, then use the JSON to CSV converter. The two-step pipeline works well for log analysis: NDJSON → JSON → CSV → spreadsheet.
Tools like BigQuery and Elasticsearch export in NDJSON. Can I use this to read those exports?
Yes. BigQuery's JSON export format is NDJSON (one object per line). Elasticsearch's _search API with scroll also returns NDJSON. Paste the export directly and the converter produces a standard JSON array.
Related Tools
All conversions run in your browser — nothing is uploaded.
Browse all 26 converters →