Convert NDJSON (Newline Delimited JSON / JSONL) to a JSON array, or a JSON array back to NDJSON. Free, private, runs entirely in your browser.
Open NDJSON Converter →NDJSON (Newline Delimited JSON), also called JSONL (JSON Lines), is a text format where each line is a complete, valid JSON object. Unlike a regular JSON array, there is no wrapping [ ] and no commas between records — just one JSON object per line.
{"id": 1, "name": "Alice", "role": "admin"}
{"id": 2, "name": "Bob", "role": "user"}
{"id": 3, "name": "Carol", "role": "user"}
[
{"id": 1, "name": "Alice", "role": "admin"},
{"id": 2, "name": "Bob", "role": "user"},
{"id": 3, "name": "Carol", "role": "user"}
]
| Use Case | Best Format | Reason |
|---|---|---|
| API responses | JSON array | Standard, easy to parse |
| Log files | NDJSON | Append one line at a time |
| BigQuery / data exports | NDJSON | Required format for bulk load |
| Elasticsearch bulk API | NDJSON | Required by the API spec |
| ML training datasets | NDJSON / JSONL | Stream line by line, memory efficient |
| Config files | JSON | Easier to read and edit |
| OpenAI fine-tuning | JSONL | Required format for training files |
To go the other direction (JSON array to NDJSON), select "JSON Array → NDJSON" mode and paste your JSON array.
NDJSON files use the .ndjson or .jsonl extension. Both are identical in format. Google BigQuery documentation uses .ndjson, while the JSON Lines specification uses .jsonl. Most tools accept both extensions.
Read the file line by line and parse each line as JSON:
import json
with open('data.ndjson') as f:
records = [json.loads(line) for line in f if line.strip()]
const records = fs.readFileSync('data.ndjson', 'utf8')
.split('\n')
.filter(line => line.trim())
.map(line => JSON.parse(line));
BigQuery processes data in parallel across many workers. NDJSON allows each worker to read a chunk of lines independently without needing to parse the entire file structure first. A JSON array requires knowing where the array starts and ends, which prevents efficient parallel processing of large files.
Also useful: JSON Formatter | JSON to CSV | JSON to XML | JSON Diff | JSON Validator