HomeBlog → JSON to CSV Guide

JSON to CSV: Complete Conversion Guide with Code Examples (2026)

📅 Updated April 2026 ⏱ 13 min read 🛠 Developer guide

Converting between JSON and CSV sounds simple until you encounter real-world data: nested objects that need to be flattened, arrays that could become multiple rows or a single stringified cell, records with inconsistent fields, or values containing commas and newlines. This guide explains the complete JSON-to-CSV conversion process — the algorithms, the edge cases, and the production-ready code — so you can handle any conversion scenario whether you are working in a browser, a Node.js service, a Python script, or the command line.

Need to convert JSON to CSV right now?

Use the free online converter — paste your JSON array and download the CSV file instantly.

Open JSON to CSV Tool →

JSON vs CSV: When to Use Each

JSON and CSV serve different purposes and make different assumptions about data structure. Understanding the fundamental differences helps you decide which format belongs where — and what you will lose in the conversion.

DimensionJSONCSV
Data shapeHierarchical — nested objects and arraysFlat — rows and columns only
TypesString, number, boolean, null, array, objectAll values are strings (type inference required)
SchemaSelf-describing (keys in each object)Header row defines schema
ToolingAPIs, programming languages, NoSQL databasesSpreadsheets, SQL imports, analytics tools
Human-readableModerately — clearer for nested dataVery — easy to read in a text editor or spreadsheet
SizeLarger — keys repeated per recordSmaller — keys appear only in header row

Use JSON when data is hierarchical, when types matter (particularly numbers and booleans), when the consuming system is a programming language or API, or when the schema varies between records. Use CSV when data is tabular, when the consumer is a spreadsheet or analytics tool, or when file size matters and the data is flat.

The conversion from JSON to CSV is lossy by nature: type information is collapsed to strings, nested structure must either be flattened (changing the shape) or stringified (making the cell opaque), and arrays in field values require a policy decision that changes the row count.

How JSON to CSV Conversion Works

Converting a JSON array to CSV involves three conceptual phases:

  1. Flatten: Recursively expand nested objects into dot-notation keys. {"user": {"name": "Alice"}} becomes {"user.name": "Alice"}.
  2. Discover columns: Collect every unique key across all records. Because JSON objects can have different fields, you must scan all records to build the complete column set, not just the first record.
  3. Emit rows: For each record, output values in column order. For any column absent from the current record, output an empty cell.

This algorithm handles the heterogeneous records case correctly — records with different fields all appear under the right column headers, with empty cells for missing values.

Simple Flat JSON to CSV

When every record has only scalar values (strings, numbers, booleans, null) and no nesting, the conversion is straightforward:

// Input JSON array
const json = [
  { "id": 1, "name": "Alice", "email": "alice@example.com", "age": 30 },
  { "id": 2, "name": "Bob",   "email": "bob@example.com",   "age": 25 },
  { "id": 3, "name": "Carol", "email": "carol@example.com", "age": 35 }
];

// Step 1: Collect all unique headers
const headers = [...new Set(json.flatMap(row => Object.keys(row)))];

// Step 2: Build CSV rows
function csvCell(value) {
  if (value === null || value === undefined) return "";
  const str = String(value);
  // Wrap in quotes if the value contains comma, quote, or newline
  if (str.includes(",") || str.includes('"') || str.includes("\n")) {
    return '"' + str.replace(/"/g, '""') + '"';
  }
  return str;
}

const csvRows = [
  headers.join(","),
  ...json.map(row => headers.map(h => csvCell(row[h])).join(","))
];

const csvString = csvRows.join("\n");
console.log(csvString);
// id,name,email,age
// 1,Alice,alice@example.com,30
// 2,Bob,bob@example.com,25
// 3,Carol,carol@example.com,35

Handling Nested Objects

Nested objects are the most common complication. The standard approach is dot notation flattening: recursively traverse the object and join key names with dots to produce flat keys.

// Recursive flatten function
function flattenObject(obj, prefix = "", result = {}) {
  for (const [key, value] of Object.entries(obj)) {
    const fullKey = prefix ? `${prefix}.${key}` : key;

    if (value !== null && typeof value === "object" && !Array.isArray(value)) {
      flattenObject(value, fullKey, result); // recurse into nested objects
    } else {
      result[fullKey] = value; // leaf value — store it
    }
  }
  return result;
}

// Example
const user = {
  id: 1,
  name: "Alice",
  address: {
    street: "123 Main St",
    city: "Portland",
    geo: { lat: 45.52, lng: -122.68 }
  }
};

flattenObject(user);
// {
//   "id": 1,
//   "name": "Alice",
//   "address.street": "123 Main St",
//   "address.city": "Portland",
//   "address.geo.lat": 45.52,
//   "address.geo.lng": -122.68
// }

Apply the flattener to every record before collecting headers and emitting rows:

const flatRecords = json.map(record => flattenObject(record));
const headers = [...new Set(flatRecords.flatMap(r => Object.keys(r)))];
// headers: ["id", "name", "address.street", "address.city", "address.geo.lat", "address.geo.lng"]

Alternative: If you prefer not to flatten, you can serialize nested objects as JSON strings in the CSV cell: address,"{""street"":""123 Main St"",...}". This preserves all data but makes the CSV less useful for analytics. Spreadsheet tools like Excel and Google Sheets will treat the cell as a plain string.

Handling Arrays in JSON

Arrays in JSON field values are the most ambiguous case in JSON-to-CSV conversion. There is no single correct answer — the right choice depends on how the data will be consumed.

Strategy 1: Stringify the Array

// Treat arrays as atomic values — serialize to JSON string
function flattenObject(obj, prefix = "", result = {}) {
  for (const [key, value] of Object.entries(obj)) {
    const fullKey = prefix ? `${prefix}.${key}` : key;
    if (Array.isArray(value)) {
      result[fullKey] = JSON.stringify(value); // "["a","b","c"]" in the CSV cell
    } else if (value !== null && typeof value === "object") {
      flattenObject(value, fullKey, result);
    } else {
      result[fullKey] = value;
    }
  }
  return result;
}

Strategy 2: Expand to Multiple Rows (for arrays of objects)

// Expand each element of an array field into a separate row
// Input: one order with multiple line items
const order = {
  orderId: "ORD-001",
  customer: "Alice",
  items: [
    { sku: "A100", qty: 2, price: 9.99 },
    { sku: "B200", qty: 1, price: 24.99 }
  ]
};

// Expand: produces one row per item, with order fields repeated
function expandArray(records, arrayField) {
  return records.flatMap(record => {
    const arr = record[arrayField];
    if (!Array.isArray(arr) || arr.length === 0) return [record];
    const { [arrayField]: _, ...rest } = record;
    return arr.map(item => ({ ...rest, ...flattenObject(item, arrayField) }));
  });
}

expandArray([order], "items");
// [
//   { orderId: "ORD-001", customer: "Alice", "items.sku": "A100", "items.qty": 2, "items.price": 9.99 },
//   { orderId: "ORD-001", customer: "Alice", "items.sku": "B200", "items.qty": 1, "items.price": 24.99 }
// ]

Strategy 3: Index-Based Columns

// Create columns for each array index: tags.0, tags.1, tags.2, ...
// Useful for fixed-length arrays (e.g., RGB values, coordinates)
function flattenArraysWithIndex(obj, prefix = "", result = {}) {
  for (const [key, value] of Object.entries(obj)) {
    const fullKey = prefix ? `${prefix}.${key}` : key;
    if (Array.isArray(value)) {
      value.forEach((item, i) => {
        if (typeof item === "object" && item !== null) {
          flattenArraysWithIndex(item, `${fullKey}.${i}`, result);
        } else {
          result[`${fullKey}.${i}`] = item;
        }
      });
    } else if (value !== null && typeof value === "object") {
      flattenArraysWithIndex(value, fullKey, result);
    } else {
      result[fullKey] = value;
    }
  }
  return result;
}

Missing and Null Fields

Real-world JSON arrays often have records with different sets of fields — for example, an API response where some users have an optional phone field and others do not. Your converter must handle this gracefully.

const records = [
  { id: 1, name: "Alice", email: "alice@x.com", phone: "555-1234" },
  { id: 2, name: "Bob",   email: "bob@x.com"                       }, // no phone
  { id: 3, name: "Carol",              phone: "555-5678"             }  // no email
];

// Collect ALL headers from ALL records — not just the first
const headers = [...new Set(records.flatMap(r => Object.keys(r)))];
// ["id", "name", "email", "phone"]

// When a field is missing, output empty string; when null, output empty or "null"
function cellValue(value) {
  if (value === undefined) return "";      // missing field → empty cell
  if (value === null)      return "";      // null → empty cell (common choice)
  return csvCell(value);
}

// Output:
// id,name,email,phone
// 1,Alice,alice@x.com,555-1234
// 2,Bob,bob@x.com,
// 3,Carol,,555-5678

Design decision: Whether to render JSON null as an empty cell or the literal string null depends on your consumer. Spreadsheet tools generally expect empty cells for absent data. Database imports may require the literal string NULL. Make this configurable in your converter.

Converting JSON to CSV in JavaScript

Here is a complete, production-ready JSON-to-CSV converter in JavaScript with proper quoting, null handling, and a file download function:

function flattenObject(obj, prefix = "") {
  return Object.entries(obj).reduce((acc, [key, val]) => {
    const newKey = prefix ? `${prefix}.${key}` : key;
    if (val !== null && typeof val === "object" && !Array.isArray(val)) {
      Object.assign(acc, flattenObject(val, newKey));
    } else if (Array.isArray(val)) {
      acc[newKey] = JSON.stringify(val); // stringify arrays
    } else {
      acc[newKey] = val;
    }
    return acc;
  }, {});
}

function escapeCell(value) {
  if (value === null || value === undefined) return "";
  const str = String(value);
  if (/[",\n\r]/.test(str)) {
    return '"' + str.replace(/"/g, '""') + '"';
  }
  return str;
}

function jsonToCsv(jsonArray) {
  if (!Array.isArray(jsonArray) || jsonArray.length === 0) return "";

  const flatRows = jsonArray.map(row => flattenObject(row));

  // Collect all unique headers preserving first-seen order
  const headerSet = new Set();
  flatRows.forEach(row => Object.keys(row).forEach(k => headerSet.add(k)));
  const headers = [...headerSet];

  const lines = [
    headers.map(escapeCell).join(","),
    ...flatRows.map(row => headers.map(h => escapeCell(row[h])).join(","))
  ];

  return lines.join("\r\n"); // RFC 4180 recommends CRLF
}

// Download as file in the browser
function downloadCsv(jsonArray, filename = "data.csv") {
  const csv = jsonToCsv(jsonArray);
  const blob = new Blob(["\uFEFF" + csv], { type: "text/csv;charset=utf-8;" });
  // \uFEFF = UTF-8 BOM — tells Excel to open as UTF-8
  const url = URL.createObjectURL(blob);
  const link = document.createElement("a");
  link.href = url;
  link.download = filename;
  link.click();
  URL.revokeObjectURL(url);
}

Converting in Python

Python's standard library provides both json and csv modules. For simple flat data they are sufficient; for nested data, pandas.json_normalize() handles the flattening automatically.

Standard Library Approach

import json, csv, io

def flatten(obj, prefix="", sep="."):
    result = {}
    for k, v in obj.items():
        key = f"{prefix}{sep}{k}" if prefix else k
        if isinstance(v, dict):
            result.update(flatten(v, key, sep))
        elif isinstance(v, list):
            result[key] = json.dumps(v)
        else:
            result[key] = v
    return result

def json_to_csv(json_array):
    if not json_array:
        return ""
    flat_rows = [flatten(row) for row in json_array]

    # Collect all headers preserving order
    headers = list(dict.fromkeys(k for row in flat_rows for k in row))

    output = io.StringIO()
    writer = csv.DictWriter(
        output,
        fieldnames=headers,
        extrasaction="ignore",
        restval=""         # empty string for missing fields
    )
    writer.writeheader()
    writer.writerows(flat_rows)
    return output.getvalue()

# Usage
with open("data.json") as f:
    data = json.load(f)
csv_str = json_to_csv(data)
with open("output.csv", "w", newline="", encoding="utf-8-sig") as f:
    f.write(csv_str)  # utf-8-sig adds BOM for Excel compatibility

Pandas json_normalize()

import pandas as pd, json

with open("data.json") as f:
    data = json.load(f)

# json_normalize flattens nested dicts automatically
# record_path expands nested arrays into rows
df = pd.json_normalize(
    data,
    record_path="items",        # expand this array field into rows
    meta=["orderId", "customer"],  # include these parent fields per row
    meta_prefix="order.",
    sep="."                     # dot-notation for nested keys
)

df.to_csv("output.csv", index=False, encoding="utf-8-sig")

Command-Line JSON to CSV

For one-off conversions or shell pipelines, command-line tools are faster than writing code.

Using jq

# jq: extract specific fields as CSV (flat JSON only)
jq -r '.[] | [.id, .name, .email] | @csv' data.json

# With a header row
jq -r '["id","name","email"], (.[] | [.id, .name, .email]) | @csv' data.json

# Dynamic headers — infer from first record keys (simple flat objects)
jq -r '(.[0] | keys_unsorted) as $h | $h, (.[] | [.[$h[]]] ) | @csv' data.json

Using miller (mlr)

# miller is purpose-built for tabular data and handles nested JSON natively
# Convert JSON array to CSV
mlr --j2c cat data.json

# Flatten nested fields with dot notation
mlr --j2c flatten data.json

# Chain operations: filter, select fields, then convert
mlr --j2c filter '$age > 25' then cut -f id,name,email data.json

Using csvkit

# csvkit's in2csv handles JSON arrays directly
pip install csvkit
in2csv --format json data.json > output.csv

# For nested JSON, combine with jq preprocessing
jq '.' data.json | in2csv --format json > output.csv

CSV to JSON Conversion

Reversing the conversion — CSV back to JSON — is conceptually simpler but requires type inference because CSV treats all values as strings.

function csvToJson(csvString) {
  const lines = csvString.trim().split(/\r?\n/);
  if (lines.length < 2) return [];

  const headers = parseCsvRow(lines[0]);

  return lines.slice(1).map(line => {
    const values = parseCsvRow(line);
    return Object.fromEntries(
      headers.map((h, i) => [h, inferType(values[i] ?? "")])
    );
  });
}

function inferType(str) {
  if (str === "" || str === "null") return null;
  if (str === "true")  return true;
  if (str === "false") return false;
  if (!isNaN(str) && str.trim() !== "") return Number(str);
  // Try to parse as JSON (handles stringified arrays/objects)
  if ((str.startsWith("{") && str.endsWith("}")) ||
      (str.startsWith("[") && str.endsWith("]"))) {
    try { return JSON.parse(str); } catch {}
  }
  return str;
}

function parseCsvRow(line) {
  const result = [];
  let current = "", inQuote = false;
  for (let i = 0; i < line.length; i++) {
    const ch = line[i];
    if (inQuote) {
      if (ch === '"' && line[i+1] === '"') { current += '"'; i++; }
      else if (ch === '"') inQuote = false;
      else current += ch;
    } else {
      if (ch === '"') inQuote = true;
      else if (ch === ",") { result.push(current); current = ""; }
      else current += ch;
    }
  }
  result.push(current);
  return result;
}

Special Characters and Quoting

RFC 4180 defines the standard rules for CSV quoting. Understanding them prevents the most common CSV parsing bugs:

// Examples of correct CSV quoting
// Value: She said "hello"    → "She said ""hello"""
// Value: Portland, OR        → "Portland, OR"
// Value: Line 1\nLine 2      → "Line 1\nLine 2"   (literal newline inside quotes)
// Value: normal value        → normal value        (no quoting needed)

Large File Handling

Loading a 500 MB JSON file into memory to convert it is neither safe nor fast. Streaming conversion processes records one at a time using constant memory.

// Node.js: streaming JSON array to CSV using stream-json library
const { createReadStream, createWriteStream } = require("fs");
const { parser } = require("stream-json");
const { streamArray } = require("stream-json/streamers/StreamArray");
const { Transform } = require("stream");

let headers = null;
let headerWritten = false;

const pipeline = createReadStream("huge.json")
  .pipe(parser())
  .pipe(streamArray())
  .pipe(new Transform({
    objectMode: true,
    transform({ value }, enc, cb) {
      const flat = flattenObject(value);

      if (!headerWritten) {
        headers = Object.keys(flat);
        this.push(headers.join(",") + "\r\n");
        headerWritten = true;
      }

      const row = headers.map(h => escapeCell(flat[h])).join(",") + "\r\n";
      this.push(row);
      cb();
    }
  }))
  .pipe(createWriteStream("output.csv"));

pipeline.on("finish", () => console.log("Done"));

In Python, you can achieve the same with ijson for streaming JSON parsing combined with the csv module for output, processing one record at a time without holding the entire dataset in memory.

Common Conversion Mistakes

These errors appear frequently in production data pipelines:

Excel and Google Sheets Integration

CSV is the universal bridge between JSON-producing APIs and spreadsheet tools. A few practical notes for real-world integration:

Opening CSV in Excel

// Include UTF-8 BOM to ensure Excel opens correctly
const csv = jsonToCsv(data);
const bom = "\uFEFF";
const blob = new Blob([bom + csv], { type: "text/csv;charset=utf-8" });

// The BOM tells Excel: "this file is UTF-8, use Unicode mode"
// Without it, Excel on Windows defaults to ANSI encoding → garbled text

Google Sheets IMPORTDATA

# Google Sheets formula to import a public JSON-as-CSV endpoint
=IMPORTDATA("https://api.example.com/users.csv")

# If your server can serve JSON-to-CSV on the fly, this auto-refreshes
# Sheets also supports IMPORTJSON via Apps Script for direct JSON import

For automating Excel file generation (rather than CSV), consider the exceljs (Node.js) or openpyxl (Python) libraries, which produce native .xlsx files with proper number types, date formatting, and multiple sheets — avoiding all the CSV encoding issues entirely.

Frequently Asked Questions

How do I convert JSON to CSV in JavaScript? +
Flatten each object in your JSON array, collect all unique keys as the header row, then map each object to a CSV row by iterating those keys in order. Wrap values containing commas, double quotes, or newlines in double quotes, and escape internal double quotes by doubling them. Use the Blob and URL.createObjectURL APIs to offer the result as a file download in the browser.
What happens to nested JSON objects when converting to CSV? +
CSV is a flat, two-dimensional format with no concept of nesting. When converting nested JSON, you have two choices: (1) flatten using dot notation (user.address.city becomes a column), or (2) serialize the nested object as a JSON string inside the CSV cell. Flattening is more useful for data analysis; stringifying preserves all data but the cells contain raw JSON.
Can I convert CSV back to JSON? +
Yes. Parse the first CSV row as the header (column names), then parse each subsequent row, mapping cells to the corresponding header key. The main challenge is type inference — all CSV values are strings, so you need logic to convert "42" to number 42 and "true" to boolean true. Libraries like Papa Parse in JavaScript and the csv module in Python handle this automatically.
How do I handle commas inside CSV field values? +
Per RFC 4180, fields containing commas, double quotes, or newlines must be enclosed in double quotes. If the field itself contains a double quote, escape it by doubling it: "" represents one literal double quote within a quoted field. For example, the value She said, "hello" becomes "She said, ""hello""" in CSV.
What is the best command-line tool for JSON to CSV conversion? +
jq is the most widely available option for quick conversions. For complex transformations, miller (mlr) is purpose-built for tabular data manipulation and supports both JSON and CSV natively with a rich pipeline syntax. csvkit's in2csv tool handles many formats including JSON. All three are available via common package managers.

Related Tools & Guides

JSON to CSV Tool  |  CSV to JSON Tool  |  JSON Formatter  |  JSON Flatten Tool  |  JSON to Excel