Complete Guide to JSON to CSV Conversion

Published: April 2026 • 8 min read • Data Formats

JSON and CSV are two of the most widely used data formats in software development, data science, and business intelligence. While JSON excels at representing complex, nested data structures, CSV remains the gold standard for tabular data exchange between applications, databases, and spreadsheet tools like Excel and Google Sheets. Converting between these formats is a routine task, but it comes with subtleties that can trip up even experienced developers.

This guide covers everything you need to know about converting JSON to CSV, from basic flat structures to deeply nested objects and large-scale datasets.

Understanding JSON and CSV Formats

JSON (JavaScript Object Notation) is a lightweight, text-based data format that uses key-value pairs and ordered lists. It supports nested objects, arrays, and multiple data types including strings, numbers, booleans, and null values. JSON is the de facto standard for web APIs, configuration files, and NoSQL databases.

CSV (Comma-Separated Values) is a plain-text format where each line represents a row and columns are separated by commas. It is inherently flat — every row has the same columns, and there is no native support for nesting, arrays, or mixed data types within a column.

Key Differences at a Glance

FeatureJSONCSV
StructureNested objects & arraysFlat rows & columns
Data TypesString, number, boolean, null, object, arrayAll values are strings
SchemaFlexible per recordUniform columns across all rows
Human ReadabilityModerateHigh
Best ForAPIs, config, complex dataSpreadsheets, databases, analytics

Converting Flat JSON to CSV

The simplest conversion scenario is a JSON array of flat objects where every object shares the same keys. This maps naturally to CSV rows and columns.

Python Example

Python's built-in csv and json modules handle this elegantly:

import json
import csv

json_data = '''
[
    {"name": "Alice", "age": 30, "city": "New York"},
    {"name": "Bob", "age": 25, "city": "London"},
    {"name": "Charlie", "age": 35, "city": "Tokyo"}
]
'''

records = json.loads(json_data)
with open('output.csv', 'w', newline='') as f:
    writer = csv.DictWriter(f, fieldnames=records[0].keys())
    writer.writeheader()
    writer.writerows(records)

JavaScript Example

In Node.js or the browser, you can achieve the same result with a few lines:

const data = [
    { name: "Alice", age: 30, city: "New York" },
    { name: "Bob", age: 25, city: "London" }
];

const headers = Object.keys(data[0]);
const csv = [
    headers.join(','),
    ...data.map(row => headers.map(h => JSON.stringify(row[h] ?? '')).join(','))
].join('\n');

console.log(csv);

Handling Nested JSON Data

The real challenge in JSON to CSV conversion is nested data. CSV has no concept of nesting, so you need a strategy to flatten your JSON before converting.

Flattening with Dot Notation

The most common approach is to flatten nested keys using dot notation. For example, {"address": {"city": "Tokyo", "zip": "10001"}} becomes columns address.city and address.zip.

def flatten(obj, parent_key='', sep='.'):
    items = []
    for k, v in obj.items():
        new_key = f"{parent_key}{sep}{k}" if parent_key else k
        if isinstance(v, dict):
            items.extend(flatten(v, new_key, sep).items())
        elif isinstance(v, list):
            items.append((new_key, json.dumps(v)))
        else:
            items.append((new_key, v))
    return dict(items)

Handling Arrays

Arrays within JSON objects require special handling. There are three common strategies:

Converting Large JSON Datasets

When dealing with JSON files that are hundreds of megabytes or larger, loading the entire file into memory is not practical. You need a streaming approach.

Streaming with Python (ijson)

The ijson library parses JSON incrementally, yielding one record at a time:

import ijson
import csv

with open('large-data.json', 'rb') as f, open('output.csv', 'w', newline='') as out:
    writer = None
    for record in ijson.items(f, 'item'):
        flat = flatten(record)
        if writer is None:
            writer = csv.DictWriter(out, fieldnames=flat.keys())
            writer.writeheader()
        writer.writerow(flat)

Streaming with Node.js

Use JSONStream or the built-in Readable stream with ndjson format:

const fs = require('fs');
const { pipeline } = require('stream');
const { parse } = require('ndjson');
const { stringify } = require('csv-stringify');

fs.createReadStream('large-data.ndjson')
    .pipe(parse())
    .pipe(stringify({ header: true }))
    .pipe(fs.createWriteStream('output.csv'));

Common Pitfalls and How to Avoid Them

Missing or Inconsistent Keys

Not every JSON object in an array will have the same keys. If object A has {"name", "email"} and object B has {"name", "phone"}, your CSV needs to include all three columns. Always collect the union of all keys across all records before writing the header.

Special Characters and Comma Escaping

CSV cells containing commas, quotes, or newlines must be wrapped in double quotes, and internal quotes must be escaped by doubling them. Most CSV libraries handle this automatically, but if you are building CSV strings manually, use a proper writer.

Data Type Loss

CSV stores everything as text. The number 42 becomes the string "42", and true becomes "true". When you need to preserve types, consider adding type metadata or using a format that supports typed data.

Using an Online JSON to CSV Converter

For quick, one-off conversions without writing code, an online converter is the fastest option. Risetop's JSON to CSV Converter handles nested flattening, large files, and special character escaping automatically. Simply paste your JSON or upload a file, and download the resulting CSV.

When to Use JSON vs CSV

Frequently Asked Questions

Can I convert JSON arrays with mixed structures to CSV?

Yes, but you need to flatten all records first and collect the union of all keys. Records missing certain keys will have empty cells in those columns. This works fine for analysis tools but can result in very wide CSVs if the structures differ significantly.

How do I handle very large JSON files without running out of memory?

Use streaming parsers like ijson in Python or JSONStream in Node.js. These libraries read the file incrementally and yield one record at a time, so you never load the entire dataset into memory. Pair them with a streaming CSV writer for optimal performance.

What happens to null values during JSON to CSV conversion?

Most converters write an empty cell for null values. You can customize this behavior by replacing null with a placeholder string like "NULL" or "N/A" before writing to CSV, depending on what your downstream tool expects.

How do nested arrays get converted to CSV columns?

The most common approach is to stringify the array into a single CSV cell (e.g., "[1,2,3]"). For more complex cases, you can expand array elements into separate columns or create multiple rows (one per array element) with duplicated parent data.

Is CSV lossy compared to JSON?

Yes. CSV loses data type information (everything becomes a string), does not support nested structures, and cannot represent empty arrays versus null values distinctly. If you need to round-trip data without loss, stick with JSON or use a format like Parquet.

How do I handle Unicode characters in CSV output?

Most modern CSV tools handle Unicode well, but you should ensure your output file is encoded as UTF-8. For Excel compatibility, prepend a UTF-8 BOM (byte order mark: \ufeff) so Excel detects the encoding correctly.

Can I convert JSON to CSV and back without data loss?

Not reliably. The round-trip will lose type information and nested structure. A number like 3.14 becomes the string "3.14" in CSV, and may or may not be parsed back as a number depending on the JSON parser's type inference.

What delimiter should I use instead of commas?

Use tabs (\t) if your data naturally contains commas (common in addresses, descriptions, or monetary values). Tab-separated values (TSV) are widely supported by spreadsheet applications. Semicolons are also common in European locales where commas serve as decimal separators.

Try Our JSON to CSV Converter →