JSON: The Complete Developer's Reference Guide
Table of Contents
- 1. What is JSON?
- 2. JSON Syntax Rules and Data Types
- 3. JSON vs XML vs YAML vs TOML
- 4. Parsing JSON in 7 Languages
- 5. JSON Schema Validation
- 6. JSON Path Expressions
- 7. Common JSON Mistakes and Fixes
- 8. JSON APIs and REST Conventions
- 9. JSON Performance Tips
- 10. JSON Lines (JSONL) Format
- 11. JSON-LD for SEO and Structured Data
- 12. Tools for Working with JSON
- Frequently Asked Questions
Try Our JSON Tools
Paste any JSON into our JSON Formatter to beautify, validate, and explore it instantly — or use our JSON Validator for detailed error messages.
1. What is JSON?
JSON stands for JavaScript Object Notation. It is a lightweight, text-based data interchange format that is easy for humans to read and write, and easy for machines to parse and generate. JSON is language-independent but uses conventions that are familiar to programmers of the C-family of languages, including C, C++, C#, Java, JavaScript, Perl, Python, and many others.
Despite having "JavaScript" in its name, JSON is not tied to JavaScript. It is a universal format supported natively by virtually every programming language in use today. When you make an API call, store data in MongoDB, or configure a TypeScript project with tsconfig.json, you are working with JSON.
A Brief History of JSON
JSON was first specified by Douglas Crockford in the early 2000s. Crockford recognized that JavaScript object literals provided a convenient and readable notation for structuring data, and he formalized this into a standalone data format. The key milestones in JSON's history include:
- 2001 — Douglas Crockford began promoting JSON as an alternative to XML for data exchange.
- 2002 — The JSON.org website was launched, providing the first formal specification.
- 2006 — JSON was formally specified in RFC 4627 by the IETF.
- 2013 — The ECMA-404 standard was published, defining JSON as an international standard.
- 2014 — RFC 7159 superseded RFC 4627, clarifying edge cases and tightening the specification.
- 2017 — RFC 8259 became the current and definitive JSON standard (STD 90).
Why JSON Won
Before JSON, XML was the dominant format for data interchange. JSON rapidly overtook XML for most use cases because of several advantages:
- Simplicity — JSON has only six data types and two structures (objects and arrays). The entire specification fits on a single page.
- Readability — JSON is significantly more concise than XML. No closing tags, no attributes-versus-elements confusion.
- Native JavaScript support — JSON maps directly to JavaScript objects, making it the natural choice for web APIs.
- Universal support — Every major programming language has built-in or standard library support for JSON.
- Smaller payload size — The same data in JSON is typically 30-50% smaller than its XML equivalent.
Today, JSON is the default format for REST APIs, GraphQL responses, NoSQL databases like MongoDB and CouchDB, configuration files (package.json, tsconfig.json, .eslintrc.json), and inter-service communication in microservice architectures.
2. JSON Syntax Rules and Data Types
JSON syntax is derived from JavaScript object notation, but it is stricter. Mastering these rules will save you from the most common JSON errors.
The Eight Fundamental Rules
Rule 1: Data is in key/value pairs. A key must be a double-quoted string followed by a colon, followed by a value:
"name": "Alice"
Rule 2: Data is separated by commas. Multiple key/value pairs within an object, or values within an array, are separated by commas:
{"name": "Alice", "age": 30, "city": "Berlin"}
Rule 3: Curly braces hold objects. A JSON object is an unordered collection of key/value pairs enclosed in {}.
Rule 4: Square brackets hold arrays. A JSON array is an ordered list of values enclosed in [].
Rule 5: All keys must be double-quoted strings. Unlike JavaScript, unquoted keys and single-quoted keys are invalid:
// INVALID JSON
{name: "Alice"}
{'name': "Alice"}
// VALID JSON
{"name": "Alice"}
Rule 6: Strings must use double quotes. Single quotes are not valid for string values.
Rule 7: No comments allowed. Standard JSON does not support // or /* */ comments. For config files that need comments, use JSONC, JSON5, YAML, or TOML.
Rule 8: No trailing commas. A comma after the last element in an object or array is a syntax error:
// INVALID - trailing comma
{"a": 1, "b": 2,}
// VALID
{"a": 1, "b": 2}
The Six JSON Data Types
JSON supports exactly six data types. Understanding each one is essential for working with JSON correctly.
String
A sequence of zero or more Unicode characters, wrapped in double quotes. Strings support escape sequences:
{
"name": "Alice Johnson",
"greeting": "Hello, \"World\"!",
"path": "C:\\Users\\alice",
"newline": "Line 1\nLine 2",
"tab": "Col1\tCol2",
"unicode": "Caf\u00e9"
}
Supported escape sequences: \", \\, \/, \b, \f, \n, \r, \t, and \uXXXX (Unicode code point).
Number
Integers or floating-point values. Can be negative and use exponential notation. JSON does not distinguish between integer and float:
{
"integer": 42,
"negative": -17,
"float": 3.14159,
"exponent": 2.998e8,
"negativeExponent": 6.674e-11
}
Important: JSON does not support NaN, Infinity, -Infinity, hexadecimal (0xFF), octal (0o77), or leading zeros (007).
Boolean
Either true or false (lowercase only, without quotes):
{"isActive": true, "isDeleted": false}
Null
Represents an empty or unknown value. Must be lowercase null without quotes:
{"middleName": null, "deletedAt": null}
Object
An unordered collection of key/value pairs enclosed in curly braces. Keys must be unique strings:
{
"user": {
"id": 1,
"name": "Alice",
"email": "alice@example.com"
}
}
Array
An ordered collection of values enclosed in square brackets. Values do not need to be the same type:
{
"tags": ["javascript", "python", "go"],
"matrix": [[1, 2], [3, 4]],
"mixed": [42, "hello", true, null, {"key": "value"}]
}
Explore JSON Structure Visually
Paste any JSON document into our JSON Viewer for a collapsible tree view with type highlighting, or use the JSON Path Finder to click on any value and get its exact path.
3. JSON vs XML vs YAML vs TOML
JSON, XML, YAML, and TOML are the four most common data serialization formats. Each has its strengths. Here is a detailed comparison with the same data represented in all four formats.
The Same Data in Four Formats
JSON:
{
"server": {
"host": "localhost",
"port": 8080,
"debug": true,
"allowed_origins": ["https://example.com", "https://app.example.com"]
}
}
XML:
<?xml version="1.0" encoding="UTF-8"?>
<server>
<host>localhost</host>
<port>8080</port>
<debug>true</debug>
<allowed_origins>
<origin>https://example.com</origin>
<origin>https://app.example.com</origin>
</allowed_origins>
</server>
YAML:
server:
host: localhost
port: 8080
debug: true
allowed_origins:
- https://example.com
- https://app.example.com
TOML:
[server]
host = "localhost"
port = 8080
debug = true
allowed_origins = ["https://example.com", "https://app.example.com"]
Comparison Table
| Feature | JSON | XML | YAML | TOML |
|---|---|---|---|---|
| Readability | Good | Verbose | Excellent | Excellent |
| File size | Small | Large | Smallest | Small |
| Comments | Not supported | Supported | Supported (#) | Supported (#) |
| Data types | String, Number, Boolean, Null, Object, Array | Everything is text | String, Number, Boolean, Null, Object, Array, Date | String, Integer, Float, Boolean, DateTime, Array, Table |
| Parsing speed | Fast | Slow | Medium | Fast |
| Schema support | JSON Schema | XSD, DTD, RelaxNG | Limited | Limited |
| Whitespace sensitive | No | No | Yes (indentation) | No |
| Nesting depth | Unlimited | Unlimited | Unlimited | Awkward beyond 2-3 levels |
| Primary use case | APIs, data exchange | Documents, SOAP, legacy | Config files, DevOps | Config files (Cargo.toml, pyproject.toml) |
When to Use Each Format
- Use JSON for REST APIs, browser-server communication, NoSQL databases, and anywhere you need fast parsing with broad language support.
- Use XML for document-centric data, SOAP web services, RSS/Atom feeds, SVG graphics, and situations requiring attributes alongside content.
- Use YAML for configuration files (Docker Compose, Kubernetes, GitHub Actions, CI/CD pipelines), Ansible playbooks, and any file humans edit frequently.
- Use TOML for application configuration (Rust's
Cargo.toml, Python'spyproject.toml, Hugo's config), especially when the structure is relatively flat.
Convert Between Formats
Use our JSON to YAML Converter or XML to JSON Converter for instant, accurate conversions. Read more in our Data Formats Comparison Guide.
4. Parsing JSON in 7 Languages
Every major programming language has built-in or standard library support for JSON. Here is how to parse and generate JSON in seven popular languages.
JavaScript
JavaScript has built-in support through the global JSON object:
// Parse JSON string to JavaScript object
const jsonString = '{"name": "Alice", "age": 30, "active": true}';
const user = JSON.parse(jsonString);
console.log(user.name); // "Alice"
// Convert JavaScript object to JSON string
const data = { name: "Bob", roles: ["admin", "editor"] };
const compact = JSON.stringify(data);
const pretty = JSON.stringify(data, null, 2);
// Reviver function: transform values during parsing
const json = '{"name": "Alice", "joinDate": "2026-01-15T10:00:00Z"}';
const parsed = JSON.parse(json, (key, value) => {
if (key === "joinDate") return new Date(value);
return value;
});
// Replacer function: filter or transform during stringification
const safe = JSON.stringify(user, (key, value) => {
if (key === "password") return undefined; // omit sensitive fields
return value;
}, 2);
// Fetch API with JSON
const response = await fetch("https://api.example.com/users/1");
const userData = await response.json();
const created = await fetch("https://api.example.com/users", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ name: "Charlie", email: "charlie@example.com" })
});
const result = await created.json();
Learn more about JSON.stringify() and JSON.parse() edge cases in our JSON.stringify and JSON.parse Deep Dive.
Python
Python includes a built-in json module. No installation required:
import json
# Parse JSON string
data = json.loads('{"name": "Alice", "age": 30}')
print(data["name"]) # Alice
# Generate JSON string
user = {"name": "Alice", "age": 30, "active": True, "manager": None}
compact = json.dumps(user)
pretty = json.dumps(user, indent=2, sort_keys=True)
# Python True -> JSON true, None -> null
# Read and write JSON files
with open("data.json", "w") as f:
json.dump(user, f, indent=2)
with open("data.json", "r") as f:
loaded = json.load(f)
# Custom serialization for datetime, set, etc.
from datetime import datetime
class CustomEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, datetime):
return obj.isoformat()
if isinstance(obj, set):
return list(obj)
return super().default(obj)
event = {"name": "deploy", "timestamp": datetime.now(), "tags": {"prod", "v2"}}
print(json.dumps(event, cls=CustomEncoder, indent=2))
For deeper Python JSON patterns, see our JSON Parsing Guide.
Go
Go's encoding/json package uses struct tags for mapping:
package main
import (
"encoding/json"
"fmt"
)
type User struct {
Name string `json:"name"`
Age int `json:"age"`
Active bool `json:"active"`
Roles []string `json:"roles,omitempty"`
}
func main() {
// Parse JSON into struct
jsonStr := `{"name": "Alice", "age": 30, "active": true, "roles": ["admin"]}`
var user User
json.Unmarshal([]byte(jsonStr), &user)
fmt.Println(user.Name) // Alice
// Generate JSON from struct
newUser := User{Name: "Bob", Age: 25, Active: true}
output, _ := json.MarshalIndent(newUser, "", " ")
fmt.Println(string(output))
// Parse unknown structure with map[string]interface{}
var raw map[string]interface{}
json.Unmarshal([]byte(jsonStr), &raw)
fmt.Println(raw["name"]) // Alice
// Streaming with json.Decoder (for large files/HTTP bodies)
// decoder := json.NewDecoder(reader)
// decoder.Decode(&user)
}
Generate Go Structs Automatically
Paste any JSON API response into our JSON to Go Converter and get perfectly tagged Go structs in one click.
Java
Java uses libraries like Jackson (most popular), Gson, or the newer jakarta.json API:
// Using Jackson (most popular Java JSON library)
import com.fasterxml.jackson.databind.ObjectMapper;
public class JsonExample {
public static void main(String[] args) throws Exception {
ObjectMapper mapper = new ObjectMapper();
// Parse JSON to object
String json = "{\"name\": \"Alice\", \"age\": 30}";
User user = mapper.readValue(json, User.class);
System.out.println(user.getName()); // Alice
// Generate JSON from object
User newUser = new User("Bob", 25);
String output = mapper.writerWithDefaultPrettyPrinter()
.writeValueAsString(newUser);
// Parse to generic Map
Map<String, Object> map = mapper.readValue(json,
new TypeReference<Map<String, Object>>() {});
// Read from file
User fromFile = mapper.readValue(new File("data.json"), User.class);
// Write to file
mapper.writeValue(new File("output.json"), newUser);
}
}
// POJO class
class User {
private String name;
private int age;
// Constructors, getters, setters...
@JsonProperty("name")
public String getName() { return name; }
}
C#
C# uses System.Text.Json (built-in since .NET Core 3.0) or the popular Newtonsoft.Json:
using System.Text.Json;
// Parse JSON to object
string json = "{\"name\": \"Alice\", \"age\": 30, \"active\": true}";
User user = JsonSerializer.Deserialize<User>(json);
Console.WriteLine(user.Name); // Alice
// Generate JSON from object
var newUser = new User { Name = "Bob", Age = 25, Active = true };
string output = JsonSerializer.Serialize(newUser, new JsonSerializerOptions {
WriteIndented = true,
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
});
// Parse to dynamic object (JsonDocument)
using JsonDocument doc = JsonDocument.Parse(json);
JsonElement root = doc.RootElement;
string name = root.GetProperty("name").GetString();
int age = root.GetProperty("age").GetInt32();
// Newtonsoft.Json (popular alternative)
// using Newtonsoft.Json;
// var user = JsonConvert.DeserializeObject<User>(json);
// string output = JsonConvert.SerializeObject(user, Formatting.Indented);
// C# record class
public record User(string Name, int Age, bool Active);
Generate C# Classes from JSON
Use our JSON to C# Converter to generate properly typed C# classes and records from any JSON payload.
Ruby
Ruby includes JSON support in its standard library:
require 'json'
# Parse JSON string
json_string = '{"name": "Alice", "age": 30, "active": true}'
data = JSON.parse(json_string)
puts data["name"] # Alice
# Parse with symbol keys
data = JSON.parse(json_string, symbolize_names: true)
puts data[:name] # Alice
# Generate JSON
user = { name: "Bob", age: 25, roles: ["admin", "editor"] }
compact = JSON.generate(user)
pretty = JSON.pretty_generate(user)
# Read and write JSON files
File.write("data.json", JSON.pretty_generate(user))
loaded = JSON.parse(File.read("data.json"))
# Custom serialization with to_json
class User
attr_accessor :name, :age
def to_json(*args)
{ name: @name, age: @age }.to_json(*args)
end
end
PHP
PHP has built-in json_encode() and json_decode() functions:
<?php
// Parse JSON string
$json = '{"name": "Alice", "age": 30, "active": true}';
$data = json_decode($json); // Returns stdClass object
echo $data->name; // Alice
$data = json_decode($json, true); // Returns associative array
echo $data["name"]; // Alice
// Generate JSON
$user = ["name" => "Bob", "age" => 25, "roles" => ["admin"]];
$compact = json_encode($user);
$pretty = json_encode($user, JSON_PRETTY_PRINT | JSON_UNESCAPED_UNICODE);
// Error handling
$invalid = json_decode("not json");
if (json_last_error() !== JSON_ERROR_NONE) {
echo "JSON error: " . json_last_error_msg();
}
// PHP 7.3+ throws exceptions
try {
$data = json_decode($json, true, 512, JSON_THROW_ON_ERROR);
} catch (JsonException $e) {
echo "Parse error: " . $e->getMessage();
}
// Read and write JSON files
file_put_contents("data.json", json_encode($user, JSON_PRETTY_PRINT));
$loaded = json_decode(file_get_contents("data.json"), true);
?>
Generate Types for Any Language
Our conversion tools generate types from JSON for multiple languages: TypeScript, Go, C#, Python, and Dart.
5. JSON Schema Validation
JSON Schema is a vocabulary for annotating and validating JSON documents. It describes the structure, types, and constraints that a JSON document must satisfy. Think of it as a type system and contract for your JSON data.
Why Use JSON Schema?
- Validation — Ensure incoming data matches the expected format before processing it.
- Documentation — The schema serves as machine-readable documentation of your data format.
- Code generation — Generate types, classes, and forms from schema definitions.
- API contracts — Define what request and response bodies should look like (used by OpenAPI/Swagger).
A Complete JSON Schema Example
{
"$schema": "https://json-schema.org/draft/2020-12/schema",
"title": "User",
"description": "A user account in the system",
"type": "object",
"required": ["name", "email", "age"],
"properties": {
"name": {
"type": "string",
"minLength": 1,
"maxLength": 100,
"description": "The user's full name"
},
"email": {
"type": "string",
"format": "email",
"description": "The user's email address"
},
"age": {
"type": "integer",
"minimum": 0,
"maximum": 150,
"description": "The user's age in years"
},
"role": {
"type": "string",
"enum": ["admin", "editor", "viewer"],
"default": "viewer"
},
"tags": {
"type": "array",
"items": { "type": "string" },
"uniqueItems": true,
"maxItems": 10
},
"address": {
"type": "object",
"properties": {
"street": { "type": "string" },
"city": { "type": "string" },
"zip": { "type": "string", "pattern": "^[0-9]{5}$" }
},
"required": ["street", "city"]
}
},
"additionalProperties": false
}
Common Schema Keywords
type— Expected data type:"string","number","integer","boolean","object","array","null".required— Array of property names that must be present.properties— Defines schemas for each property of an object.items— Defines the schema for items in an array.enum— An array of allowed values.minimum/maximum— Numeric range constraints.minLength/maxLength— String length constraints.pattern— Regular expression for string validation.format— Semantic validation:"email","uri","date-time","ipv4","uuid".additionalProperties— Whether extra properties are allowed.$ref— Reference to another schema (composition and reuse).oneOf/anyOf/allOf— Combine multiple schemas with logical operators.
Validating JSON Against a Schema
JavaScript (Ajv — the fastest JSON Schema validator):
import Ajv from "ajv";
import addFormats from "ajv-formats";
const ajv = new Ajv();
addFormats(ajv); // adds "email", "uri", "date-time", etc.
const validate = ajv.compile(schema);
const valid = validate(data);
if (!valid) {
console.log(validate.errors);
// [{ instancePath: '/age', message: 'must be integer' }]
}
Python (jsonschema):
from jsonschema import validate, ValidationError
schema = {
"type": "object",
"required": ["name", "email"],
"properties": {
"name": {"type": "string", "minLength": 1},
"email": {"type": "string", "format": "email"}
}
}
try:
validate(instance={"name": "", "email": "invalid"}, schema=schema)
except ValidationError as e:
print(f"Validation error: {e.message}")
Validate JSON Schema Online
Test your schemas instantly with our JSON Schema Validator — it supports JSON Schema Draft 2020-12 and gives clear, actionable error messages.
6. JSON Path Expressions
JSONPath is a query language for JSON, similar to XPath for XML. It lets you extract specific values from complex JSON structures using path expressions. JSONPath was formalized in RFC 9535 (published February 2024).
JSONPath Syntax
A JSONPath expression always starts with $, which represents the root of the JSON document:
// Given this JSON:
{
"store": {
"books": [
{"title": "The Great Gatsby", "price": 8.99, "author": "Fitzgerald"},
{"title": "1984", "price": 6.99, "author": "Orwell"},
{"title": "Dune", "price": 12.99, "author": "Herbert"},
{"title": "Neuromancer", "price": 9.99, "author": "Gibson"}
],
"location": "Main Street"
}
}
Common JSONPath Expressions
| Expression | Description | Result |
|---|---|---|
$.store.location |
Dot notation for property access | "Main Street" |
$.store.books[0] |
Array index (zero-based) | First book object |
$.store.books[-1] |
Negative index (last element) | Last book object |
$.store.books[0:2] |
Array slice (index 0 and 1) | First two books |
$.store.books[*].title |
Wildcard — all titles | All book titles |
$..author |
Recursive descent — all authors at any depth | All author values |
$.store.books[?@.price < 10] |
Filter expression | Books cheaper than $10 |
$.store.books[?@.author == 'Orwell'] |
Filter by value equality | Books by Orwell |
Using JSONPath in Code
JavaScript (jsonpath-plus):
import { JSONPath } from "jsonpath-plus";
const data = { store: { books: [/* ... */] } };
// Get all titles
const titles = JSONPath({ path: "$.store.books[*].title", json: data });
// Filter books under $10
const cheap = JSONPath({ path: "$.store.books[?@.price < 10]", json: data });
Python (jsonpath-ng):
from jsonpath_ng.ext import parse
data = {"store": {"books": [...]}}
# Get all titles
expr = parse("$.store.books[*].title")
titles = [match.value for match in expr.find(data)]
# Get books by specific author
expr = parse("$.store.books[?author = 'Orwell']")
orwell_books = [match.value for match in expr.find(data)]
Command line (jq):
# jq uses its own syntax (not JSONPath, but similar power)
# Get all titles
jq '.store.books[].title' data.json
# Filter books under $10
jq '.store.books[] | select(.price < 10)' data.json
# Get unique authors
jq '[.store.books[].author] | unique' data.json
Find JSON Paths Visually
Our JSON Path Finder lets you click on any value in a JSON tree and instantly get the JSONPath expression, dot notation, and bracket notation for that value.
7. Common JSON Mistakes and Fixes
Even experienced developers run into JSON errors. Here are the most common mistakes with clear explanations and fixes.
Mistake 1: Trailing Commas
// BROKEN - trailing comma after last property
{
"name": "Alice",
"age": 30,
}
// FIX: Remove the trailing comma
{
"name": "Alice",
"age": 30
}
This is the single most common JSON error. JavaScript allows trailing commas, but JSON does not.
Mistake 2: Single Quotes
// BROKEN - single quotes are not valid
{'name': 'Alice'}
// FIX: Use double quotes for all keys and string values
{"name": "Alice"}
Mistake 3: Unquoted Keys
// BROKEN - valid JavaScript, but invalid JSON
{name: "Alice", age: 30}
// FIX: Quote all keys with double quotes
{"name": "Alice", "age": 30}
Mistake 4: Comments
// BROKEN - JSON does not support comments
{
// user's name
"name": "Alice" /* admin user */
}
// FIX: Remove all comments
{
"name": "Alice"
}
If you need comments in config files, use JSONC (VS Code convention), JSON5, YAML, or TOML instead.
Mistake 5: Unescaped Special Characters
// BROKEN - \n is interpreted as newline, \f as form feed
{"path": "C:\new\folder"}
// FIX: Escape backslashes
{"path": "C:\\new\\folder"}
Mistake 6: Using undefined
// BROKEN - undefined is not a JSON value
{"status": undefined}
// FIX: Use null instead
{"status": null}
Mistake 7: Missing Brackets
// BROKEN - missing closing bracket
{
"users": [
{"name": "Alice"},
{"name": "Bob"}
}
// FIX: Match every opening bracket
{
"users": [
{"name": "Alice"},
{"name": "Bob"}
]
}
Mistake 8: Duplicate Keys
// PROBLEMATIC - behavior is undefined per the JSON spec
{
"name": "Alice",
"age": 30,
"name": "Bob"
}
// FIX: Each key must be unique within its object
{
"firstName": "Alice",
"lastName": "Bob",
"age": 30
}
Most parsers will silently keep the last value, but this is not guaranteed. Use our JSON Validator to detect duplicate keys.
Mistake 9: Number Precision Loss
// PROBLEMATIC in JavaScript - exceeds Number.MAX_SAFE_INTEGER
{"id": 9007199254740993}
// After JSON.parse(): id becomes 9007199254740992 (silently rounded!)
// FIX: Use string IDs for large numbers
{"id": "9007199254740993"}
// Or in modern JavaScript, use BigInt-aware parsing:
// JSON.parse(text, (key, val, { source }) => {
// if (key === "id") return BigInt(source);
// return val;
// });
Debug JSON Errors Instantly
Paste broken JSON into our JSON Formatter — it pinpoints the exact line and character where the error occurs. To compare two JSON documents, use our JSON Diff tool.
8. JSON APIs and REST Conventions
JSON is the standard data format for REST APIs. Following established conventions makes your APIs consistent and easier to consume.
Standard Content-Type Headers
// Request
Content-Type: application/json
Accept: application/json
// Response
Content-Type: application/json; charset=utf-8
Always set Content-Type: application/json when sending JSON. The MIME type application/json was registered in RFC 4627 and is the only correct type for JSON. Do not use text/json or text/plain.
Response Envelope Pattern
Wrap responses in a consistent envelope so clients know what to expect:
// Success response
{
"status": "success",
"data": {
"id": 42,
"name": "Alice",
"email": "alice@example.com"
}
}
// List response with pagination
{
"status": "success",
"data": [
{"id": 1, "name": "Alice"},
{"id": 2, "name": "Bob"}
],
"meta": {
"page": 1,
"perPage": 20,
"totalPages": 5,
"totalItems": 93
}
}
// Error response
{
"status": "error",
"error": {
"code": "VALIDATION_FAILED",
"message": "Request validation failed",
"details": [
{"field": "email", "message": "must be a valid email address"},
{"field": "age", "message": "must be a positive integer"}
]
}
}
Naming Conventions
Choose one convention and use it consistently across your entire API:
// camelCase (most common for JavaScript/TypeScript APIs)
{"firstName": "Alice", "lastName": "Johnson", "createdAt": "2026-02-11T10:30:00Z"}
// snake_case (common for Python/Ruby APIs, used by GitHub, Stripe, Twitter)
{"first_name": "Alice", "last_name": "Johnson", "created_at": "2026-02-11T10:30:00Z"}
Whichever you choose, never mix conventions within the same API.
Date and Time Formatting
Always use ISO 8601 format with timezone information:
// GOOD: ISO 8601 with timezone
{"createdAt": "2026-02-11T14:30:00Z"}
{"date": "2026-02-11"}
// BAD: Ambiguous or non-standard formats
{"createdAt": "02/11/2026"} // US? International?
{"createdAt": "11 Feb 2026"} // Not machine-friendly
{"createdAt": 1739284200} // Unix timestamp - not human-readable
HTTP Status Codes with JSON
200 OK - Successful GET, PUT, PATCH, or DELETE
201 Created - Successful POST that created a resource
204 No Content - Successful DELETE with no response body
400 Bad Request - Invalid JSON or validation failure
401 Unauthorized - Missing or invalid authentication
403 Forbidden - Authenticated but not authorized
404 Not Found - Resource does not exist
409 Conflict - Duplicate or conflicting state
422 Unprocessable - Valid JSON but semantic errors
429 Too Many - Rate limit exceeded
500 Server Error - Unexpected server failure
Null vs Absent Fields
Decide on a convention and document it. Both approaches are valid:
// Convention 1: Include null values (explicit)
{"name": "Alice", "middleName": null, "phone": null}
// Convention 2: Omit empty fields (compact, saves bandwidth)
{"name": "Alice"}
// For PATCH operations, distinguish between:
// - Field absent: don't change it
// - Field present with null: clear the value
// - Field present with value: update to new value
HATEOAS and Links
Include links to related resources for discoverability:
{
"id": 42,
"name": "Alice",
"links": {
"self": "/api/users/42",
"posts": "/api/users/42/posts",
"avatar": "/api/users/42/avatar"
}
}
9. JSON Performance Tips
JSON is fast by default, but when you are dealing with large payloads, high-throughput APIs, or resource-constrained environments, these optimization techniques matter.
Enable Compression
JSON compresses exceptionally well because of its repetitive structure. Always enable gzip or brotli compression for JSON API responses:
# Typical compression results:
# Original JSON: 10 MB
# Gzipped: 1.2 MB (88% reduction)
# Brotli: 0.9 MB (91% reduction)
# Express.js
const compression = require("compression");
app.use(compression());
# Nginx
gzip on;
gzip_types application/json;
gzip_min_length 256;
Use Streaming Parsers for Large Files
Standard parsers load the entire document into memory. For large files, use streaming parsers that process tokens incrementally:
Python (ijson — iterative JSON parser):
import ijson
# Process a 10GB JSON file without loading it all into memory
with open("huge_dataset.json", "rb") as f:
for item in ijson.items(f, "records.item"):
process(item) # Each item is parsed individually
# Memory usage stays constant regardless of file size
Node.js (stream-json):
const { parser } = require("stream-json");
const { streamArray } = require("stream-json/streamers/StreamArray");
const fs = require("fs");
const pipeline = fs.createReadStream("huge_dataset.json")
.pipe(parser())
.pipe(streamArray());
pipeline.on("data", ({ value }) => {
process(value); // Handle each array element individually
});
Go (json.Decoder for streaming):
file, _ := os.Open("huge_dataset.json")
defer file.Close()
decoder := json.NewDecoder(file)
// Read opening bracket
decoder.Token() // [
for decoder.More() {
var item Record
decoder.Decode(&item)
process(item)
}
// Read closing bracket
decoder.Token() // ]
Minimize Payload Size
- Shorten key names in high-volume APIs (e.g.,
"n"instead of"name") — but only when bandwidth is a real bottleneck. - Omit null values instead of including them explicitly.
- Use pagination to limit response sizes (
?page=1&limit=50). - Support field selection:
GET /api/users?fields=name,email - Minify JSON in production — remove whitespace with
JSON.stringify(data)(no spacing parameter).
Use Faster Parsers
Drop-in replacements that are significantly faster than built-in parsers:
- Python:
orjson(3-10x faster thanjson),ujson(2-5x faster) - Node.js:
simdjson(uses SIMD CPU instructions),fast-json-stringify(2-5x faster for serialization) - Java: Jackson with
afterburnermodule, ordsl-json - Go:
jsoniterorsonic(5-10x faster thanencoding/json)
# Python: orjson (fastest JSON library for Python)
import orjson
data = orjson.loads(json_bytes) # Parse (accepts bytes)
output = orjson.dumps(data, option=orjson.OPT_INDENT_2) # Serialize
Consider Binary Alternatives for Internal Services
For microservice-to-microservice communication where human readability is not needed, binary formats like Protocol Buffers, MessagePack, CBOR, or Avro offer better performance. Use JSON at your API boundary, binary formats internally.
Sort and Organize JSON
Use our JSON Sorter to alphabetize keys for consistent output and smaller diffs, or convert large JSON datasets to CSV with our JSON to CSV Converter.
10. JSON Lines (JSONL) Format
JSON Lines (also called JSONL, Newline-Delimited JSON, or NDJSON) is a format where each line is a separate, valid JSON value. It solves many of the problems with using standard JSON for large datasets.
What JSON Lines Looks Like
{"id": 1, "name": "Alice", "score": 95, "active": true}
{"id": 2, "name": "Bob", "score": 87, "active": true}
{"id": 3, "name": "Charlie", "score": 92, "active": false}
{"id": 4, "name": "Diana", "score": 78, "active": true}
Each line is a complete, valid JSON object. Lines are separated by newline characters (\n). There is no wrapping array, no commas between records, and no enclosing brackets.
Why Use JSONL Instead of Regular JSON?
- Streaming — Process records one at a time without parsing the entire file. You can start processing before the file is fully downloaded.
- Append-friendly — Add new records by simply appending a line. With regular JSON arrays, you would need to parse the entire file, add the record, and rewrite it.
- Memory efficient — Process gigabyte files with constant memory usage by reading one line at a time.
- Unix-friendly — Works perfectly with
head,tail,wc -l,grep,sort, and other standard Unix tools. - Parallel processing — Split the file at newline boundaries for parallel processing across multiple cores or machines.
- Fault tolerant — If a write fails partway through, you lose at most one record. With a regular JSON array, the entire file could become corrupt.
Working with JSONL
Python:
import json
# Read JSONL
records = []
with open("data.jsonl") as f:
for line in f:
records.append(json.loads(line))
# Write JSONL
with open("output.jsonl", "w") as f:
for record in records:
f.write(json.dumps(record) + "\n")
# Append to existing JSONL file
with open("data.jsonl", "a") as f:
f.write(json.dumps({"id": 5, "name": "Eve"}) + "\n")
Node.js:
const fs = require("fs");
const readline = require("readline");
async function readJSONL(filePath) {
const records = [];
const rl = readline.createInterface({
input: fs.createReadStream(filePath),
crlfDelay: Infinity
});
for await (const line of rl) {
if (line.trim()) {
records.push(JSON.parse(line));
}
}
return records;
}
// Write JSONL
function writeJSONL(filePath, records) {
const content = records.map(r => JSON.stringify(r)).join("\n") + "\n";
fs.writeFileSync(filePath, content);
}
Command line:
# Count records
wc -l data.jsonl
# View first 5 records
head -5 data.jsonl | jq .
# Filter records with jq
cat data.jsonl | jq -c 'select(.score > 90)'
# Extract a single field
cat data.jsonl | jq -r '.name'
# Convert JSONL to regular JSON array
jq -s '.' data.jsonl > data.json
# Convert JSON array to JSONL
jq -c '.[]' data.json > data.jsonl
Where JSONL is Used
- OpenAI API — Training data for fine-tuning models must be in JSONL format.
- Elasticsearch — The bulk API uses NDJSON format for batch indexing.
- Structured logging — Each log line is a JSON object, easily ingested by tools like Datadog, Splunk, and the ELK stack.
- Data pipelines — Apache Spark, Pandas, and DuckDB all support JSONL natively.
- Web scraping — Scrapy exports results in JSONL format by default.
11. JSON-LD for SEO and Structured Data
JSON-LD (JSON for Linking Data) is a method of encoding structured data using JSON syntax. It is Google's recommended format for adding structured data to web pages, powering rich results in search engine results pages (SERPs).
What is JSON-LD?
JSON-LD extends regular JSON with a few special keywords that give meaning to data:
@context— Defines the vocabulary being used (usuallyhttps://schema.org).@type— Specifies the type of entity being described (e.g.,Person,Product,Article).@id— A unique identifier for the entity.
JSON-LD is embedded in HTML pages using a <script type="application/ld+json"> tag. Search engines read this structured data to understand the content and display enhanced search results.
Common JSON-LD Types for SEO
Article / BlogPosting:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "BlogPosting",
"headline": "JSON: The Complete Developer's Reference Guide",
"author": {
"@type": "Organization",
"name": "DevToolbox"
},
"datePublished": "2026-02-11",
"dateModified": "2026-02-11",
"image": "https://example.com/og/json-guide.png",
"publisher": {
"@type": "Organization",
"name": "DevToolbox",
"logo": {
"@type": "ImageObject",
"url": "https://example.com/logo.png"
}
}
}
</script>
FAQPage (triggers FAQ rich results in Google):
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "FAQPage",
"mainEntity": [
{
"@type": "Question",
"name": "What is JSON?",
"acceptedAnswer": {
"@type": "Answer",
"text": "JSON (JavaScript Object Notation) is a lightweight data format..."
}
},
{
"@type": "Question",
"name": "Is JSON better than XML?",
"acceptedAnswer": {
"@type": "Answer",
"text": "For most modern use cases like REST APIs, JSON is preferred..."
}
}
]
}
</script>
Product (triggers product rich results):
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "Product",
"name": "Wireless Headphones",
"description": "Premium noise-cancelling headphones",
"brand": { "@type": "Brand", "name": "SoundMax" },
"offers": {
"@type": "Offer",
"price": "79.99",
"priceCurrency": "USD",
"availability": "https://schema.org/InStock"
},
"aggregateRating": {
"@type": "AggregateRating",
"ratingValue": "4.5",
"reviewCount": "127"
}
}
</script>
BreadcrumbList (triggers breadcrumb navigation in SERPs):
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "BreadcrumbList",
"itemListElement": [
{"@type": "ListItem", "position": 1, "name": "Home", "item": "https://example.com/"},
{"@type": "ListItem", "position": 2, "name": "Blog", "item": "https://example.com/blog"},
{"@type": "ListItem", "position": 3, "name": "JSON Guide"}
]
}
</script>
Other Common Types
- HowTo — Step-by-step instructions (rich result with steps).
- SoftwareApplication — App listings with ratings and pricing.
- Organization — Company info, logo, and social profiles (knowledge panel).
- LocalBusiness — Business hours, address, phone (local SEO).
- Recipe — Cooking time, ingredients, nutritional info.
- Event — Date, location, ticket availability.
- VideoObject — Video thumbnails and metadata in search results.
JSON-LD Best Practices
- Place JSON-LD in the
<head>or<body>of your HTML — Google reads both. - Use Google's Rich Results Test to validate your structured data.
- Do not mark up content that is not visible on the page — Google penalizes hidden structured data.
- Keep structured data in sync with visible page content.
- Use multiple JSON-LD blocks on the same page for different entity types.
12. Tools for Working with JSON
Having the right tools makes working with JSON faster and less error-prone. Here is a comprehensive roundup organized by category.
Online JSON Tools (DevToolbox)
DevToolbox provides a complete suite of free, browser-based JSON tools. All processing happens client-side — your data never leaves your browser:
JSON Type Generators
Generate typed data structures from JSON for any programming language:
Command-Line Tools
- jq — The Swiss Army knife for JSON. Filter, transform, and query JSON from the command line. Essential for scripting and data pipelines.
- fx — Interactive JSON viewer for the terminal with JavaScript expression support.
- gron — Converts JSON to discrete assignments, making it greppable.
- jless — A command-line JSON viewer with vim keybindings.
- dasel — Select, put, and delete data from JSON, YAML, TOML, and XML using a unified syntax.
# jq essentials
jq '.' data.json # Pretty-print
jq '.users[0].name' data.json # Get a value
jq '.users[] | {name, age}' data.json # Project fields
jq '.users | length' data.json # Count items
jq '.users[] | select(.age > 30)' data.json # Filter
jq '[.users[].country] | unique' data.json # Unique values
jq -r '.users[].name' data.json # Raw output (no quotes)
IDE and Editor Support
- VS Code — Built-in JSON formatting, validation, schema support, and IntelliSense for known schemas (
package.json,tsconfig.json). Supports JSONC for config files with comments. - JetBrains IDEs — JSON schema validation, structural search, and refactoring support.
- Vim/Neovim — Use
:%!jq .for formatting or:%!python -m json.tool.
Frequently Asked Questions
What is the difference between JSON and a JavaScript object?
JSON is a text-based data interchange format with strict syntax rules: all keys must be double-quoted strings, no trailing commas, no comments, no undefined values, and no functions. A JavaScript object is a runtime data structure that allows unquoted keys, single-quoted strings, trailing commas, methods, computed properties, and Symbol keys. JSON is a subset of JavaScript object literal notation. You convert between them using JSON.parse() and JSON.stringify().
Why does JSON not allow comments or trailing commas?
Douglas Crockford deliberately excluded comments from the JSON specification because he observed that people were using comments to hold parsing directives, which would have destroyed interoperability between different implementations. Trailing commas were excluded to keep the grammar as simple as possible and avoid ambiguity. If you need comments in configuration files, consider JSON5, JSONC (used by VS Code and TypeScript), YAML, or TOML instead.
How do I handle large JSON files that don't fit in memory?
For large JSON files, use streaming parsers instead of loading the entire file into memory. In Python, use the ijson library. In Node.js, use stream-json or JSONStream. In Go, use json.Decoder. Alternatively, convert your data to JSON Lines (JSONL) format where each line is a separate JSON object, enabling line-by-line processing with constant memory usage. For command-line work, jq processes large JSON files efficiently. Also enable gzip or brotli compression, which typically reduces JSON size by 70-90%.
Conclusion
JSON is a foundational technology in modern software development. Its simplicity — just six data types and two structures — is its greatest strength. Whether you are building REST APIs, configuring applications, storing data in document databases, exchanging messages between microservices, or adding structured data to web pages for SEO, JSON is almost certainly part of your workflow.
The key takeaways from this guide:
- Syntax is strict — double-quoted keys, no trailing commas, no comments, no single quotes.
- Six data types — string, number, boolean, null, object, array. Know how they map to your language.
- Use JSON Schema to validate data at system boundaries.
- JSONPath provides XPath-like querying for extracting values from complex structures.
- Follow REST conventions — consistent naming, ISO 8601 dates, proper status codes, response envelopes.
- For large files, use streaming parsers or JSON Lines format.
- JSON-LD adds semantic meaning for search engines — use it for SEO structured data.
- Always validate incoming JSON before processing it.
With the fundamentals covered in this guide and the right tools at your disposal, you can work with JSON confidently in any context, in any programming language, at any scale.
Start Working with JSON
Put this knowledge into practice with our complete suite of JSON tools. Format, validate, convert, compare, and generate typed code — all free, all client-side, no installation required.