JSON Blog & Tutorials

Guides, tips and best practices for working with JSON.

JSON vs XML JWT Tutorial JSON Schema stringify & parse How to Validate JSON
NEW — INTERACTIVE

JSON Playground & Tutorial for Beginners

Published: March 14, 2026 | 20 min interactive

Learn JSON from zero with 7 hands-on lessons — live sandbox, fix-the-bug exercises, quizzes, and a cheat sheet. No setup, no signup.

Start the Playground →

Complete Guide to JSON Validation: Best Practices & Common Errors

Published: February 25, 2026 | 8 min read

Learn how to properly validate JSON data, understand common syntax errors, and follow best practices for JSON structure. This comprehensive guide covers everything from basic validation to advanced schema validation.

Why JSON Validation Matters

JSON (JavaScript Object Notation) has become the standard for data interchange between web services. Invalid JSON can break your entire application, cause API failures, and lead to security vulnerabilities. Proper validation ensures data integrity and prevents runtime errors.

Common JSON Errors

  • Missing Commas: Forgetting commas between key-value pairs or array elements
  • Trailing Commas: Extra commas after the last element (not allowed in strict JSON)
  • Single Quotes: Using single quotes instead of double quotes for strings
  • Unquoted Keys: Object keys must always be quoted in JSON
  • Invalid Values: Using undefined, NaN, or functions (not valid JSON)

Best Practices

  1. Always validate JSON before processing
  2. Use proper indentation for readability
  3. Implement schema validation for complex structures
  4. Handle validation errors gracefully
  5. Use automated validation in CI/CD pipelines

Tools for JSON Validation

Use online validators like JSON Web Tools for quick validation. For production environments, implement server-side validation using libraries like Ajv (JavaScript), Jackson (Java), or Marshmallow (Python).

Try JSON Validator →

JSON to CSV Conversion: When, Why, and How

Published: February 25, 2026 | 6 min read

Discover when to convert JSON to CSV, the benefits of each format, and step-by-step instructions for seamless conversion. Perfect for data analysis and Excel integration.

When to Use CSV vs JSON

Use CSV when:

  • Working with Excel or Google Sheets
  • Dealing with tabular data (rows and columns)
  • Need smaller file sizes
  • Sharing data with non-technical users

Use JSON when:

  • Working with nested or hierarchical data
  • Building APIs and web services
  • Need to preserve data types
  • Working with complex object structures

Conversion Process

Converting JSON to CSV involves flattening nested structures and mapping object properties to columns. Arrays are converted to rows, and nested objects require special handling.

Tips for Better Conversions

  1. Ensure consistent object structure across array items
  2. Handle nested objects by flattening or creating separate tables
  3. Use meaningful column headers
  4. Escape special characters properly
Try JSON to CSV Converter →

Understanding JWT: Complete Guide to JSON Web Tokens

Published: February 25, 2026 | 10 min read

Master JSON Web Tokens (JWT) for authentication and authorization. Learn about JWT structure, security best practices, and common use cases in modern web applications.

What is JWT?

JSON Web Token (JWT) is an open standard (RFC 7519) for securely transmitting information between parties as a JSON object. JWTs are commonly used for authentication and information exchange in web applications.

JWT Structure

A JWT consists of three parts separated by dots (.):

  1. Header: Algorithm and token type
  2. Payload: Claims (user data, permissions)
  3. Signature: Verification signature

Security Best Practices

  • Always use HTTPS to transmit JWTs
  • Set appropriate expiration times
  • Never store sensitive data in payload (it's base64, not encrypted)
  • Validate tokens on the server side
  • Use strong secret keys for signing
  • Implement token refresh mechanisms

Common Use Cases

JWTs are perfect for stateless authentication, single sign-on (SSO), and API authorization. They're widely used in microservices architectures and mobile applications.

Try JWT Decoder →

JSON Schema Validation: Ensure Data Quality at Scale

Published: February 25, 2026 | 7 min read

Learn how JSON Schema helps validate data structure, enforce business rules, and maintain data quality across your applications and APIs.

Why Use JSON Schema?

JSON Schema provides a contract for your JSON data. It defines the expected structure, data types, required fields, and validation rules. This is crucial for API development, data migration, and maintaining data consistency.

Key Benefits

  • Automatic documentation of data structures
  • Client and server-side validation
  • Clear API contracts
  • Type safety for dynamic languages
  • Reduced errors and debugging time

Example Schema

A simple JSON Schema for user data:

{
  "type": "object",
  "required": ["name", "email"],
  "properties": {
    "name": { "type": "string" },
    "email": { "type": "string", "format": "email" },
    "age": { "type": "number", "minimum": 0 }
  }
}
Try Schema Validator →

JSON Performance Optimization: Tips for Large Datasets

Published: February 25, 2026 | 9 min read

Optimize JSON processing for large datasets. Learn about streaming, compression, and efficient parsing techniques to improve application performance.

Performance Challenges

Large JSON files can cause memory issues, slow parsing, and poor application performance. Understanding how to handle big data is essential for scalable applications.

Optimization Techniques

  1. Minimize JSON: Remove whitespace to reduce file size by 20-40%
  2. Use Streaming Parsers: Process data incrementally instead of loading entire file
  3. Compression: Enable gzip compression on server (70-90% size reduction)
  4. Pagination: Break large datasets into smaller chunks
  5. Selective Loading: Fetch only needed fields

When to Use Alternatives

For extremely large datasets (>100MB), consider alternatives like Protocol Buffers, MessagePack, or database queries instead of JSON files.

Try JSON Minifier →

JSON to TypeScript: Complete Guide to Generating Type-Safe Interfaces

Published: February 28, 2026 | 7 min read

Learn how to convert JSON objects into TypeScript interfaces for type-safe development. Understand the benefits of static typing and automate interface generation.

Why Convert JSON to TypeScript?

TypeScript provides compile-time type checking that catches errors before runtime. Converting JSON to TypeScript interfaces ensures your data structures are properly typed throughout your application.

Basic Conversion Example

Input JSON:

{"name": "John", "age": 30, "isActive": true}

Generated TypeScript:

interface MyInterface {
  name: string;
  age: number;
  isActive: boolean;
}

Handling Complex Types

  • Arrays: Automatically detected and typed (e.g., string[], number[])
  • Nested Objects: Creates nested interfaces or uses Record types
  • Nullable Values: Handles null with union types (string | null)
  • Mixed Arrays: Uses union types for arrays with multiple types

Best Practices

  1. Use representative sample data for accurate type inference
  2. Review and refine generated interfaces
  3. Add optional (?) modifiers where appropriate
  4. Consider using utility types like Partial, Required, Pick
  5. Document complex types with JSDoc comments
Try JSON to TypeScript →

REST API Testing with JSON: A Complete Guide for Developers

Published: February 28, 2026 | 10 min read

Master REST API testing using JSON. Learn about HTTP methods, request/response formats, authentication, and debugging common API issues.

Understanding REST APIs

REST (Representational State Transfer) APIs use JSON as their primary data format for both requests and responses. Understanding HTTP methods and status codes is crucial for effective API testing.

HTTP Methods Explained

  • GET: Retrieve data (no request body needed)
  • POST: Create new resources (JSON body required)
  • PUT: Update entire resources (JSON body required)
  • PATCH: Partially update resources (JSON body required)
  • DELETE: Remove resources (usually no body)

Common Status Codes

  • 200 OK: Request successful
  • 201 Created: Resource successfully created
  • 400 Bad Request: Invalid JSON or parameters
  • 401 Unauthorized: Authentication required
  • 404 Not Found: Resource doesn't exist
  • 500 Server Error: Internal server problem

Testing Best Practices

  1. Always validate JSON structure before sending
  2. Include proper Content-Type headers (application/json)
  3. Test error scenarios, not just success cases
  4. Monitor response times and optimize slow endpoints
  5. Use tools to save and reuse common requests
Try JSON Validator →

JSON Security: Best Practices to Prevent Vulnerabilities

Published: February 28, 2026 | 8 min read

Protect your applications from JSON-based security vulnerabilities. Learn about injection attacks, data validation, and secure JSON parsing practices.

Common JSON Security Risks

  • JSON Injection: Malicious JSON can execute unintended code
  • XXE Attacks: XML to JSON conversion can expose system files
  • DoS Attacks: Extremely large or nested JSON can crash servers
  • Data Exposure: Sensitive data in JSON responses
  • CSRF via JSON: Cross-site request forgery using JSON payloads

Security Best Practices

  1. Validate Input: Always validate JSON structure and content
  2. Sanitize Data: Remove or escape dangerous characters
  3. Limit Depth: Restrict nesting levels to prevent DoS
  4. Size Limits: Set maximum JSON payload size
  5. Use HTTPS: Encrypt JSON data in transit
  6. Proper Headers: Set Content-Type correctly
  7. Schema Validation: Use JSON Schema to enforce structure

Secure Parsing

Never use eval() or Function() to parse JSON. Always use JSON.parse() which safely parses without executing code. Wrap parsing in try-catch blocks to handle malformed JSON gracefully.

Validate JSON Securely →

JSON Mock Data Generation: Testing Made Easy

Published: February 28, 2026 | 6 min read

Generate realistic test data for your applications. Learn when and how to use mock JSON data for development, testing, and prototyping.

Why Use Mock Data?

Mock data allows you to develop and test applications before backend APIs are ready. It helps identify UI issues, test edge cases, and demonstrate features to stakeholders.

Types of Mock Data

  • User Profiles: Names, emails, addresses, demographics
  • Products: Inventory items, prices, descriptions
  • Orders: Transactions, payments, statuses
  • Blog Posts: Articles, comments, metadata
  • Custom Schemas: Application-specific data structures

Best Practices for Mock Data

  1. Match production data structure exactly
  2. Use realistic values (not "test1", "test2")
  3. Include edge cases (null values, empty arrays, special characters)
  4. Generate enough data to test pagination and performance
  5. Keep mock data separate from production code
  6. Version control your mock data files

When to Use Mock Data

  • Frontend development before backend is ready
  • Unit testing and integration testing
  • Demo and prototype presentations
  • Load testing and performance optimization
  • Offline development environments
Generate Mock Data →

JSON Path Query: Navigate Complex JSON Like a Pro

Published: February 28, 2026 | 7 min read

Master JSONPath to extract specific data from complex JSON structures. Learn the syntax, operators, and advanced techniques for efficient data querying.

What is JSONPath?

JSONPath is a query language for JSON, similar to XPath for XML. It allows you to select and extract specific data from JSON documents using expressions.

Basic Syntax

  • $: Root object
  • .key: Access object property
  • [index]: Access array element
  • [*]: All elements in array
  • .. Recursive descent (search all levels)
  • @: Current node (in filters)

Common Query Examples

$.users[0].name          // First user's name
$.users[*].email         // All user emails
$.store.book[?(@.price < 10)]  // Books under $10
$..author                // All authors at any level

Advanced Techniques

  1. Use filters to select items based on conditions
  2. Combine wildcards with property access
  3. Leverage recursive descent for unknown structures
  4. Test queries incrementally to debug complex paths

Use Cases

  • Extract specific fields from API responses
  • Transform JSON data structures
  • Search configuration files
  • Filter and sort JSON arrays
  • Data migration and transformation scripts
Try JSONPath Query →

JSON vs XML: When to Use Each Format

Published: February 28, 2026 | 9 min read

Compare JSON and XML to make informed decisions about data formats. Understand the strengths, weaknesses, and ideal use cases for each.

JSON Advantages

  • Lightweight: 20-30% smaller than equivalent XML
  • Faster Parsing: Native JavaScript support
  • Readable: Simpler syntax, easier to write
  • Better for Arrays: Native array support
  • Modern APIs: Standard for REST APIs

XML Advantages

  • Metadata: Attributes provide additional context
  • Namespaces: Avoid naming conflicts
  • Schema Validation: More robust with XSD
  • Comments: Built-in comment support
  • Document-centric: Better for complex documents

When to Choose JSON

  • Web APIs and microservices
  • Configuration files (package.json, tsconfig.json)
  • Data interchange between JavaScript applications
  • Mobile app communication
  • Real-time data streaming

When to Choose XML

  • Complex document structures (DocX, SVG)
  • Legacy enterprise systems integration
  • SOAP web services
  • Publishing workflows (RSS, ATOM)
  • When metadata and attributes are crucial

Conversion Between Formats

Both formats can be converted to each other, but some information may be lost. JSON to XML loses type information, while XML to JSON loses attribute context.

Convert XML to JSON →

JSON Merge Strategies: Combining Objects Effectively

Published: February 28, 2026 | 6 min read

Learn different strategies for merging JSON objects. Understand shallow vs deep merge, handling conflicts, and preserving data integrity.

Shallow vs Deep Merge

Shallow Merge: Only merges top-level properties. Nested objects are replaced entirely.
Deep Merge: Recursively merges nested objects, preserving data at all levels.

Merge Strategies

  • Override: Second object completely replaces first
  • Preserve: Keep first object's values on conflicts
  • Combine: Merge arrays instead of replacing
  • Custom: Apply business logic to resolve conflicts

Common Use Cases

  1. Merging configuration files (defaults + user settings)
  2. Combining API responses from multiple sources
  3. User preferences with system defaults
  4. Feature flags and environment variables
  5. Data aggregation from multiple services

Best Practices

  • Choose the right merge strategy for your use case
  • Document how conflicts are resolved
  • Test edge cases (null values, empty objects, arrays)
  • Consider using libraries for complex merges
  • Validate merged data structure

Performance Considerations

Deep merging can be expensive for large objects. Consider performance impacts and optimize by limiting merge depth or using shallow merge when appropriate.

Try JSON Merge →

JSON Flatten & Unflatten: Simplifying Nested Structures

Published: February 28, 2026 | 5 min read

Transform nested JSON into flat key-value pairs and back. Learn when flattening is useful and how to preserve data structure during transformation.

What is JSON Flattening?

Flattening converts nested JSON objects into a flat structure with dot-notation keys. For example: {"user": {"name": "John"}} becomes {"user.name": "John"}.

Benefits of Flattening

  • CSV Export: Easier conversion to tabular format
  • Database Storage: Simpler schema for key-value stores
  • Search & Filter: Easier to query flat structures
  • Form Handling: Maps well to HTML form inputs
  • Logging: Simplified log entry structure

When to Flatten

  • Converting JSON to CSV/Excel
  • Storing in key-value databases (Redis, DynamoDB)
  • Query parameters in URLs
  • Form data serialization
  • Configuration management

When NOT to Flatten

  • Arrays with complex objects (loses structure)
  • When nested relationships are important
  • API responses (consumers expect nested format)
  • When working with JSON Schema validation

Unflattening

Unflattening reverses the process, reconstructing nested objects from dot-notation keys. This is useful when converting CSV back to JSON or retrieving data from flat storage.

Try Flatten/Unflatten →

JWT Tokens Explained: Understanding JSON Web Tokens

Published: February 28, 2026 | 10 min read

Deep dive into JSON Web Tokens (JWT). Learn structure, claims, signing algorithms, and security best practices for authentication.

What is a JWT?

JWT (JSON Web Token) is a compact, URL-safe token format for securely transmitting information between parties as a JSON object. It's commonly used for authentication and information exchange.

JWT Structure

A JWT consists of three parts separated by dots (.):

  • Header: Token type and signing algorithm
  • Payload: Claims (user data, expiration, etc.)
  • Signature: Cryptographic signature for verification

Common Claims

  • iss (issuer): Who created the token
  • sub (subject): Who the token is about (user ID)
  • aud (audience): Who should accept the token
  • exp (expiration): When the token expires
  • iat (issued at): When the token was created
  • Custom claims: Your application-specific data

Security Best Practices

  1. Use HTTPS: Always transmit JWTs over encrypted connections
  2. Short Expiration: Keep exp times short (15-30 minutes)
  3. Strong Secrets: Use long, random signing keys
  4. Validate Always: Verify signature on every request
  5. Don't Store Secrets: Never put sensitive data in payload
  6. Refresh Tokens: Use refresh tokens for long sessions

Common Mistakes

  • Storing JWTs in localStorage (vulnerable to XSS)
  • Not validating the signature
  • Using weak signing algorithms (HS256 with weak secrets)
  • Putting sensitive data in the payload
  • Not checking expiration times
Decode JWT →

JSON Performance Optimization: Making Your APIs Faster

Published: February 28, 2026 | 8 min read

Optimize JSON for better performance. Learn compression techniques, parsing strategies, and best practices for high-traffic APIs.

Why JSON Performance Matters

Large JSON payloads increase bandwidth usage, parsing time, and memory consumption. Optimizing JSON can significantly improve application performance and reduce costs.

Size Reduction Techniques

  1. Minification: Remove whitespace (20-40% reduction)
  2. Compression: Enable gzip/brotli (70-90% reduction)
  3. Short Property Names: Use abbreviated keys
  4. Remove Nulls: Omit null values when possible
  5. Pagination: Split large datasets into chunks

Parsing Optimization

  • Streaming Parsers: Process data incrementally
  • Lazy Loading: Parse only needed sections
  • Worker Threads: Parse in background threads
  • Caching: Cache parsed results
  • Schema Validation: Validate once, trust thereafter

API Design for Performance

  • Field filtering: Let clients request specific fields
  • Pagination: Limit results per request
  • ETags: Enable caching with conditional requests
  • Sparse fieldsets: Return only populated fields
  • Batch endpoints: Combine multiple requests

Monitoring & Benchmarking

Measure JSON payload sizes, parsing times, and network transfer speeds. Use tools like Chrome DevTools, Lighthouse, and APM solutions to identify bottlenecks.

When to Use Alternatives

For extreme performance requirements, consider binary formats like Protocol Buffers, MessagePack, or CBOR which can be 3-10x smaller and faster than JSON.

Optimize JSON Now →

JSON Schema: Complete Guide to Data Validation

Published: February 28, 2026 | 11 min read

Master JSON Schema to validate data structures, enforce business rules, and document your APIs. Learn schema design, validation, and best practices.

What is JSON Schema?

JSON Schema is a vocabulary that allows you to annotate and validate JSON documents. It describes your data format and provides complete structural validation.

Basic Schema Example

{
  "type": "object",
  "properties": {
    "name": {"type": "string"},
    "age": {"type": "number", "minimum": 0}
  },
  "required": ["name"]
}

Schema Keywords

  • type: Data type (string, number, boolean, object, array, null)
  • properties: Object property definitions
  • required: Mandatory fields
  • minimum/maximum: Number range validation
  • minLength/maxLength: String length validation
  • pattern: Regex validation for strings
  • enum: Allowed values list
  • format: String formats (email, uri, date-time)

Advanced Features

  1. $ref: Reuse schema definitions
  2. allOf/anyOf/oneOf: Combine schemas
  3. not: Exclude certain structures
  4. dependencies: Conditional requirements
  5. additionalProperties: Control extra fields

Use Cases

  • API request/response validation
  • Configuration file validation
  • Database schema documentation
  • Form validation
  • Data migration verification

Best Practices

  1. Start with required fields only, add constraints incrementally
  2. Use descriptive titles and descriptions for documentation
  3. Set additionalProperties: false for strict validation
  4. Use $ref to avoid duplication
  5. Version your schemas
  6. Test schemas with edge cases
Validate with Schema →

Building REST APIs with JSON: Design Principles & Best Practices

Published: February 28, 2026 | 12 min read

Design robust REST APIs using JSON. Learn about resource modeling, HTTP methods, status codes, versioning, and API documentation.

REST API Design Principles

  • Resource-Based: Design around resources, not actions
  • Stateless: Each request contains all necessary information
  • Cacheable: Responses indicate if they can be cached
  • Uniform Interface: Consistent patterns across endpoints
  • Layered System: Client can't tell if connected to end server

Resource Naming

Good:
GET /users          - Get all users
GET /users/123      - Get specific user
POST /users         - Create user
PUT /users/123      - Update user
DELETE /users/123   - Delete user

Bad:
GET /getUsers
POST /createUser
GET /user/delete/123

HTTP Methods & Their Use

  • GET: Retrieve resources (safe, idempotent, cacheable)
  • POST: Create resources (not idempotent)
  • PUT: Update entire resource (idempotent)
  • PATCH: Partial update (idempotent)
  • DELETE: Remove resource (idempotent)

Response Structure Best Practices

{
  "data": {...},           // Actual data
  "meta": {                // Metadata
    "page": 1,
    "total": 100
  },
  "links": {               // HATEOAS
    "self": "/users?page=1",
    "next": "/users?page=2"
  }
}

Error Handling

{
  "error": {
    "code": "VALIDATION_ERROR",
    "message": "Invalid email format",
    "field": "email",
    "timestamp": "2026-02-28T10:00:00Z"
  }
}

Versioning Strategies

  1. URL Path: /v1/users, /v2/users
  2. Header: Accept: application/vnd.api.v1+json
  3. Query Param: /users?version=1

Security Best Practices

  • Always use HTTPS
  • Implement rate limiting
  • Use JWT or OAuth for authentication
  • Validate all inputs
  • Don't expose sensitive data in URLs
  • Use CORS properly
Try Webhook Tester →

JSON in GraphQL: Converting Between REST and GraphQL

Published: February 28, 2026 | 9 min read

Understand the relationship between JSON and GraphQL. Learn when to use each, how to convert, and best practices for modern APIs.

GraphQL vs REST

REST: Multiple endpoints, fixed response structure, over-fetching common.
GraphQL: Single endpoint, client specifies exact data needed, precise fetching.

GraphQL Benefits

  • No Over-fetching: Get exactly what you need
  • No Under-fetching: Get all needed data in one request
  • Strongly Typed: Schema defines exact structure
  • Self-Documenting: Schema serves as documentation
  • Versioning: Add fields without breaking clients

JSON to GraphQL Schema

JSON Data:

{"id": 1, "name": "John", "email": "john@example.com"}

GraphQL Type:

type User {
  id: Int!
  name: String!
  email: String!
}

GraphQL Query Example

query {
  user(id: 1) {
    name
    email
  }
}

Response:
{
  "data": {
    "user": {
      "name": "John",
      "email": "john@example.com"
    }
  }
}

When to Use GraphQL

  • Mobile apps (minimize bandwidth)
  • Complex, nested data requirements
  • Rapid frontend development
  • Multiple client types (web, mobile, IoT)
  • Real-time features with subscriptions

When to Stick with REST

  • Simple CRUD operations
  • Caching is critical
  • File uploads (simpler in REST)
  • Team unfamiliar with GraphQL
  • Legacy system integration

Migration Strategy

  1. Start with GraphQL wrapper around REST APIs
  2. Migrate high-traffic endpoints first
  3. Use schema stitching for gradual migration
  4. Maintain REST for backwards compatibility
  5. Educate team on GraphQL patterns
Convert to GraphQL →

JSON Compression Techniques: Reduce File Size by 90%

Published: February 28, 2026 | 8 min read

Learn advanced JSON compression techniques including minification, gzip, brotli, and alternative formats. Dramatically reduce payload sizes.

Compression Levels

Example 1KB JSON file:

  • Original: 1,000 bytes (100%)
  • Minified: 700 bytes (30% reduction)
  • Gzip: 200 bytes (80% reduction)
  • Brotli: 150 bytes (85% reduction)
  • MessagePack: 450 bytes (55% reduction, binary)

1. Minification

Remove all whitespace, newlines, and unnecessary characters:

Before (formatted):
{
  "name": "John",
  "age": 30
}

After (minified):
{"name":"John","age":30}

2. Property Name Shortening

Before:
{"firstName":"John","lastName":"Doe"}

After:
{"fn":"John","ln":"Doe"}

3. Gzip/Brotli Compression

Server-side compression (enable in web server config):

Nginx:
gzip on;
gzip_types application/json;
brotli on;
brotli_types application/json;

4. Remove Null Values

Before:
{"name":"John","age":null,"city":null}

After:
{"name":"John"}

5. Binary Formats

  • MessagePack: Binary JSON, 30-50% smaller
  • Protocol Buffers: 60-70% smaller, requires schema
  • CBOR: Similar to MessagePack
  • Avro: Compact, schema-based

Best Practices

  1. Always minify production JSON
  2. Enable gzip/brotli on server
  3. Use CDN with compression
  4. Implement pagination for large datasets
  5. Cache compressed responses
  6. Consider binary formats for high-traffic APIs
  7. Monitor payload sizes

Compression Decision Tree

  • <10KB: Minification only
  • 10-100KB: Minification + gzip
  • 100KB-1MB: Minification + brotli + pagination
  • >1MB: Consider binary format or streaming
Minify JSON →

JSON in Modern JavaScript: ES6+ Features & Best Practices

Published: February 28, 2026 | 10 min read

Master modern JavaScript techniques for working with JSON. Learn destructuring, spread operators, async/await, and functional programming with JSON.

1. Destructuring JSON

const json = {name: "John", age: 30, city: "NYC"};

// Old way
const name = json.name;
const age = json.age;

// Modern way
const {name, age} = json;

// With renaming
const {name: userName, age: userAge} = json;

// Nested destructuring
const {address: {city, zip}} = user;

2. Spread Operator

// Merge objects
const defaults = {theme: "dark", lang: "en"};
const userSettings = {lang: "es"};
const settings = {...defaults, ...userSettings};
// Result: {theme: "dark", lang: "es"}

// Clone objects
const original = {a: 1, b: 2};
const clone = {...original};

// Add properties
const extended = {...original, c: 3};

3. Optional Chaining

// Old way
const city = user && user.address && user.address.city;

// Modern way
const city = user?.address?.city;

// With arrays
const firstFriend = user?.friends?.[0]?.name;

4. Nullish Coalescing

// Old way
const value = json.value !== null && json.value !== undefined
  ? json.value : "default";

// Modern way
const value = json.value ?? "default";

// vs OR operator
const port = config.port || 3000;  // 0 becomes 3000 (bad)
const port = config.port ?? 3000;  // 0 stays 0 (good)

5. Async JSON Operations

// Fetch JSON
async function getUser(id) {
  const response = await fetch(`/api/users/${id}`);
  return await response.json();
}

// Multiple requests in parallel
const [users, posts] = await Promise.all([
  fetch('/api/users').then(r => r.json()),
  fetch('/api/posts').then(r => r.json())
]);

// Error handling
try {
  const data = await fetch(url).then(r => r.json());
} catch (error) {
  console.error('JSON parse failed:', error);
}

6. Array Methods for JSON

const users = [{name: "John", age: 30}, {name: "Jane", age: 25}];

// Filter
const adults = users.filter(u => u.age >= 18);

// Map
const names = users.map(u => u.name);

// Find
const john = users.find(u => u.name === "John");

// Reduce
const totalAge = users.reduce((sum, u) => sum + u.age, 0);

// Some/Every
const hasAdults = users.some(u => u.age >= 18);
const allAdults = users.every(u => u.age >= 18);

7. JSON.parse with Reviver

// Convert date strings to Date objects
const json = '{"name":"John","birthDate":"1990-01-01"}';
const user = JSON.parse(json, (key, value) => {
  if (key === 'birthDate') return new Date(value);
  return value;
});

// Result: user.birthDate is a Date object

8. JSON.stringify with Replacer

// Remove sensitive fields
const user = {name: "John", password: "secret", age: 30};
const safe = JSON.stringify(user, (key, value) => {
  if (key === 'password') return undefined;
  return value;
});

// Format with spacing
JSON.stringify(data, null, 2);  // 2-space indent

Modern Patterns

  1. Use const for JSON objects (immutability)
  2. Prefer destructuring over property access
  3. Use spread for merging, not Object.assign()
  4. Use optional chaining to avoid null checks
  5. Use async/await for cleaner async code
  6. Use array methods instead of loops
Validate Your JSON →