Home → JSON Duplicate Key Detector
Detect duplicate keys in JSON objects before they cause bugs.
Detect duplicate keys in JSON objects before they cause bugs. This tool runs entirely in your browser — no data is ever sent to a server. Free to use, no account required.
Duplicate keys in a JSON object create ambiguity — different parsers resolve them differently, leading to subtle and hard-to-debug data issues.
Different JSON parsers handle duplicates differently: most keep the last value, some keep the first, and strict parsers throw an error. This inconsistency means the same JSON file can produce different data in different languages or libraries.
When a parser silently takes only the last duplicate value, all earlier values are lost without any error or warning. This is especially dangerous in automated pipelines where the data is not manually inspected.
After detecting duplicates, you have several options depending on whether you need to preserve both values or just pick one.
The most common fix — decide which duplicate to keep and remove the others. Keep first preserves the original value; keep last preserves the most recent override.
If both values are needed, rename the keys to make them unique — for example, name and name_2. This avoids data loss while resolving the ambiguity.
The following JSON contains a duplicate name key. While it may look intentional, the behavior when parsing it is entirely determined by the parser you use — not the JSON file itself.
{
"user": {
"name": "Alice",
"age": 30,
"name": "Bob"
}
}
// JavaScript
JSON.parse(input).user.name // → "Bob" (last value wins)
# Python
json.loads(input)["user"]["name"] // → "Bob" (last value wins)
// Java (Jackson)
mapper.readTree(input).get("user").get("name").asText() // → "Bob"
// Go encoding/json
var v map[string]interface{}
json.Unmarshal([]byte(input), &v) // → "Bob" (last value wins)
# Ruby JSON
JSON.parse(input)["user"]["name"] // → "Bob" (last value wins)
// PHP
json_decode($input, true)["user"]["name"] // → "Bob" (last value wins)
The silent overwrite means "Alice" is permanently lost with no warning or error. If the JSON was generated by a bug in your serializer, you would never know from the parsed output.
| Language / Library | Behavior with duplicates | Error thrown? |
|---|---|---|
| JavaScript JSON.parse | Keeps last value | No |
| Python json.loads | Keeps last value | No |
| Python with object_pairs_hook | Exposes all pairs — you decide | No |
| Java Jackson (default) | Keeps last value | No |
| Java Jackson (STRICT_DUPLICATE_DETECTION) | Rejects the document | Yes — JsonParseException |
| Go encoding/json | Keeps last value | No |
| PHP json_decode | Keeps last value | No |
| Ruby JSON gem | Keeps last value | No |
| .NET System.Text.Json | Keeps last value | No |
| jq | Keeps last value | No |
| Ajv (JSON Schema validator) | Keeps last value | No |
| JSON5 | Keeps last value | No |
The pattern is clear: virtually every mainstream parser silently keeps the last value with no warning. This means a duplicate key bug can survive undetected in production for months until a specific code path exposes the wrong value.
If you prefer the command line or need to automate duplicate detection in a CI/CD pipeline, here are two quick approaches:
import json
def find_duplicates(pairs):
keys = [k for k, v in pairs]
seen = set()
for key in keys:
if key in seen:
print(f"Duplicate key found: {key}")
seen.add(key)
return dict(pairs)
with open("data.json") as f:
json.load(f, object_pairs_hook=find_duplicates)
function parseWithDuplicateCheck(jsonString) {
const seen = new Set();
return JSON.parse(jsonString, (key, value) => {
if (key !== '' && seen.has(key)) {
console.warn(`Duplicate key detected: "${key}"`);
}
seen.add(key);
return value;
});
}
const data = parseWithDuplicateCheck('{"name":"Alice","name":"Bob"}');
// Console: Duplicate key detected: "name"
Explore more tools: All JSON Tools | Validator | Pretty Print | JSON Diff
The JSON spec (RFC 8259) says object keys "SHOULD be unique." It does not say they must be — so most parsers accept duplicate keys without throwing an error, but they handle them differently. The result is silent data loss that is extremely difficult to debug.
// Same JSON, different parsers:
const json = '{"name": "Alice", "name": "Bob"}';
// JavaScript JSON.parse: keeps LAST value
JSON.parse(json).name // "Bob"
// Python json.loads: keeps LAST value
// {"name": "Bob"}
// Java Jackson: keeps FIRST value
// {"name": "Alice"}
// Go encoding/json: keeps LAST value
// {"name": "Bob"}
// Result: silent data loss with no error!
Three approaches to resolve duplicate keys, ordered from least to most automated.
Use this tool to identify all duplicate keys and their values, then decide which value to keep based on business logic.
function deduplicateJSON(jsonStr) {
// Re-serialising forces deduplication (keeps last value per JS behaviour)
return JSON.stringify(JSON.parse(jsonStr));
}
import json # json.loads already deduplicates (keeps last value) deduped = json.loads(json_string)