Need help with your JSON?

Try our JSON Formatter tool to automatically identify and fix syntax errors in your JSON. JSON Formatter tool

Duplicate Keys in JSON: Detection and Resolution Strategies

Duplicate keys in JSON are a quiet source of bugs because many parsers accept them, but they do not preserve them in a consistent or reversible way. A payload like {"role":"user","role":"admin"} may look harmless in a text editor, yet by the time application code receives it, one of those values is usually gone.

For most searchers, the practical answer is simple: in JavaScript, JSON.parse() typically keeps the last occurrence of a duplicate name, and JSON.stringify() cannot bring the earlier one back. If you need to detect duplicates or preserve every occurrence, you must inspect the raw JSON text before you turn it into a normal object.

Quick Answer

  • JSON object names are supposed to be unique. When they are not, behavior becomes non-interoperable.
  • In JavaScript, JSON.parse() collapses duplicates into one property, usually keeping the last value.
  • JSON.stringify() only serializes the object it receives. If duplicates were already lost during parsing, stringify cannot retain them.
  • If duplicate keys matter to your workflow, use a parser that exposes tokens, member order, or an AST instead of returning only a plain object.

What Current Standards Say

The current JSON interoperability guidance still points in the same direction. RFC 8259 says that names within an object SHOULD be unique and warns that duplicate names make receiver behavior unpredictable. The stricter I-JSON profile goes further and says objects MUST NOT have duplicate names. If you are handling API payloads, configuration files, feature flags, auth claims, or anything security-sensitive, rejecting duplicates is the safest policy.

What JavaScript Actually Does

JavaScript is where most confusion comes from. Developers often search for answers after seeing one value in the source text and another value at runtime. That happens because the duplicate is resolved during parsing, not during formatting.

Parse and Stringify Example

const text = '{"role":"user","role":"admin"}';

const parsed = JSON.parse(text);
console.log(parsed); // { role: "admin" }
console.log(parsed.role); // "admin"

console.log(JSON.stringify(parsed)); // {"role":"admin"}

This is the key point behind searches like "json.stringify retaining duplicate keys": stringify is not dropping anything here. The earlier duplicate was already discarded when the JSON text became a normal object.

Which JavaScript Tools Can Preserve Duplicates?

Tool or APICan it preserve duplicates?What actually happens
JSON.parse()NoCreates one property; the last duplicate usually wins.
JSON.parse() reviverNoRuns after parsing, so overwritten duplicates are already gone.
JSON.stringify()NoSerializes the current object state only.
Token, event, or AST parserYesCan expose every member in source order so you can reject or reconcile duplicates yourself.

One extra JavaScript edge case is worth knowing. MDN notes that JSON.parse() treats "__proto__" as a normal property name in JSON text, while duplicate __proto__ entries in a JavaScript object literal are a syntax error. JSON text and object literal syntax look similar, but they are not identical.

Important Warning

Do not build logic around unspecified "last wins" behavior unless you fully control every parser and consumer in the pipeline. Even if all of your current tools behave the same way, a future parser, validator, gateway, or language binding may not.

Common JavaScript Questions

Can JSON.stringify retain duplicate keys?

No. JSON.stringify() serializes an in-memory JavaScript value. Plain objects cannot store two properties with the same name at the same level, so by the time you call stringify, duplicate keys have already been reduced to one value.

Can a reviver function detect the duplicates for me?

No. The reviver runs after JSON.parse() has already built the object tree. That makes revivers useful for transforming values, but not for recovering duplicate members that were overwritten during parse.

Can JSON Schema catch duplicate keys?

Usually not by itself. A schema validator typically receives parsed data, not the raw token stream. If your parser already collapsed duplicate names into one surviving property, the schema only sees that final object. Treat duplicate-key detection as a parsing concern first, then run schema validation on the clean result.

Why Duplicate Keys Cause Real Problems

Common Failure Modes

ProblemWhy it matters
Silent data lossOne value overwrites another with no obvious error in many runtimes.
Validation gapsDownstream validators may only see the surviving key, not the duplicate source text.
Configuration ambiguityHumans reading the file may expect one value, while runtime code uses another.
Security mistakesPermission, routing, or policy fields can be overridden in ways reviewers miss.

Example: Security-Relevant Override

{
  "user": {
    "role": "viewer",
    "permissions": ["read"],
    "role": "admin"
  }
}

A reviewer may read the first role and miss the second one. A parser that keeps the last value will interpret this object very differently from what the file appears to say at a glance.

How to Detect Duplicates Reliably

Reliable detection has one rule: check duplicates before you collapse the JSON into a plain object. In practice, that means one of these approaches:

  1. Reject at the boundary. Run a duplicate-aware parser or linter when a request, file, or import first enters your system.
  2. Preserve tokens or AST nodes. If you need exact locations or all duplicate values, use a parser that returns member order and source positions instead of only a plain object.
  3. Fail fast in CI. Validate JSON fixtures, config files, and generated payloads before they are checked in or deployed.

Minimal JavaScript Scanner Example

function findDuplicateKeys(json) {
  let index = 0;
  let line = 1;
  let column = 1;
  const duplicates = [];

  function peek() {
    return json[index];
  }

  function advance() {
    const char = json[index++];

    if (char === "\n") {
      line += 1;
      column = 1;
    } else {
      column += 1;
    }

    return char;
  }

  function fail(message) {
    throw new Error(message + " at " + line + ":" + column);
  }

  function skipWhitespace() {
    while (peek() === " " || peek() === "\n" || peek() === "\r" || peek() === "\t") {
      advance();
    }
  }

  function readString() {
    if (advance() !== '"') {
      fail("Expected string");
    }

    let value = "";

    while (index < json.length) {
      const char = advance();

      if (char === '"') {
        return value;
      }

      if (char === "\\") {
        const escape = advance();

        if (escape === "u") {
          for (let i = 0; i < 4; i += 1) {
            const hex = advance();
            if (!/[0-9a-fA-F]/.test(hex)) {
              fail("Invalid unicode escape");
            }
          }
          value += "?";
          continue;
        }

        if (!'"\\/bfnrt'.includes(escape)) {
          fail("Invalid escape sequence");
        }

        value += escape;
        continue;
      }

      value += char;
    }

    fail("Unterminated string");
  }

  function readNumber() {
    if (peek() === "-") {
      advance();
    }

    if (peek() === "0") {
      advance();
    } else {
      if (peek() < "1" || peek() > "9") {
        fail("Invalid number");
      }

      while (peek() >= "0" && peek() <= "9") {
        advance();
      }
    }

    if (peek() === ".") {
      advance();

      if (peek() < "0" || peek() > "9") {
        fail("Invalid fraction");
      }

      while (peek() >= "0" && peek() <= "9") {
        advance();
      }
    }

    if (peek() === "e" || peek() === "E") {
      advance();

      if (peek() === "+" || peek() === "-") {
        advance();
      }

      if (peek() < "0" || peek() > "9") {
        fail("Invalid exponent");
      }

      while (peek() >= "0" && peek() <= "9") {
        advance();
      }
    }
  }

  function readLiteral(literal) {
    for (const expected of literal) {
      if (advance() !== expected) {
        fail("Unexpected token");
      }
    }
  }

  function readArray(path) {
    advance();
    skipWhitespace();

    if (peek() === "]") {
      advance();
      return;
    }

    let itemIndex = 0;

    while (true) {
      readValue(path + "[" + itemIndex + "]");
      itemIndex += 1;
      skipWhitespace();

      const separator = advance();

      if (separator === "]") {
        return;
      }

      if (separator !== ",") {
        fail("Expected ',' or ']'");
      }

      skipWhitespace();
    }
  }

  function readObject(path) {
    advance();
    skipWhitespace();

    const seen = new Map();

    if (peek() === "}") {
      advance();
      return;
    }

    while (true) {
      const keyLine = line;
      const keyColumn = column;
      const key = readString();
      const objectPath = path || "$";

      if (seen.has(key)) {
        duplicates.push({
          path: objectPath,
          key,
          first: seen.get(key),
          duplicate: { line: keyLine, column: keyColumn },
        });
      } else {
        seen.set(key, { line: keyLine, column: keyColumn });
      }

      skipWhitespace();

      if (advance() !== ":") {
        fail("Expected ':'");
      }

      skipWhitespace();

      const childPath = objectPath === "$" ? "$." + key : objectPath + "." + key;
      readValue(childPath);
      skipWhitespace();

      const separator = advance();

      if (separator === "}") {
        return;
      }

      if (separator !== ",") {
        fail("Expected ',' or '}'");
      }

      skipWhitespace();
    }
  }

  function readValue(path) {
    skipWhitespace();

    if (peek() === "{") {
      readObject(path);
      return;
    }

    if (peek() === "[") {
      readArray(path);
      return;
    }

    if (peek() === '"') {
      readString();
      return;
    }

    if (peek() === "-" || (peek() >= "0" && peek() <= "9")) {
      readNumber();
      return;
    }

    if (json.startsWith("true", index)) {
      readLiteral("true");
      return;
    }

    if (json.startsWith("false", index)) {
      readLiteral("false");
      return;
    }

    if (json.startsWith("null", index)) {
      readLiteral("null");
      return;
    }

    fail("Unexpected token");
  }

  readValue("$");
  skipWhitespace();

  if (index !== json.length) {
    fail("Unexpected trailing content");
  }

  return duplicates;
}

This is a minimal illustration of the right approach: inspect the raw JSON text before calling JSON.parse(). For production code, prefer a well-tested parser library over maintaining your own JSON parser.

Resolution Strategies That Actually Work

Once you have detected duplicates, the right fix depends on whether the duplicate is invalid input, legacy data, or a real modeling mistake.

Choose a Policy Deliberately

StrategyBest forTradeoff
Reject the payloadAPIs, configs, auth, billing, infrastructureSafest option, but callers must fix the source data.
Keep the last value explicitlyControlled migrations where later values are authoritativeEasy, but still discards information.
Keep the first value explicitlyLegacy importers that document first-win semanticsStill discards information and may surprise downstream systems.
Convert repeated values to an arrayCases where every value is meaningfulRequires a data-model change, but preserves information cleanly.
Rename the fields upstreamBad source models such as address lines or repeated labelsBest long-term fix when the duplicate was really a schema problem.

Portable Alternatives to Duplicate Keys

// Ambiguous and non-portable
{
  "tag": "important",
  "tag": "urgent",
  "tag": "review"
}

// Better if one field should hold multiple values
{
  "tags": ["important", "urgent", "review"]
}

// Better if order and repeated names matter
[
  ["tag", "important"],
  ["tag", "urgent"],
  ["tag", "review"]
]

If you truly need repeated names, an array of pairs is the portable JSON representation. Plain objects are the wrong data structure for that job.

When a Custom JavaScript Library Makes Sense

You only need a custom JavaScript library if the built-in behavior is too destructive for your use case. In practice, that means one of two things: you must preserve every duplicate occurrence, or you must report the exact duplicate locations back to a user or calling system.

If you evaluate a library for this problem, look for these capabilities: it preserves member order, exposes all object entries instead of collapsing them into a plain object, reports line and column information, and lets you choose a clear duplicate policy such as reject, keep first, keep last, or merge. If a library only returns ordinary objects, it cannot retain duplicate keys no matter what its README says.

Bottom Line

Duplicate keys are not something to normalize away casually. They are a signal that the JSON source is ambiguous, and normal JavaScript parsing will usually hide that ambiguity by keeping only one value. Detect duplicates before parsing, reject them by default for important data, and remodel the data as arrays or distinct fields when you need to keep every value.

Need help with your JSON?

Try our JSON Formatter tool to automatically identify and fix syntax errors in your JSON. JSON Formatter tool