Need help with your JSON?
Try our JSON Formatter tool to automatically identify and fix syntax errors in your JSON. JSON Formatter tool
Protecting Against Prototype Pollution in JSON Parsing
Short answer: untrusted JSON does not normally pollute prototypes by itself. The real risk starts after parsing, when code copies, merges, or writes attacker-controlled keys into normal JavaScript objects.
That distinction matters because many developers test a payload with JSON.parse(), see nothing happen, and assume the whole issue is overblown. In practice, the dangerous sink is usually a later step such as Object.assign(), an unsafe deep merge, or a dynamic path setter.
Quick Answer
JSON.parse()treats"__proto__"as data.- The bug appears when parsed data is merged into ordinary objects or used as dynamic property paths.
- Block
__proto__,constructor, andprototypebefore recursive assignment. - Prefer schema validation, own-key iteration, object spread for shallow copies, and null-prototype objects for dictionary-like data.
What `JSON.parse()` Actually Does
When JSON.parse() encounters keys such as "__proto__", "constructor", or "prototype", it creates normal own properties on the returned object. It does not walk the prototype chain and it does not modify Object.prototype.
Safe Parse Example
const payload = JSON.parse(
'{"theme":"dark","__proto__":{"isAdmin":true}}'
);
console.log(Object.prototype.hasOwnProperty.call(payload, "__proto__")); // true
console.log(payload.theme); // "dark"
console.log(({}).isAdmin); // undefined
If nothing bad happens during this test, that is expected. The payload becomes dangerous only when later code handles it unsafely.
Where Prototype Pollution Actually Starts
The issue appears when an application takes parsed JSON and feeds it into code that assigns properties recursively or trusts user-controlled keys. Common sinks include:
- Shallow merges with
Object.assign()or similar helper functions. - Custom deep merge or clone utilities.
- Path-based setters such as writing to
obj[key1][key2][key3]. for...inloops that copy properties without own-property checks.
Unsafe Shallow Merge
const payload = JSON.parse('{"__proto__":{"isAdmin":true}}');
const options = Object.assign({ mode: "cors" }, payload);
console.log(options.mode); // "cors"
console.log(options.isAdmin); // true
console.log(Object.prototype.hasOwnProperty.call(options, "__proto__")); // false
In this case the target object's prototype is changed, even though Object.prototype is not. That may still be enough to bypass checks that rely on inherited defaults or option flags.
Unsafe Dynamic Path Assignment
// VULNERABLE PATTERN
function setPath(target, path, value) {
let current = target;
for (let i = 0; i < path.length - 1; i += 1) {
const key = path[i];
if (current[key] == null) {
current[key] = {};
}
current = current[key];
}
current[path[path.length - 1]] = value;
}
const result = {};
setPath(result, ["constructor", "prototype", "polluted"], "PWNED");
console.log(({}).polluted); // "PWNED"
delete Object.prototype.polluted;
Parsed JSON often supplies the path segments or nested keys that make this kind of bug reachable.
Practical Defenses That Work
1. Validate Shape at the Boundary
Validate incoming JSON before it reaches merge logic. The important goal is to reject unexpected keys and types, not just to coerce values. For configuration-like payloads, allow only known properties and nested structures.
Schema Mindset
const allowedTopLevelKeys = new Set(["theme", "language", "features"]);
for (const key of Object.keys(parsedInput)) {
if (!allowedTopLevelKeys.has(key)) {
throw new Error(`Unexpected key: ${key}`);
}
}
JSON Schema, Ajv, Zod, and similar validators are useful here when configured to reject unknown fields.
2. Explicitly Block Dangerous Keys
If your code must walk arbitrary nested objects, reject __proto__, constructor, and prototype anywhere in the structure before assigning into a target object.
Recursive Sanitizer
const blockedKeys = new Set(["__proto__", "constructor", "prototype"]);
function sanitize(value) {
if (Array.isArray(value)) {
return value.map(sanitize);
}
if (!value || typeof value !== "object") {
return value;
}
const clean = Object.create(null);
for (const key of Object.keys(value)) {
if (blockedKeys.has(key)) {
continue;
}
clean[key] = sanitize(value[key]);
}
return clean;
}
Returning a null-prototype object makes the sanitized result safer for dictionary-style usage.
3. Prefer Safe Copy Patterns
For shallow copies of untrusted objects, object spread is usually safer than Object.assign() because it defines an own property instead of triggering the legacy __proto__ setter on the target object.
Safer Shallow Copy
const payload = JSON.parse('{"__proto__":{"test":"value"},"mode":"cors"}');
const safeCopy = { ...payload };
const safeDict = Object.assign(Object.create(null), payload);
console.log(Object.prototype.hasOwnProperty.call(safeCopy, "__proto__")); // true
console.log(safeCopy.mode); // "cors"
console.log(safeDict.__proto__); // { test: "value" }
4. Iterate Over Own Keys Only
Avoid copying untrusted objects with for...in unless you also guard with Object.prototype.hasOwnProperty.call(). Prefer Object.keys(), Object.entries(), or a validator that gives you an already-trusted shape.
5. Use Better Containers for Dictionary Data
If you are storing arbitrary user-defined keys, a plain object is often the wrong container. A Map or a null-prototype object created with Object.create(null) avoids the default prototype baggage of normal objects.
6. Add Runtime Hardening in Node.js
In Node.js, the --disable-proto=delete and --disable-proto=throw CLI modes can reduce exposure to the legacy Object.prototype.__proto__ accessor. This is useful hardening, but it is not a full substitute for input validation because constructor.prototype style attacks can still exist in buggy code.
Why Reproductions Often Confuse People
Prototype pollution bugs are easy to misunderstand because different sinks have different effects:
JSON.parse()alone usually looks harmless.Object.assign()can mutate the prototype of one target object.- Recursive path setters and unsafe deep merges can escalate into changes on
Object.prototype. - The visible impact may show up later, when authorization, templating, fetch options, or config defaults read inherited properties.
Key Takeaways
JSON.parse()is usually not the vulnerable step.- The dangerous part is what your code does with the parsed object afterward.
- Block dangerous keys, validate schema, and avoid unsafe merge or path-writing helpers.
- Prefer object spread, null-prototype objects, and own-key iteration when handling untrusted data.
- In Node.js,
--disable-protocan add useful defense-in-depth.
If your application accepts uploaded settings, API request bodies, imported JSON files, or feature-flag payloads, treat prototype pollution as a post-parse handling problem. Tight validation and careful object manipulation matter more than the parser itself.
Need help with your JSON?
Try our JSON Formatter tool to automatically identify and fix syntax errors in your JSON. JSON Formatter tool