Need help with your JSON?
Try our JSON Formatter tool to automatically identify and fix syntax errors in your JSON. JSON Formatter tool
Validating JSON Files in CI/CD Workflows
If a broken JSON file can block deploys, break application startup, or corrupt config at runtime, validate it before the expensive parts of your pipeline run. In practice that means two layers: a fast syntax check for every tracked .json file, and schema validation for files whose structure matters.
For most teams, the most portable CI command is python3 -m json.tool. If your runner already has jq, jq . file.json > /dev/null is equally good for syntax checks. Use JSON Schema only where you need to enforce required fields, value ranges, enums, or disallow unknown keys.
Quickest Reliable Checks
- Portable syntax validation with Python: This works on many CI images without installing extra packages.
Python 3.14 also supportspython3 -m json.tool config/app.json > /dev/nullpython -m json, butjson.toolis the safer choice across older runners and container images. - Fast syntax validation with jq: Good when your job already installs or bundles
jq.
Avoidjq . config/app.json > /dev/nulljq -e .for pure syntax checks. It can fail on valid JSON whose top-level value isfalseornull. - Schema validation with Ajv: Use this when validity depends on required keys, types, or allowed values, not just parseability.
npx ajv-cli validate \ --spec=draft2020 \ -s ci/config.schema.json \ -d ci/config.json \ --errors=text
A Good Default for Repositories
The simplest reliable pattern is to validate tracked JSON files, not every file a recursive find happens to discover in build output, dependencies, or caches. That makes CI quieter and avoids false failures from generated directories like node_modules or .next.
set -euo pipefail
git ls-files -z -- '*.json' |
while IFS= read -r -d '' file; do
case "$file" in
package-lock.json|*/package-lock.json)
continue
;;
esac
echo "Validating $file"
python3 -m json.tool "$file" >/dev/null
doneSkip generated files only if they are not part of the contract you care about. If your deployment actually depends on a tracked generated JSON file, validate it too.
Example: GitHub Actions
For GitHub Actions, put JSON validation near the top of the job so bad config fails fast. The example below uses actions/checkout@v6 and relies on the Python already available on GitHub-hosted Ubuntu runners.
name: validate-json
on:
pull_request:
push:
branches: [main]
jobs:
validate-json:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v6
- name: Validate tracked JSON files
run: |
set -euo pipefail
git ls-files -z -- '*.json' |
while IFS= read -r -d '' file; do
case "$file" in
package-lock.json|*/package-lock.json)
continue
;;
esac
echo "Validating $file"
python3 -m json.tool "$file" >/dev/null
doneIf the repository already installs Node later in the workflow, add a separate schema-validation step with npx ajv-cli validate for the files that need stronger guarantees than syntax alone.
Example: GitLab CI
In GitLab CI, a lightweight Python image is enough for syntax checks, and rules: changes lets you skip the job when JSON files are untouched.
stages:
- validate
validate_json:
stage: validate
image: python:3-slim
script:
- |
set -e
find . \
-path './.git' -prune -o \
-path './node_modules' -prune -o \
-path './dist' -prune -o \
-name '*.json' ! -name 'package-lock.json' -print |
while IFS= read -r file; do
echo "Validating $file"
python -m json.tool "$file" >/dev/null
done
rules:
- changes:
- "**/*.json"
- ".gitlab-ci.yml"One GitLab caveat: rules: changes can still evaluate to true for new branches and some non-push pipelines. If you need stricter behavior there, switch to rules: changes: compare_to.
When Syntax Checks Are Not Enough
Syntax validation answers only one question: "Can a parser read this file?" It does not tell you whether the file has the right keys, acceptable values, or safe defaults. For CI configuration, application settings, API fixtures, and seed data, add a JSON Schema and fail the build when the data stops matching the contract.
Example schema:
{
"$schema": "https://json-schema.org/draft/2020-12/schema",
"type": "object",
"required": ["environment", "timeoutSeconds"],
"properties": {
"environment": {
"enum": ["dev", "staging", "prod"]
},
"timeoutSeconds": {
"type": "integer",
"minimum": 1
}
},
"additionalProperties": false
}Validation command:
npx ajv-cli validate \
--spec=draft2020 \
-s ci/config.schema.json \
-d ci/config.json \
--errors=textIf you keep schemas in the repo, validate the schema files too. A bad schema can make every downstream data check meaningless.
Validating Only Changed JSON Files
Large monorepos sometimes validate only changed files to keep feedback fast. That is reasonable for syntax checks, but full-repo validation is still safer when one JSON file can affect many services or environments.
git diff --name-only --diff-filter=ACMRT "$BASE_SHA" "$HEAD_SHA" -- '*.json' |
while IFS= read -r file; do
[ -n "$file" ] || continue
python3 -m json.tool "$file" >/dev/null
doneSet BASE_SHA and HEAD_SHA from your CI platform, or compute them in a previous step. The exact variables differ between GitHub Actions, GitLab CI, Jenkins, and other systems.
Common Failure Modes
- Comments or trailing commas: Many tools call these files JSON, but standard validators do not. If the file is really JSONC, parse it with a JSONC-aware tool instead of strict JSON tooling.
- Generated files causing noise: Lockfiles and build artifacts can dominate failures. Skip them deliberately, not accidentally.
- Schema drift: A syntax-valid file can still break production if required properties were renamed or removed. That is the gap schema validation closes.
- Using formatting as a proxy for validation: A formatter helps humans read the file, but CI should rely on exit codes from a parser or schema validator.
Conclusion
The best CI/CD setup for JSON is usually simple: parse every tracked JSON file early, validate critical files against schemas, and scope the checks so generated noise does not hide real failures. That gives search users and engineering teams the same outcome: faster feedback and fewer config bugs escaping into deploys.
Need help with your JSON?
Try our JSON Formatter tool to automatically identify and fix syntax errors in your JSON. JSON Formatter tool