Need help with your JSON?
Try our JSON Formatter tool to automatically identify and fix syntax errors in your JSON. JSON Formatter tool
Ansible Playbook JSON Configuration Strategies
The short version: keep your playbook data as normal YAML for as long as possible, and only convert to JSON at the boundary where another system actually needs a JSON string or file. In modern Ansible, most JSON work falls into four buckets: sending API payloads, loading external configuration files, parsing command output, and querying nested results.
That distinction matters because Ansible already handles some JSON cases for you. Current Ansible docs note that ansible.builtin.from_json is mainly for contexts where automatic conversion did not happen, while ansible.builtin.uri can send JSON bodies directly and places JSON API responses in a json key when the server returns application/json. A practical playbook is usually simpler when you lean on those behaviors instead of serializing and parsing everything manually.
Choose the Right Strategy
- •Static configuration in the repo: write it as native YAML variables, then render JSON at the last moment with
to_json. - •REST APIs and webhooks: prefer
ansible.builtin.uriwithbody_format: jsoninstead of building raw JSON strings by hand. - •Large external config files: use
ansible.builtin.include_varsto load.jsonfiles, especially when you want namespacing or conditional loading. - •CLI output that is a JSON string: parse it with
ansible.builtin.from_json. - •Deep filtering across nested arrays: use
community.general.json_querywhen plain Jinja filters stop being readable.
Keep Data Native Until the JSON Boundary
Ansible playbooks are YAML-first. That is a strength, not a limitation. YAML is easier to review in Git, safer to edit, and already maps cleanly to the dictionaries and lists that JSON expects. For repo-managed configuration, define structured variables in YAML and serialize only when a file, API, or command needs actual JSON text.
Example: author in YAML, emit deterministic JSON
---
- name: Render an application config as JSON
hosts: app
gather_facts: false
vars:
app_config:
service:
name: catalog
enabled: true
listen_port: 8443
features:
audit: true
cache: true
allowed_origins:
- https://app.example.com
- https://admin.example.com
owner_name: "Zoë"
tasks:
- name: Write config.json
ansible.builtin.copy:
dest: /etc/catalog/config.json
mode: "0644"
content: "{{ app_config | to_json(indent=2, sort_keys=True, ensure_ascii=False) }}"
This pattern avoids hand-maintaining a long JSON blob inside YAML. It also gives you stable output for diffs with sort_keys=True and keeps non-ASCII text readable with ensure_ascii=False.
Send JSON to APIs with uri, Not Manual String Building
A common mistake is turning a dictionary into a JSON string too early and then passing that string around the playbook. For HTTP APIs, current Ansible guidance is simpler: hand a normal data structure to ansible.builtin.uri and set body_format: json. The module handles the JSON encoding, and if the response is JSON it exposes parsed content in result.json.
Example: post a JSON payload and use the parsed response
---
- name: Create a remote resource
hosts: localhost
gather_facts: false
vars:
payload:
name: "edge-cache"
enabled: true
limits:
requests_per_minute: 1000
tasks:
- name: POST JSON to the API
ansible.builtin.uri:
url: https://api.example.com/v1/services
method: POST
body_format: json
body: "{{ payload }}"
headers:
Authorization: "Bearer {{ api_token }}"
status_code: [200, 201]
register: api_result
- name: Show the returned id
ansible.builtin.debug:
msg: "Created service id {{ api_result.json.id }}"
Only serialize manually when an API explicitly requires a raw JSON string or when you must sign the exact bytes being sent. Otherwise, using body_format: json is clearer and less error-prone.
Load External JSON Files with include_vars
Both vars_files and include_vars can load JSON, but they are not equally useful for real projects. vars_files dumps keys straight into the play scope and is best for simple, static files. ansible.builtin.include_vars is usually the better JSON configuration strategy because you can load conditionally, namespace the result with name, or load a directory of files.
Example: keep imported JSON under one variable
---
- name: Load a JSON settings file
hosts: localhost
gather_facts: false
tasks:
- name: Load vars/app.json into app_settings
ansible.builtin.include_vars:
file: vars/app.json
name: app_settings
- name: Read one field from the imported document
ansible.builtin.debug:
msg: "Log level is {{ app_settings.logging.level }}"
The current module docs also note that directory loads happen in alphabetical order, which is useful if you split JSON fragments into numbered files such as 10-base.json and 20-prod.json.
Use from_json Only When the Input Is Still a String
ansible.builtin.from_json is still important, but mostly for output coming from shell commands, external tools, templated files, or odd modules that return JSON text in a string field. If you already have a normal Ansible dictionary or list, parsing it again is unnecessary and sometimes harmful.
Example: parse command stdout once, then work with structured data
---
- name: Parse JSON emitted by a CLI tool
hosts: localhost
gather_facts: false
tasks:
- name: Export current state as JSON
ansible.builtin.command: mytool export --format json
register: raw_export
changed_when: false
- name: Convert stdout into a normal Ansible variable
ansible.builtin.set_fact:
export_doc: "{{ raw_export.stdout | from_json }}"
- name: Use the parsed object
ansible.builtin.debug:
msg: "First project is {{ export_doc.projects[0].name }}"
Good rule: if the value came from stdout, a template, or a text file, you probably need from_json. If it came from include_vars, uri.json, or normal playbook variables, you probably do not.
Query Nested JSON Without Making the Playbook Opaque
For straightforward selections, standard Jinja filters are often easier to read than a JMESPath expression. Reach for selectattr, map, items2dict, and dict2items first. Use community.general.json_query when the data is deeply nested or the selection logic is naturally expressed as a JMESPath query.
Example: simple Jinja filtering vs. JMESPath
---
- name: Extract data from nested results
hosts: localhost
gather_facts: false
vars:
services:
- name: api
state: enabled
port: 8443
- name: worker
state: disabled
port: 9000
- name: ui
state: enabled
port: 8080
tasks:
- name: Prefer plain Jinja when the intent is obvious
ansible.builtin.debug:
msg: "{{ services | selectattr('state', 'equalto', 'enabled') | map(attribute='port') | list }}"
- name: Use JMESPath when the query gets more complex
ansible.builtin.debug:
msg: "{{ services | community.general.json_query('[?state==`enabled`].name') }}"
One current caveat matters here: json_query lives in community.general, not in ansible-core, and the controller needs the jmespath Python dependency installed. If that dependency is missing, the filter fails before your playbook logic runs.
Common Mistakes and Safer Defaults
- •Do not double-encode API payloads: avoid combining
body_format: jsonwith a pre-serializedbody: "{{ payload | to_json }}". That often sends a JSON string instead of the object the API expects. - •Prefer namespaced imports: loading raw JSON keys into top-level play scope makes collisions more likely.
include_varswithnamekeeps the source obvious. - •Keep booleans and nulls as data, not strings: write YAML values like
true,false, andnullunless the target system literally expects the quoted text. - •Inspect return shapes before parsing: a quick
debug: var=resulttells you whether a module returned a string, a dictionary, or a nestedjsonfield. - •Fail early for missing keys: when a JSON document is required, assert the fields you need before writing config files or calling later tasks.
Conclusion
The best Ansible JSON configuration strategy is usually the least magical one: keep your data as structured YAML, let Ansible modules do automatic JSON handling where they can, and reserve explicit parsing or serialization for the edges of the workflow. That keeps playbooks easier to read and avoids the two biggest sources of bugs here: stringly-typed payloads and double conversion.
If you remember only one rule, make it this one: JSON is usually an output or integration format in Ansible, not the format you should optimize your whole playbook around.
Need help with your JSON?
Try our JSON Formatter tool to automatically identify and fix syntax errors in your JSON. JSON Formatter tool