Need help with your JSON?

Try our JSON Formatter tool to automatically identify and fix syntax errors in your JSON. JSON Formatter tool

Jenkins Pipeline JSON Configuration Techniques

Jenkins Pipelines, whether written in Declarative or Scripted Groovy syntax, are powerful tools for automating build, test, and deployment workflows. While the pipeline definition itself is written in Groovy, there are many scenarios where you need to work with configuration or data stored in JSON format. This article explores common techniques for incorporating and manipulating JSON data within your Jenkins Pipelines.

It's important to clarify that you don't write the *entire pipeline definition* in JSON. Instead, you use JSON as a format for data or configuration that your Groovy pipeline script then processes and acts upon.

Why Use JSON in Jenkins Pipelines?

JSON is a ubiquitous data interchange format, widely used for:

  • API Payloads: Interacting with REST APIs to fetch data or send configuration.
  • Configuration Files: Reading application or job-specific settings from a structured file.
  • Pipeline Parameters: Accepting complex input parameters to a job as a JSON string.
  • Shared Library Data: Defining configurable behaviors for reusable pipeline code.
  • Logging and Reporting: Generating structured output for downstream processing.

Core Techniques for Handling JSON

Using Groovy's Built-in JSON Support

The most common and recommended way to handle JSON directly within your Groovy pipeline script (both Declarative script blocks and Scripted pipelines) is using the built-in libraries available in Jenkins, primarily from the groovy.json package. The key classes are JsonSlurper for parsing JSON strings into Groovy/Java objects (Maps, Lists, primitives) and JsonOutput for converting Groovy/Java objects into JSON strings.

Example: Parsing JSON String

pipeline {
    agent any
    stages {
        stage('Parse JSON') {
            steps {
                script {
                    def jsonString = '''
                    {
                      "name": "Jenkins Job",
                      "version": 1.5,
                      "enabled": true,
                      "tags": ["build", "deploy"],
                      "config": {
                        "timeoutSec": 300
                      }
                    }
                    '''

                    // Use JsonSlurper to parse the string
                    def slurper = new groovy.json.JsonSlurper()
                    def jsonObject = slurper.parseText(jsonString)

                    // Access parsed data
                    echo "Job Name: {jsonObject.name}"
                    echo "First Tag: {jsonObject.tags[0]}"
                    echo "Timeout: {jsonObject.config.timeoutSec} seconds"

                    // Check type
                    echo "Parsed data type: {jsonObject.class.name}" // Should be Map
                    echo "Tags type: {jsonObject.tags.class.name}" // Should be List
                }
            }
        }
    }
}

Example: Generating JSON String

pipeline {
    agent any
    stages {
        stage('Generate JSON') {
            steps {
                script {
                    def data = [
                        status: "success",
                        timestamp: System.currentTimeMillis(),
                        result: [
                            buildNumber: {env.BUILD_NUMBER},
                            jobName: {env.JOB_NAME}
                        ]
                    ]

                    // Use JsonOutput to generate a JSON string
                    def jsonString = groovy.json.JsonOutput.toJson(data)
                    def prettyJsonString = groovy.json.JsonOutput.prettyPrint(jsonString)

                    echo "Generated JSON:"
                    echo {prettyJsonString}

                    // This JSON string can then be sent to an API or saved to a file
                    // Example (conceptual):
                    // sh "curl -X POST -H 'Content-Type: application/json' -d '{jsonString}' http://your-api.com/report"
                }
            }
        }
    }
}

Reading JSON from Files

Often, configuration data is stored in a JSON file within your source code repository. You can read these files and parse their content within your pipeline.

Example: Reading and Parsing a JSON File

Assumes you have a file named config.json in your workspace.

pipeline {
    agent any
    stages {
        stage('Read JSON File') {
            steps {
                script {
                    // Ensure you have a config.json file in your workspace
                    // e.g., echo '{"database": {"host": "localhost", "port": 5432}, "api_key": "abcdef123"}' > config.json

                    def configFile = 'config.json'

                    if (fileExists(configFile)) {
                        def jsonText = readFile(configFile)
                        def slurper = new groovy.json.JsonSlurper()
                        def config = slurper.parseText(jsonText)

                        echo "Database Host: {config.database.host}"
                        // Note: Be careful with sensitive data like api_key!
                        // Store secrets in Jenkins Credentials, not in files committed to Git.
                        echo "API Key (caution!): {config.api_key}"
                    } else {
                        error "Configuration file {configFile} not found!"
                    }
                }
            }
        }
    }
}

Security Note: Never store sensitive information like API keys, passwords, or database credentials directly in JSON files committed to source control. Use Jenkins Credentials Provider to securely manage secrets and inject them into your pipeline environment variables or script context at runtime.

Using JSON for Pipeline Parameters

For complex job inputs, defining a single String parameter and requiring users to input a JSON string is a common pattern. You can then parse this string inside the pipeline.

Example: JSON Input Parameter

pipeline {
    agent any
    parameters {
        string(
            name: 'JOB_CONFIG_JSON',
            defaultValue: '{"environment": "dev", "featureFlags": {"newUI": false, "betaMode": true}}',
            description: 'JSON configuration for the job execution'
        )
    }
    stages {
        stage('Process Parameters') {
            steps {
                script {
                    def configJsonString = params.JOB_CONFIG_JSON
                    def slurper = new groovy.json.JsonSlurper()

                    try {
                        def jobConfig = slurper.parseText(configJsonString)

                        echo "Running in environment: {jobConfig.environment}"
                        echo "New UI flag is: {jobConfig.featureFlags.newUI}"

                        if (jobConfig.featureFlags.betaMode) {
                            echo "Beta mode is enabled."
                            // Add logic for beta mode
                        } else {
                            echo "Beta mode is disabled."
                        }

                    } catch (Exception e) {
                        error "Failed to parse JOB_CONFIG_JSON parameter: {e.getMessage()}"
                    }
                }
            }
        }
    }
}

Using Shell Steps with JSON Tools (jq, Python, etc.)

For more complex JSON transformations, validation, or querying, leveraging command-line tools like jq or scripting languages like Python within sh or bat steps can be highly effective. These tools are often better suited for complex text processing than Groovy's basic string manipulation.

Example: Using jq to Extract Data

Requires jq to be installed on the agent.

pipeline {
    agent any
    stages {
        stage('Process with jq') {
            steps {
                script {
                    def jsonString = '''
                    {
                      "users": [
                        { "name": "Alice", "role": "admin" },
                        { "name": "Bob", "role": "editor" },
                        { "name": "Charlie", "role": "viewer" }
                      ]
                    }
                    '''

                    // Use jq to extract names of users with role 'admin'
                    // Pass the JSON string to jq's standard input
                    def adminName = sh(script: "echo '\\${jsonString}' | jq -r '.users[] | select(.role == "admin") | .name'", returnStdout: true).trim()

                    echo "Admin user found: {adminName}"
                }
            }
        }
    }
}

Example: Using Python for Complex Logic

Requires Python to be installed on the agent.

pipeline {
    agent any
    stages {
        stage('Process with Python') {
            steps {
                script {
                    def jsonString = '''
                    {
                      "items": [
                        { "id": 1, "value": 10 },
                        { "id": 2, "value": 25 },
                        { "id": 3, "value": 15 }
                      ]
                    }
                    '''

                    // Use Python to calculate the sum of values > 10
                    def pythonScript = """
import json
import sys

data = json.loads(sys.stdin.read())
total_sum = sum(item['value'] for item in data['items'] if item['value'] > 10)
print(total_sum)
"""
                    def sumResult = sh(script: "echo '\\${jsonString}' | python -c "\\${pythonScript}"", returnStdout: true).trim()

                    echo "Sum of values > 10: {sumResult}"
                }
            }
        }
    }
}

Best Practices

  • Handle Errors: Always wrap JSON parsing logic in try-catch blocks to gracefully handle malformed JSON input.
  • Validate JSON: If the JSON source is external or user-provided, consider adding validation steps (e.g., checking for required keys or data types) before proceeding.
  • Use Credentials for Secrets: As mentioned, sensitive data in JSON should be avoided in source control. Use Jenkins Credentials.
  • Keep it Readable: Use JsonOutput.prettyPrint() when logging JSON for debugging.
  • Choose the Right Tool: Use Groovy's JsonSlurper/JsonOutput for simple parsing/generation. Use sh with tools like jq or scripting languages for complex queries, transformations, or validation.
  • Size Matters: Be mindful of parsing very large JSON files directly in Groovy, as it consumes memory on the controller or agent. For massive files, stream processing or external tools might be more efficient.

Conclusion

While Jenkins Pipeline syntax is Groovy-based, integrating JSON data is a frequent requirement for interacting with external services, managing configurations, or handling complex parameters. By leveraging Groovy's built-in JSON libraries and external command-line tools, you can effectively incorporate JSON data into your Jenkins workflows, making your pipelines more flexible and powerful. Understanding these techniques allows you to seamlessly connect your automation with the wider ecosystem of tools and services that rely on JSON.

Need help with your JSON?

Try our JSON Formatter tool to automatically identify and fix syntax errors in your JSON. JSON Formatter tool