How to parse a XML response in ansible? - xml-parsing

I'm running the panos_op ansible module and struggling to parse the output.
ok: [localhost] => {
"result": {
"changed": true,
"failed": false,
"msg": "Done",
"stdout": "{\"response\": {\"#status\": \"success\", \"result\": \"no\"}}",
"stdout_lines": [
"{\"response\": {\"#status\": \"success\", \"result\": \"no\"}}"
],
"stdout_xml": "<response status=\"success\"><result>no</result></response>"
}
}
This is as close as I can get to assigning the value for "result".
ok: [localhost] => {
"result.stdout": {
"response": {
"#status": "success",
"result": "no"
}
}
}
My goal is to set a conditional loop for the ansible task.
tasks:
- name: Checking for pending changes
panos_op:
ip_address: '{{ host }}'
password: '{{ operator_pw }}'
username: '{{ operator_user}}'
cmd: 'check pending-changes'
register: result
until: result.stdout.result = no
retries: 10
delay: 5
tags: check
How can I make this work?
UPDATE: I've tried it another way, but now I have a new issue trying to deal with a literal "<" char.
tasks:
- name: Checking for pending changes
panos_op:
ip_address: '{{ host }}'
password: '{{ operator_pw }}'
username: '{{ operator_user}}'
cmd: 'check pending-changes'
register: result
- fail:
msg: The Firewall has pending changes to commit.
when: '"<result>no"' not in result.stdout_xml
ERROR:
did not find expected key
Any help at all would be very appreciated.

As I just mentioned in another answer, since Ansible 2.4, there's an xml module.
Playbook
---
- hosts: localhost
gather_facts: false
tasks:
- name: Get result from xml.
xml:
xmlstring: "<response status=\"success\"><result>no</result></response>"
content: "text"
xpath: "/response/result"
Output
PLAY [localhost] ***************************************************************
TASK [Get result from xml.] ****************************************************
ok: [localhost] => changed=false
actions:
namespaces: {}
state: present
xpath: /response/result
count: 1
matches:
- result: 'no'
msg: 1
xmlstring: |-
<?xml version='1.0' encoding='UTF-8'?>
<response status="success"><result>no</result></response>
PLAY RECAP *********************************************************************
localhost : ok=1 changed=0 unreachable=0 failed=0

Related

Deleting multiple files and folders using Ansible

I need to delete files and folders using a ansible playbook. I pass the file/foler paths as a variable to an Ansible playbook from a Groovy script.
Variables are in a properties file named delete.properties. I stored file/foler paths seperatly in a variables so I can change the paths as I need in future.
delete.properties:
delete_files=/home/new-user/myfolder/dltfolder1 /home/new-user/myfolder/dltfolder2 /home/new-user/myfolder/dltfolder3
Groovy script:
stage("Read variable"){
steps{
script{
def propertifile = readFile(properti file path)
deleteParams = new Properties()
deleteParams.load(new StringReader(propertifile))
}
}
}
stage("Delete files/folders"){
steps{
script{
sh script: """cd ansible code path && \
export ANSIBLE_HOST_KEY_CHECKING=False && \
ansible-playbook delete.yml \
--extra-vars"dete_files=${deleteParams.delete_files}" --user user"""
}
}
}
Ansible playbook:
---
- name: delete files
hosts: localhost
tasks:
- name: delete files
file:
path: "{{ delete_files }}"
state: absent
As a result of above codes, only the first file path in delete_files (/home/new-user/myfolder/dltfolder1) variable in delete.properties file gets deleted.
I need to delete the other file/folder paths included in the delete_files variable too.
Put the path of the file into the extra vars. For example,
sh script: """cd ansible code path && \
export ANSIBLE_HOST_KEY_CHECKING=False && \
ansible-playbook delete.yml \
--extra-vars "dete_files=/tmp/delete.properties" --user user"""
Then, given the tree
shell> tree /tmp/test
/tmp/test
├── f1
├── f2
└── f3
, the file
shell> cat /tmp/delete.properties
delete_files=/tmp/test/f1 /tmp/test/f2 /tmp/test/f3
, and the playbook
shell> cat delete.yml
- hosts: localhost
vars:
delete_files: "{{ lookup('ini',
'delete_files',
file=dete_files,
type='properties') }}"
tasks:
- debug:
var: delete_files
- name: delete files
file:
path: "{{ item }}"
state: absent
loop: "{{ delete_files.split() }}"
gives, running in --check --diff mode
shell> ansible-playbook delete.yml --extra-vars "dete_files=/tmp/delete.properties" -CD
PLAY [localhost] *****************************************************************************
TASK [debug] *********************************************************************************
ok: [localhost] =>
delete_files: /tmp/test/f1 /tmp/test/f2 /tmp/test/f3
TASK [delete files] **************************************************************************
--- before
+++ after
## -1,5 +1,2 ##
path: /tmp/test/f1
-path_content:
- directories: []
- files: []
-state: directory
+state: absent
changed: [localhost] => (item=/tmp/test/f1)
--- before
+++ after
## -1,5 +1,2 ##
path: /tmp/test/f2
-path_content:
- directories: []
- files: []
-state: directory
+state: absent
changed: [localhost] => (item=/tmp/test/f2)
--- before
+++ after
## -1,5 +1,2 ##
path: /tmp/test/f3
-path_content:
- directories: []
- files: []
-state: directory
+state: absent
changed: [localhost] => (item=/tmp/test/f3)
PLAY RECAP ***********************************************************************************
localhost: ok=2 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
One solution would be to parse the properties file inside the Ansible playbook, with a ini lookup, if you are indeed acting on localhost, as you are showing it in your playbook:
- hosts: localhost
gather_facts: no
tasks:
- file:
path: "{{ item }}"
state: absent
loop: >-
{{
lookup(
'ini',
'delete_files type=properties file=delete.properties'
).split()
}}

How to escape JSON in Ansible playbook

I have the following YAML Ansible playbook file which I intent do use to capture some information from my docker containers:
---
# Syntax check: /usr/bin/ansible-playbook --syntax-check --inventory data/config/host_inventory.yaml data/config/ansible/docker_containers.yaml
- hosts: hosts
gather_facts: no
tasks:
- name: Docker ps output - identify running containers
shell: "/usr/bin/docker ps --format '{\"ID\":\"{{ .ID }}\", \"Image\": \"{{ .Image }}\", \"Names\":\"{{ .Names }}\"}'"
register: docker_ps_output
- name: Show content of docker_ps_output
debug:
msg: docker_ps_output.stdout_lines
But escaping is not working, Ansible gives me the middle finger when I try to run the playbook:
PLAY [hosts] ***********************************************************************************************************************************************************
TASK [Docker ps output - identify running containers] **********************************************************************************************************************************************
fatal: [myhost.com]: FAILED! => {"msg": "template error while templating string: unexpected '.'. String: /usr/bin/docker ps --format ''{\"ID\":\"{{ .ID }}\", \"Image\": \"{{ .Image }}\", \"Names\":\"{{ .Names }}\"}''"}
to retry, use: --limit #/tmp/docker_containers.retry
PLAY RECAP *****************************************************************************************************************************************************************************************
myhost.com : ok=0 changed=0 unreachable=0 failed=1
The original command I'm trying to run:
/usr/bin/docker ps --format '{"ID":"{{ .ID }}", "Image": "{{ .Image }}", "Names":"{{ .Names }}"}'
I would suggest to use a block scalar. Your problem is that {{ .ID }} etc is processed by Ansible's Jinja templating engine when it should not. Probably the most readable way around this is:
---
# Syntax check: /usr/bin/ansible-playbook --syntax-check --inventory data/config/host_inventory.yaml data/config/ansible/docker_containers.yaml
- hosts: hosts
gather_facts: no
tasks:
- name: Docker ps output - identify running containers
shell: !unsafe >-
/usr/bin/docker ps --format
'{"ID":"{{ .ID }}", "Image": "{{ .Image }}", "Names":"{{ .Names }}"}'
register: docker_ps_output
- name: Show content of docker_ps_output
debug:
msg: docker_ps_output.stdout_lines
>- starts a folded block scalar, in which you do not need to escape anything and newlines are folded into spaces. The tag !unsafe prevents the value to be processed with Jinja.
If you want to avoid templating, you need to cover double-brackets by another double-brackets:
{{ thmthng }}
should look like:
{{ '{{' }} thmthng {{ '}}' }}
Your playbook:
---
- hosts: hosts
gather_facts: no
tasks:
- name: Docker ps output - identify running containers
shell: "docker ps -a --format '{\"ID\": \"{{ '{{' }} .ID {{ '}}' }}\", \"Image\": \"{{ '{{' }} .Image {{ '}}' }}\", \"Names\" : \"{{ '{{' }} .Names {{ '}}' }}}\"'"
register: docker_ps_output
- name: Show content of docker_ps_output
debug:
var: docker_ps_output.stdout_lines

Reading multiple values from an env file with ansible and storing them as facts

I have the following code which reads values from an environment (.env) file and stores them as facts:
- name: Read values from environment
shell: "source {{ env_path }}; echo $DB_PASSWORD"
register: output
args:
executable: /bin/bash
changed_when: false
- name: Store read password
set_fact:
db_password: "{{ output.stdout }}"
when:
- db_password is undefined
changed_when: false
- name: Read values from environment
shell: "source {{ env_path }}; echo $DB_USER"
register: output
args:
executable: /bin/bash
changed_when: false
- name: Store read user
set_fact:
db_user: "{{ output.stdout }}"
when:
- db_user is undefined
changed_when: false
- name: Read values from environment
shell: "source {{ env_path }}; echo $DB_NAME"
register: output
args:
executable: /bin/bash
changed_when: false
- name: Store read db_name
set_fact:
db_name: "{{ output.stdout }}"
when:
- db_name is undefined
changed_when: false
- name: Container environment loaded; the following facts are now available for use by ansible
debug: "var={{ item }}"
with_items:
- db_name
- db_user
- db_password
It's quite bulky and unwieldy. I would like to write it something like this, but I can't figure out how :
vars:
values:
- db_name
- db_password
- db_user
tasks:
- name: Read values from environment
shell: "source {{ env_path }}; echo {{ item|upper }}"
register: output
with_items: values
args:
executable: /bin/bash
changed_when: false
- name: Store read value
set_fact:
"{{ item.0 }}": "{{ item.1.stdout }}"
when:
- item.0 is undefined
with_together:
- values
- output.results
changed_when: false
Instead, I get this output:
ok: [default] => (item=values) => {"changed": false, "cmd": "source /var/www/mydomain.org/.env; echo VALUES", "delta": "0:00:00.002240", "end": "2017-02-15 15:25:15.338673", "item": "values", "rc": 0, "start": "2017-02-15 15:25:15.336433", "stderr": "", "stdout": "VALUES", "stdout_lines": ["VALUES"], "warnings": []}
TASK [sql-base : Store read password] ******************************************
skipping: [default] => (item=[u'values', u'output.results']) => {"changed": false, "item": ["values", "output.results"], "skip_reason": "Conditional check failed", "skipped": true}
Even more ideal of course would be if there is an ansible module I have overlooked that allows me to load values from an environment file.
Generally I would either put my variables into the inventory files themselves, or convert them to yml format and use the include_vars module (you might be able to run a sed script to convert your environment file to yml on the fly). Before using the below code make sure you are really required to use those environment files, and you cannot easily use some other mechanism like:
a YAML file with the include_vars module
putting the config inside the inventory (no modules required)
putting the config into an ansible vault file (no modules required, you need to store the decryption key somewhere though)
use some other secure storage mechanism, like Hashicorp's vault (with a plugin like https://github.com/jhaals/ansible-vault)
Back to your code, it is actually almost correct, you were missing a dollar from the read task, and some curly braces from the condition:
test.env
DB_NAME=abcd
DB_PASSWORD=defg
DB_USER=fghi
Note: make sure that this file adheres to sh standards meaning you don't put spaces around the = sign. Having something like DB_NAME = abcd will fail
play.yml
- hosts: all
connection: local
vars:
env_path: 'test.env'
values:
- db_name
- db_password
- db_user
tasks:
- name: Read values from environment
shell: "source {{ env_path }}; echo ${{ item|upper }}"
register: output
with_items: "{{ values }}"
args:
executable: /bin/bash
changed_when: false
- name: Store read value
set_fact:
"{{ item.0 }}": "{{ item.1.stdout }}"
when: '{{ item.0 }} is undefined'
with_together:
- "{{ values }}"
- "{{ output.results }}"
changed_when: false
- name: Debug
debug:
msg: "NAME: {{db_name}} PASS: {{db_password}} USER: {{db_user}}"
Running with ansible-playbook -i 127.0.0.1, play.yml:
TASK [Debug] *******************************************************************
ok: [127.0.0.1] => {
"msg": "NAME: abcd PASS: defg USER: fghi"
}

Why ansible keeps recreating docker containers with state "started"

I have a docker container managed by Ansible. Every time I start the container with Ansible it is recreated instead of just started.
Here are the Ansible commands I use to stop/start the container:
ansible-playbook <playbook> -i <inventory> --extra-vars "state=stopped"
ansible-playbook <playbook> -i <inventory> --extra-vars "state=started"
Here's the Ansible taks I use to manage container. The only thing that changes between "stop" and "start" command is {{ state }}.
- docker:
name: "{{ postgres_container_name }}"
image: "{{ postgres_image_name }}"
state: "{{ state }}"
ports:
- "{{ postgres_host_port }}:{{ postgres_guest_port }}"
env:
POSTGRES_USER: "{{ postgres_user }}"
POSTGRES_PASSWORD: "{{ postgres_password }}"
POSTGRES_DB: "{{ postgres_db }}"
When I start, stop and start the container I get the following verbose output from Ansible command:
changed: [127.0.0.1] => {"ansible_facts": {"docker_containers": [{"Id": "ab1c0f6cc30de33aba31ce93671267783ba08a1294df40556870e66e8bf77b6d", "Warnings": null}]}, "changed": true, "containers": [{"Id": "ab1c0f6cc30de33aba31ce93671267783ba08a1294df40556870e66e8bf77b6d", "Warnings": null}], "msg": "removed 1 container, started 1 container, created 1 container.", "reload_reasons": null, "summary": {"created": 1, "killed": 0, "pulled": 0, "removed": 1, "restarted": 0, "started": 1, "stopped": 0}}
It states that the container changed, was removed, created and started.
Could you tell me why Ansible sees my container as changed and recreates it instead of starts?
Ansible's docker module will first remove any stopped containers with the same name when you use it with the state of started.
The module docs don't really make it all that clear but there is a comment explaining this in the source code in the started function.

Ansible - Include environment variables from external YML

I'm attempting to store all my environment variables in a file called variables.yml that looks like so:
---
doo: "external"
Then I have a playbook like so:
---
- hosts: localhost
tasks:
- name: "i can totally echo"
environment:
include: variables.yml
ugh: 'internal'
shell: echo "$doo vs $ugh"
register: results
- debug: msg="{{ results.stdout }}"
The result of the echo is ' vs internal'.
How can I change this so that the result is 'external vs internal'. Many thanks!
Assuming the external variable file called variables.ext is structured as follow
---
EXTERNAL:
DOO: "external"
than, according Setting the remote environment and Load variables from files, dynamically within a task a small test could look like
---
- hosts: localhost
become: false
gather_facts: false
tasks:
- name: Load environment variables
include_vars:
file: variables.ext
- name: Echo variables
shell:
cmd: 'echo "${DOO} vs ${UGH}"'
environment:
DOO: "{{ EXTERNAL.DOO }}"
UGH: "internal"
register: result
- name: Show result
debug:
msg: "{{ result.stdout }}"
resulting into an output of
TASK [Show result] ********
ok: [localhost] =>
msg: external vs internal

Resources