Getting just volume id ansible - parsing

This has been answered before here on Stack.
ansible get aws ebs volume id which already exist
Get volume id from newly created ebs volume using ansible
For the life of me I am trying ec2_vol.volume_id and some other jmespath query bits but not getting the right output help. I just want the vol id. Nothing more.
---
- hosts: localhost
connection: local
gather_facts: no
tasks:
- name: get associated vols
ec2_vol:
instance: i-xxxxxxxxxxxxx
state: list
profile: default
region: us-east-1
register: ec2_vol
- debug:
msg: "{{ ec2_vol.volume_id }}"
also doesn't work
---
- hosts: localhost
connection: local
gather_facts: no
tasks:
- name: get associated vols
ec2_vol:
instance: i-xxxxxxxxxxxxxx
state: list
profile: default
region: us-east-1
register: ec2_vol
- debug: msg="{{ item.volume_id }}"
with_items: ec2_vol.results
Ansible 2.2 and 2.3 tested

Taking bits from the prior answer you will need to understand JMESPATH like filtering to get what you want out of the output.
Here is the answer
---
- hosts: localhost
connection: local
gather_facts: no
tasks:
- name: get associated vols
ec2_vol:
instance: i-xxxxxxxxxxxxxx
state: list
profile: default
region: us-east-1
register: ec2_vol
- debug: msg="{{ ec2_vol.volumes | map(attribute='id') | list }}"

Related

Ansible known_hosts module ssh key propagation question

I'm trying to craft a playbook that will update the known_hosts for a machine / user however I'm getting an error I can't make sense of.
---
- name: Keys
hosts: adminslaves
gather_facts: false
no_log: false
remote_user: test
#pre_tasks:
# - setup:
# gather_subset:
# - '!all'
tasks:
- name: Scan for SSH host keys.
shell: ssh-keyscan myhost.mydomain.com 2>/dev/null
changed_when: False
register: ssh_scan
# - name: show vars
# debug:
# msg: "{{ ssh_scan.stdout_lines }}"
#
- name: Update known_hosts.
known_hosts:
key: "{{ item }}"
name: "{{ ansible_host }}"
state: present
with_items: "{{ ssh_scan.stdout_lines }}"
My error is "msg": "Host parameter does not match hashed host field in supplied key"}
I think the variable has the right information (at least it does when I debug it).
My end goal is a playbook that will add ssh keys of a list of hosts to a list of hosts for Jenkins auth.
Appreciate any help.
the problem is that the output of ssh-keyscan myhost.mydomain.com 2>/dev/null usually contains more than one key so you need to process it.
Someone with the same error message raised an issue, but again the problem was with the ssh-key format. I better understood checking the code used by known_hosts task.
Here the code I use:
- name: Populate known_hosts
hosts: spectrum_scale
tags: set_known_hosts
become: true
tasks:
- name: Scan for SSH keys
ansible.builtin.shell:
cmd: "ssh-keyscan {{ hostvars[spectrum_scale].ansible_fqdn }}
{{ hostvars[spectrum_scale].ansible_hostname }}
{{ hostvars[spectrum_scale].ansible_default_ipv4.address }}
2>/dev/null"
loop: "{{ groups['spectrum_scale'] }}"
loop_control:
loop_var: spectrum_scale
register: ssh_scan
- name: Set stdout_lines array for ssh_scan
set_fact:
ssout: []
- name: Fill ssout
set_fact:
ssout: "{{ ssout + ss_r.stdout_lines }}"
loop: "{{ ssh_scan.results }}"
loop_control:
loop_var:
ss_r
when: ss_r.stdout_lines is defined
- name: Add client ssh keys to known_hosts
ansible.builtin.known_hosts:
name: "{{ hk.split()[0] }}"
key: "{{ hk }}"
state: present
loop: "{{ ssout }}"
loop_control:
loop_var: hk

hashi_vault don't work through Web Application Firewall

I want to retrieve a vault secret with Ansible using the hashi_vault module which doesn't seem to work through a WAF.
The hashi_vault module work when the vault server is mapped to the root url (https://address/) in the WAF but when we use a custom path (https://address/vault) the Ansible playbook return this error : "Invalid Hashicorp VaultToken Specified for hashi_vault lookup"
- hosts: localhost
tasks:
- name: set variables
set_fact:
vault_token: !vault |
$ANSIBLE_VAULT;1.1;AES256
SOMETOKEN
- name: get Configuration token from Vault
set_fact:
vault_result: "{{ lookup('hashi_vault', 'secret=secret/data/all/default/token token={{ vault_token }} url=https://address validate_certs=False' ) }}"
- name: parse result
set_fact:
TOKEN: "{{ vault_result.data.token }}"
register: TOKEN
- name: show result
debug:
msg: "{{ TOKEN }}"
I'd love to find a way to keep my custom url (https://address/vault) and get my secret !
It's not related to your WAF, it's just a variable issue...
Try something like that:
- name: get Configuration token from Vault
set_fact:
vault_result: "{{ lookup('hashi_vault', 'secret=secret/data/all/default/token token=' + vault_token + ' url=https://address validate_certs=False' ) }}"

Ansible - Set environment path as inventory variable

Osmc media player needs a specific path for playbooks
https://github.com/osmc/osmc/issues/319
environment:
PATH: "{{ ansible_env.PATH }}:/sbin:/usr/sbin"
I was wondering whether I can set this as an environmental variable in the inventory for those machines, rather than have it in every playbook or create separate playbooks.
In common usage - is that path likely to cause problems for general *nix machines if it is implemented on non-osmc installations?
If you can't set this an an inventory variable:
Is that just because it's no implemented/ useful to most?
Or because the inventory has no relation to path - e.g. it's not invoked at that point?
Or is a better way for all of this to have it as a machine specific variable/ task in a role?
How would that look please?
New to ansible and still trying to get my head round some of the concepts.
As said, the environment keyword can be used only at task or playbook level.
You will be able to use an standard playbook just adding the following:
---
- name: Environment
hosts: localhost
connection: local
gather_facts: False
tasks:
- name: Setup
setup:
gather_subset:
- "!all"
or
---
- name: Environment
hosts: localhost
connection: local
gather_facts: True
gather_subset:
- "!all"
If you debug the variable:
---
- name: Environment
hosts: localhost
connection: local
gather_facts: False
tasks:
- name: Setup
setup:
gather_subset:
- "!all"
- name: Debug
debug:
var: ansible_env.PATH
You will get something like:
TASK [Setup] *******************************************************************************************************************************************************
ok: [localhost]
TASK [Debug] *******************************************************************************************************************************************************
ok: [localhost] => {
"ansible_env.PATH": "/Users/imjoseangel/source/venv/ansible/bin:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin"
}
And what if you want to pass that variable to another play with different inventory?
Just do hostvars.localhost.ansible_env.PATH
- name: Environment2
hosts: windows
connection: local
gather_facts: False
tasks:
- name: Debug
debug:
var: hostvars.localhost.ansible_env.PATH
So the
environment:
PATH: "{{ ansible_env.PATH }}:/sbin:/usr/sbin"
Will be valid only with gather_facts or setup module under the defined inventory but you don't need to split playbooks.

Ansible Router Template Generator - Trouble with vars_prompt [duplicate]

I was trying to use vars_prompt in Ansible with default values taken from facts (or otherwise a previously defined variable). The playbook is intended be used as an ad-hoc one for initial provisioning.
My playbook:
---
- hosts: server01
gather_facts: True
vars_prompt:
- name: new_hostname
prompt: please enter the name for the target
default: "{{ ansible_hostname }}"
private: no
tasks:
- debug: msg="{{ new_hostname }}"
Current result:
please enter the name for the target [{{ ansible_hostname }}]:
ERROR! 'ansible_hostname' is undefined
Expected results (assuming ansible_hostname=server01:
please enter the name for the target [server01]:
Is it possible to achieve in Ansible?
This can be implemented using the pause module:
---
- hosts: server01
gather_facts: True
tasks:
- pause:
prompt: please enter the name for the target [{{ ansible_hostname }}]
register: prompt
- debug:
msg: "{{ prompt.user_input if prompt.user_input else ansible_hostname }}"

Ansible - Include environment variables from external YML

I'm attempting to store all my environment variables in a file called variables.yml that looks like so:
---
doo: "external"
Then I have a playbook like so:
---
- hosts: localhost
tasks:
- name: "i can totally echo"
environment:
include: variables.yml
ugh: 'internal'
shell: echo "$doo vs $ugh"
register: results
- debug: msg="{{ results.stdout }}"
The result of the echo is ' vs internal'.
How can I change this so that the result is 'external vs internal'. Many thanks!
Assuming the external variable file called variables.ext is structured as follow
---
EXTERNAL:
DOO: "external"
than, according Setting the remote environment and Load variables from files, dynamically within a task a small test could look like
---
- hosts: localhost
become: false
gather_facts: false
tasks:
- name: Load environment variables
include_vars:
file: variables.ext
- name: Echo variables
shell:
cmd: 'echo "${DOO} vs ${UGH}"'
environment:
DOO: "{{ EXTERNAL.DOO }}"
UGH: "internal"
register: result
- name: Show result
debug:
msg: "{{ result.stdout }}"
resulting into an output of
TASK [Show result] ********
ok: [localhost] =>
msg: external vs internal

Resources