to allow this to work:
https://github.com/ansible/ansible/issues/41313
On Thu, Nov 29, 2018 at 1:09 PM Frank Thommen <mailto:lists.ansi...@drosera.ch>> wrote:
Hello,
our latest yum update brought ansible from 2.4.2.0 to 2.7.2 and now we
are facing
[WARNING]: flush_hand
Hello,
our latest yum update brought ansible from 2.4.2.0 to 2.7.2 and now we
are facing
[WARNING]: flush_handlers task does not support when conditional
we need the handlers to be flushed at that point, as later tasks rely on
previously reconfigured services to be reloaded. The "when"
Then your remote file is most probably identical to the one you are
trying to copy ;-)
Did you check that your local file (src) is for sure different from the
remote one (dest)? Run `ansible-playbook` with '-vvv' to see, which
local files ansible selects (this can be confusing when you are
This is not the same:
* copy module copies from the controller (where you run the
ansible-playbook command) to the client
* cp -f copies a file locally from client to itself
frank
On 11/02/2018 10:25 AM, Mohan L wrote:
Try with command or shell module with cp -f.
On Friday,
Consider, that with this mechanism you will not detect packages which
have been installed directly either by custom installer, the standard
configure-make-make install or by directly copying binaries or scripts
in some central location.
Also keep in mind, that if you omit (or someone removes)
On 10/21/2018 10:16 AM, Kai Stian Olstad wrote:
On Sunday, 21 October 2018 06:09:38 CEST LJMedina wrote:
Hello!
Unfortunately, we are using version 2.4, don't think upgrading is an option
(at least at this point) any other ideas ?
What are the part that you struggle with?
For the IP you can
Thanks a lot Kai,
On 08/24/2018 04:31 PM, Kai Stian Olstad wrote:
On Friday, 24 August 2018 15.47.39 CEST Frank Thommen wrote:
since "recently" (we realized just now), tags applied in role's main.yml
task, like:
/role/myrole/tasks/main.yml:
- name: My Task
include_task
Dear all,
since "recently" (we realized just now), tags applied in role's main.yml
task, like:
/role/myrole/tasks/main.yml:
- name: My Task
include_tasks: "mytask.{{ansible_osfamily}}.yml"
tags:
- mytask-only
- always
are not propagated to the included task any more. It still
On 04/01/18 07:11, Kai Stian Olstad wrote:
On 03.01.2018 18:03, Frank Thommen wrote:
I assumed, that host ranges could be used with ansible-playbook's
--limit as they can be used in the inventory itself. However this
doesn't seem to be possible:
It's called patterns
http://docs.ansible.com
Hi,
I assumed, that host ranges could be used with ansible-playbook's
--limit as they can be used in the inventory itself. However this
doesn't seem to be possible:
$ ansible-playbook site.yml --limit='host[010:027]'
[WARNING]: Could not match supplied host pattern, ignoring: host
Thanks a lot
On 07/27/2017 07:25 PM, Kai Stian Olstad wrote:
On 27. juli 2017 18:42, Frank Thommen wrote:
Hi,
I was looking for a way to run the distro-dependent package manager
w/o too many "when"s and I found this (on
https://ansible-tips-and-tricks.readthedocs.io/en/latest/os
Hi,
I was looking for a way to run the distro-dependent package manager w/o
too many "when"s and I found this (on
https://ansible-tips-and-tricks.readthedocs.io/en/latest/os-dependent-tasks/installing_packages/#installing-packages):
- name: install basic package
action: >
{{
Hi
On 07/21/2017 05:17 PM, Branko Majic wrote:
On Fri, 21 Jul 2017 16:34:53 +0200
Frank Thommen <lists.ansi...@drosera.ch> wrote:
Hi,
I have a playbook which executes a local task from which I need the
exit status in later steps:
- name: Get exit status of ./run.sh
local_
Hi,
I have a playbook which executes a local task from which I need the exit
status in later steps:
- name: Get exit status of ./run.sh
local_action: command ./run.sh
register: ES
ignore_errors: yes
- name: Do something if ./run.sh failed
command:
On 06/21/2017 11:01 AM, jean-y...@lenhof.eu.org wrote:
Hi Frank,
21 juin 2017 10:29 "Frank Thommen" <lists.ansi...@drosera.ch> a écrit:
Hi,
from ansible's documentation I don't understand how ansible handles fact
gathering when using
roles: Are a host's facts gathere
Hi,
from ansible's documentation I don't understand how ansible handles fact
gathering when using roles: Are a host's facts gathered at each single
playbook run? Or at each "role" run? Or only once?
In most cases one would want to gather facts only once in the beginning,
but is this the
Hi,
is it possible to expand a list variable within the inventory? E.g. I
have a list of users which should be allowed on all hosts but on /some/
hosts, additional users should be allowed. I want to define this in the
inventory:
So far I so:
[all]
users=['user1', 'user2']
[servers]
On 04/27/2017 12:40 PM, Kai Stian Olstad wrote:
On 26.04.2017 16:20, Frank Thommen wrote:
What I'd need is a syntax, which checks for each mount in the
dictionary, if one of it's (possibly multiple) classes matches the
(single) class of the server and then applies the mount. I could not
find
, you could just apply those
roles to the (groups holding) the relevant servers.
You'll need to have multiple data structures, one per role, but that's
probably going to 'work with the grain' of Ansible better.
On 26 April 2017 at 15:20, Frank Thommen <lists.ansi...@drosera.ch> wrote:
De
Dear all,
I'm at a loss as to how to filter entries fom a dictionry based von
specific host variables. In the concrete case we have a long list of
possible NFS mounts which should be applied according to the hosts' class.
Our hostlist would be like:
host1 class=cluster
host2 class=cluster
On 08/23/2016 03:38 PM, Frank Thommen wrote:
Hi,
when using the include_module with the "file" parameter, it fails with
AttributeError: 'NoneType' object has no attribute 'startswith'.
[...]
Seems to have been fixed in the latest 2.2 release
f.
--
You received this messa
Hi,
when using the include_module with the "file" parameter, it fails with
AttributeError: 'NoneType' object has no attribute 'startswith'.
This works fine:
- testmyvars_1.yml ---
---
- hosts: '{{ target }}'
gather_facts: no
tasks:
- include_vars: ./myvars_1.yml
Hi,
when using the include_module with the "file" parameter, it fails with
AttributeError: 'NoneType' object has no attribute 'startswith'.
This works fine:
- testmyvars_1.yml ---
---
- hosts: '{{ target }}'
gather_facts: no
tasks:
- include_vars:
Hi,
when using the include_module with the "file" parameter, it fails with
AttributeError: 'NoneType' object has no attribute 'startswith'.
This works fine:
- testmyvars_1.yml ---
---
- hosts: '{{ target }}'
gather_facts: no
tasks:
- include_vars: ./myvars_1.yml
On 08/19/2016 02:33 PM, Jean-Yves LENHOF wrote:
Le 19/08/2016 à 14:20, Frank Thommen a écrit :
On 08/19/2016 01:54 PM, Jean-Yves LENHOF wrote:
The first task ansible is doing is gathering facts In facts there
are mounted filesystems, so the NFS one too
[...]
However I found
On 08/19/2016 01:54 PM, Jean-Yves LENHOF wrote:
Le 19/08/2016 à 13:19, Frank Thommen a écrit :
Dear all,
doing my first steps with ansible I noticed, that on some clients
executing playbooks completely hangs. The common problem on these
hosts is, that they are either swapping (even very small
Dear all,
doing my first steps with ansible I noticed, that on some clients
executing playbooks completely hangs. The common problem on these hosts
is, that they are either swapping (even very small amounts of swap used)
or they have problems with hanging/not responding NFS filesystems. In
.
Hope this helps,
Jon
On Monday, July 11, 2016 at 8:21:12 AM UTC+1, Frank Thommen wrote:
Hi,
I'm currently evaluating ansible as a candidate for our future CM tool.
I'm stuck with the issue, that we have "blocks" of resources (e.g. a
list of around 30 NFS mou
Hi,
I'm currently evaluating ansible as a candidate for our future CM tool.
I'm stuck with the issue, that we have "blocks" of resources (e.g. a
list of around 30 NFS mounts) which we'd like to be able to switch on
and off w/o repeating the complete resources in two playbooks. How can
we
Hi,
we plan to deploy Linux client configurations via Ansible. Since not all
clients are running all the time, we want Ansible to run the deployment in
regular invervals, e.g. every 2-3 hours. What is "the Ansible way" to
implement this? Is there a better way than to have a cronjob on the
30 matches
Mail list logo