Hi,

I'm testing a playbook by running it against localhost, and am getting a 
really bizarre error from the syncronize module. I am connecting back to 
localhost via ssh with a deployer user, which has the correct permissions.

Playbook:
-   name: Deploy to servers
    hosts: cron
    remote_user: "{{ remote_user }}"

    roles:
    -   role: deploy

roles/deploy/main.yml
---
  - name: Make directory
    file:
      path: "{{ release_dir }}"
      state: directory
      group: www-data

  - name: Copy source to server
    synchronize:
      src: "{{ deploy_dir }}"
      dest: "{{ release_dir }}"
      recursive: true

the variable remote_user in this case is changed depending on the host, and 
is set to deployer on localhost:

local.hosts
[localhost]
localhost ansible_ssh_user=deployer

[cron]
localhost

Output:

[snip]
TASK: [deploy | Make directory] **************************************** 
<localhost> ESTABLISH CONNECTION FOR USER: deployer
<localhost> REMOTE_MODULE file group=www-data state=directory path=/var/www/
release/
[snip task output]

TASK: [deploy | Copy source to server] 
**************************************** 
<127.0.0.1> ESTABLISH CONNECTION FOR USER: www-data
<127.0.0.1> EXEC ['ssh', '-C', '-tt', '-q', '-o', 'ControlMaster=auto', '-o'
, 'ControlPersist=60s', '-o', 
'ControlPath=/home/cameron/.ansible/cp/ansible-ssh-%h-%p-%r', '-o', 
'Port=22', '-o', 'KbdInteractiveAuthentication=no', '-o', 
'PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey', 
'-o', 'PasswordAuthentication=no', '-o', u'User=www-data', '-o', 
'ConnectTimeout=10', '127.0.0.1', "/bin/sh -c 'mkdir -p 
$HOME/.ansible/tmp/ansible-tmp-1403830489.07-36564692306368 && chmod a+rx 
$HOME/.ansible/tmp/ansible-tmp-1403830489.07-36564692306368 && echo 
$HOME/.ansible/tmp/ansible-tmp-1403830489.07-36564692306368'"]
fatal: [localhost] => SSH encountered an unknown error during the connection
. We recommend you re-run the command using -vvvv, which will enable SSH 
debugging output to help diagnose the issue

Running the exact same playbook against a non-localhost host works fine.

The issue seems to be that the SSH user is being changed for JUST this one 
task.
TASK: [deploy | Copy source to server] 
**************************************** 
<127.0.0.1> ESTABLISH CONNECTION FOR USER: www-data

I do not have a host called 127.0.0.1 in my inventory file (it's called 
localhost), so the ssh user is falling back to the remote_user defined in 
group_vars/all (www-data). Explicitly setting remote_user: deployer on the 
play makes it work.

Seems strange to me that a single task can run as a different user to the 
rest of the tasks in the play.

Is this a bug with Synchronize? A bug with Ansible?

Any help would be appreciated.

Cheers, Cameron

-- 
You received this message because you are subscribed to the Google Groups 
"Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/ansible-project/2928631f-42d4-4e32-b5b6-ce6538c73e98%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to