2
votes

Note: all the code examples in this question are viewable in context at https://github.com/discopatrick/ansible-pocs/tree/feature/sync (note the specific branch). I have also provided links to specific lines where appropriate.

Useful info:

The question: how can I use the Ansible synchronize module between a vagrant box and a remote host?

I'm going to demonstrate my problem by firstly showing a use case that succeeds, and then showing how my desired use case fails.

synchronize between two remote hosts: succeeds

I've been using the synchronize module between two remote hosts by using the delegate_to parameter, and it works wonderfully:

https://github.com/discopatrick/ansible-pocs/blob/feature/sync/rsync-remote.yml#L39-L52

## This playbook is run against host ansible-pocs-1, but this task is
## delegated to ansible-pocs-2. In practice this means that the task
## first ssh's into ansible-pocs-2 and then runs rsync in PUSH mode
## using ansible-pocs-1 as the destination.
- name: sync remote folder to remote folder
  synchronize:
    src: /home/admin/syncthis-pocs2/
    dest: /home/admin/syncthis-pocs1/
    delete: yes
  delegate_to: ansible-pocs-2

The output:

TASK [sync remote folder to remote folder] ************************************* changed: [ansible-pocs-1 -> None]

Note that both remote machines are using the same key pair for login ('~/.ssh/id_rsa' on my local host), the same user name ('admin'), and the same ssh port (22).

Here's a sample of the verbose output from that task - I believe it may become significant later:

... changed: [ansible-pocs-1 -> None] => { "changed": true, "cmd": "/usr/bin/rsync --delay-updates -F --compress --delete-after --archive --rsh 'ssh -S none -o StrictHostKeyChecking=no -o Port=22' --out-format='<>%i %n%L' \"/home/admin/syncthis-pocs2/\" \"178.62.50.236:/home/admin/syncthis-pocs1/\"", ...

synchronize between vagrant box and remote host: fails

I am now trying to do the same thing between a vagrant box and a remote host, but I get an error. Here's the task code. To be clear, the vagrant box is called 'alpha':

https://github.com/discopatrick/ansible-pocs/blob/feature/sync/rsync-vagrant-1step.yml#L26-L35

## This playbook is run against host ansible-pocs-1, but this task is
## delegated to alpha. In practice this means that the task
## first ssh's into alpha and then runs rsync in PUSH mode
## using ansible-pocs-1 as the destination.
- name: sync vagrant folder to remote folder
  synchronize:
    src: /home/vagrant/syncthis-alpha/
    dest: /home/admin/syncthis-pocs1/
    delete: yes
  delegate_to: alpha

Here's the error:

fatal: [ansible-pocs-1 -> None]: FAILED! => {"changed": false, "cmd": "/usr/bin/rsync --delay-updates -F --compress --delete-after --archive --rsh 'ssh -i /Users/patrick/Documents/Development/ansible-pocs/.vagrant/machines/alpha/virtualbox/private_key -S none -o StrictHostKeyChecking=no -o Port=2200' --out-format='<>%i %n%L' \"/home/vagrant/syncthis-alpha/\" \"178.62.50.236:/home/admin/syncthis-pocs1/\"", "failed": true, "msg": "Warning: Identity file /Users/patrick/Documents/Development/ansible-pocs/.vagrant/machines/alpha/virtualbox/private_key not accessible: No such file or directory.\nssh: connect to host 178.62.50.236 port 2200: Connection refused\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: error in rsync protocol data stream (code 12) at io.c(226) [sender=3.1.0]\n", "rc": 12}

There are two interesting things about this error message: the SSH port, and the path to the private key.

Ansible attempts to use incorrect SSH port

Ansible attempts to connect to port 2200 on my remote host. This won't work; 2200 is the ssh port on the vagrant box, but the remote host uses port 22. Ansible seems to be using the settings from the development group inventory file when connecting a staging group remote host.

Here's the development inventory:

alpha ansible_ssh_host=127.0.0.1 ansible_ssh_port=2200 ansible_ssh_user='vagrant' ansible_ssh_private_key_file='/Users/patrick/Documents/Development/ansible-pocs/.vagrant/machines/alpha/virtualbox/private_key'
...

Here's the staging inventory with no port specified (implicitly using port 22):

ansible-pocs-1 ansible_ssh_host=178.62.50.236
ansible-pocs-2 ansible_ssh_host=178.62.96.61
...

I can fix this issue by being explicit about the port to use on my remote hosts:

ansible-pocs-1 ansible_ssh_host=178.62.50.236 ansible_ssh_port=22
...

Now the output reads:

... -o Port=22 ...

On to the next point of interest:

Ansible attempts to access the private key via a path that exists on the host machine, not the vagrant guest box

Warning: Identity file /Users/patrick/Documents/Development/ansible-pocs/.vagrant/machines/alpha/virtualbox/private_key not accessible: No such file or directory.\nPermission denied (publickey).

That path exists on my host machine. Why is Ansible trying to feed it into the rsync command that is being executed on the guest machine? Could this be a bug in the synchronize module?

Furthermore, why is it that the synchronize module doesn't have this problem when attempting to connect between two remote hosts? If you look above at the verbose output from the succeeding remote-to-remote sync task, there is no attempt to access a non-existent path. In that case, I believe that ssh key forwarding is taking care of things quite nicely.

I have tried to fix this issue in the same manner as the previous issue - by being explicit in the staging host file as to which private key file to use:

ansible-pocs-1 ansible_ssh_host=178.62.50.236 ansible_ssh_port=22 ansible_ssh_private_key_file=~/.ssh/id_rsa

...but this results in an identical error message.

Could the problem be that the ssh username for the vagrant and remote boxes are different? This doesn't seem to matter for the rest of the tasks in the play, which happily switches between running tasks on various hosts, using delegate_to. Regardless, I could attempt to be explicit in the inventory about which user to connect to the box with:

ansible-pocs-1 ansible_ssh_host=178.62.50.236 ansible_ssh_port=22 ansible_ssh_private_key_file=~/.ssh/id_rsa ansible_ssh_user=admin

...again, the same error message.

Further investigation

I have also attempted to insert my default public key into the vagrant box's authorized_keys file, in the hope that using the same key across all machines (both vagrant and remote) might help. Here's the code that does this (currently partially commented, as it didn't fix the problem):

https://github.com/discopatrick/ansible-pocs/blob/feature/sync/Vagrantfile#L13-L25

  # config.ssh.insert_key = false # don't insert secure key, use default insecure key
  # config.ssh.private_key_path = [
  #   "~/.ssh/id_rsa", # the first key in the list is the one used by ansible
  #   "~/.vagrant.d/insecure_private_key", # vagrant will attempt to use subsequent keys on a \`vagrant ssh\`
  # ]

  # add host default public ssh key to guest authorized_keys file
  config.vm.provision "file", 
    source: "~/.ssh/id_rsa.pub", 
    destination: "~/host_id_rsa.pub"
  config.vm.provision "shell", 
    inline: "cat ~/host_id_rsa.pub >> ~/.ssh/authorized_keys", 
    privileged: false # runs with sudo by default

The error message is almost identical, except now it's trying to find the path to /Users/patrick/.ssh/id_rsa on the vagrant box, which of course doesn't exist:

fatal: [ansible-pocs-1 -> None]: FAILED! => {"changed": false, "cmd": "/usr/bin/rsync --delay-updates -F --compress --delete-after --archive --rsh 'ssh -i /Users/patrick/.ssh/id_rsa -S none -o StrictHostKeyChecking=no -o Port=22' --out-format='<>%i %n%L' \"/home/vagrant/syncthis-alpha/\" \"178.62.50.236:/home/admin/syncthis-pocs1/\"", "failed": true, "msg": "Warning: Identity file /Users/patrick/.ssh/id_rsa not accessible: No such file or directory.\nPermission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: error in rsync protocol data stream (code 12) at io.c(226) [sender=3.1.0]\n", "rc": 12}

User error, or bug?

I thought I would post this here on Stack Overflow before filing it as a bug with the Ansible team, because as meticulous as I feel I've been, I may have missed something simple.

Can anyone help?

1
I had the first issue same with you.But I didn't get it fixed even if I explicit about the port and username to use on two different hosts.(one is 22 and admin, another is 2209, root). Finally I accomplished my requirement by using Shell Module to run rsync command.And here is my issue for Ansible.github.com/ansible/ansible/issues/28209Dai Kaixian

1 Answers

0
votes

Got the same problem and the following workaround did the job if you put it before the synchronize task:

    - name: Set  correct ssh key path
      set_fact: 
        ansible_ssh_private_key_file: "{{ ansible_ssh_private_key_file | realpath }}"
      when: ansible_ssh_private_key_file is defined