0
votes

I'm writing a playbook that spins up X number of EC2 AWS instances then just installs some software on them (apt packages and pip modules). When I run my playbook, it executes the shell commands on my local system because Ansible won't run unless I specify a host and I put localhost.

In the playbook, I've tried specifying "hosts: all" at the top-level, but this just makes the playbook run for a second without doing anything.

playbook.yml

- name: Spin up spot instances
  hosts: localhost
  connection: local
  vars_files: ansible-vars.yml
  tasks:
    - name: create {{ spot_count }} spot instances with spot_price of ${{ spot_price }}      
      local_action:
        module: ec2
        region: us-east-2
        spot_price:  '{{ spot_price }}'
        spot_wait_timeout: 180
        keypair: My-keypair
        instance_type: t3a.nano
        image: ami-0f65671a86f061fcd
        group: Allow from Home
        instance_initiated_shutdown_behavior: terminate
        wait: yes
        count:  '{{ spot_count }}'
      register: ec2

    - name: Wait for the instances to boot by checking the ssh port
      wait_for: host={{item.public_ip}} port=22 delay=15 timeout=300 state=started
      with_items: "{{ ec2.instances }}"

    - name: test whoami
      args:
        executable: /bin/bash
      shell: whoami
      with_items: "{{ ec2.instances }}"

    - name: Update apt
      args:
        executable: /bin/bash
      shell: apt update
      become: yes
      with_items: "{{ ec2.instances }}"

    - name: Install Python and Pip
      args:
        executable: /bin/bash
      shell: apt install python3 python3-pip -y        
      become: yes
      with_items: "{{ ec2.instances }}"

    - name: Install Python modules
      args:
        executable: /bin/bash
      shell: pip3 install bs4 requests
      with_items: "{{ ec2.instances }}"

ansible-vars.yml

ansible_ssh_private_key_file: ~/.ssh/my-keypair.pem
spot_count: 1
spot_price: '0.002'
remote_user: ubuntu

The EC2 instances get created just fine and the "wait for SSH" task works, but the shell tasks get run on my local system instead of the remote hosts.

How can I tell Ansible to connect to the EC2 instances without using a hosts file since we're creating them on the fly?

1
You have to use the add_host module to register your dynamically created hosts and target the next play with your shell task to them.Zeitounator
@Zeitounator could you post an example of that? I tried using an example I found but I really have no idea where to put it.Dan S

1 Answers

0
votes

Can you try this if it works.

- name: test whoami
      args:
        executable: /bin/bash
      shell: whoami
      delegate_to: "{{ item }}"
      with_items: "{{ ec2.instances }}"