I'm having trouble running my Ansible playbook on AWS instance. Here is my version:
$ ansible --versionansible 2.0.0.2
I created an inventory file as:
[my_ec2_instance]default ansible_host=MY_EC2_ADDRESS ansible_user='ubuntu' ansible_ssh_private_key_file='/home/MY_USER/MY_KEYS/MY_KEY.pem'
Testing connection to my server:
$ ansible -i provisioner/inventory my_ec2_instance -m pingdefault | SUCCESS => {"changed": false, "ping": "pong"}
Now when running my playbook on this inventory I get the error Timeout (12s) waiting for privilege escalation prompt
as follows:
$ ansible-playbook -i provisioner/inventory -l my_ec2_instance provisioner/playbook.ymlPLAY [Ubuntu14/Python3/Postgres/Nginx/Gunicorn/Django stack] *****TASK [setup] *******************************************************************fatal: [default]: FAILED! => {"failed": true, "msg": "ERROR! Timeout (12s) waiting for privilege escalation prompt: "}NO MORE HOSTS LEFT *************************************************************PLAY RECAP *********************************************************************default : ok=0 changed=0 unreachable=0 failed=1
If I run the same playbook using the .vagrant/provisioners/ansible/inventory/vagrant_ansible_inventory
as the inventory parameter it works perfectly on my Vagrant instance.(I believe, proving there is nothing wrong in the playbook/roles itself)
Also, if I run it with an -vvvv
, copy the exec ssh
line and run it manually it indeed connects to AWS without problems.
Do I need to add any other parameter on my inventory file to connect an EC2 instance? What am I missing?