Quantcast
Channel: Active questions tagged amazon-ec2 - Stack Overflow
Viewing all articles
Browse latest Browse all 29543

Can't sync s3 with ec2 folder from aws lambda

$
0
0

I am trying to automate data processing using AWS. I have setup an AWS lambda function in python that:

  1. Gets triggered by an S3 PUT event
  2. Ssh into an EC2 instance using paramiko layer
  3. Copy the new objects from the bucket into some folder in the instance, unzip the file inside the instance and run a python script that cleans the csv files.

The problem is the aws cli call to sync s3 bucket with ec2 folder is not working, but when I manually ssh into the ec2 instance and runn the command it works.My aws-cli is configured with my access_keys and the ec2 has an s3 role that allows it full access.

    import boto3
    import time
    import paramiko

    def lambda_handler(event, context):
    #create a low level client representing s3
        s3 = boto3.client('s3')
        ec2 = boto3.resource('ec2', region_name='eu-west-a')
        instance_id = 'i-058456c79fjcde676'
        instance = ec2.Instance(instance_id)
    ------------------------------------------------------'''
    #start instance
        instance.start()
    #allow some time for the instance to start
        time.sleep(30)

    # Print few details of the instance
       print("Instance id - ", instance.id)
       print("Instance public IP - ", instance.public_ip_address)
       print("Instance private IP - ", instance.private_ip_address)
       print("Public dns name - ", instance.public_dns_name)
       print("----------------------------------------------------")
       print('Downloading pem file')
       s3.download_file('some_bucket', 'some_pem_file.pem', '/tmp/some_pem_file.pem')

    # Allowing few seconds for the download to complete
       print('waiting for instance to start')
       time.sleep(30)
       print('sshing to instsnce')
       ssh = paramiko.SSHClient()
       ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
       privkey = paramiko.RSAKey.from_private_key_file('/tmp/some_pem_file.pem')
    # username is most likely 'ec2-user' or 'root' or 'ubuntu'
    # depending upon yor ec2 AMI
    #s3_path = "s3://some_bucket/" + object_name
       ssh.connect(
       instance.public_dns_name, username='ubuntu', pkey=privkey)
       print('inside machine...running commands')
       stdin, stdout, stderr = ssh.exec_command('aws s3 sync s3://some_bucket/ ~/ec2_folder;\
       bash ~/ec2_folder/unzip.sh; python3 ~/ec2_folder/process.py;')
       stdin.flush()
       data = stdout.read().splitlines()
       for line in data:
        print(line)
        print('done, closing ssh session')
       ssh.close()

    # Stop the instance
      instance.stop()

    return('Triggered')

Viewing all articles
Browse latest Browse all 29543

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>