Skip to content

How to make commands execute on Lambda #3

@faromero

Description

@faromero

How do I run commands from the spark-shell so that they are executed on Lambda? Right now, the commands are being executed locally on my machine, but I would like Lambda to be the backend.

I am running the following command to start the shell (which does start successfully):
bin/spark-shell --conf spark.hadoop.fs.s3n.awsAccessKeyId=<my-key> --conf spark.hadoop.fs.s3n.awsSecretAccessKey=<my-secret-key> --conf spark.shuffle.s3.bucket=s3://<my-bucket> --conf spark.lambda.function.name=spark-lambda --conf spark.lambda.s3.bucket=s3://<my-bucket>/lambda --conf spark.lambda.spark.software.version=149

I have created the function spark-lambda to be the contents of spark-lambda-os.py and have given it S3 and EC2 permissions. In addition, the S3 bucket <my-bucket>/lambda has the package spark-lambda-149.zip which was put together by the spark-lambda script. Is there anything else I need to do to have it execute on Lambda?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions