Quantcast
Channel: Active questions tagged amazon-ec2 - Stack Overflow
Viewing all articles
Browse latest Browse all 29537

Can You Use a Script to Start Spark Cluster Nodes?

$
0
0

I'm running Hadoop and Spark on a four-node cluster in AWS EC2.

After doing a lot of web research, it seems the accepted way to start Spark on a cluster (once Hadoop is running) is to:

1) Log into the master node and run start-master.sh.

2) Log into each slave node and run start-slave.sh, passing it the DNS and port information for the master node.

My question is: If there are, let's say 20 nodes, this is pretty tedious and time consuming. Is there a way to start Spark from some localized location the way Hadoop is started? When you run Hadoop from the master node, it starts all the slave nodes remotely. I'm looking for a solution like that, or for a python script that can SSH into the nodes and start them.


Viewing all articles
Browse latest Browse all 29537

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>