SPARK_WORKER_INSTANCES for Spark 2.2.0 -


in spark 2.2.0, not see option spark_worker_instances launch multiple workers per node. how do this?

if @ spark-env.sh file inside conf directory of spark folder, see option spark_worker_instances=1. can change number want.

so when spark started sbin/start-all.sh number of worker nodes defined should started on machine.


Comments

Popular posts from this blog

node.js - Node js - Trying to send POST request, but it is not loading javascript content -

javascript - Replicate keyboard event with html button -

javascript - Web audio api 5.1 surround example not working in firefox -