elasticsearch - configuring elastichsearch node in spark -


i'm trying use spark/elasticsearch connector read/write data from/to elasticsearch. how configure es settings @ spark (2.1.1) initialization? i'm on stand-alone mode method found here: https://www.elastic.co/guide/en/elasticsearch/hadoop/master/spark.html using new sparkconf not working:

import org.apache.spark.sparkconf  val conf = new sparkconf().setappname(appname).setmaster(master)  conf.set("es.nodes", "true")  

here's error message:

org.apache.spark.sparkexception: 1 sparkcontext may running in jvm (see spark-2243).

i've found fix here uses map @ writing/reading time:

import org.elasticsearch.spark.rdd.esspark  esspark.savetoes(rdd, "spark/docs", map("es.nodes" -> "10.0.5.151")). 

however, there better way of setting node once , all? thank you!


Comments

Popular posts from this blog

node.js - Node js - Trying to send POST request, but it is not loading javascript content -

javascript - Replicate keyboard event with html button -

javascript - Web audio api 5.1 surround example not working in firefox -