Cassandra Spark Dataframe, CSV : "query in the SELECT clause of the INSERT INTO/OVERWRITE statement generates the same number of columns as its schema -


i have csv file around 100 cols. wanted put in table 101 cols (it 102 cols)

the problem is have following message:org.apache.spark.sql.cassandra.cassandrasourcerelation requires query in select clause of insert into/overwrite statement generates same number of columns schema.

how can overcome problem?

here code:

  df =  sqlcontext.read()                   .format("csv")                   .option("delimiter", ";")                   .option("header", "true")                   .load("file:///" + namefile); 

and then:

df.repartition(8).select("col1","col2",..."col100").write().mode(savemode.append).saveastable("mytable"); 


Comments

Popular posts from this blog

node.js - Node js - Trying to send POST request, but it is not loading javascript content -

javascript - Replicate keyboard event with html button -

javascript - Web audio api 5.1 surround example not working in firefox -