apache spark - Scala - Dynamic Code generation using scala reflection -


i have requirement need add multiple columns spark dataframe.

i using dataframe.withcolumn adding new column. want code dynamically generated @ run time new columns added defined user @ run time.

i'm using eval.scala in below link dynamic execution

eval.scala https://gist.github.com/xuwei-k/9ba39fe22f120cb098f4

the code below works:

val df: dataframe = sqlcontext.read.load("somefile.parquet")       val schema = structtype(     structfield("k", stringtype, true) ::       structfield("v", integertype, false) :: nil)  //create empty dataframe   var df2: dataframe = sqlcontext.createdataframe(sc.emptyrdd[row], schema)      //add new column eval[unit](s"${df2 = df.withcolumn("""segment""", lit("""soft drinks"""))}") //displays dataframe content df2.show 

when try build code above string , pass eval, fails

var strdf: string= "df2 = df.withcolumn(" + """"segment"""" + ", lit(" + """"soft drinks"""" + "))" eval[unit](s"${strdf}") 

the above code fails "not found: value df2"

what doing wrong here?


Comments

Popular posts from this blog

php - How to add and update images or image url in Volusion using Volusion API -

javascript - IE9 error '$'is not defined -