scala - Sort a Spark data frame/ Hive result set -
i'm trying retrieve list of columns hive table , store result in spark dataframe.
var my_column_list = hivecontext.sql(s""" show columns in $my_hive_table""")
but i'm unable alphabetically sort dataframe or result of show columns query. tried using sort , orderby().
how sort result alphabetically?
update: added sample of code
import org.apache.spark.{ sparkconf, sparkcontext } import org.apache.spark.sql.dataframe import org.apache.spark.sql.hive.hivecontext val hivecontext = new hivecontext(sc) hivecontext.sql("use my_test_db") var lv_column_list = hivecontext.sql(s""" show columns in mytable""") //warn lazystruct: bytes detected @ end of row! ignoring similar problems lv_column_list.show //works fine lv_column_list.orderby("result").show //error arises
the show columns
query produces dataframe column named result
. if order column, want :
val df = hivecontext.sql(s""" show columns in $my_hive_table """) df.orderby("result").show
Comments
Post a Comment