mysql - Getting Blank table after performing valid query in spark sql -
i new in spark. facing weird problem after submitting valid query inside pyspark sql query is
spark.sql(" select id , entityid , bldgid , leaseid , suiteid , txndate , incomecat , sourcecode , period , dept , actualprojected , ((tchargeamt1*tdays1)/(ttotaldays)+(tchargeamt2*tdays2)/(ttotaldays)) chargeamt , openamt , invoice , currencycode , glclosedstatus , glpostedstatus , paidstatus , frequency , retropd , fcworkbook , fcleaseno , fcsuitid , txndateint fact_cmcharges f join tt on tt.tid = f.id tt.tid <> null ").show()
which fine have save dataframe in registertemptable
spark.sql(" select id, entityid,bldgid,leaseid,suiteid,txndate,incomecat,sourcecode,period,dept,actualprojected,((tchargeamt1*tdays1)/(ttotaldays)+(tchargeamt2*tdays2)/(ttotaldays)) chargeamt ,openamt,invoice,currencycode,glclosedstatus,glpostedstatus,paidstatus,frequency,retropd,fcworkbook,fcleaseno,fcsuitid,txndateint fact_cmcharges f full join tt on tid=f.id tid<>null ").registertemptable('testwithtid')
and table data showing same dataframe field value testing purpose have test query on like
spark.sql("select id,chargeamt testwithtid").show()
and result
which fine well. when perform simple query on table like
spark.sql("select id,chargeamt testwithtid id=2740189134848").show()
i getting this
it seems weird , new me. there no missing or incompatible fields in table getting blank value unexpected. kindly me scenario why happening , possible solution . using pyspark 2.0 in advance kalyan
Comments
Post a Comment