Re: [Spark SQL] does pyspark udf support spark.sql inside def

2020-10-01 Thread Lakshmi Nivedita
Sure, will do that.I am using impala in pyspark. to retrieve the data A table schema date1 Bigint date2 Bigint ctry string sample data for table A: date1 date2 ctry 22-12-2012 06-01-2013 IN B table schema holidate Bigint Holiday =0/1 —string 0 means holiday—- 1 means

Re: [Spark SQL] does pyspark udf support spark.sql inside def

2020-09-30 Thread Amit Joshi
Can you pls post the schema of both the tables. On Wednesday, September 30, 2020, Lakshmi Nivedita wrote: > Thank you for the clarification.I would like to how can I proceed for > this kind of scenario in pyspark > > I have a scenario subtracting the total number of days with the number of > ho

Re: [Spark SQL] does pyspark udf support spark.sql inside def

2020-09-30 Thread Lakshmi Nivedita
Thank you for the clarification.I would like to how can I proceed for this kind of scenario in pyspark I have a scenario subtracting the total number of days with the number of holidays in pyspark by using dataframes I have a table with dates date1 date2 in one table and number of holidays in

Re: [Spark SQL] does pyspark udf support spark.sql inside def

2020-09-30 Thread Sean Owen
No, you can't use the SparkSession from within a function executed by Spark tasks. On Wed, Sep 30, 2020 at 7:29 AM Lakshmi Nivedita wrote: > Here is a spark udf structure as an example > > Def sampl_fn(x): >Spark.sql(“select count(Id) from sample Where Id = x ”) > > > Spark.udf.regis

[Spark SQL] does pyspark udf support spark.sql inside def

2020-09-30 Thread Lakshmi Nivedita
Here is a spark udf structure as an example Def sampl_fn(x): Spark.sql(“select count(Id) from sample Where Id = x ”) Spark.udf.register(“sample_fn”, sample_fn) Spark.sql(“select id, sampl_fn(Id) from example”) Advance Thanks for the help -- k.Lakshmi Nivedita