Re: Extracting Logical Plan

2023-08-11 Thread Vibhatha Abeykoon
hatha, >> >> I helped you post this question to another community. There is one answer >> by someone else for your reference. >> >> To access the logical plan or optimized plan, you can register a custom >> QueryExecutionListener and retrieve the plans during the quer

Re: Extracting Logical Plan

2023-08-02 Thread Vibhatha Abeykoon
Hello Winston, Thanks again for this response, I will check this one out. On Wed, Aug 2, 2023 at 3:50 PM Winston Lai wrote: > > Hi Vibhatha, > > I helped you post this question to another community. There is one answer > by someone else for your reference. > > To ac

Re: Extracting Logical Plan

2023-08-02 Thread Winston Lai
Hi Vibhatha, I helped you post this question to another community. There is one answer by someone else for your reference. To access the logical plan or optimized plan, you can register a custom QueryExecutionListener and retrieve the plans during the query execution process. Here's

Re: Extracting Logical Plan

2023-08-02 Thread Vibhatha Abeykoon
0.19) >>> Type in expressions to have them evaluated. >>> Type :help for more information. >>> >>> scala> val df = spark.range(0, 10) >>> df: org.apache.spark.sql.Dataset[Long] = [id: bigint] >>> >>> scala> df.queryExecution >>>

Re: Extracting Logical Plan

2023-08-02 Thread Ruifeng Zheng
, Java 11.0.19) >> Type in expressions to have them evaluated. >> Type :help for more information. >> >> scala> val df = spark.range(0, 10) >> df: org.apache.spark.sql.Dataset[Long] = [id: bigint] >> >> scala> df.queryExecution >> res0: org.apache.spark

Re: Extracting Logical Plan

2023-08-02 Thread Vibhatha Abeykoon
1.0.19) > Type in expressions to have them evaluated. > Type :help for more information. > > scala> val df = spark.range(0, 10) > df: org.apache.spark.sql.Dataset[Long] = [id: bigint] > > scala> df.queryExecution > res0: org.apache.spark.sql.execution.QueryExecut

Re: Extracting Logical Plan

2023-08-02 Thread Ruifeng Zheng
in expressions to have them evaluated. Type :help for more information. scala> val df = spark.range(0, 10) df: org.apache.spark.sql.Dataset[Long] = [id: bigint] scala> df.queryExecution res0: org.apache.spark.sql.execution.QueryExecution = == Parsed Logical Plan == Range (0, 10, step=1, splits=S

Re: Extracting Logical Plan

2023-08-02 Thread Vibhatha Abeykoon
wrote: > Hi Vibhatha, > > How about reading the logical plan from Spark UI, do you have access to > the Spark UI? I am not sure what infra you run your Spark jobs on. Usually > you should be able to view the logical and physical plan under Spark UI in > text version at least. It i

Re: Extracting Logical Plan

2023-08-02 Thread Winston Lai
Hi Vibhatha, How about reading the logical plan from Spark UI, do you have access to the Spark UI? I am not sure what infra you run your Spark jobs on. Usually you should be able to view the logical and physical plan under Spark UI in text version at least. It is independent from the language

Re: Extracting Logical Plan

2023-08-02 Thread Vibhatha Abeykoon
what platform you are running > your Spark jobs on, what cloud servies you are using ... > > On Wednesday, August 2, 2023, Vibhatha Abeykoon > wrote: > >> Hello, >> >> I recently upgraded the Spark version to 3.4.1 and I have encountered a >> few issues.

Re: Extracting Logical Plan

2023-08-01 Thread Winston Lai
ave encountered a > few issues. In my previous code, I was able to extract the logical plan > using `df.queryExecution` (df: DataFrame and in Scala), but it seems like > in the latest API it is not supported. Is there a way to extract the > logical plan or optimized plan from a datafram

Extracting Logical Plan

2023-08-01 Thread Vibhatha Abeykoon
Hello, I recently upgraded the Spark version to 3.4.1 and I have encountered a few issues. In my previous code, I was able to extract the logical plan using `df.queryExecution` (df: DataFrame and in Scala), but it seems like in the latest API it is not supported. Is there a way to extract

restoring SQL text from logical plan

2022-02-16 Thread Wang Cheng
I??m implementing the materialized feature for Spark. I have built a customized listener that logs the logical plan and physical plan of each sql query. After some analysis, I can get the most valuable subtree that needs to be materialized. Then I need to restore the subtree of the plan back

Re: toDebugString - RDD Logical Plan

2019-04-23 Thread kanchan tewary
gt; About the other question, you may use `getNumberPartitions`. > > On Sat, Apr 20, 2019 at 2:40 PM kanchan tewary > wrote: > >> Dear All, >> >> Greetings! >> >> I am new to Apache Spark and working on RDDs using pyspark. I am trying >> to und

Re: toDebugString - RDD Logical Plan

2019-04-20 Thread Dylan Guedes
r question, you may use `getNumberPartitions`. On Sat, Apr 20, 2019 at 2:40 PM kanchan tewary wrote: > Dear All, > > Greetings! > > I am new to Apache Spark and working on RDDs using pyspark. I am trying to > understand the logical plan provided by toDebugString funct

toDebugString - RDD Logical Plan

2019-04-20 Thread kanchan tewary
Dear All, Greetings! I am new to Apache Spark and working on RDDs using pyspark. I am trying to understand the logical plan provided by toDebugString function, but I find two issues a) the output is not formatted when I print the result b) I do not see number of partitions shown. Can anyone

Re: [Spark Core] Is it possible to insert a function directly into the Logical Plan?

2017-08-14 Thread Jörn Franke
What about accumulators ? > On 14. Aug 2017, at 20:15, Lukas Bradley wrote: > > We have had issues with gathering status on long running jobs. We have > attempted to draw parallels between the Spark UI/Monitoring API and our code > base. Due to the separation between

Re: [Spark Core] Is it possible to insert a function directly into the Logical Plan?

2017-08-14 Thread Vadim Semenov
Something like this, maybe? import org.apache.spark.sql.Dataset import org.apache.spark.sql.catalyst.expressions.AttributeReference import org.apache.spark.sql.execution.LogicalRDD import org.apache.spark.sql.catalyst.encoders.RowEncoder val df: DataFrame = ??? val spark = df.sparkSession val

[Spark Core] Is it possible to insert a function directly into the Logical Plan?

2017-08-14 Thread Lukas Bradley
We have had issues with gathering status on long running jobs. We have attempted to draw parallels between the Spark UI/Monitoring API and our code base. Due to the separation between code and the execution plan, even having a guess as to where we are in the process is difficult. The

[Spark Core] Is it possible to insert a function directly into the Logical Plan?

2017-08-11 Thread Lukas Bradley
We have had issues with gathering status on long running jobs. We have attempted to draw parallels between the Spark UI/Monitoring API and our code base. Due to the separation between code and the execution plan, even having a guess as to where we are in the process is difficult. The

Re: Logical Plan

2016-06-30 Thread Mich Talebzadeh
t sure what could be done here. > > Thanks > > On Thu, Jun 30, 2016 at 10:10 PM, Reynold Xin <r...@databricks.com> wrote: > >> Which version are you using here? If the underlying files change, >> technically we should go through optimization again. >> >>

Re: Logical Plan

2016-06-30 Thread Darshan Singh
> Perhaps the real "fix" is to figure out why is logical plan creation so > slow for 700 columns. > > > On Thu, Jun 30, 2016 at 1:58 PM, Darshan Singh <darshan.m...@gmail.com> > wrote: > >> Is there a way I can use same Logical plan for a query. Everything will >

Re: Logical Plan

2016-06-30 Thread Mich Talebzadeh
A logical plan should not change assuming the same DAG diagram is used throughout Have you tried Spark GUI Page under stages? This is Spark 2 example: [image: Inline images 1] HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id

Re: Logical Plan

2016-06-30 Thread Reynold Xin
Which version are you using here? If the underlying files change, technically we should go through optimization again. Perhaps the real "fix" is to figure out why is logical plan creation so slow for 700 columns. On Thu, Jun 30, 2016 at 1:58 PM, Darshan Singh <darshan.m...@gma

Logical Plan

2016-06-30 Thread Darshan Singh
Is there a way I can use same Logical plan for a query. Everything will be same except underlying file will be different. Issue is that my query has around 700 columns and Generating logical plan takes 20 seconds and it happens every 2 minutes but every time underlying file is different. I do