[ 
https://issues.apache.org/jira/browse/SPARK-15443?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Saisai Shao updated SPARK-15443:
--------------------------------
    Description: 
Currently when called `explain()` on streaming dataset, it will only get the 
parsed and analyzed logical plan, exceptions for optimized logical plan and 
physical plan, like below:

{code}
scala> res0.explain(true)
== Parsed Logical Plan ==
FileSource[file:///tmp/input]
== Analyzed Logical Plan ==
value: string
FileSource[file:///tmp/input]
== Optimized Logical Plan ==
org.apache.spark.sql.AnalysisException: Queries with streaming sources must be 
executed with write.startStream();
== Physical Plan ==
org.apache.spark.sql.AnalysisException: Queries with streaming sources must be 
executed with write.startStream();
{code}

The reason is that structure streaming dynamically materialize the plan in the 
run-time.

So here we should figure out a way to properly get the streaming plan. 

  was:
Currently when called `explain()` on streaming dataset, it will only get the 
parsed and analyzed logical plan and exceptions for optimized logical plan and 
physical plan, like below:

{code}
scala> res0.explain(true)
== Parsed Logical Plan ==
FileSource[file:///tmp/input]
== Analyzed Logical Plan ==
value: string
FileSource[file:///tmp/input]
== Optimized Logical Plan ==
org.apache.spark.sql.AnalysisException: Queries with streaming sources must be 
executed with write.startStream();
== Physical Plan ==
org.apache.spark.sql.AnalysisException: Queries with streaming sources must be 
executed with write.startStream();
{code}

The reason is that structure streaming dynamically materialize the plan in the 
run-time.

So here we should figure out a way to properly get the streaming plan. 


> Properly explain the streaming queries
> --------------------------------------
>
>                 Key: SPARK-15443
>                 URL: https://issues.apache.org/jira/browse/SPARK-15443
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL, Streaming
>    Affects Versions: 2.0.0
>            Reporter: Saisai Shao
>            Priority: Minor
>
> Currently when called `explain()` on streaming dataset, it will only get the 
> parsed and analyzed logical plan, exceptions for optimized logical plan and 
> physical plan, like below:
> {code}
> scala> res0.explain(true)
> == Parsed Logical Plan ==
> FileSource[file:///tmp/input]
> == Analyzed Logical Plan ==
> value: string
> FileSource[file:///tmp/input]
> == Optimized Logical Plan ==
> org.apache.spark.sql.AnalysisException: Queries with streaming sources must 
> be executed with write.startStream();
> == Physical Plan ==
> org.apache.spark.sql.AnalysisException: Queries with streaming sources must 
> be executed with write.startStream();
> {code}
> The reason is that structure streaming dynamically materialize the plan in 
> the run-time.
> So here we should figure out a way to properly get the streaming plan. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to