A logical plan should not change assuming the same DAG diagram is used
throughout


Have you tried Spark GUI Page under stages? This is Spark 2

example:

[image: Inline images 1]

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 30 June 2016 at 22:10, Reynold Xin <r...@databricks.com> wrote:

> Which version are you using here? If the underlying files change,
> technically we should go through optimization again.
>
> Perhaps the real "fix" is to figure out why is logical plan creation so
> slow for 700 columns.
>
>
> On Thu, Jun 30, 2016 at 1:58 PM, Darshan Singh <darshan.m...@gmail.com>
> wrote:
>
>> Is there a way I can use same Logical plan for a query. Everything will
>> be same except underlying file will be different.
>>
>> Issue is that my query has around 700 columns and Generating logical plan
>> takes 20 seconds and it happens every 2 minutes but every time underlying
>> file is different.
>>
>> I do not know these files in advance so I cant create the table on
>> directory level. These files are created and then used in the final query.
>>
>> Thanks
>>
>
>

Reply via email to