Thanks a lot for the guidelines.
I could successfully configure and debug
On Wed, Aug 24, 2016 at 7:05 PM, Jacek Laskowski wrote:
> On Wed, Aug 24, 2016 at 2:32 PM, Steve Loughran
> wrote:
>
> > no reason; the key thing is : not in cluster mode, as there your work
> happens elsewhere
>
> Righ
On Wed, Aug 24, 2016 at 2:32 PM, Steve Loughran wrote:
> no reason; the key thing is : not in cluster mode, as there your work happens
> elsewhere
Right! Anything but cluster mode should make it easy (that leaves us
with local).
Jacek
--
> On 24 Aug 2016, at 11:38, Jacek Laskowski wrote:
>
> On Wed, Aug 24, 2016 at 11:13 AM, Steve Loughran
> wrote:
>
>> I'd recommend
>
> ...which I mostly agree to with some exceptions :)
>
>> -stark spark standalone from there
>
> Why spark standalone since the OP asked about "learning how
On Wed, Aug 24, 2016 at 11:13 AM, Steve Loughran wrote:
> I'd recommend
...which I mostly agree to with some exceptions :)
> -stark spark standalone from there
Why spark standalone since the OP asked about "learning how query
execution flow occurs in Spark SQL"? How about spark-shell in local
On 24 Aug 2016, at 07:10, Nishadi Kirielle
mailto:ndime...@gmail.com>> wrote:
Hi,
I'm engaged in learning how query execution flow occurs in Spark SQL. In order
to understand the query execution flow, I'm attempting to run an example in
debug mode with intellij IDEA. It would be great if anyon
Hi,
I'm engaged in learning how query execution flow occurs in Spark SQL. In
order to understand the query execution flow, I'm attempting to run an
example in debug mode with intellij IDEA. It would be great if anyone can
help me with debug configurations.
Thanks & Regards
Nishadi
On Tue, Jun 21,
You can read this documentation to get started with the setup
https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IntelliJ
There was a pyspark setup discussion on SO over here
http://stackoverflow.com/questions/33478218/write-and-run-pyspark-in-intellij-i
Hi all,
I am interested in figuring out how pyspark works at core/internal level.
And would like to understand the code flow as well.
For that I need to run a simple example in debug mode so that I can trace
the data flow for pyspark.
Can anyone please guide me on how do I set up my developmen