Hi all,
Currently, I wrote some code lines to access spark master which was deployed
on standalone style. I wanted to set the breakpoint for spark master which
was running on the different process. I am wondering maybe I need attach
process in IntelliJ, so that when AppClient sent the message to
Hi all,
I wanted to join the data frame based on spark sql in IntelliJ, and wrote
these code lines as below:
df1.as('first).join(df2.as('second), $first._1 === $second._1)
IntelliJ reported the error for $ and === in red colour.
I found $ and === are defined as implicit conversion in
Hi all,
I run start-master.sh to start standalone Spark with
spark://192.168.1.164:7077. Then, I use this command as below, and it's OK:
./bin/spark-shell --master spark://192.168.1.164:7077
The console print correct message, and Spark context had been initialised
correctly.
However, when I