That's string interpolation. You could create your own for example :bind
and then do replaceall, to replace named parameter.
On Wed, Nov 28, 2018, 18:55 Mann Du Hello there,
>
> I am trying to pass parameters in spark.sql query in Java code, the same
> as in this link
>
>
Hello there,
I am trying to pass parameters in spark.sql query in Java code, the same
as in this link
https://forums.databricks.com/questions/115/how-do-i-pass-parameters-to-my-sql-statements.html
The link suggested to use 's' before 'select' as -
val param = 100
spark.sql(s""" select * from
Are you referring to have spark picking up a new jar build? If so, you can
probably script that on bash.
Thank You,
Irving Duran
On Wed, Nov 28, 2018 at 12:44 PM Mina Aslani wrote:
> Hi,
>
> I have a question for you.
> Do we need to kill a spark job every time we change and deploy it to
>
Hi,
I have a question for you.
Do we need to kill a spark job every time we change and deploy it to
cluster? Or, is there a way for Spark to automatically pick up the recent
jar version?
Best regards,
Mina
I ran into problems using 5.19 so I referred to 5.17 and it resolved my
issues.
On Wed, Nov 28, 2018 at 2:48 AM Conrad Lee wrote:
> Hello Vadim,
>
> Interesting. I've only been running this job at scale for a couple weeks
> so I can't say whether this is related to recent EMR changes.
>
> Much
Hi Muthuraman,
Previous to 0.9, Kafka had no built-in security features but DStreams
support only kafka 0.8.
Suggest to use structured streaming where 0.10+ (2.0.0 in spark 2.4)
support is available.
BR,
G
On Wed, Nov 28, 2018 at 4:15 AM Ramaswamy, Muthuraman <
muthuraman.ramasw...@viasat.com>