Hi all,
This is more of a general architecture question, I have my idea, but wanted to
confirm/infirm...
When your executor is accessing data, where is it stored: at the executor level
or at the worker level?
jg
-
To unsubscr
Hi Sunitha,
Make the class which is having the common function your calling as
serializable.
Thank you,
Naresh
On Wed, Dec 20, 2017 at 9:58 PM Sunitha Chennareddy <
chennareddysuni...@gmail.com> wrote:
> Hi,
>
> Thank You All..
>
> Here is my requirement, I have a dataframe which contains list
You can add a shutdown hook to your JVM and request spark streaming context
to stop gracefully.
/**
* Shutdown hook to shutdown JVM gracefully
* @param ssCtx
*/
def addShutdownHook(ssCtx: StreamingContext) = {
Runtime.getRuntime.addShutdownHook( new Thread() {
override def
I'm trying to write a deployment job for Spark application. Basically the
job will send yarn application --kill app_id to the cluster but after the
application received the signal it dies without finishing whatever is
processing or stopping the stream.
I'm using Spark Streaming. What's the best wa
unsubscribe
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Hi I think you are in correct track. You can stuff all your param in a
suitable data structure like array or dict and pass this structure as a
single param in your udf.
On Fri, 22 Dec 2017 at 2:55 pm, Aakash Basu
wrote:
> Hi,
>
> I am using Spark 2.2 using Java, can anyone please suggest me how
Hi,
I am using Spark 2.2 using Java, can anyone please suggest me how to take
more than 22 parameters in an UDF? I mean, if I want to pass all the
parameters as an array of integers?
Thanks,
Aakash.