e path and the folder
on all workers contains the same .jar files.
Thank you for your help,
Regards,
Jan
From: Mich Talebzadeh [mailto:mich.talebza...@gmail.com]
Sent: Tuesday, November 1, 2016 3:22 PM
To: Jan Botorek
Cc: Vinod Mangipudi ; user
Subject: Re: Add jar files on classpath when
2016 2:51 PM
To: Jan Botorek
Cc: Vinod Mangipudi ; user
Subject: Re: Add jar files on classpath when submitting tasks to Spark
Are you submitting your job through spark-submit?
Dr Mich Talebzadeh
LinkedIn
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8
mages arising from such loss,
damage or destruction.
On 1 November 2016 at 13:04, Vinod Mangipudi
mailto:vinod...@gmail.com>> wrote:
unsubscribe
On Tue, Nov 1, 2016 at 8:56 AM, Jan Botorek
mailto:jan.boto...@infor.com>> wrote:
Thank you for the reply.
I am aware of the par
, November 1, 2016 1:49 PM
To: Jan Botorek
Cc: user
Subject: Re: Add jar files on classpath when submitting tasks to Spark
There are options to specify external jars in the form of --jars,
--driver-classpath etc depending on spark version and cluster manager.. Please
see spark documents for
Hello,
I have a problem trying to add jar files to be available on classpath when
submitting task to Spark.
In my spark-defaults.conf file I have configuration:
spark.driver.extraClassPath = path/to/folder/with/jars
all jars in the folder are available in SPARK-SHELL
The problem is that jars are
Hello,
>From my point of view, it would be more efficient and probably i more
>"readible" if you just extracted the required data using some json parsing
>library (GSON, Jackson), construct some global object (or pre-process data),
>and then begin with the Spark operations.
Jan
From: Kappagant
Hello, Nipun
In my opinion, the „converting the dataframe to an RDD“ wouldn’t be a costly
operation since Dataframe (Dataset) operations are under the hood operated
always as RDDs. I don’t know which version of Spark you operate, but I suppose
you utilize the 2.0.
I would, therefore go for:
dat