use --jars to include your scala library to be accessed by the JVM
backend.
From: Michal Haris [michal.ha...@visualdna.com]
Sent: Sunday, July 12, 2015 6:39 PM
To: user@spark.apache.org
Subject: Including additional scala libraries in sparkR
I have spark
@spark.apache.orgmailto:user@spark.apache.org
Subject: Including additional scala libraries in sparkR
I have spark program with a custom optimised rdd for hbase scans and updates. I
have a small library of objects in scala to support efficient serialisation,
partitioning etc. I would like to use R as an analysis
a bug.
From: Michal Haris [michal.ha...@visualdna.com]
Sent: Tuesday, July 14, 2015 5:31 PM
To: Sun, Rui
Cc: Michal Haris; user@spark.apache.org
Subject: Re: Including additional scala libraries in sparkR
Ok thanks. It seems that --jars is not behaving
@spark.apache.org
Subject: Including additional scala libraries in sparkR
I have spark program with a custom optimised rdd for hbase scans and updates. I
have a small library of objects in scala to support efficient serialisation,
partitioning etc. I would like to use R as an analysis
I have spark program with a custom optimised rdd for hbase scans and
updates. I have a small library of objects in scala to support efficient
serialisation, partitioning etc. I would like to use R as an analysis and
visualisation front-end. I have tried to use rJava (i.e. not using sparkR)
and I