Thanks for attaching code. If I get your use case right you want to call
the sentiment analysis code from Spark Streaming right ? For that I think
you can just use jvmr if that works and I don't think you need SparkR.
SparkR is mainly intended as an API for large scale jobs which are written
in R.
Thanks Shivaram! Will give it a try and let you know.
Regards,
Pawan Venugopal
On Mon, Apr 7, 2014 at 3:38 PM, Shivaram Venkataraman <
shiva...@eecs.berkeley.edu> wrote:
> You can create standalone jobs in SparkR as just R files that are run
> using the sparkR script. These commands will be sen
You can create standalone jobs in SparkR as just R files that are run using
the sparkR script. These commands will be sent to a Spark cluster and the
examples on the SparkR repository (
https://github.com/amplab-extras/SparkR-pkg#examples-unit-tests) are in
fact standalone jobs.
However I don't th
Hi,
Is it possible to create a standalone job in scala using sparkR? If
possible can you provide me with the information of the setup process.
(Like the dependencies in SBT and where to include the JAR files)
This is my use-case:
1. I have a Spark Streaming standalone Job running in local machin