1 seems like its spending a lot of time in R (slicing the data I guess?) and
not with Spark
2 could you write it into a csv file locally and then read it from Spark?
From: ayan guha
Sent: Monday, October 8, 2018 11:21 PM
To: user
Subject: SparkR issue
Hi
We
Hi
We are seeing some weird behaviour in Spark R.
We created a R Dataframe with 600K records and 29 columns. Then we tried to
convert R DF to SparkDF using
df <- SparkR::createDataFrame(rdf)
from RStudio. It hanged, we had to kill the process after 1-2 hours.
We also tried following:
df <-
Hello,
I am seeing this issue when starting the sparkR shell. Please note that I
have R version 2.14.1.
[root@vertica4 bin]# sparkR
R version 2.14.1 (2011-12-22)
Copyright (C) 2011 The R Foundation for Statistical Computing
ISBN 3-900051-07-0
Platform: x86_64-unknown-linux-gnu (64-bit)
R is
Yes, right now, we only tested SparkR with R 3.x
On Fri, Jun 19, 2015 at 5:53 AM, Kulkarni, Vikram
vikram.kulka...@hp.com wrote:
Hello,
I am seeing this issue when starting the sparkR shell. Please note that I
have R version 2.14.1.
[root@vertica4 bin]# sparkR
R version 2.14.1