What are the parameters you passed to the classifier and what is the size
of your train data? You are hitting that issue because one of the block
size is over 2G, repartitioning the data will help.

On Fri, Sep 15, 2017 at 7:55 PM, rpulluru <ranjith.pull...@gmail.com> wrote:

> Hi,
>
> I am using sparkR randomForest function and running into
> java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE issue.
> Looks like I am running into this issue
> https://issues.apache.org/jira/browse/SPARK-1476, I used
> spark.default.parallelism=1000 but still facing the same issue.
>
> Thanks
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


-- 
Cheers!

Reply via email to