Re: Size exceeds Integer.MAX_VALUE issue with RandomForest

2017-09-18 Thread Pulluru Ranjith
Lost task 0.0 in stage 10.0 (TID 66, node1.test, executor 1): java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE On Sat, Sep 16, 2017 at 8:54 PM, Akhil Das wrote: > What are the parameters you passed to the classifier and what is the size > of your train data? You are hitting that is

Re: Size exceeds Integer.MAX_VALUE issue with RandomForest

2017-09-16 Thread Akhil Das
domForest function and running into > java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE issue. > Looks like I am running into this issue > https://issues.apache.org/jira/browse/SPARK-1476, I used > spark.default.parallelism=1000 but still facing the same issue. > > Th

Size exceeds Integer.MAX_VALUE issue with RandomForest

2017-09-15 Thread rpulluru
Hi, I am using sparkR randomForest function and running into java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE issue. Looks like I am running into this issue https://issues.apache.org/jira/browse/SPARK-1476, I used spark.default.parallelism=1000 but still facing the same issue

[StackOverflow] Size exceeds Integer.MAX_VALUE When Joining 2 Large DFs

2016-11-25 Thread Gerard Maas
This question seems to deserve an scalation from Stack Overflow: http://stackoverflow.com/questions/40803969/spark-size-exceeds-integer-max-value-when-joining-2-large-dfs Looks like an important limitation. -kr, Gerard. Meta:PS: What do you think would be the best way to scalate from SO? Should

Random forest classifier error : Size exceeds Integer.MAX_VALUE

2016-10-23 Thread Kürşat Kurt
Hi; I am trying to train Random forest classifier. I have predefined classification set (classifications.csv , ~300.000 line) While fitting, i am getting "Size exceeds Integer.MAX_VALUE" error. Here is the code: object Test1 { var savePath = "c:/Temp/Spar

Re: Size exceeds Integer.MAX_VALUE

2016-07-24 Thread Andrew Ehrlich
gt; >> Hi, >> >> Please help! >> >> My spark: 1.6.2 >> Java: java8_u40 >> >> I am trying random forest training, I got " Size exceeds Integer.MAX_VALUE". >> >> Any idea how to resolve it? >> >> &g

Re: Size exceeds Integer.MAX_VALUE

2016-07-24 Thread Ascot Moss
ng the data in memory. Make sure you are using Kryo > serialization. > > Andrew > > On Jul 23, 2016, at 9:00 PM, Ascot Moss wrote: > > > Hi, > > Please help! > > My spark: 1.6.2 > Java: java8_u40 > > I am trying random forest training, I got &

Re: Size exceeds Integer.MAX_VALUE

2016-07-23 Thread Andrew Ehrlich
, > > Please help! > > My spark: 1.6.2 > Java: java8_u40 > > I am trying random forest training, I got " Size exceeds Integer.MAX_VALUE". > > Any idea how to resolve it? > > > (the log) > 16/07/24 07:59:49 ERROR Executor: Exception in task 0.0 in

Size exceeds Integer.MAX_VALUE

2016-07-23 Thread Ascot Moss
Hi, Please help! My spark: 1.6.2 Java: java8_u40 I am trying random forest training, I got " Size exceeds Integer.MAX_VALUE". Any idea how to resolve it? (the log) 16/07/24 07:59:49 ERROR Executor: Exception in task 0.0 in stage 7.0 (TID 25) java.lang.IllegalArgumentException: Si

Re: Error:java.lang.RuntimeException: java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

2016-02-24 Thread Yin Yang
a, the code "channel.map(MapMode.READ_ONLY, offset, > length)" will be called, and the "map" function's parameter "length" has a > type of "long", but the valid range is "Integer". > > This results in the error: Size exce

Error:java.lang.RuntimeException: java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

2016-02-24 Thread xiazhuchang
ange is "Integer". This results in the error: Size exceeds Integer.MAX_VALUE. should the valid range of length be changed to Long.MAX_VALUE? Error: org.apache.spark.SparkException: Job aborted due to stage failure: Task 7 in stage 2.0 failed 4 times, most rece

RE: Size exceeds Integer.MAX_VALUE on EMR 4.0.0 Spark 1.4.1

2015-11-16 Thread Ewan Leith
big it would be, I assume it’s over 2 GB From: Zhang, Jingyu [mailto:jingyu.zh...@news.com.au] Sent: 16 November 2015 10:17 To: user Subject: Size exceeds Integer.MAX_VALUE on EMR 4.0.0 Spark 1.4.1 I am using spark-csv to save files in s3, it shown Size exceeds. Please let me know how to fix it

Re: Size exceeds Integer.MAX_VALUE on EMR 4.0.0 Spark 1.4.1

2015-11-16 Thread Sabarish Sasidharan
gt; .format("com.databricks.spark.csv") > .option("header", "true") > .save("s3://newcars.csv"); > > java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE > at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:860) >

Re: Size exceeds Integer.MAX_VALUE (SparkSQL$TreeNodeException: sort, tree) on EMR 4.0.0 Spark 1.4.1

2015-11-16 Thread Zhang, Jingyu
xceeds. Please let > me know how to fix it. Thanks. > > df.write() > .format("com.databricks.spark.csv") > .option("header", "true") > .save("s3://newcars.csv"); > > java.lang.IllegalArgumentExcep

Size exceeds Integer.MAX_VALUE on EMR 4.0.0 Spark 1.4.1

2015-11-16 Thread Zhang, Jingyu
I am using spark-csv to save files in s3, it shown Size exceeds. Please let me know how to fix it. Thanks. df.write() .format("com.databricks.spark.csv") .option("header", "true") .save("s3://newcars.csv"); java.lang.IllegalArgumentException: S

Re: [Spark] java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

2015-10-30 Thread Yifan LI
gt; >> On 29 Oct 2015, at 12:52, Yifan LI wrote: >> >> Hey, >> >> I was just trying to scan a large RDD sortedRdd, ~1billion elements, >> using toLocalIterator api, but an exception returned as it was almost >> finished: >> >> java.lang.RuntimeException

Re: [Spark] java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

2015-10-29 Thread Deng Ching-Mallete
e RDD sortedRdd, ~1billion elements, using > toLocalIterator api, but an exception returned as it was almost finished: > > java.lang.RuntimeException: java.lang.IllegalArgumentException: Size > exceeds Integer.MAX_VALUE > at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:821) > a

Re: [Spark] java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

2015-10-29 Thread Yifan LI
gt; java.lang.RuntimeException: java.lang.IllegalArgumentException: Size exceeds > Integer.MAX_VALUE > at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:821) > at > org.apache.spark.storage.DiskStore$$anonfun$getBytes$2.apply(DiskStore.scala:125) > at > org.

[Spark] java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

2015-10-29 Thread Yifan LI
Hey, I was just trying to scan a large RDD sortedRdd, ~1billion elements, using toLocalIterator api, but an exception returned as it was almost finished: java.lang.RuntimeException: java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE at sun.nio.ch.FileChannelImpl.map

Re: ERROR: "Size exceeds Integer.MAX_VALUE" Spark 1.5

2015-10-05 Thread Anuj Kumar
d unable to > resolve this error, please help > > > org.apache.spark.SparkException: Job aborted due to stage failure: Task > 815 in stage 13.0 failed 4 times, most recent failure: Lost task 815.3 in > stage 13.0 (TID 3612, amd-014.test.com): java.lang.RuntimeException: > java.

ERROR: "Size exceeds Integer.MAX_VALUE" Spark 1.5

2015-10-05 Thread Muhammad Ahsan
or, please help org.apache.spark.SparkException: Job aborted due to stage failure: Task 815 in stage 13.0 failed 4 times, most recent failure: Lost task 815.3 in stage 13.0 (TID 3612, amd-014.test.com): java.lang.RuntimeException: java.lang.IllegalArgumentException: Size exceeds Integer.MA

Re: work around Size exceeds Integer.MAX_VALUE

2015-07-09 Thread Michal Čizmazia
aunch worker-0) Executor:96 - > Exception in task 0.0 in stage 0.0 (TID 0) > java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE > at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:836) > at > org.apache.spark.storag

Re: work around Size exceeds Integer.MAX_VALUE

2015-07-09 Thread Matei Zaharia
mail.com>> wrote: > Which release of Spark are you using ? > > Can you show the complete stack trace ? > > getBytes() could be called from: > getBytes(file, 0, file.length) > or: > getBytes(segment.file, segment.offset, segment.length) > > Cheers >

Re: work around Size exceeds Integer.MAX_VALUE

2015-07-09 Thread Michal Čizmazia
0.0 (TID 0) java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:836) at org.apache.spark.storage.DiskStore$$anonfun$getBytes$2.apply(DiskStore.scala:125) at org.apache.spark.storage.DiskStore$$anonfun

Re: work around Size exceeds Integer.MAX_VALUE

2015-07-09 Thread Ted Yu
uld anyone give me pointers for appropriate SparkConf to work > around "Size exceeds Integer.MAX_VALUE"? > > Stacktrace: > > 2015-07-09 20:12:02 INFO (sparkDriver-akka.actor.default-dispatcher-3) > BlockManagerInfo:59 - Added rdd_0_0 on disk on localhost:51132 (size: 29.8 &g

work around Size exceeds Integer.MAX_VALUE

2015-07-09 Thread Michal Čizmazia
Please could anyone give me pointers for appropriate SparkConf to work around "Size exceeds Integer.MAX_VALUE"? Stacktrace: 2015-07-09 20:12:02 INFO (sparkDriver-akka.actor.default-dispatcher-3) BlockManagerInfo:59 - Added rdd_0_0 on disk on localhost:51132 (size: 29.8 GB) 2015-07-0

Re: Size exceeds Integer.MAX_VALUE exception when broadcasting large variable

2015-02-13 Thread Soila Pertet Kavulya
the broadcast variable exceeds 2GB. >> > Any >> > ideas on how I can resolve this issue? >> > >> > java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE >> > at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:829) >>

Re: Size exceeds Integer.MAX_VALUE exception when broadcasting large variable

2015-02-13 Thread Imran Rashid
t; following exception when the size of the broadcast variable exceeds 2GB. > Any > > ideas on how I can resolve this issue? > > > > java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE > > at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:829)

Re: Size exceeds Integer.MAX_VALUE exception when broadcasting large variable

2015-02-13 Thread Sean Owen
exceeds 2GB. Any > ideas on how I can resolve this issue? > > java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE > at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:829) > at org.apache.spark.storage.DiskStore.getBytes(DiskSto

Size exceeds Integer.MAX_VALUE exception when broadcasting large variable

2015-02-13 Thread soila
I am trying to broadcast a large 5GB variable using Spark 1.2.0. I get the following exception when the size of the broadcast variable exceeds 2GB. Any ideas on how I can resolve this issue? java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE at

Re: java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

2014-09-24 Thread Victor Tso-Guillen
ask ID 40599 > > java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE > > at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:789) > > at org.apache.spark.storage.DiskStore.getBytes(DiskStore.scala:108) > > at > org.apache.spark.storage.BlockManager.doGetLocal

java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

2014-09-24 Thread Victor Tso-Guillen
Really? What should we make of this? 24 Sep 2014 10:03:36,772 ERROR [Executor task launch worker-52] Executor - Exception in task ID 40599 java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:789) at

Re: Size exceeds Integer.MAX_VALUE in BlockFetcherIterator

2014-09-17 Thread Nicholas Chammas
Which appears in turn to be caused by SPARK-1476 . On Wed, Sep 17, 2014 at 9:14 PM, francisco wrote: > Looks like this is a known issue: > > https://issues.apache.org/jira/browse/SPARK-1353 > > > > -- > View this message in context: > http://apac

Re: Size exceeds Integer.MAX_VALUE in BlockFetcherIterator

2014-09-17 Thread francisco
Looks like this is a known issue: https://issues.apache.org/jira/browse/SPARK-1353 -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Size-exceeds-Integer-MAX-VALUE-in-BlockFetcherIterator-tp14483p14500.html Sent from the Apache Spark User List mailing list ar

Re: Size exceeds Integer.MAX_VALUE in BlockFetcherIterator

2014-09-17 Thread Burak Yavuz
probably will not work giving the "exceeds Integer.MAX_VALUE" error. Best, Burak - Original Message - From: "francisco" To: u...@spark.incubator.apache.org Sent: Wednesday, September 17, 2014 3:18:29 PM Subject: Size exceeds Integer.MAX_VALUE in BlockFetcherIterator

Size exceeds Integer.MAX_VALUE in BlockFetcherIterator

2014-09-17 Thread francisco
empty blocks out of 1 blocks 14/09/17 13:33:30 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 0 remote fetches in 8 ms 14/09/17 13:33:30 ERROR BlockFetcherIterator$BasicBlockFetcherIterator: Error occurred while fetching local blocks java.lang.IllegalArgumentException: Size ex