be called from:
getBytes(file, 0, file.length)
or:
getBytes(segment.file, segment.offset, segment.length)
Cheers
On Thu, Jul 9, 2015 at 2:50 PM, Michal Čizmazia mici...@gmail.com
wrote:
Please could anyone give me pointers for appropriate SparkConf to work
around Size exceeds
(segment.file, segment.offset, segment.length)
Cheers
On Thu, Jul 9, 2015 at 2:50 PM, Michal Čizmazia mici...@gmail.com wrote:
Please could anyone give me pointers for appropriate SparkConf to work
around Size exceeds Integer.MAX_VALUE?
Stacktrace:
2015-07-09 20:12:02 INFO (sparkDriver
Please could anyone give me pointers for appropriate SparkConf to work
around Size exceeds Integer.MAX_VALUE?
Stacktrace:
2015-07-09 20:12:02 INFO (sparkDriver-akka.actor.default-dispatcher-3)
BlockManagerInfo:59 - Added rdd_0_0 on disk on localhost:51132 (size: 29.8
GB)
2015-07-09 20:12:02
:
Please could anyone give me pointers for appropriate SparkConf to work
around Size exceeds Integer.MAX_VALUE?
Stacktrace:
2015-07-09 20:12:02 INFO (sparkDriver-akka.actor.default-dispatcher-3)
BlockManagerInfo:59 - Added rdd_0_0 on disk on localhost:51132 (size: 29.8
GB)
2015-07-09 20:12:02
)
or:
getBytes(segment.file, segment.offset, segment.length)
Cheers
On Thu, Jul 9, 2015 at 2:50 PM, Michal Čizmazia mici...@gmail.com
mailto:mici...@gmail.com wrote:
Please could anyone give me pointers for appropriate SparkConf to work around
Size exceeds Integer.MAX_VALUE?
Stacktrace