Hi Team, I have been trying out of core graph option with 1.2 branch of giraph . I have been continously recieving the buffer underflow exception exactly when the superstep1 starts. Any pointers would be really helpful here ,
Logs: 2016-07-25 16:08:20,664 ERROR [ooc-io-0] org.apache.giraph.utils.LogStacktraceCallable: Execution of callable failed java.lang.RuntimeException: call: execution of IO command LoadPartitionIOCommand: (partitionId = 258, superstep = 1) failed! at org.apache.giraph.ooc.OutOfCoreIOCallable.call(OutOfCoreIOCallable.java:115) at org.apache.giraph.ooc.OutOfCoreIOCallable.call(OutOfCoreIOCallable.java:36) at org.apache.giraph.utils.LogStacktraceCallable.call(LogStacktraceCallable.java:67) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: com.esotericsoftware.kryo.KryoException: Buffer underflow. at com.esotericsoftware.kryo.io.Input.require(Input.java:199) at com.esotericsoftware.kryo.io.UnsafeInput.readInt(UnsafeInput.java:82) at com.esotericsoftware.kryo.io.KryoDataInput.readInt(KryoDataInput.java:87) at org.apache.hadoop.io.IntWritable.readFields(IntWritable.java:47) at org.apache.giraph.ooc.data.DiskBackedPartitionStore.readOutEdges(DiskBackedPartitionStore.java:245) at org.apache.giraph.ooc.data.DiskBackedPartitionStore.loadInMemoryPartitionData(DiskBackedPartitionStore.java:278) at org.apache.giraph.ooc.data.DiskBackedDataStore.loadPartitionDataProxy(DiskBackedDataStore.java:233) at org.apache.giraph.ooc.data.DiskBackedPartitionStore.loadPartitionData(DiskBackedPartitionStore.java:311) at org.apache.giraph.ooc.command.LoadPartitionIOCommand.execute(LoadPartitionIOCommand.java:66) at org.apache.giraph.ooc.OutOfCoreIOCallable.call(OutOfCoreIOCallable.java:99) ... 6 more Command: hadoop jar /usr/local/giraph-1.2/giraph-examples/target/giraph-examples-1.2.0-SNAPSHOT-for-hadoop-2.5.1-jar-with-dependencies.jar org.apache.giraph.GiraphRunner -Dmapreduce.task.timeout=12000000 -Dmapred.job.tracker=ip-172-31-42-220.eu-west-1.compute.internal:8021 -Dmapreduce.map.memory.mb=43480 -Dmapreduce.map.java.opts=-Xmx42480m org.apache.giraph.examples.ConnectedComponentsComputation -vif org.apache.giraph.io.formats.IntIntNullTextInputFormat -vip /VUID/input_2B -vof org.apache.giraph.io.formats.IdWithValueTextOutputFormat -op /VUID/ouput_2B -w 4 -ca giraph.userPartitionCount=400,giraph.SplitMasterWorker=true,giraph.isStaticGraph=true,giraph.maxPartitionsInMemory=10,mapred.map.max.attempts=2,giraph.maxMessagesInMemory=100,giraph.numOutputThreads=1,giraph.useOutOfCoreMessages=true,giraph.numInputThreads=1,giraph.useOutOfCoreGraph=true,giraph.numComputeThreads=1,giraph.messagesBufferSize=8192000,giraph.partitionsDirectory=_bs,giraph.useUnsafeSerialization=false Thanks, Ramesh