Hi,

My configurations are follows:

SPARK_EXECUTOR_INSTANCES=4
SPARK_EXECUTOR_MEMORY=1G

But on my spark UI it shows:

 * *Alive Workers:*1
 * *Cores in use:*4 Total, 0 Used
 * *Memory in use:*6.7 GB Total, 0.0 B Used


Also while running a program in java for spark I am getting the following error: 15/09/25 10:35:02 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, 192.168.0.105): java.lang.IllegalStateException: unread block data at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2421)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1382)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72) at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)

I am not getting what all is happening.Can anyone help?

--
Thanks and Regards
Madhvi Gupta

Reply via email to