TID 3, ..): java.lang.IllegalStateException: unread block data
at
java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2449)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1385)
at
java.io.ObjectInputStream.defaultReadF
Hi,
I have been running a batch of data through my application for the last
couple of days and this morning discovered it had fallen over with the
following error.
java.lang.IllegalStateException: unread block data
at
java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode
I found the reason, it is about sc. Thanks
On Tue, Jul 14, 2015 at 9:45 PM, Akhil Das
wrote:
> Someone else also reported this error with spark 1.4.0
>
> Thanks
> Best Regards
>
> On Tue, Jul 14, 2015 at 6:57 PM, Arthur Chan
> wrote:
>
>> Hi, Below is the log form the worker.
>>
>>
>> 15/07/14
Someone else also reported this error with spark 1.4.0
Thanks
Best Regards
On Tue, Jul 14, 2015 at 6:57 PM, Arthur Chan
wrote:
> Hi, Below is the log form the worker.
>
>
> 15/07/14 17:18:56 ERROR FileAppender: Error writing stream to file
> /spark/app-20150714171703-0004/5/stderr
>
> java.io.I
Hi, Below is the log form the worker.
15/07/14 17:18:56 ERROR FileAppender: Error writing stream to file
/spark/app-20150714171703-0004/5/stderr
java.io.IOException: Stream closed
at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:170)
at java.io.BufferedInputStream.read1(Buf
gt;
> 15/07/14 18:27:40 INFO Executor: Running task 4.0 in stage 174.0 (TID 4517)
>
> 15/07/14 18:27:40 INFO Executor: Running task 5.0 in stage 174.0 (TID 4518)
>
> 15/07/14 18:27:40 INFO Executor: Running task 6.0 in stage 174.0 (TID 4519)
>
> 15/07/14 18:27
nning task 7.0 in stage 174.0 (TID 4520)
15/07/14 18:27:40 INFO Executor: Running task 8.0 in stage 174.0 (TID 4521)
15/07/14 18:27:40 ERROR Executor: Exception in task 1.0 in stage 174.0 (TID
4514)
java.lang.IllegalStateException: unread block data
at
java.io.ObjectI
File at :24
scala> textInput take 10
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
in stage 3.0 failed 4 times, most recent failure: Lost task 0.3 in stage
3.0 (TID 17, hadoop-kn-t503.systems.private):
java.lang.IllegalStateException: unread block data
at
java
I got the same problem, maybe java serializer is unstable
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-IllegalStateException-unread-block-data-tp20668p21463.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
Hello all,
When I try to read data from an HBase table, I get an unread block data
exception. I am running HBase and Spark on a single node (my
workstation). My code is in Java, and I'm running it from the Eclipse
IDE. Here are the versions I'm using :
Cloudera : 2.5.0-cdh5.2.1
Hado
same issue anyone help please
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-IllegalStateException-unread-block-data-tp20668p20745.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
I found solution.
I use HADOOP_MAPRED_HOME in my environment what clashes with spark.
After I set empty HADOOP_MAPRED_HOME spark's started working.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-IllegalStateException-unread-block
://apache-spark-user-list.1001560.n3.nabble.com/java-lang-IllegalStateException-unread-block-data-tp20668p20692.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubsc
failed with
> [akka.tcp://sparkExecutor@node001:37697]] [
> akka.remote.EndpointAssociationException: Association failed with
> [akka.tcp://sparkExecutor@node001:37697]
> Caused by:
> akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
> Connection refused: n
1)
> 14/12/12 20:25:02 WARN scheduler.TaskSetManager: Loss was due to
> java.lang.IllegalStateException
> java.lang.IllegalStateException: unread block data
> at
> java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2421)
>
Someone just posted a very similar question:
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-IllegalStateException-unread-block-data-tt20668.html
I ran into this a few weeks back -- I can't remember if my jar was built
against a different version of spark or if I had acciden
nabble.com/unread-block-data-when-reading-from-NFS-tp20672.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-ma
er: Lost TID 61 (task 1.0:61)
> 14/12/12 20:25:02 WARN scheduler.TaskSetManager: Loss was due to
> java.lang.IllegalStateException
> java.lang.IllegalStateException: unread block data
> at
> java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInpu
(task 1.0:61)
14/12/12 20:25:02 WARN scheduler.TaskSetManager: Loss was due to
java.lang.IllegalStateException
java.lang.IllegalStateException: unread block data
at
java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2421)
at
; driver. That's one of the reasons I filed SPARK-4048, but I digress.)
>>>> >>
>>>> >>
>>>> >> On Tue, Nov 18, 2014 at 1:59 PM, Anson Abraham <
>>>> anson.abra...@gmail.com>
>>>> >> wro
t;> >>
>>> >>
>>> >> On Tue, Nov 18, 2014 at 1:59 PM, Anson Abraham <
>>> anson.abra...@gmail.com>
>>> >> wrote:
>>> >> > I'm essentially loading a file and saving output to another
>>> location:
>>>
>> >> >
>> >> > val source = sc.textFile("/tmp/testfile.txt")
>> >> > source.saveAsTextFile("/tmp/testsparkoutput")
>> >> >
>> >> > when i do so, i'm hitting th
ing output to another location:
> >> >
> >> > val source = sc.textFile("/tmp/testfile.txt")
> >> > source.saveAsTextFile("/tmp/testsparkoutput")
> >> >
> >> > when i do so, i'm hitting this error:
> &g
O DAGScheduler: Failed to run saveAsTextFile at
>> > :15
>> > org.apache.spark.SparkException: Job aborted due to stage failure: Task
>> > 0 in
>> > stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage
>>
uot;/tmp/testsparkoutput")
> >
> > when i do so, i'm hitting this error:
> > 14/11/18 21:15:08 INFO DAGScheduler: Failed to run saveAsTextFile at
> > :15
> > org.apache.spark.SparkException: Job aborted due to stage failure: Task
> 0 in
> >
stage 0.0
> (TID 6, cloudera-1.testdomain.net): java.lang.IllegalStateException: unread
> block data
>
> java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2421)
> java.io.ObjectInputStream.readObject0(ObjectInp
:08 INFO DAGScheduler: Failed to run saveAsTextFile at
>>> :15
>>> org.apache.spark.SparkException: Job aborted due to stage failure: Task
>>> 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage
>>> 0.0 (TID 6, cloudera-1.testdomain
DAGScheduler: Failed to run saveAsTextFile at
>> :15
>> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
>> in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage
>> 0.0 (TID 6, cloudera-1.testdomain.net): java.lang.IllegalStateE
.SparkException: Job aborted due to stage failure: Task 0
> in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage
> 0.0 (TID 6, cloudera-1.testdomain.net): java.lang.IllegalStateException:
> unread block data
>
> java.io.ObjectInput
d to run saveAsTextFile at
:15
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage
0.0 (TID 6, cloudera-1.testdomain.net): java.lang.IllegalStateException:
unread block data
java.io.ObjectInputStream$BlockDa
problem is solved. I basically built a fat spark jar that includes all hbase
stuff and sent over the examples.jar over to the slaves too.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/IllegalStateException-unread-block-data-tp18011p18102.html
Sent from
ter.
Can anyone help with this problem? Tons of thanks!
java.lang.IllegalStateException: unread block data
java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2399)
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1
Hi,
I get exactly the same error. It runs on my local machine but not on the
cluster. I am running the example pi.py example.
Best,
Tassilo
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/stage-failure-java-lang-IllegalStateException-unread-block-data
led with
[akka.tcp://sparkExecutor@node001:37697]] [
akka.remote.EndpointAssociationException: Association failed with
[akka.tcp://sparkExecutor@node001:37697]
Caused by:
akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
Connection refused: node001/10.180.49.228:37697
]
Thanks!
--
View this
.1 in stage 0.0 (TID
1, node001, ANY, 1265 bytes)
14/10/30 17:51:53 INFO TaskSetManager: Lost task 0.1 in stage 0.0 (TID 1) on
executor node001: java.lang.IllegalStateException (unread block data)
[duplicate 1]
14/10/30 17:51:53 INFO TaskSetManager: Starting task 0.2 in stage 0.0 (TID
2, node001
Did you ever find a sln to this problem? I'm having similar issues.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-IllegalStateException-unread-block-data-while-running-the-sampe-WordCount-program-from-Ecle-tp8388p11412.html
Sent from the A
s.apache.org/jira/browse/SPARK-1867
The exception:
Exception in thread "main" org.apache.spark.SparkException: Job aborted:
Task 0.0:1 failed 32 times (most recent failure: Exception failure:
java.lang.IllegalStateException: unread block data)
at
org.apache.spark.schedule
37 matches
Mail list logo