KafkaLZ4BlockOutputStream is in kafka-clients jar :

$ jar tvf kafka-clients-0.8.2.0.jar | grep KafkaLZ4BlockOutputStream
  1609 Wed Jan 28 22:30:36 PST 2015
org/apache/kafka/common/message/KafkaLZ4BlockOutputStream$BD.class
  2918 Wed Jan 28 22:30:36 PST 2015
org/apache/kafka/common/message/KafkaLZ4BlockOutputStream$FLG.class
  4578 Wed Jan 28 22:30:36 PST 2015
org/apache/kafka/common/message/KafkaLZ4BlockOutputStream.class

Can you check whether kafka-clients jar was in the classpath of the
container ?

Thanks

On Fri, Mar 11, 2016 at 5:00 PM, Siva <sbhavan...@gmail.com> wrote:

> Hi Everyone,
>
> All of sudden we are encountering the below error from one of the spark
> consumer. It used to work before without any issues.
>
> When I restart the consumer with latest offsets, it is working fine for
> sometime (it executed few batches) and it fails again, this issue is
> intermittent.
>
> Did any one come across this issue?
>
> 16/03/11 19:44:50 WARN scheduler.TaskSetManager: Lost task 0.0 in stage
> 1.0 (TID 3, ip-172-31-32-183.us-west-2.compute.internal):
> java.lang.NoClassDefFoundError:
> org/apache/kafka/common/message/KafkaLZ4BlockOutputStream
> at
> kafka.message.ByteBufferMessageSet$.decompress(ByteBufferMessageSet.scala:65)
> at
> kafka.message.ByteBufferMessageSet$$anon$1.makeNextOuter(ByteBufferMessageSet.scala:179)
> at
> kafka.message.ByteBufferMessageSet$$anon$1.makeNext(ByteBufferMessageSet.scala:192)
> at
> kafka.message.ByteBufferMessageSet$$anon$1.makeNext(ByteBufferMessageSet.scala:146)
> at kafka.utils.IteratorTemplate.maybeComputeNext(IteratorTemplate.scala:66)
> at kafka.utils.IteratorTemplate.hasNext(IteratorTemplate.scala:58)
> at scala.collection.Iterator$$anon$1.hasNext(Iterator.scala:847)
> at scala.collection.Iterator$$anon$19.hasNext(Iterator.scala:615)
> at
> org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.getNext(KafkaRDD.scala:160)
> at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
> at scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:388)
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
> at
> org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:199)
> at
> org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:56)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:70)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
> at org.apache.spark.scheduler.Task.run(Task.scala:70)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.kafka.common.message.KafkaLZ4BlockOutputStream
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> ... 23 more
>
> Container id: container_1456361466298_0236_01_000002
> Exit code: 50
> Stack trace: ExitCodeException exitCode=50:
>       at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
>       at org.apache.hadoop.util.Shell.run(Shell.java:455)
>       at 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
>       at 
> org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212)
>       at 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
>       at 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
>
>
> Container exited with a non-zero exit code 50
>
> 16/03/11 19:44:55 INFO yarn.YarnAllocator: Completed container 
> container_1456361466298_0236_01_000003 (state: COMPLETE, exit status: 50)
> 16/03/11 19:44:55 INFO yarn.YarnAllocator: Container marked as failed: 
> container_1456361466298_0236_01_000003. Exit status: 50. Diagnostics: 
> Exception from container-launch.
> Container id: container_1456361466298_0236_01_000003
> Exit code: 50
> Stack trace: ExitCodeException exitCode=50:
>       at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
>       at org.apache.hadoop.util.Shell.run(Shell.java:455)
>       at 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
>       at 
> org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212)
>       at 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
>       at 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
>
>
>
>
> Thanks,
> Sivakumar Bhavanari.
>

Reply via email to