Thank you for your help Mich. 

ThanksRajesh
 

Sent from Yahoo Mail. Get the app 

    On Wednesday, May 25, 2016 3:14 PM, Mich Talebzadeh 
<mich.talebza...@gmail.com> wrote:
 

 You may have some memory issues OOM etc that terminated the process. 
Dr Mich Talebzadeh LinkedIn  
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
 http://talebzadehmich.wordpress.com 
On 25 May 2016 at 10:35, <spark....@yahoo.com> wrote:

Hi Friends,
In the yarn log files of the nodemanager i can see the error below. Can i know 
why i am getting this error.

ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: 
SIGTERM
ThanksRajesh
 

Sent from Yahoo Mail. Get the app 

    On Wednesday, May 25, 2016 1:08 PM, Mich Talebzadeh 
<mich.talebza...@gmail.com> wrote:
 

 Yes check the yarn log files both resourcemanager and nodemanager. Also ensure 
that you have set up work directories consistently, especially 
yarn.nodemanager.local-dirs 
HTH
Dr Mich Talebzadeh LinkedIn  
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
 http://talebzadehmich.wordpress.com 
On 25 May 2016 at 08:29, Jeff Zhang <zjf...@gmail.com> wrote:

Could you check the yarn app logs ?

On Wed, May 25, 2016 at 3:23 PM, <spark....@yahoo.com.invalid> wrote:

Hi,
I am running spark streaming job on yarn-client mode. If run muliple jobs, some 
of the jobs failing and giving below error message. Is there any configuration 
missing?
ERROR apache.spark.util.Utils - Uncaught exception in thread main
java.lang.NullPointerException
    at 
org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152)
    at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1228)
    at org.apache.spark.SparkEnv.stop(SparkEnv.scala:100)
    at 
org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1749)
    at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185)
    at org.apache.spark.SparkContext.stop(SparkContext.scala:1748)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:593)
    at 
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:878)
    at 
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
    at 
org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:134)
    at 
com.infinite.spark.SparkTweetStreamingHDFSLoad.init(SparkTweetStreamingHDFSLoad.java:212)
    at 
com.infinite.spark.SparkTweetStreamingHDFSLoad.main(SparkTweetStreamingHDFSLoad.java:162)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
INFO  org.apache.spark.SparkContext - Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: Yarn application 
has already ended! It might have been killed or unable to launch application 
master.
    at 
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:123)
    at 
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
    at 
org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:523)
    at 
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:878)
    at 
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
    at 
org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:134)
    at 
com.infinite.spark.SparkTweetStreamingHDFSLoad.init(SparkTweetStreamingHDFSLoad.java:212)
    at 
com.infinite.spark.SparkTweetStreamingHDFSLoad.main(SparkTweetStreamingHDFSLoad.java:162)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
INFO  apache.spark.storage.DiskBlockManager - Shutdown hook called
INFO  apache.spark.util.ShutdownHookManager - Shutdown hook called
INFO  apache.spark.util.ShutdownHookManager - Deleting directory 
/tmp/spark-945fa8f4-477c-4a65-a572-b247e9249061/userFiles-857fece4-83c4-441a-8d3e-2a6ae8e3193a
INFO  apache.spark.util.ShutdownHookManager - Deleting directory 
/tmp/spark-945fa8f4-477c-4a65-a572-b247e9249061
 

Sent from Yahoo Mail. Get the app



-- 
Best Regards

Jeff Zhang



   



  

Reply via email to