Hi,

I have the following code

object MyJob extends org.apache.spark.Logging{
...
 val source: DStream[SomeType] ...

 source.foreachRDD { rdd =>
      logInfo(s"""+++ForEachRDD+++""")
      rdd.foreachPartition { partitionOfRecords =>
        logInfo(s"""+++ForEachPartition+++""")
      }
  }

I was expecting to see both log messages in job log.
But unfortunately you will never see string '+++ForEachPartition+++' in logs, 
cause block foreachPartition will never execute.
And also there is no error message or something in logs.
I wonder is this a bug or known behavior? 
I know that org.apache.spark.Logging is DeveloperAPI, but why it is silently 
fails with no messages?
What to use instead of org.apache.spark.Logging? in spark-streaming jobs?

P.S. running spark 1.4.1 (on yarn)

Thanks in advance

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to