I have got the issues all, after quite a lot of test.

Function would only be defined in static normal function body, or defined as 
static member variable.Function would also be defined as inner static class, 
some its own member variable or functions could be defined, the variable can be 
passed while new the Function obj, and in the Function inner class the inner 
normal function can be called. 


     On Tuesday, November 10, 2015 5:12 PM, Zhiliang Zhu <zchl.j...@yahoo.com> 
wrote:
   

 As more test, the Function call by map/sortBy etc must be defined as static, 
or it can be defined as non-static and must be called by other static normal 
function.I am really confused by it. 


     On Tuesday, November 10, 2015 4:12 PM, Zhiliang Zhu 
<zchl.j...@yahoo.com.INVALID> wrote:
   

 Hi All,
I have met some bug not understandable as follows:
class A {  private JavaRDD<Vector> _com_rdd;  ...  ...
  //here it must be static, but not every Function as map etc would be static, 
as the code examples in spark self official doc
  static Function<Vector, Vector> mapParseRow = new Function<Vector, Vector>() 
{ 
    @Override
     public Vector call (Vector v) {        System.out.println("mark. map log 
is here");        Vector rt;
        ...           //if here needs to call some other non-static function, 
how can it be ?
        return rt;
    }  };  public void run() { //it will be called outside some other public 
class by A object
      ...      JavaRDD<Vector> rdd = (this._com_rdd).map(mapParseRow); //it 
will cause failure while map is not static
      ...  }
}

Would you help comment some for it? What would be done? 

Thank you in advance!Zhiliang 





     On Tuesday, November 10, 2015 11:42 AM, Deng Ching-Mallete 
<och...@apache.org> wrote:
   

 Hi Zhiliang,
You should be able to see them in the executor logs, which you can view via the 
Spark UI, in the Executors page (stderr log).
HTH,Deng

On Tue, Nov 10, 2015 at 11:33 AM, Zhiliang Zhu <zchl.j...@yahoo.com.invalid> 
wrote:

Hi All,
I need debug spark job, my general way is to print out the log, however, some 
bug is in spark functions as mapPartitions etc, and not any log printed from 
those functionscould be found...Would you help point what is way to the log in 
the spark own function as mapPartitions? Or, what is general way to debug spark 
job.
Thank you in advance!
Zhiliang 





   

   

  

Reply via email to