The program does not fail consistently. We are executing the spark driver as a 
standalone Java program. If the cause was a null value in a field, I would 
assume the program will fail every time we execute it . But, it fails with 
Nullpointer exception and Task failure only on certain executions even if the 
input remains the same and we restart the program.

Thanks
Bijuna


Sent from my iPhone

> On Feb 17, 2016, at 3:18 AM, Sudhanshu Janghel 
> <sudhanshu.jang...@cloudwick.com> wrote:
> 
> I think that the value in the data frame is null for some field. Why not 
> insert a check for it . For e.g. Say field is not null then convert to a df.
> 
> Kind Regards,
> Sudhanshu
> 
>> On 16 Feb 2016, at 8:03 pm, Bijuna <bij...@gmail.com> wrote:
>> 
>> Spark Experts,
>> 
>> We are using Spark 1.6 Java version in standalone mode in Windows . Our 
>> program reads data from text files using javasparkcontext and SQLServer 
>> using SQLContext. 
>> 
>> We use DataFrame API to filter, join and aggregate data read from above 
>> sources. The collectAsList operation on the DataFrame fails intermittently 
>> with Nullpointer and Task failure exception. I have attached two screenshots 
>> of the errors we see.
>> 
>> Any help in debugging and fixing this error will be much appreciated.
>> 
>> Thank you,
>> Bijuna
>> 
>> 
>> Sent from my iPad
>> 
>> <IMG_0337.JPG>
>> <IMG_0338.JPG>
>>> 
>>> 
>>> 
>>> Sent from my iPad
>> 
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to