It seems not an issue in Spark. Does "CSVParser" works fine without Spark
with the data?

On 20 Sep 2016 2:15 a.m., "Mohamed ismail" <mismai...@yahoo.com.invalid>
wrote:

> Hi all
>
> I am trying to read:
>
> sc.textFile(DataFile).mapPartitions(lines => {
>                     val parser = new CSVParser(",")
>                     lines.map(line=>parseLineToTuple(line, parser))
>                     })
> Data looks like:
> android phone,0,0,0,,0,0,0,0,0,0,0,5,0,0,0,5,0,0.00000,0.00000,0.
> 00000,0.00000,0.00000,0,0,0,0,0,0,0,0.00000,0,0,0
> ios phone,0,-1,0,,0,0,0,0,0,0,1,0,0,0,0,1,0,0.00000,0.00000,0.
> 00000,0.00000,0.00000,0,0,0,0,0,0,0,0.00000,0,0,0
>
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 1
> in stage 23055.0 failed 4 times, most recent failure: Lost task 1.3 in
> stage 23055.0 (TID 191607, ):
> java.lang.NumberFormatException: For input string: "0.00000"
>
> Has anyone faced such issues. Is there a solution?
>
> Thanks,
> Mohamed
>
>

Reply via email to