ut anyway it will print the objectID
>> of the array if the input is same as you have shown here. Try flatMap()
>> instead of map and check if the problem is same.
>>
>>--Himanshu
>>
>> --
>> If you reply
input is same as you have shown here. Try flatMap()
> instead of map and check if the problem is same.
>
>--Himanshu
>
> --------------
> If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-
f the problem is same.
>
>--Himanshu
>
> --
> If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/HELP-I-get-java-lang-String-cannot-be-cast-to-java-lang-Inte
It looks like you have an issue with your classpath, I think it is because
you add a jar containing Spark twice: first, you have a dependency on Spark
somewhere in your build tool (this allows you to compile and run your
application), second you re-add Spark here
> sc.addJar("/home/hadoop/spark-a
tp://apache-spark-user-list.1001560.n3.nabble.com/HELP-I-get-java-lang-String-cannot-be-cast-to-java-lang-Intege-for-a-long-time-tp25666p25667.html
To unsubscribe from HELP! I get "java.lang.String cannot be cast to
java.lang.Intege " for a long time., click here.
NAML
--
View this message
Could you provide some more context? What is rawData?
On 10 December 2015 at 06:38, Bonsen wrote:
> I do like this "val secondData = rawData.flatMap(_.split("\t").take(3))"
>
> and I find:
> 15/12/10 22:36:55 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0,
> 219.216.65.129): java.lang.Cl
I do like this "val secondData = rawData.flatMap(_.split("\t").take(3))"
and I find:
15/12/10 22:36:55 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0,
219.216.65.129): java.lang.ClassCastException: java.lang.String cannot be
cast to java.lang.Integer
at scala.runtime.BoxesRunTime.u