We plan to upgrade our spark cluster to 1.4, and I just have a test in
local mode  which reference here:
http://hortonworks.com/blog/bringing-orc-support-into-apache-spark/

but an exception caused when running the example, the stack trace as below:

*Exception in thread "main" java.lang.NoSuchFieldError: defaultVal*
    at
org.apache.spark.sql.hive.HiveContext$$anonfun$newTemporaryConfiguration$1.apply(HiveContext.scala:536)
    at
org.apache.spark.sql.hive.HiveContext$$anonfun$newTemporaryConfiguration$1.apply(HiveContext.scala:534)
    at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:105)
    at
org.apache.spark.sql.hive.HiveContext$.newTemporaryConfiguration(HiveContext.scala:534)
    at
org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:165)
    at
org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:168)
    *at com.newegg.ec.bigdata.ORCSpark$.main(ORCSpark.scala:24)*
    at com.newegg.ec.bigdata.ORCSpark.main(ORCSpark.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

the code of the 24th line was:  val sqlContext = new
org.apache.spark.sql.hive.HiveContext(sc)

I use the Spark core with 1.4.1 and Hive with 1.1.0-cdh5.4.0


On Sat, Aug 22, 2015 at 11:18 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> In Spark 1.4, there was considerable refactoring around interaction with
> Hive, such as SPARK-7491.
>
> It would not be straight forward to port ORC support to 1.3
>
> FYI
>
> On Fri, Aug 21, 2015 at 10:21 PM, dong.yajun <dongt...@gmail.com> wrote:
>
>> hi Ted,
>>
>> thanks for your reply, are there any other way to do this with spark 1.3?
>> such as write the orcfile manually in foreachPartition method?
>>
>> On Sat, Aug 22, 2015 at 12:19 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>> ORC support was added in Spark 1.4
>>> See SPARK-2883
>>>
>>> On Fri, Aug 21, 2015 at 7:36 PM, dong.yajun <dongt...@gmail.com> wrote:
>>>
>>>> Hi list,
>>>>
>>>> Is there a way to save the RDD result as Orcfile in spark1.3?  due to
>>>> some reasons we can't upgrade our spark version to 1.4 now.
>>>>
>>>> --
>>>> *Ric Dong*
>>>>
>>>>
>>>
>>
>>
>> --
>> *Ric Dong*
>>
>>
>


-- 
*Ric Dong*

Reply via email to