Spark already provides an explode function on lateral views. Please see
https://issues.apache.org/jira/browse/SPARK-5573.

On Mon, Jul 13, 2015 at 6:47 AM, David Sabater Dinter <
david.sabater.maill...@gmail.com> wrote:

> It seems this feature was added in Hive 0.13.
> https://issues.apache.org/jira/browse/HIVE-4943
>
> I would assume this is supported as Spark is by default compiled using
> Hive 0.13.1.
>
> On Sun, Jul 12, 2015 at 7:42 PM, Ruslan Dautkhanov <dautkha...@gmail.com>
> wrote:
>
>> You can see what Spark SQL functions are supported in Spark by doing the
>> following in a notebook:
>> %sql show functions
>>
>>
>> https://forums.databricks.com/questions/665/is-hive-coalesce-function-supported-in-sparksql.html
>>
>> I think Spark SQL support is currently around Hive ~0.11?
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Tue, Jul 7, 2015 at 3:10 PM, Jeff J Li <l...@us.ibm.com> wrote:
>>
>>> I am trying to use the posexplode function in the HiveContext to
>>> auto-generate a sequence number. This feature is supposed to be available
>>> Hive 0.13.0.
>>>
>>> SELECT name, phone FROM contact LATERAL VIEW
>>> posexplode(phoneList.phoneNumber) phoneTable AS pos, phone
>>>
>>> My test program failed with the following
>>>
>>>         java.lang.ClassNotFoundException: posexplode
>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:665)
>>>         at java.lang.ClassLoader.loadClassHelper(ClassLoader.java:942)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:851)
>>>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:827)
>>>         at
>>> org.apache.spark.sql.hive.HiveFunctionWrapper.createFunction(Shim13.scala:147)
>>>         at
>>> org.apache.spark.sql.hive.HiveGenericUdtf.function$lzycompute(hiveUdfs.scala:274)
>>>         at
>>> org.apache.spark.sql.hive.HiveGenericUdtf.function(hiveUdfs.scala:274)
>>>
>>> Does spark support this Hive function posexplode? If not, how to patch
>>> it to support this? I am on Spark 1.3.1
>>>
>>> Thanks,
>>> Jeff Li
>>>
>>>
>>>
>>
>>
>


-- 
Best Regards,
Ayan Guha

Reply via email to