t;>- Bagel
>>>>- Support for Hadoop 2.1 and earlier
>>>> - The ability to configure closure serializer
>>>>- HTTPBroadcast
>>>>- TTL-based metadata cleaning
>>>>- *Semi-private class org.apache.spark.Logging. We
2.1 and earlier
>>>- The ability to configure closure serializer
>>>- HTTPBroadcast
>>>- TTL-based metadata cleaning
>>> - *Semi-private class org.apache.spark.Logging. We suggest you use
>>>slf4j directly.*
>>>- SparkContext.metricsSystem
gt;- The ability to configure closure serializer
>>- HTTPBroadcast
>>- TTL-based metadata cleaning
>>- *Semi-private class org.apache.spark.Logging. We suggest you use
>>slf4j directly.*
>>- SparkContext.metricsSystem
>>
>> Thanks,
>>
&
t; *From:* ayan guha [mailto:guha.a...@gmail.com]
> *Sent:* Monday, June 26, 2017 6:26 AM
> *To:* Weiqing Yang
> *Cc:* user
> *Subject:* Re: HDP 2.5 - Python - Spark-On-Hbase
>
>
>
> Hi
>
>
>
> I am using following:
>
>
>
> --packages com.hortonwork
Yang
Cc: user
Subject: Re: HDP 2.5 - Python - Spark-On-Hbase
Hi
I am using following:
--packages com.hortonworks:shc:1.0.0-1.6-s_2.10 --repositories
http://repo.hortonworks.com/content/groups/public/
Is it compatible with Spark 2.X? I would like to use it
Best
Ayan
On Sat, Jun 24, 2017 at 2
Hi
I am using following:
--packages com.hortonworks:shc:1.0.0-1.6-s_2.10 --repositories
http://repo.hortonworks.com/content/groups/public/
Is it compatible with Spark 2.X? I would like to use it
Best
Ayan
On Sat, Jun 24, 2017 at 2:09 AM, Weiqing Yang
wrote:
>
Yes.
What SHC version you were using?
If hitting any issues, you can post them in SHC github issues. There are
some threads about this.
On Fri, Jun 23, 2017 at 5:46 AM, ayan guha wrote:
> Hi
>
> Is it possible to use SHC from Hortonworks with pyspark? If so, any
> working
Hi
Is it possible to use SHC from Hortonworks with pyspark? If so, any working
code sample available?
Also, I faced an issue while running the samples with Spark 2.0
"Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging"
Any workaround?
Thanks in advance
--
Best