Unsubscribe

2020-04-02 Thread Alfredo Marquez
Please unsubscribe me.

Thanks,

Alfredo


unsubscribe

2020-04-02 Thread Alfredo Marquez



Unsubscribe

2020-03-26 Thread Alfredo Marquez
On Thu, Mar 26, 2020, 8:35 PM Andrew Melo  wrote:

> Hello all,
>
> Is there a way to register classes within a datasourcev2 implementation in
> the Kryo serializer?
>
> I've attempted the following in both the constructor and static block of
> my toplevel class:
>
> SparkContext context = SparkContext.getOrCreate();
> SparkConf conf = context.getConf();
> Class[] classesRegistered = new Class[] {
> edu.vanderbilt.accre.laurelin.spark_ttree.Reader.class,
> edu.vanderbilt.accre.laurelin.spark_ttree.Partition.class,
> edu.vanderbilt.accre.laurelin.spark_ttree.SlimTBranch.class
> };
> conf.registerKryoClasses(classesRegistered);
>
> But (if I'm reading correctly), this is too late, since the config has
> already been parsed while initializing the SparkContext, adding classes to
> the SparkConf has no effect. From what I can tell, the kryo instance behind
> is private, so I can't add the registration manually either.
>
> Any thoughts?
> Thanks
> Andrew
>
>
>


Re: Spark 2.4.4 with Hadoop 3.2.0

2019-11-25 Thread Alfredo Marquez
Thank you Ismael! That's what I was looking for. I can take this to our
platform team.

Alfredo

On Mon, Nov 25, 2019, 3:32 PM Ismaël Mejía  wrote:

> Not officially. Apache Spark only announced support for Hadoop 3.x
> starting with the upcoming Spark 3.
> There is a preview release of Spark 3 with support for Hadoop 3.2 that you
> can try now:
>
> https://archive.apache.org/dist/spark/spark-3.0.0-preview/spark-3.0.0-preview-bin-hadoop3.2.tgz
>
> Enjoy!
>
>
>
> On Tue, Nov 19, 2019 at 3:44 PM Alfredo Marquez <
> alfredo.g.marq...@gmail.com> wrote:
>
>> I also would like know the answer to this question.
>>
>> Thanks,
>>
>> Alfredo
>>
>> On Tue, Nov 19, 2019, 8:24 AM bsikander  wrote:
>>
>>> Hi,
>>> Are Spark 2.4.4 and Hadoop 3.2.0 compatible?
>>> I tried to search the mailing list but couldn't find anything relevant.
>>>
>>>
>>>
>>>
>>>
>>> --
>>> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>>>
>>> -
>>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>>
>>>


Re: SparkR integration with Hive 3 spark-r

2019-11-22 Thread Alfredo Marquez
Does anyone else have some insight to this question?

Thanks,

Alfredo

On Mon, Nov 18, 2019, 3:00 PM Alfredo Marquez 
wrote:

> Hello Nicolas,
>
> Well the issue is that with Hive 3, Spark gets it's own metastore,
> separate from the Hive 3 metastore.  So how do you reconcile this
> separation of metastores?
>
> Can you continue to "enableHivemetastore" and be able to connect to Hive
> 3? Does this connection take advantage of Hive's LLAP?
>
> Our team doesn't believe that it's possible to make the connection as you
> would in the past.  But if it is that simple, I would be ecstatic .
>
> Thanks,
>
> Alfredo
>
> On Mon, Nov 18, 2019, 12:53 PM Nicolas Paris 
> wrote:
>
>> Hi Alfredo
>>
>> my 2 cents:
>> To my knowlegde and reading the spark3 pre-release note, it will handle
>> hive metastore 2.3.5 - no mention of hive 3 metastore. I made several
>> tests on this in the past[1] and it seems to handle any hive metastore
>> version.
>>
>> However spark cannot read hive managed table AKA transactional tables.
>> So I would say you should be able to read any hive 3 regular table with
>> any of spark, pyspark or sparkR.
>>
>>
>> [1]
>> https://parisni.frama.io/posts/playing-with-hive-spark-metastore-versions/
>>
>> On Mon, Nov 18, 2019 at 11:23:50AM -0600, Alfredo Marquez wrote:
>> > Hello,
>> >
>> > Our company is moving to Hive 3, and they are saying that there is no
>> SparkR
>> > implementation in Spark 2.3.x + that will connect to Hive 3.  Is this
>> true?
>> >
>> > If it is true, will this be addressed in the Spark 3 release?
>> >
>> > I don't use python, so losing SparkR to get work done on Hadoop is a
>> huge loss.
>> >
>> > P.S. This is my first email to this community; if there is something I
>> should
>> > do differently, please let me know.
>> >
>> > Thank you
>> >
>> > Alfredo
>>
>> --
>> nicolas
>>
>> -
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>>


Re: Spark 2.4.4 with Hadoop 3.2.0

2019-11-19 Thread Alfredo Marquez
I also would like know the answer to this question.

Thanks,

Alfredo

On Tue, Nov 19, 2019, 8:24 AM bsikander  wrote:

> Hi,
> Are Spark 2.4.4 and Hadoop 3.2.0 compatible?
> I tried to search the mailing list but couldn't find anything relevant.
>
>
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: SparkR integration with Hive 3 spark-r

2019-11-18 Thread Alfredo Marquez
Hello Nicolas,

Well the issue is that with Hive 3, Spark gets it's own metastore, separate
from the Hive 3 metastore.  So how do you reconcile this separation of
metastores?

Can you continue to "enableHivemetastore" and be able to connect to Hive 3?
Does this connection take advantage of Hive's LLAP?

Our team doesn't believe that it's possible to make the connection as you
would in the past.  But if it is that simple, I would be ecstatic .

Thanks,

Alfredo

On Mon, Nov 18, 2019, 12:53 PM Nicolas Paris 
wrote:

> Hi Alfredo
>
> my 2 cents:
> To my knowlegde and reading the spark3 pre-release note, it will handle
> hive metastore 2.3.5 - no mention of hive 3 metastore. I made several
> tests on this in the past[1] and it seems to handle any hive metastore
> version.
>
> However spark cannot read hive managed table AKA transactional tables.
> So I would say you should be able to read any hive 3 regular table with
> any of spark, pyspark or sparkR.
>
>
> [1]
> https://parisni.frama.io/posts/playing-with-hive-spark-metastore-versions/
>
> On Mon, Nov 18, 2019 at 11:23:50AM -0600, Alfredo Marquez wrote:
> > Hello,
> >
> > Our company is moving to Hive 3, and they are saying that there is no
> SparkR
> > implementation in Spark 2.3.x + that will connect to Hive 3.  Is this
> true?
> >
> > If it is true, will this be addressed in the Spark 3 release?
> >
> > I don't use python, so losing SparkR to get work done on Hadoop is a
> huge loss.
> >
> > P.S. This is my first email to this community; if there is something I
> should
> > do differently, please let me know.
> >
> > Thank you
> >
> > Alfredo
>
> --
> nicolas
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


SparkR integration with Hive 3 spark-r

2019-11-18 Thread Alfredo Marquez
Hello,

Our company is moving to Hive 3, and they are saying that there is no
SparkR implementation in Spark 2.3.x + that will connect to Hive 3.  Is
this true?

If it is true, will this be addressed in the Spark 3 release?

I don't use python, so losing SparkR to get work done on Hadoop is a huge
loss.

P.S. This is my first email to this community; if there is something I
should do differently, please let me know.

Thank you

Alfredo