Hello guys,
hope all of you are ok.
I am trying to use HiveContext on Spark 1.6, I am developing using Eclipse and
I placed the hive-site.xml in the classPath, so doing I use the Hive instance
running on my cluster instead
of creating a local metastore and a local warehouse.
So far so good, in
Hello guys,
hope all of you are ok.
I am trying to use HiveContext on Spark 1.6, I am developing using Eclipse and
I placed the hive-site.xml in the classPath, so doing I use the Hive instance
running on my cluster instead
of creating a local metastore and a local warehouse.
So far so good, in
Hi,
I have a doubt about Kryo and Spark 1.6.0.
I read that for using Kryo, the class that you want to serialize must have a
default constructor.
I created a simple class avoiding to insert such a constructor and If I try to
serialize manually, it does not work.
But If I use that class in Spark
January 2017 15:12
To: Enrico DUrso
Cc: user@spark.apache.org
Subject: Re: Kryo On Spark 1.6.0
If you don’t mind, could please share me with the scala solution? I tried to
use kryo but seamed not work at all. I hope to get some practical example. THX
On 2017年1月10日, at 19:10, Enrico DUrso
<enrico
in according to
how Spark works.
How can I register all those classes?
cheers,
From: Richard Startin [mailto:richardstar...@outlook.com]
Sent: 10 January 2017 11:18
To: Enrico DUrso; user@spark.apache.org
Subject: Re: Kryo On Spark 1.6.0
Hi Enrico,
Only set spark.kryo.registrationRequired if you want
Hi,
I am trying to use Kryo on Spark 1.6.0.
I am able to register my own classes and it works, but when I set
"spark.kryo.registrationRequired " to true, I get an error about a scala class:
"Class is not registered: scala.collection.mutable.WrappedArray$ofRef".
Any of you has already solved
Hello,
I had a discussion today with a colleague who was saying the following:
"We can use Spark as fast serving layer in our architecture, that is we can
compute an RDD or even a dataset using Spark SQL,
then we can cache it and offering to the front end layer an access to our
application in