With binary i think it might not be possible, although if you can download
the sources and then build it then you can remove this function
https://github.com/apache/spark/blob/master/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala#L1023
which initializes the SQLContext.
The main reason is Spark's startup time and the need to configure a component I
don't really need (without configs the hivecontext takes more time to load)
Thanks,
Daniel
On 3 ביולי 2015, at 11:13, Robin East robin.e...@xense.co.uk wrote:
As Akhil mentioned there isn’t AFAIK any kind of
Hivecontext should be supersets of SQL context so you should be able to
perform all your tasks. Are you facing any problem with hivecontext?
On 3 Jul 2015 17:33, Daniel Haviv daniel.ha...@veracity-group.com wrote:
Thanks
I was looking for a less hack-ish way :)
Daniel
On Fri, Jul 3, 2015 at
Thanks
I was looking for a less hack-ish way :)
Daniel
On Fri, Jul 3, 2015 at 10:15 AM, Akhil Das ak...@sigmoidanalytics.com
wrote:
With binary i think it might not be possible, although if you can download
the sources and then build it then you can remove this function
Hi,
I've downloaded the pre-built binaries for Hadoop 2.6 and whenever I start
the spark-shell it always start with HiveContext.
How can I disable the HiveContext from being initialized automatically ?
Thanks,
Daniel