its definitely unfortunate that the current naming breaks scala's for
comprehension
On Sat, Mar 15, 2014 at 2:15 PM, andy petrella wrote:
> [Thanks a *lot* for your answers!]
>
> That's CoOl, a possible example would be to simply write a
> for-comprehension that would do this:
> >
> > val allEve
Hey Nathan,
I don't think this would be possible because there are at least dozens
of permutations of Hadoop versions (different vendor distros X
different versions X YARN vs not YARN, etc) and maybe hundreds. So
publishing new artifacts for each would be really difficult.
What is the exact probl
After just spending a couple days fighting with a new spark installation,
getting spark and hadoop version numbers matching everywhere, I have a
suggestion I'd like to put out there.
Can we put the hadoop version against which the spark jars were built into
the version number?
I noticed that the
dear community, i have used such command to build shark0.9:
export SHARK_HADOOP_VERSION=2.2.0
sbt/sbt package
but when i run bin/shark, I got this error:
Exception in thread "main" java.lang.IllegalAccessError: tried to access
field org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator