Re: Exception when using cosh

2015-10-21 Thread Reynold Xin
I think we made a mistake and forgot to register the function in the registry: https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala Do you mind submitting a pull request to fix this? Should be an one line change. I

Re: Exception when using cosh

2015-10-21 Thread Shagun Sodhani
Sure! Would do that. Thanks a lot On Wed, Oct 21, 2015 at 10:59 PM, Reynold Xin wrote: > I think we made a mistake and forgot to register the function in the > registry: >

Re: Exception when using cosh

2015-10-21 Thread Shagun Sodhani
@Reynold submitted the PR: https://github.com/apache/spark/pull/9199 On Wed, Oct 21, 2015 at 11:01 PM, Shagun Sodhani wrote: > Sure! Would do that. > > Thanks a lot > > On Wed, Oct 21, 2015 at 10:59 PM, Reynold Xin wrote: > >> I think we made a

Bringing up JDBC Tests to trunk

2015-10-21 Thread Luciano Resende
I have started looking into PR-8101 [1] and what is required to merge it into trunk which will also unblock me around SPARK-10521 [2]. So here is the minimal plan I was thinking about : - make the docker image version fixed so we make sure we are using the same image all the time - pull the

Re: If you use Spark 1.5 and disabled Tungsten mode ...

2015-10-21 Thread Jerry Lam
Hi guys, There is another memory issue. Not sure if this is related to Tungsten this time because I have it disable (spark.sql.tungsten.enabled=false). It happens more there are too many tasks running (300). I need to limit the number of task to avoid this. The executor has 6G. Spark 1.5.1 is

Re: Bringing up JDBC Tests to trunk

2015-10-21 Thread Josh Rosen
Hey Luciano, This sounds like a reasonable plan to me. One of my colleagues has written some Dockerized MySQL testing utilities, so I'll take a peek at those to see if there are any specifics of their solution that we should adapt for Spark. On Wed, Oct 21, 2015 at 1:16 PM, Luciano Resende

Re: If you use Spark 1.5 and disabled Tungsten mode ...

2015-10-21 Thread Reynold Xin
Is this still Mesos fine grained mode? On Wed, Oct 21, 2015 at 1:16 PM, Jerry Lam wrote: > Hi guys, > > There is another memory issue. Not sure if this is related to Tungsten > this time because I have it disable (spark.sql.tungsten.enabled=false). It > happens more there

Possible bug on Spark Yarn Client (1.5.1) during kerberos mode ?

2015-10-21 Thread Chester Chen
All, just to see if this happens to other as well. This is tested against the spark 1.5.1 ( branch 1.5 with label 1.5.2-SNAPSHOT with commit on Tue Oct 6, 84f510c4fa06e43bd35e2dc8e1008d0590cbe266) Spark deployment mode : Spark-Cluster Notice that if we enable Kerberos mode,

Re: If you use Spark 1.5 and disabled Tungsten mode ...

2015-10-21 Thread Jerry Lam
Yes. The crazy thing about mesos running in fine grained mode is that there is no way (correct me if I'm wrong) to set the number of cores per executor. If one of my slaves on mesos has 32 cores, the fine grained mode can allocate 32 cores on this executor for the job and if there are 32 tasks

Re: Possible bug on Spark Yarn Client (1.5.1) during kerberos mode ?

2015-10-21 Thread Chester Chen
Doug thanks for responding. >>I think Spark just needs to be compiled against 1.2.1 Can you elaborate on this, or specific command you are referring ? In our build.scala, I was including the following "org.spark-project.hive" % "hive-exec" % "1.2.1.spark" intransitive() I am not

SPARK_DRIVER_MEMORY doc wrong

2015-10-21 Thread tyronecai
In conf/spark-env.sh.template https://github.com/apache/spark/blob/master/conf/spark-env.sh.template#L42 # - SPARK_DRIVER_MEMORY, Memory for Master (e.g. 1000M, 2G) (Default: 1G) SPARK_DRIVER_MEMORY is memory config for driver, not master. Thanks!

Re: SPARK_DRIVER_MEMORY doc wrong

2015-10-21 Thread Sean Owen
You're welcome to open a little pull request to fix that. On Wed, Oct 21, 2015, 10:47 AM tyronecai wrote: > In conf/spark-env.sh.template > https://github.com/apache/spark/blob/master/conf/spark-env.sh.template#L42 > # - SPARK_DRIVER_MEMORY, Memory for Master (e.g. 1000M, 2G)

Exception when using cosh

2015-10-21 Thread Shagun Sodhani
Hi! I was trying out different arithmetic functions in SparkSql. I noticed a weird thing. While *sinh* and *tanh* functions are working, using *cosh* results in an error saying: *Exception in thread "main" org.apache.spark.sql.AnalysisException: undefined function cosh;* The documentation says

FW: Spark Streaming scheduler delay VS driver.cores

2015-10-21 Thread Adrian Tanase
Apologies for reposting this to the dev list but I’ve had no luck in getting information about spark.driver.cores on the user list. Happy to create a PR with documentation improvements for the spark.driver.cores config setting after I get some more details. Thanks! -adrian From: Adrian Tanase