Re: zinc invocation examples

2014-12-05 Thread Ryan Williams
fwiw I've been using `zinc -scala-home $SCALA_HOME -nailed -start` which: - starts a nailgun server as well, - uses my installed scala 2.{10,11}, as opposed to zinc's default 2.9.2 https://github.com/typesafehub/zinc#scala: If no options are passed to locate a version of Scala then Scala 2.9.2 is

Re: Unit tests in 5 minutes

2014-12-05 Thread Andrew Or
@Patrick and Josh actually we went even further than that. We simply disable the UI for most tests and these used to be the single largest source of port conflict.

Re: zinc invocation examples

2014-12-05 Thread Patrick Wendell
One thing I created a JIRA for a while back was to have a similar script to sbt/sbt that transparently downloads Zinc, Scala, and Maven in a subdirectory of Spark and sets it up correctly. I.e. build/mvn. Outside of brew for MacOS there aren't good Zinc packages, and it's a pain to figure out how

Re: drop table if exists throws exception

2014-12-05 Thread Michael Armbrust
The command run fine for me on master. Note that Hive does print an exception in the logs, but that exception does not propogate to user code. On Thu, Dec 4, 2014 at 11:31 PM, Jianshi Huang jianshi.hu...@gmail.com wrote: Hi, I got exception saying Hive: NoSuchObjectException(message:table

Re: drop table if exists throws exception

2014-12-05 Thread Mark Hamstra
And that is no different from how Hive has worked for a long time. On Fri, Dec 5, 2014 at 11:42 AM, Michael Armbrust mich...@databricks.com wrote: The command run fine for me on master. Note that Hive does print an exception in the logs, but that exception does not propogate to user code.

CREATE TABLE AS SELECT does not work with temp tables in 1.2.0

2014-12-05 Thread kb
I am having trouble getting create table as select or saveAsTable from a hiveContext to work with temp tables in spark 1.2. No issues in 1.1.0 or 1.1.1 Simple modification to test case in the hive SQLQuerySuite.scala: test(double nested data) {

Re: CREATE TABLE AS SELECT does not work with temp tables in 1.2.0

2014-12-05 Thread Michael Armbrust
Thanks for reporting. This looks like a regression related to: https://github.com/apache/spark/pull/2570 I've filed it here: https://issues.apache.org/jira/browse/SPARK-4769 On Fri, Dec 5, 2014 at 12:03 PM, kb kend...@hotmail.com wrote: I am having trouble getting create table as select or

Protobuf version in mvn vs sbt

2014-12-05 Thread spark.dubovsky.jakub
Hi devs,   I play with your amazing Spark here in Prague for some time. I have stumbled on a thing which I like to ask about. I create assembly jars from source and then use it to run simple jobs on our 2.3.0-cdh5.1.3 cluster using yarn. Example of my usage [1]. Formerly I had started to use

Re: Protobuf version in mvn vs sbt

2014-12-05 Thread Marcelo Vanzin
When building against Hadoop 2.x, you need to enable the appropriate profile, aside from just specifying the version. e.g. -Phadoop-2.3 for Hadoop 2.3. On Fri, Dec 5, 2014 at 12:51 PM, spark.dubovsky.ja...@seznam.cz wrote: Hi devs, I play with your amazing Spark here in Prague for some

Re: Protobuf version in mvn vs sbt

2014-12-05 Thread DB Tsai
As Marcelo said, CDH5.3 is based on hadoop 2.3, so please try ./make-distribution.sh -Pyarn -Phive -Phadoop-2.3 -Dhadoop.version=2.3.0-cdh5.1.3 -DskipTests See the detail of how to change the profile at https://spark.apache.org/docs/latest/building-with-maven.html Sincerely, DB Tsai

Re: Protobuf version in mvn vs sbt

2014-12-05 Thread Sean Owen
(Nit: CDH *5.1.x*, including 5.1.3, is derived from Hadoop 2.3.x. 5.3 is based on 2.5.x) On Fri, Dec 5, 2014 at 3:29 PM, DB Tsai dbt...@dbtsai.com wrote: As Marcelo said, CDH5.3 is based on hadoop 2.3, so please try - To

Re: Protobuf version in mvn vs sbt

2014-12-05 Thread DB Tsai
oh, I meant to say cdh5.1.3 used by Jakub's company is based on 2.3. You can see it from the first part of the Cloudera's version number - 2.3.0-cdh 5.1.3. Sincerely, DB Tsai --- My Blog: https://www.dbtsai.com LinkedIn:

build in IntelliJ IDEA

2014-12-05 Thread Judy Nash
Hi everyone, Have a newbie question on using IntelliJ to build and debug. I followed this wiki to setup IntelliJ: https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-BuildingSparkinIntelliJIDEA Afterward I tried to build via Toolbar (Build Rebuild

Re: [VOTE] Release Apache Spark 1.2.0 (RC1)

2014-12-05 Thread Patrick Wendell
Hey All, Thanks all for the continued testing! The issue I mentioned earlier SPARK-4498 was fixed earlier this week (hat tip to Mark Hamstra who contributed to fix). In the interim a few smaller blocker-level issues with Spark SQL were found and fixed (SPARK-4753, SPARK-4552, SPARK-4761).

Re: build in IntelliJ IDEA

2014-12-05 Thread Josh Rosen
If you go to “File - Project Structure” and click on “Project” under the “Project settings” heading, do you see an entry for “Project SDK?”  If not, you should click “New…” and configure a JDK; by default, I think IntelliJ should figure out a correct path to your system JDK, so you should just