Re: Network-related environemental problem when running JDBCSuite

2015-10-15 Thread Richard Hillegas
-getlocalhost-throws-unknownhostexception Thanks, -Rick Richard Hillegas/San Francisco/IBM@IBMUS wrote on 10/15/2015 11:15:29 AM: > From: Richard Hillegas/San Francisco/IBM@IBMUS > To: Dev <dev@spark.apache.org> > Date: 10/15/2015 11:16 AM > Subject: Re: Network-related environem

Network-related environemental problem when running JDBCSuite

2015-10-15 Thread Richard Hillegas
I am seeing what look like environmental errors when I try to run a test on a clean local branch which has been sync'd to the head of the development trunk. I would appreciate advice about how to debug or hack around this problem. For the record, the test ran cleanly last week. This is the

Re: Network-related environemental problem when running JDBCSuite

2015-10-15 Thread Richard Hillegas
-with-two-workers export SPARK_LOCAL_IP=127.0.0.1 Then I got errors related to booting the metastore_db. So I deleted that directory. After that I was able to run spark-shell again. Now let's see if this hack fixes the tests... Thanks, Rick Hillegas Richard Hillegas/San Francisco/IBM@IBMUS

Re: Network-related environemental problem when running JDBCSuite

2015-10-15 Thread Richard Hillegas
.scala:120) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) :10: error: not found: value sqlContext import sqlContext.implicits._ ^ :10: error: not found: value sqlContext import sqlContext.sql Thanks, Rick Hillegas Richard Hillegas/San Francisco

Re: [Discuss] NOTICE file for transitive "NOTICE"s

2015-09-28 Thread Richard Hillegas
Thanks, Sean! Sean Owen <so...@cloudera.com> wrote on 09/25/2015 06:35:46 AM: > From: Sean Owen <so...@cloudera.com> > To: Reynold Xin <r...@databricks.com>, Richard Hillegas/San > Francisco/IBM@IBMUS > Cc: "dev@spark.apache.org" <dev@spark.apache.org

Re: unsubscribe

2015-09-30 Thread Richard Hillegas
Hi Sukesh, To unsubscribe from the dev list, please send a message to dev-unsubscr...@spark.apache.org. To unsubscribe from the user list, please send a message user-unsubscr...@spark.apache.org. Please see: http://spark.apache.org/community.html#mailing-lists. Thanks, -Rick sukesh kumar

column identifiers in Spark SQL

2015-09-22 Thread Richard Hillegas
I am puzzled by the behavior of column identifiers in Spark SQL. I don't find any guidance in the "Spark SQL and DataFrame Guide" at http://spark.apache.org/docs/latest/sql-programming-guide.html. I am seeing odd behavior related to case-sensitivity and to delimited (quoted) identifiers.

Derby version in Spark

2015-09-22 Thread Richard Hillegas
I see that lib_managed/jars holds these old Derby versions: lib_managed/jars/derby-10.10.1.1.jar lib_managed/jars/derby-10.10.2.0.jar The Derby 10.10 release family supports some ancient JVMs: Java SE 5 and Java ME CDC/Foundation Profile 1.1. It's hard to imagine anyone running Spark on

Re: column identifiers in Spark SQL

2015-09-22 Thread Richard Hillegas
Thanks, -Rick Michael Armbrust <mich...@databricks.com> wrote on 09/22/2015 10:58:36 AM: > From: Michael Armbrust <mich...@databricks.com> > To: Richard Hillegas/San Francisco/IBM@IBMUS > Cc: Dev <dev@spark.apache.org> > Date: 09/22/2015 10:59 AM > Subject: Re: c

Re: Derby version in Spark

2015-09-22 Thread Richard Hillegas
Thanks, Ted. I'll follow up with the Hive folks. Cheers, -Rick Ted Yu <yuzhih...@gmail.com> wrote on 09/22/2015 03:41:12 PM: > From: Ted Yu <yuzhih...@gmail.com> > To: Richard Hillegas/San Francisco/IBM@IBMUS > Cc: Dev <dev@spark.apache.org> > Date: 09/22/2015

Re: column identifiers in Spark SQL

2015-09-22 Thread Richard Hillegas
ysisException: cannot resolve 'c\"d' given input columns A, b, c"d; line 1 pos 7 sqlContext.sql("""select `c\"d` from test_data""").show Thanks, -Rick Michael Armbrust <mich...@databricks.com> wrote on 09/22/2015 01:16:12 PM: > From: Michael Armb

Re: Derby version in Spark

2015-09-22 Thread Richard Hillegas
jersey-guice-1.9.jar parquet-encoding-1.7.0.jar Ted Yu <yuzhih...@gmail.com> wrote on 09/22/2015 01:32:39 PM: > From: Ted Yu <yuzhih...@gmail.com> > To: Richard Hillegas/San Francisco/IBM@IBMUS > Cc: Dev <dev@spark.apache.org> > Date: 09/22/2015 01:33 P

Re: [Discuss] NOTICE file for transitive "NOTICE"s

2015-09-24 Thread Richard Hillegas
NY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE." Thanks, -Rick Reynold Xin <r...@databricks.com> wrote on 09/24/2015 10:55:53 AM: > From: Reynold Xin <r...@databricks.com> > To: Sean Owen <so...@cloudera.com> > Cc: Rich

Re: [Discuss] NOTICE file for transitive "NOTICE"s

2015-09-24 Thread Richard Hillegas
-howto.html#permissive-deps Thanks, -Rick Sean Owen <so...@cloudera.com> wrote on 09/24/2015 12:07:01 PM: > From: Sean Owen <so...@cloudera.com> > To: Richard Hillegas/San Francisco/IBM@IBMUS > Cc: "dev@spark.apache.org" <dev@spark.apache.org> > Date: 09/24/2015

Re: [VOTE] Release Apache Spark 1.5.1 (RC1)

2015-09-24 Thread Richard Hillegas
ll <pwend...@gmail.com> > To: Sean Owen <so...@cloudera.com> > Cc: Richard Hillegas/San Francisco/IBM@IBMUS, "dev@spark.apache.org" > <dev@spark.apache.org> > Date: 09/24/2015 10:24 AM > Subject: Re: [VOTE] Release Apache Spark 1.5.1 (RC1) > > Hey Richard

Re: [VOTE] Release Apache Spark 1.5.1 (RC1)

2015-09-24 Thread Richard Hillegas
-1 (non-binding) I was able to build Spark cleanly from the source distribution using the command in README.md: build/mvn -DskipTests clean package However, while I was waiting for the build to complete, I started going through the NOTICE file. I was confused about where to find licenses

Re: Unsubscribe

2015-09-21 Thread Richard Hillegas
To unsubscribe from the dev list, please send a message to dev-unsubscr...@spark.apache.org as described here: http://spark.apache.org/community.html#mailing-lists. Thanks, -Rick Dulaj Viduranga wrote on 09/21/2015 10:15:58 AM: > From: Dulaj Viduranga