Re: Reporting errors from spark sql

2016-08-21 Thread Jacek Laskowski
Hi, See https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ParseDriver.scala#L65 to learn how Spark SQL parses SQL texts. It could give you a way out. Pozdrawiam, Jacek Laskowski https://medium.com/@jaceklaskowski/ Mastering Apache

Reporting errors from spark sql

2016-08-18 Thread yael aharon
Hello, I am working on an SQL editor which is powered by spark SQL. When the SQL is not valid, I would like to provide the user with a line number and column number where the first error occurred. I am having a hard time finding a mechanism that will give me that information programmatically.

How to turn off Jetty Http stack errors on Spark web

2015-09-23 Thread Rafal Grzymkowski
Hi, Is it possible to disable Jetty stack trace with errors on Spark master:8080 ? When I trigger Http server error 500 than anyone can read details. I tried options available in log4j.properties but it doesn't help. Any hint? Thank you for answer MyCo

Re: How to turn off Jetty Http stack errors on Spark web

2015-09-23 Thread Ted Yu
Have you read this ? http://stackoverflow.com/questions/2246074/how-do-i-hide-stack-traces-in-the-browser-using-jetty On Wed, Sep 23, 2015 at 6:56 AM, Rafal Grzymkowski <m...@o2.pl> wrote: > Hi, > > Is it possible to disable Jetty stack trace with errors on Spark > master:8080

Re: How to turn off Jetty Http stack errors on Spark web

2015-09-23 Thread Rafal Grzymkowski
Yes, I've seen it, but there are no files web.xml and error.jsp in binary installation of Spark. To apply this solution I should probably take Spark sources than create missing files and than recompile Spark. Right? I am looking for a solution to turn off error details without recompilation.

Getting outofmemory errors on spark

2015-04-10 Thread Anshul Singhle
Hi, I'm reading data stored in S3 and aggregating and storing it in Cassandra using a spark job. When I run the job with approx 3Mil records (about 3-4 GB of data) stored in text files, I get the following error: (11529/14925)15/04/10 19:32:43 INFO TaskSetManager: Starting task 11609.0 in

Re: Errors in SPARK

2015-03-24 Thread Denny Lee
The error you're seeing typically means that you cannot connect to the Hive metastore itself. Some quick thoughts: - If you were to run show tables (instead of the CREATE TABLE statement), are you still getting the same error? - To confirm, the Hive metastore (MySQL database) is up and running

Re: Errors in SPARK

2015-03-24 Thread sandeep vura
Hi Denny, Still facing the same issue.Please find the following errors. *scala val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)* *sqlContext: org.apache.spark.sql.hive.HiveContext = org.apache.spark.sql.hive.HiveContext@4e4f880c* *scala sqlContext.sql(CREATE TABLE IF NOT EXISTS

Re: Errors in SPARK

2015-03-24 Thread sandeep vura
No I am just running ./spark-shell command in terminal I will try with above command On Wed, Mar 25, 2015 at 11:09 AM, Denny Lee denny.g@gmail.com wrote: Did you include the connection to a MySQL connector jar so that way spark-shell / hive can connect to the metastore? For example, when

Re: Errors in SPARK

2015-03-24 Thread Denny Lee
Did you include the connection to a MySQL connector jar so that way spark-shell / hive can connect to the metastore? For example, when I run my spark-shell instance in standalone mode, I use: ./spark-shell --master spark://servername:7077 --driver-class-path /lib/mysql-connector-java-5.1.27.jar

Errors in SPARK

2015-03-13 Thread sandeep vura
Hi Sparkers, Can anyone please check the below error and give solution for this.I am using hive version 0.13 and spark 1.2.1 . Step 1 : I have installed hive 0.13 with local metastore (mySQL database) Step 2: Hive is running without any errors and able to create tables and loading data in hive

Re: Errors in spark

2015-02-27 Thread Yana Kadiyska
I was actually just able to reproduce the issue. I do wonder if this is a bug -- the docs say When not configured by the hive-site.xml, the context automatically creates metastore_db and warehouse in the current directory. But as you can see in from the message warehouse is not in the current

Re: Errors in spark

2015-02-27 Thread sandeep vura
Hi yana, I have removed hive-site.xml from spark/conf directory but still getting the same errors. Anyother way to work around. Regards, Sandeep On Fri, Feb 27, 2015 at 9:38 PM, Yana Kadiyska yana.kadiy...@gmail.com wrote: I think you're mixing two things: the docs say When* not *configured

Errors in spark

2015-02-27 Thread sandeep vura
Hi Sparkers, I am using hive version - hive 0.13 and copied hive-site.xml in spark/conf and using default derby local metastore . While creating a table in spark shell getting the following error ..Can any one please look and give solution asap.. sqlContext.sql(CREATE TABLE IF NOT EXISTS

Re: Errors in spark

2015-02-27 Thread Yana Kadiyska
I think you're mixing two things: the docs say When* not *configured by the hive-site.xml, the context automatically creates metastore_db and warehouse in the current directory.. AFAIK if you want a local metastore, you don't put hive-site.xml anywhere. You only need the file if you're going to

Errors in Spark streaming application due to HDFS append

2014-11-05 Thread Ping Tang
Hi All, I’m trying to write streaming processed data in HDFS (Hadoop 2). The buffer is flushed and closed after each writing. The following errors occurred when opening the same file to append. I know for sure the error is caused by closing the file. Any idea? Here is the code to write HDFS