"Too Large DataFrame" shuffle Fetch Failed exception in Spark SQL (SPARK-16753) (SPARK-9862)(SPARK-5928)(TAGs - Spark SQL, Intermediate Level, Debug)

2018-02-16 Thread Ashutosh Ranjan
Hi All, My spark Configuration is following. spark = SparkSession.builder.master(mesos_ip) \ .config('spark.executor.cores','3')\ .config('spark.executor.memory','8g')\ .config('spark.es.scroll.size','1')\ .config('spark.network.timeout','600s')\

Re: Exception in spark streaming + kafka direct app

2017-02-07 Thread Srikanth
This is running in YARN cluster mode. It was restarted automatically and continued fine. I was trying to see what went wrong. AFAIK there were no task failure. Nothing in executor logs. The log I gave is in driver. After some digging, I did see that there was a rebalance in kafka logs around this

Re: Exception in spark streaming + kafka direct app

2017-02-07 Thread Tathagata Das
Does restarting after a few minutes solves the problem? Could be a transient issue that lasts long enough for spark task-level retries to all fail. On Tue, Feb 7, 2017 at 4:34 PM, Srikanth wrote: > Hello, > > I had a spark streaming app that reads from kafka running for a

Exception in spark streaming + kafka direct app

2017-02-07 Thread Srikanth
Hello, I had a spark streaming app that reads from kafka running for a few hours after which it failed with error *17/02/07 20:04:10 ERROR JobScheduler: Error generating jobs for time 148649785 ms java.lang.IllegalStateException: No current assignment for partition mt_event-5 at

FetchFailed exception with Spark 1.6

2016-09-29 Thread Ankur Srivastava
Hi, I am running a simple job on Spark 1.6 in which I am trying to leftOuterJoin a big RDD with a smaller one. I am not yet broadcasting the smaller RDD yet but I am stilling running into FetchFailed errors with finally the job getting killed. I have already partitioned the data to 5000

KeyManager exception in Spark 1.6.2

2016-08-31 Thread Eric Ho
I was trying to enable SSL in Spark 1.6.2 and got this exception. Not sure if I'm missing something or my keystore / truststore files got bad although keytool showed that both files are fine... = *16/09/01 04:01:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your

Total Task size exception in Spark 1.6.0 when writing a DataFrame

2016-01-17 Thread Night Wolf
Hi all, Doing some simple column transformations (e.g. trimming strings) on a DataFrame using UDFs. This DataFrame is in Avro format and being loaded off HDFS. The job has about 16,000 parts/tasks. About half way through the job, then fails with a message; org.apache.spark.SparkException: Job

Re: Exception in Spark-sql insertIntoJDBC command

2016-01-13 Thread RichG
parkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Exception-in-Spark-sql-insertIntoJDBC-com

Re: Exception in Spark-sql insertIntoJDBC command

2015-12-08 Thread kali.tumm...@gmail.com
etprops) Thanks Sri -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Exception-in-Spark-sql-insertIntoJDBC-command-tp24655p25640.html Sent from the Apache Spark User List mailing list a

Exception in Spark-sql insertIntoJDBC command

2015-09-11 Thread Baljeet Singh
. or if there is any other way to do the same in 1.4.1 version. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Exception-in-Spark-sql-insertIntoJDBC-command-tp24655.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Exception Handling : Spark Streaming

2015-09-11 Thread Ted Yu
gt; > wordCountPair.foreachRDD(rdd => > rdd.saveToCassandra("nexti","direct_api_test",AllColumns)) > > ssc.start() > ssc.awaitTermination() > } > catch { > case ex: Exception =>{ > println(">>>>>>>

Exception Handling : Spark Streaming

2015-09-11 Thread Samya
t;direct_api_test",AllColumns)) ssc.start() ssc.awaitTermination() } catch { case ex: Exception =>{ println(">>>>>>>> Exception UNKNOWN Only.") } } } I am sure that missing out on something, please provide your in

Exception in spark

2015-08-14 Thread Ravisankar Mani
Hi all, I got an exception like “org.apache.spark.sql.catalyst.analysis.UnresolvedException: Invalid call to dataType on unresolved object” when using some where condition queries. I am using 1.4.0 version spark. But its perfectly working in hive . Please refer the following query. I have

Re: Exception in spark

2015-08-11 Thread Josh Rosen
to dataType on unresolved object” when using some where condition queries. I am using 1.4.0 version spark. Is this exception resolved in latest spark? Regards, Ravi

Re: Exception in spark

2015-08-11 Thread Ravisankar Mani
, We got an exception like “org.apache.spark.sql.catalyst.analysis.UnresolvedException: Invalid call to dataType on unresolved object” when using some where condition queries. I am using 1.4.0 version spark. Is this exception resolved in latest spark? Regards, Ravi

Re: Exception in spark

2015-08-11 Thread Ravisankar Mani
“org.apache.spark.sql.catalyst.analysis.UnresolvedException: Invalid call to dataType on unresolved object” when using some where condition queries. I am using 1.4.0 version spark. Is this exception resolved in latest spark? Regards, Ravi

Exception in spark

2015-08-11 Thread Ravisankar Mani
Hi all, We got an exception like “org.apache.spark.sql.catalyst.analysis.UnresolvedException: Invalid call to dataType on unresolved object” when using some where condition queries. I am using 1.4.0 version spark. Is this exception resolved in latest spark? Regards, Ravi

Kerberos authentication exception when spark access hbase with yarn-cluster mode on a kerberos yarn Cluster

2015-06-17 Thread 马元文
Hi, all I have a question about spark access hbase with yarn-cluster mode on a kerberos yarn Cluster. Is it the only way to enable Spark access HBase by distributing the keytab to each NodeManager? It seems that Spark doesn't provide a delegation token like MR job, am I right?

Re: Weird exception in Spark job

2015-03-25 Thread Akhil Das
...@gmail.com wrote: Any Ideas on this? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Weird-exception-in-Spark-job-tp22195p22204.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Weird exception in Spark job

2015-03-24 Thread nitinkak001
Any Ideas on this? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Weird-exception-in-Spark-job-tp22195p22204.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Weird exception in Spark job

2015-03-23 Thread nitinkak001
) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642) ... 4 more -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Weird-exception-in-Spark-job-tp22195.html Sent from the Apache

Re: Custom UDTF with Lateral View throws ClassNotFound exception in Spark SQL CLI

2014-12-16 Thread shenghua
A workaround trick is found and put in the ticket https://issues.apache.org/jira/browse/SPARK-4854. Hope this would be useful. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Custom-UDTF-with-Lateral-View-throws-ClassNotFound-exception-in-Spark-SQL-CLI

Custom UDTF with Lateral View throws ClassNotFound exception in Spark SQL CLI

2014-12-15 Thread shenghua
. Thank you. Shenghua -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Custom-UDTF-with-Lateral-View-throws-ClassNotFound-exception-in-Spark-SQL-CLI-tp20689.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Custom UDTF with Lateral View throws ClassNotFound exception in Spark SQL CLI

2014-12-15 Thread Michael Armbrust
.nabble.com/Custom-UDTF-with-Lateral-View-throws-ClassNotFound-exception-in-Spark-SQL-CLI-tp20689.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr

Re: KryoSerializer exception in Spark Streaming JAVA

2014-12-11 Thread bonnahu
.n3.nabble.com/KryoSerializer-exception-in-Spark-Streaming-JAVA-tp15479p20647.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional

Re: Exception in spark sql when running a group by query

2014-11-18 Thread Sadhan Sood
ah makes sense - Thanks Michael! On Mon, Nov 17, 2014 at 6:08 PM, Michael Armbrust mich...@databricks.com wrote: You are perhaps hitting an issue that was fixed by #3248 https://github.com/apache/spark/pull/3248? On Mon, Nov 17, 2014 at 9:58 AM, Sadhan Sood sadhan.s...@gmail.com wrote:

Exception in spark sql when running a group by query

2014-11-17 Thread Sadhan Sood
While testing sparkSQL, we were running this group by with expression query and got an exception. The same query worked fine on hive. SELECT from_unixtime(floor(xyz.whenrequestreceived/1000.0 - 25200), '/MM/dd') as pst_date, count(*) as num_xyzs FROM all_matched_abc GROUP BY

Re: Exception in spark sql when running a group by query

2014-11-17 Thread Michael Armbrust
You are perhaps hitting an issue that was fixed by #3248 https://github.com/apache/spark/pull/3248? On Mon, Nov 17, 2014 at 9:58 AM, Sadhan Sood sadhan.s...@gmail.com wrote: While testing sparkSQL, we were running this group by with expression query and got an exception. The same query worked

KryoSerializer exception in Spark Streaming JAVA

2014-10-01 Thread Mudassar Sarwar
) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang.Thread.run(Thread.java:722) Please help to resolve this -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/KryoSerializer-exception-in-Spark-Streaming-JAVA-tp15479.html

Null Pointer Exception in Spark Application with Yarn Client Mode

2014-04-07 Thread Sai Prasanna
Hi All, I wanted Spark on Yarn to up and running. I did *SPARK_HADOOP_VERSION=2.3.0 SPARK_YARN=true ./sbt/sbt assembly* Then i ran *SPARK_JAR=./assembly/target/scala-2.9.3/spark-assembly-0.8.1-incubating-hadoop2.3.0.jar