executor is not registered error in pyspark

2021-12-29 Thread fangmin
Hi every body, I am trying to run a pyspark job. After running many days later I am seeing the following failures: resultstage 46047 has failed the maximum allowable number of times:4.Most recent failure reason: org.apache.spark.shuffle.FetchFailedException: Failure while fetching

JDBC write error of Pyspark dataframe

2017-04-19 Thread Cinyoung Hur
Hi, I'm trying to write dataframe to MariaDB. I got this error message, but I have no clue. Please give some advice. Py4JJavaErrorTraceback (most recent call last) in ()> 1 result1.filter(result1["gnl_nm_set"] == "").count() /usr/local/linewalks/spark/spark/python/pyspark/sql/dataframe.pyc

pickling error with PySpark and Elasticsearch-py analyzer

2015-08-22 Thread pkphlam
.1001560.n3.nabble.com/pickling-error-with-PySpark-and-Elasticsearch-py-analyzer-tp24402.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org

SparkContext with error from PySpark

2014-12-30 Thread Jaggu
: org.apache.spark.SparkException ( Any clue how to resolve the same. Best regards Jagan -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-with-error-from-PySpark-tp20907.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: SparkContext with error from PySpark

2014-12-30 Thread Eric Friedman
this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-with-error-from-PySpark-tp20907.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user

Re: SparkContext with error from PySpark

2014-12-30 Thread JAGANADH G
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-with-error-from-PySpark-tp20907.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe

Re: SparkContext with error from PySpark

2014-12-30 Thread Josh Rosen
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-with-error-from-PySpark-tp20907.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe

Re: error with pyspark

2014-08-11 Thread Ron Gonzalez
If you're running on Ubuntu, do ulimit -n, which gives the max number of allowed open files. You will have to change the value in /etc/security/limits.conf to something like 1, logout and log back in. Thanks, Ron Sent from my iPad On Aug 10, 2014, at 10:19 PM, Davies Liu

Re: error with pyspark

2014-08-11 Thread Baoqiang Cao
Thanks Daves and Ron! It indeed was due to ulimit issue. Thanks a lot! Best, Baoqiang Cao Blog: http://baoqiang.org Email: bqcaom...@gmail.com On Aug 11, 2014, at 3:08 AM, Ron Gonzalez zlgonza...@yahoo.com wrote: If you're running on Ubuntu, do ulimit -n, which gives the max number of

Re: error with pyspark

2014-08-10 Thread Davies Liu
On Fri, Aug 8, 2014 at 9:12 AM, Baoqiang Cao bqcaom...@gmail.com wrote: Hi There I ran into a problem and can’t find a solution. I was running bin/pyspark ../python/wordcount.py you could use bin/spark-submit ../python/wordcount.py The wordcount.py is here:

error with pyspark

2014-08-08 Thread Baoqiang Cao
Hi There I ran into a problem and can’t find a solution. I was running bin/pyspark ../python/wordcount.py The wordcount.py is here: import sys from operator import add from pyspark import SparkContext datafile = '/mnt/data/m1.txt' sc =