[jira] [Resolved] (SPARK-26113) TypeError: object of type 'NoneType' has no len() in authenticate_and_accum_updates of pyspark/accumulators.py

2018-11-19 Thread Sai Varun Reddy Daram (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sai Varun Reddy Daram resolved SPARK-26113.
---
Resolution: Invalid

> TypeError: object of type 'NoneType' has no len() in 
> authenticate_and_accum_updates of pyspark/accumulators.py
> --
>
> Key: SPARK-26113
> URL: https://issues.apache.org/jira/browse/SPARK-26113
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes, PySpark
>Affects Versions: 2.4.0
>Reporter: Sai Varun Reddy Daram
>Priority: Major
>
> Machine OS: Ubuntu 16.04.
> Kubernetes: Minikube 
> Kubernetes Version: 1.10.0
> Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) 
> built using standard spark docker build.sh file.
> Driver is inside pod in kubernetes cluster.
> Steps to replicate:
> 1) Create a spark Session:  
> {code:java}
> // 
> spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
> {code}
>  2) Create a sample DataFrame
> {code:java}
> // df=spark_session.createDataFrame([{'a':1}])
> {code}
>  3) Do some operation on this dataframe
> {code:java}
> // df.count(){code}
> I get this output.
> {code:java}
> // Exception happened during processing of request from ('127.0.0.1', 38690)
> Traceback (most recent call last):
> File "/usr/lib/python3.6/socketserver.py", line 317, in 
> _handle_request_noblock
> self.process_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
> self.finish_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
> self.RequestHandlerClass(request, client_address, self)
> File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
> self.handle()
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 266, in handle
> poll(authenticate_and_accum_updates)
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 241, in poll
> if func():
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 254, in authenticate_and_accum_updates
> received_token = self.rfile.read(len(auth_token))
> TypeError: object of type 'NoneType' has no len()
> {code}
> 4) Repeat above step; it won't show the error.
>  
> But now close the session, kill the python terminal or process. and try 
> again, the same happens.
>  
> Something related to https://issues.apache.org/jira/browse/SPARK-26019  ?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26019) pyspark/accumulators.py: "TypeError: object of type 'NoneType' has no len()" in authenticate_and_accum_updates()

2018-11-19 Thread Sai Varun Reddy Daram (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26019?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691462#comment-16691462
 ] 

Sai Varun Reddy Daram commented on SPARK-26019:
---

Any help with https://issues.apache.org/jira/browse/SPARK-26113

> pyspark/accumulators.py: "TypeError: object of type 'NoneType' has no len()" 
> in authenticate_and_accum_updates()
> 
>
> Key: SPARK-26019
> URL: https://issues.apache.org/jira/browse/SPARK-26019
> Project: Spark
>  Issue Type: Bug
>  Components: PySpark
>Affects Versions: 2.3.2, 2.4.0
>Reporter: Ruslan Dautkhanov
>Priority: Major
>
> Started happening after 2.3.1 -> 2.3.2 upgrade.
>  
> {code:python}
> Exception happened during processing of request from ('127.0.0.1', 43418)
> 
> Traceback (most recent call last):
>   File "/opt/cloudera/parcels/Anaconda/lib/python2.7/SocketServer.py", line 
> 290, in _handle_request_noblock
>     self.process_request(request, client_address)
>   File "/opt/cloudera/parcels/Anaconda/lib/python2.7/SocketServer.py", line 
> 318, in process_request
>     self.finish_request(request, client_address)
>   File "/opt/cloudera/parcels/Anaconda/lib/python2.7/SocketServer.py", line 
> 331, in finish_request
>     self.RequestHandlerClass(request, client_address, self)
>   File "/opt/cloudera/parcels/Anaconda/lib/python2.7/SocketServer.py", line 
> 652, in __init__
>     self.handle()
>   File 
> "/opt/cloudera/parcels/SPARK2-2.3.0.cloudera4-1.cdh5.13.3.p0.611179/lib/spark2/python/lib/pyspark.zip/pyspark/accumulators.py",
>  line 263, in handle
>     poll(authenticate_and_accum_updates)
>   File 
> "/opt/cloudera/parcels/SPARK2-2.3.0.cloudera4-1.cdh5.13.3.p0.611179/lib/spark2/python/lib/pyspark.zip/pyspark/accumulators.py",
>  line 238, in poll
>     if func():
>   File 
> "/opt/cloudera/parcels/SPARK2-2.3.0.cloudera4-1.cdh5.13.3.p0.611179/lib/spark2/python/lib/pyspark.zip/pyspark/accumulators.py",
>  line 251, in authenticate_and_accum_updates
>     received_token = self.rfile.read(len(auth_token))
> TypeError: object of type 'NoneType' has no len()
>  
> {code}
>  
> Error happens here:
> https://github.com/apache/spark/blob/cb90617f894fd51a092710271823ec7d1cd3a668/python/pyspark/accumulators.py#L254
> The PySpark code was just running a simple pipeline of 
> binary_rdd = sc.binaryRecords(full_file_path, record_length).map(lambda .. )
> and then converting it to a dataframe and running a count on it.
> It seems error is flaky - on next rerun it didn't happen.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-26113) TypeError: object of type 'NoneType' has no len() in authenticate_and_accum_updates of pyspark/accumulators.py

2018-11-19 Thread Sai Varun Reddy Daram (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691330#comment-16691330
 ] 

Sai Varun Reddy Daram edited comment on SPARK-26113 at 11/19/18 9:33 AM:
-

Something to help here: https://issues.apache.org/jira/browse/SPARK-26019 ?


was (Author: saivarunvishal):
Something to help here: https://issues.apache.org/jira/browse/SPARK-26113 ?

> TypeError: object of type 'NoneType' has no len() in 
> authenticate_and_accum_updates of pyspark/accumulators.py
> --
>
> Key: SPARK-26113
> URL: https://issues.apache.org/jira/browse/SPARK-26113
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes, PySpark
>Affects Versions: 2.4.0
>Reporter: Sai Varun Reddy Daram
>Priority: Major
>
> Machine OS: Ubuntu 16.04.
> Kubernetes: Minikube 
> Kubernetes Version: 1.10.0
> Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) 
> built using standard spark docker build.sh file.
> Driver is inside pod in kubernetes cluster.
> Steps to replicate:
> 1) Create a spark Session:  
> {code:java}
> // 
> spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
> {code}
>  2) Create a sample DataFrame
> {code:java}
> // df=spark_session.createDataFrame([{'a':1}])
> {code}
>  3) Do some operation on this dataframe
> {code:java}
> // df.count(){code}
> I get this output.
> {code:java}
> // Exception happened during processing of request from ('127.0.0.1', 38690)
> Traceback (most recent call last):
> File "/usr/lib/python3.6/socketserver.py", line 317, in 
> _handle_request_noblock
> self.process_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
> self.finish_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
> self.RequestHandlerClass(request, client_address, self)
> File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
> self.handle()
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 266, in handle
> poll(authenticate_and_accum_updates)
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 241, in poll
> if func():
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 254, in authenticate_and_accum_updates
> received_token = self.rfile.read(len(auth_token))
> TypeError: object of type 'NoneType' has no len()
> {code}
> 4) Repeat above step; it won't show the error.
>  
> But now close the session, kill the python terminal or process. and try 
> again, the same happens.
>  
> Something related to https://issues.apache.org/jira/browse/SPARK-26019  ?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26113) TypeError: object of type 'NoneType' has no len() in authenticate_and_accum_updates of pyspark/accumulators.py

2018-11-19 Thread Sai Varun Reddy Daram (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sai Varun Reddy Daram updated SPARK-26113:
--
Description: 
Machine OS: Ubuntu 16.04.

Kubernetes: Minikube 

Kubernetes Version: 1.10.0

Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) built 
using standard spark docker build.sh file.

Driver is inside pod in kubernetes cluster.

Steps to replicate:

1) Create a spark Session:  
{code:java}
// 
spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
{code}
 2) Create a sample DataFrame
{code:java}
// df=spark_session.createDataFrame([{'a':1}])
{code}
 3) Do some operation on this dataframe
{code:java}
// df.count(){code}
I get this output.
{code:java}
// Exception happened during processing of request from ('127.0.0.1', 38690)
Traceback (most recent call last):
File "/usr/lib/python3.6/socketserver.py", line 317, in _handle_request_noblock
self.process_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
self.finish_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
self.RequestHandlerClass(request, client_address, self)
File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
self.handle()
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 266, in handle
poll(authenticate_and_accum_updates)
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 241, in poll
if func():
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 254, in authenticate_and_accum_updates
received_token = self.rfile.read(len(auth_token))
TypeError: object of type 'NoneType' has no len()

{code}
4) Repeat above step; it won't show the error.

 

But now close the session, kill the python terminal or process. and try again, 
the same happens.

 

Something related to https://issues.apache.org/jira/browse/SPARK-26019  ?

  was:
Machine OS: Ubuntu 16.04.

Kubernetes: Minikube 

Kubernetes Version: 1.10.0

Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) built 
using standard spark docker build.sh file.

Steps to replicate:

1) Create a spark Session:  
{code:java}
// 
spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
{code}
 2) Create a sample DataFrame
{code:java}
// df=spark_session.createDataFrame([{'a':1}])
{code}
 3) Do some operation on this dataframe
{code:java}
// df.count(){code}
I get this output.
{code:java}
// Exception happened during processing of request from ('127.0.0.1', 38690)
Traceback (most recent call last):
File "/usr/lib/python3.6/socketserver.py", line 317, in _handle_request_noblock
self.process_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
self.finish_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
self.RequestHandlerClass(request, client_address, self)
File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
self.handle()
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 266, in handle
poll(authenticate_and_accum_updates)
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 241, in poll
if func():
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 254, in authenticate_and_accum_updates
received_token = self.rfile.read(len(auth_token))
TypeError: object of type 'NoneType' has no len()

{code}
4) Repeat above step; it won't show the error.

 

But now close the session, kill the python terminal or process. and try again, 
the same happens.

 

Something related to https://issues.apache.org/jira/browse/SPARK-26019  ?


> TypeError: object of type 'NoneType' has no len() in 
> authenticate_and_accum_updates of pyspark/accumulators.py
> --
>
> Key: SPARK-26113
> URL: https://issues.apache.org/jira/browse/SPARK-26113
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes, PySpark
>Affects Versions: 2.4.0
>Reporter: Sai Varun Reddy Daram
>Priority: Major
>
> Machine OS: Ubuntu 16.04.
> Kubernetes: Minikube 
> Kubernetes Version: 1.10.0
> Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) 
> built using standard spark docker build.sh file.
> Driver is inside pod in kubernetes cluster.
> Step

[jira] [Commented] (SPARK-26113) TypeError: object of type 'NoneType' has no len() in authenticate_and_accum_updates of pyspark/accumulators.py

2018-11-19 Thread Sai Varun Reddy Daram (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691381#comment-16691381
 ] 

Sai Varun Reddy Daram commented on SPARK-26113:
---

[~hyukjin.kwon] sorry for that, I did not know.

> TypeError: object of type 'NoneType' has no len() in 
> authenticate_and_accum_updates of pyspark/accumulators.py
> --
>
> Key: SPARK-26113
> URL: https://issues.apache.org/jira/browse/SPARK-26113
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes, PySpark
>Affects Versions: 2.4.0
>Reporter: Sai Varun Reddy Daram
>Priority: Major
>
> Machine OS: Ubuntu 16.04.
> Kubernetes: Minikube 
> Kubernetes Version: 1.10.0
> Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) 
> built using standard spark docker build.sh file.
> Steps to replicate:
> 1) Create a spark Session:  
> {code:java}
> // 
> spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
> {code}
>  2) Create a sample DataFrame
> {code:java}
> // df=spark_session.createDataFrame([{'a':1}])
> {code}
>  3) Do some operation on this dataframe
> {code:java}
> // df.count(){code}
> I get this output.
> {code:java}
> // Exception happened during processing of request from ('127.0.0.1', 38690)
> Traceback (most recent call last):
> File "/usr/lib/python3.6/socketserver.py", line 317, in 
> _handle_request_noblock
> self.process_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
> self.finish_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
> self.RequestHandlerClass(request, client_address, self)
> File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
> self.handle()
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 266, in handle
> poll(authenticate_and_accum_updates)
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 241, in poll
> if func():
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 254, in authenticate_and_accum_updates
> received_token = self.rfile.read(len(auth_token))
> TypeError: object of type 'NoneType' has no len()
> {code}
> 4) Repeat above step; it won't show the error.
>  
> But now close the session, kill the python terminal or process. and try 
> again, the same happens.
>  
> Something related to https://issues.apache.org/jira/browse/SPARK-26019  ?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26113) TypeError: object of type 'NoneType' has no len() in authenticate_and_accum_updates of pyspark/accumulators.py

2018-11-19 Thread Sai Varun Reddy Daram (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sai Varun Reddy Daram updated SPARK-26113:
--
Priority: Major  (was: Critical)

> TypeError: object of type 'NoneType' has no len() in 
> authenticate_and_accum_updates of pyspark/accumulators.py
> --
>
> Key: SPARK-26113
> URL: https://issues.apache.org/jira/browse/SPARK-26113
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes, PySpark
>Affects Versions: 2.4.0
>Reporter: Sai Varun Reddy Daram
>Priority: Major
>
> Machine OS: Ubuntu 16.04.
> Kubernetes: Minikube 
> Kubernetes Version: 1.10.0
> Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) 
> built using standard spark docker build.sh file.
> Steps to replicate:
> 1) Create a spark Session:  
> {code:java}
> // 
> spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
> {code}
>  2) Create a sample DataFrame
> {code:java}
> // df=spark_session.createDataFrame([{'a':1}])
> {code}
>  3) Do some operation on this dataframe
> {code:java}
> // df.count(){code}
> I get this output.
> {code:java}
> // Exception happened during processing of request from ('127.0.0.1', 38690)
> Traceback (most recent call last):
> File "/usr/lib/python3.6/socketserver.py", line 317, in 
> _handle_request_noblock
> self.process_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
> self.finish_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
> self.RequestHandlerClass(request, client_address, self)
> File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
> self.handle()
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 266, in handle
> poll(authenticate_and_accum_updates)
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 241, in poll
> if func():
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 254, in authenticate_and_accum_updates
> received_token = self.rfile.read(len(auth_token))
> TypeError: object of type 'NoneType' has no len()
> {code}
> 4) Repeat above step; it won't show the error.
>  
> But now close the session, kill the python terminal or process. and try 
> again, the same happens.
>  
> Something related to https://issues.apache.org/jira/browse/SPARK-26019  ?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26113) TypeError: object of type 'NoneType' has no len() in authenticate_and_accum_updates of pyspark/accumulators.py

2018-11-19 Thread Sai Varun Reddy Daram (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sai Varun Reddy Daram updated SPARK-26113:
--
Priority: Critical  (was: Major)

> TypeError: object of type 'NoneType' has no len() in 
> authenticate_and_accum_updates of pyspark/accumulators.py
> --
>
> Key: SPARK-26113
> URL: https://issues.apache.org/jira/browse/SPARK-26113
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes, PySpark
>Affects Versions: 2.4.0
>Reporter: Sai Varun Reddy Daram
>Priority: Critical
>
> Machine OS: Ubuntu 16.04.
> Kubernetes: Minikube 
> Kubernetes Version: 1.10.0
> Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) 
> built using standard spark docker build.sh file.
> Steps to replicate:
> 1) Create a spark Session:  
> {code:java}
> // 
> spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
> {code}
>  2) Create a sample DataFrame
> {code:java}
> // df=spark_session.createDataFrame([{'a':1}])
> {code}
>  3) Do some operation on this dataframe
> {code:java}
> // df.count(){code}
> I get this output.
> {code:java}
> // Exception happened during processing of request from ('127.0.0.1', 38690)
> Traceback (most recent call last):
> File "/usr/lib/python3.6/socketserver.py", line 317, in 
> _handle_request_noblock
> self.process_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
> self.finish_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
> self.RequestHandlerClass(request, client_address, self)
> File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
> self.handle()
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 266, in handle
> poll(authenticate_and_accum_updates)
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 241, in poll
> if func():
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 254, in authenticate_and_accum_updates
> received_token = self.rfile.read(len(auth_token))
> TypeError: object of type 'NoneType' has no len()
> {code}
> 4) Repeat above step; it won't show the error.
>  
> But now close the session, kill the python terminal or process. and try 
> again, the same happens.
>  
> Something related to https://issues.apache.org/jira/browse/SPARK-26019  ?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26113) TypeError: object of type 'NoneType' has no len() in authenticate_and_accum_updates of pyspark/accumulators.py

2018-11-18 Thread Sai Varun Reddy Daram (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691330#comment-16691330
 ] 

Sai Varun Reddy Daram commented on SPARK-26113:
---

Something to help here: https://issues.apache.org/jira/browse/SPARK-26113 ?

> TypeError: object of type 'NoneType' has no len() in 
> authenticate_and_accum_updates of pyspark/accumulators.py
> --
>
> Key: SPARK-26113
> URL: https://issues.apache.org/jira/browse/SPARK-26113
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes, PySpark
>Affects Versions: 2.4.0
>Reporter: Sai Varun Reddy Daram
>Priority: Blocker
>
> Machine OS: Ubuntu 16.04.
> Kubernetes: Minikube 
> Kubernetes Version: 1.10.0
> Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) 
> built using standard spark docker build.sh file.
> Steps to replicate:
> 1) Create a spark Session:  
> {code:java}
> // 
> spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
> {code}
>  2) Create a sample DataFrame
> {code:java}
> // df=spark_session.createDataFrame([{'a':1}])
> {code}
>  3) Do some operation on this dataframe
> {code:java}
> // df.count(){code}
> I get this output.
> {code:java}
> // Exception happened during processing of request from ('127.0.0.1', 38690)
> Traceback (most recent call last):
> File "/usr/lib/python3.6/socketserver.py", line 317, in 
> _handle_request_noblock
> self.process_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
> self.finish_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
> self.RequestHandlerClass(request, client_address, self)
> File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
> self.handle()
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 266, in handle
> poll(authenticate_and_accum_updates)
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 241, in poll
> if func():
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 254, in authenticate_and_accum_updates
> received_token = self.rfile.read(len(auth_token))
> TypeError: object of type 'NoneType' has no len()
> {code}
> 4) Repeat above step; it won't show the error.
>  
> But now close the session, kill the python terminal or process. and try 
> again, the same happens.
>  
> Something related to https://issues.apache.org/jira/browse/SPARK-26019  ?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26113) TypeError: object of type 'NoneType' has no len() in authenticate_and_accum_updates of pyspark/accumulators.py

2018-11-18 Thread Sai Varun Reddy Daram (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sai Varun Reddy Daram updated SPARK-26113:
--
Description: 
Machine OS: Ubuntu 16.04.

Kubernetes: Minikube 

Kubernetes Version: 1.10.0

Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) built 
using standard spark docker build.sh file.

Steps to replicate:

1) Create a spark Session:  
{code:java}
// 
spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
{code}
 2) Create a sample DataFrame
{code:java}
// df=spark_session.createDataFrame([{'a':1}])
{code}
 3) Do some operation on this dataframe
{code:java}
// df.count(){code}
I get this output.
{code:java}
// Exception happened during processing of request from ('127.0.0.1', 38690)
Traceback (most recent call last):
File "/usr/lib/python3.6/socketserver.py", line 317, in _handle_request_noblock
self.process_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
self.finish_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
self.RequestHandlerClass(request, client_address, self)
File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
self.handle()
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 266, in handle
poll(authenticate_and_accum_updates)
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 241, in poll
if func():
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 254, in authenticate_and_accum_updates
received_token = self.rfile.read(len(auth_token))
TypeError: object of type 'NoneType' has no len()

{code}
4) Repeat above step; it won't show the error.

 

But now close the session, kill the python terminal or process. and try again, 
the same happens.

 

Something related to https://issues.apache.org/jira/browse/SPARK-26019  ?

  was:
Machine OS: Ubuntu 16.04.

Kubernetes: Minikube 

Kubernetes Version: 1.10.0

Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) built 
using standard spark docker build.sh file.

Steps to replicate:

1) Create a spark Session:  
{code:java}
// 
spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
{code}
 2) Create a sample DataFrame
{code:java}
// df=spark_session.createDataFrame([{'a':1}])
{code}
 3) Do some operation on this dataframe
{code:java}
// df.count(){code}
I get this output.
{code:java}
// Exception happened during processing of request from ('127.0.0.1', 38690)
Traceback (most recent call last):
File "/usr/lib/python3.6/socketserver.py", line 317, in _handle_request_noblock
self.process_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
self.finish_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
self.RequestHandlerClass(request, client_address, self)
File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
self.handle()
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 266, in handle
poll(authenticate_and_accum_updates)
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 241, in poll
if func():
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 254, in authenticate_and_accum_updates
received_token = self.rfile.read(len(auth_token))
TypeError: object of type 'NoneType' has no len()

{code}
4) Repeat above step; it won't show the error.

 

But now close the session, kill the python terminal or process. and try again, 
the same happens.

 

Something related to https://issues.apache.org/jira/browse/SPARK-26019?


> TypeError: object of type 'NoneType' has no len() in 
> authenticate_and_accum_updates of pyspark/accumulators.py
> --
>
> Key: SPARK-26113
> URL: https://issues.apache.org/jira/browse/SPARK-26113
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes, PySpark
>Affects Versions: 2.4.0
>Reporter: Sai Varun Reddy Daram
>Priority: Blocker
>
> Machine OS: Ubuntu 16.04.
> Kubernetes: Minikube 
> Kubernetes Version: 1.10.0
> Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) 
> built using standard spark docker build.sh file.
> Steps to replicate:
> 1) Create a spark Session:  
> {code:java}
> // 
> spark_session=SparkSes

[jira] [Created] (SPARK-26113) TypeError: object of type 'NoneType' has no len() in authenticate_and_accum_updates of pyspark/accumulators.py

2018-11-18 Thread Sai Varun Reddy Daram (JIRA)
Sai Varun Reddy Daram created SPARK-26113:
-

 Summary: TypeError: object of type 'NoneType' has no len() in 
authenticate_and_accum_updates of pyspark/accumulators.py
 Key: SPARK-26113
 URL: https://issues.apache.org/jira/browse/SPARK-26113
 Project: Spark
  Issue Type: Bug
  Components: Kubernetes, PySpark
Affects Versions: 2.4.0
Reporter: Sai Varun Reddy Daram


Machine OS: Ubuntu 16.04.

Kubernetes: Minikube 

Kubernetes Version: 1.10.0

Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) built 
using standard spark docker build.sh file.

Steps to replicate:

1) Create a spark Session:  
{code:java}
// 
spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
{code}
 2) Create a sample DataFrame
{code:java}
// df=spark_session.createDataFrame([{'a':1}])
{code}
 3) Do some operation on this dataframe
{code:java}
// df.count(){code}
I get this output.
{code:java}
// Exception happened during processing of request from ('127.0.0.1', 38690)
Traceback (most recent call last):
File "/usr/lib/python3.6/socketserver.py", line 317, in _handle_request_noblock
self.process_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
self.finish_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
self.RequestHandlerClass(request, client_address, self)
File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
self.handle()
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 266, in handle
poll(authenticate_and_accum_updates)
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 241, in poll
if func():
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 254, in authenticate_and_accum_updates
received_token = self.rfile.read(len(auth_token))
TypeError: object of type 'NoneType' has no len()

{code}
4) Repeat above step; it won't show the error.

 

But now close the session, kill the python terminal or process. and try again, 
the same happens.

 

Something related to https://issues.apache.org/jira/browse/SPARK-26019?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org