[ 
https://issues.apache.org/jira/browse/SPARK-7899?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14566089#comment-14566089
 ] 

Justin Uang commented on SPARK-7899:
------------------------------------

Can we get this back ported into spark 1.4 or is it too late for that.

> PySpark sql/tests breaks pylint validation
> ------------------------------------------
>
>                 Key: SPARK-7899
>                 URL: https://issues.apache.org/jira/browse/SPARK-7899
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark, Tests
>    Affects Versions: 1.4.0
>            Reporter: Michael Nazario
>            Assignee: Michael Nazario
>             Fix For: 1.5.0
>
>
> The pyspark.sql.types module is dynamically named {{types}} from {{_types}} 
> which messes up pylint validation
> From [~justin.uang] below:
> In commit 04e44b37, the migration to Python 3, {{pyspark/sql/types.py}} was 
> renamed to {{pyspark/sql/\_types.py}} and then some magic in 
> {{pyspark/sql/\_\_init\_\_.py}} dynamically renamed the module back to 
> {{types}}. I imagine that this is some naming conflict with Python 3, but 
> what was the error that showed up?
> The reason why I'm asking about this is because it's messing with pylint, 
> since pylint cannot now statically find the module. I tried also importing 
> the package so that {{\_\_init\_\_}} would be run in a init-hook, but that 
> isn't what the discovery mechanism is using. I imagine it's probably just 
> crawling the directory structure.
> One way to work around this would be something akin to this 
> (http://stackoverflow.com/questions/9602811/how-to-tell-pylint-to-ignore-certain-imports),
>  where I would have to create a fake module, but I would probably be missing 
> a ton of pylint features on users of that module, and it's pretty hacky.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to