Ruslan Dautkhanov created SPARK-14017:
-----------------------------------------

             Summary: dataframe.dtypes -> pyspark.sql.types aliases
                 Key: SPARK-14017
                 URL: https://issues.apache.org/jira/browse/SPARK-14017
             Project: Spark
          Issue Type: Improvement
          Components: PySpark
    Affects Versions: 1.5.0
         Environment: Python 2.7; Spark 1.5; Java 1.7; Hadoop 2.6; Scala 2.10
            Reporter: Ruslan Dautkhanov
            Priority: Minor


Running following:

#fix schema for gaid which should not be Double 
from pyspark.sql.types import *
customSchema = StructType()
for (col,typ) in tsp_orig.dtypes:
    if col=='Agility_GAID':
        typ='string'
    customSchema.add(col,typ,True)

Getting 
  ValueError: Could not parse datatype: bigint

Looks like pyspark.sql.types doesn't know anything about bigint.. 
Should it be aliased to LongType in pyspark.sql.types?





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to