Hyukjin Kwon created SPARK-42011:
------------------------------------

             Summary: Implement DataFrameReader.csv
                 Key: SPARK-42011
                 URL: https://issues.apache.org/jira/browse/SPARK-42011
             Project: Spark
          Issue Type: Sub-task
          Components: Connect
    Affects Versions: 3.4.0
            Reporter: Hyukjin Kwon


{code}
                                                                          
pyspark/sql/tests/test_datasources.py:147 
(DataSourcesParityTests.test_checking_csv_header)
self = 
<pyspark.sql.tests.connect.test_parity_datasources.DataSourcesParityTests 
testMethod=test_checking_csv_header>

    def test_checking_csv_header(self):
        path = tempfile.mkdtemp()
        shutil.rmtree(path)
        try:
            self.spark.createDataFrame([[1, 1000], [2000, 2]]).toDF("f1", 
"f2").write.option(
                "header", "true"
            ).csv(path)
            schema = StructType(
                [
                    StructField("f2", IntegerType(), nullable=True),
                    StructField("f1", IntegerType(), nullable=True),
                ]
            )
            df = (
>               self.spark.read.option("header", "true")
                .schema(schema)
                .csv(path, enforceSchema=False)
            )

../test_datasources.py:162: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <pyspark.sql.connect.readwriter.DataFrameReader object at 0x7fb118289520>
args = ('/var/folders/0c/q8y15ybd3tn7sr2_jmbmftr80000gp/T/tmp4kdxohcw',)
kwargs = {'enforceSchema': False}

    def csv(self, *args: Any, **kwargs: Any) -> None:
>       raise NotImplementedError("csv() is not implemented.")
E       NotImplementedError: csv() is not implemented.

../../connect/readwriter.py:225: NotImplementedError

{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to