davlee1972 opened a new issue, #38485:
URL: https://github.com/apache/arrow/issues/38485

   ### Describe the bug, including details regarding any error messages, 
version, and platform.
   
   This seems to be a problem on windows with hard drive letters:
   
   You can read a dataset without partitioning, but it you try to add 
partitioned columns it complains it can't parse the "c:" drive letter
   
   ```
   >>> my_dataset = ds.dataset(files)
   >>> my_dataset.files
   ['c:/prices_parquet/2011/10/17/prices.0.parquet', 
'c:/prices_parquet/2011/11/19/prices.0.parquet', 
'c:/prices_parquet/2011/12/31/prices.0.parquet', 
'c:/prices_parquet/2011/9/19/prices.0.parquet', 
'c:/prices_parquet/2012/1/5/prices.0.parquet']
   >>>
   
   >>> p = ds.partitioning(schema=pa.schema([pa.field("as_of_year", 
pa.int32()), pa.field("as_of_month", pa.int32()), pa.field("as_of_day", 
pa.int32())]))
   
   >>> my_dataset2 = ds.dataset(my_dataset.files, partitioning = p)
   Traceback (most recent call last):
     File "<stdin>", line 1, in <module>
     File "C:\Users\????\Anaconda3\lib\site-packages\pyarrow\dataset.py", line 
766, in dataset
       return _filesystem_dataset(source, **kwargs)
     File "C:\Users\????\Anaconda3\lib\site-packages\pyarrow\dataset.py", line 
456, in _filesystem_dataset
       return factory.finish(schema)
     File "pyarrow\_dataset.pyx", line 2752, in 
pyarrow._dataset.DatasetFactory.finish
     File "pyarrow\error.pxi", line 144, in 
pyarrow.lib.pyarrow_internal_check_status
     File "pyarrow\error.pxi", line 100, in pyarrow.lib.check_status
   pyarrow.lib.ArrowInvalid: error parsing 'c:' as scalar of type int32
   
   ```
   
   Hive partitioning works fine..
   
   ### Component(s)
   
   Python


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to