[ 
https://issues.apache.org/jira/browse/SPARK-38627?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17511765#comment-17511765
 ] 

Prakhar Sandhu commented on SPARK-38627:
----------------------------------------

Hi [~hyukjin.kwon] , Great ^^
 # Did it work on spark 3.3 or spark 3.2?
 # . What environment are you using?

I have set up a conda environment in my local system with spark 3.2. 

I specified the numpy explicitly but got the below error : 

  
{code:java}
 df = pd.DataFrame({ 'Date1': rng.to_numpy(),  'Date2': rng.to_numpy()})
  File 
"C:\Users\abc\Anaconda3\envs\env2\lib\site-packages\pyspark\pandas\indexes\base.py",
 line 519, in to_numpy     
    result = np.asarray(self._to_internal_pandas()._values, dtype=dtype)
  File 
"C:\Users\abc\Anaconda3\envs\env2\lib\site-packages\pyspark\pandas\indexes\base.py",
 line 472, in _to_internal_pandas
    return self._psdf._internal.to_pandas_frame.index{code}

> TypeError: Datetime subtraction can only be applied to datetime series
> ----------------------------------------------------------------------
>
>                 Key: SPARK-38627
>                 URL: https://issues.apache.org/jira/browse/SPARK-38627
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 3.2.1
>            Reporter: Prakhar Sandhu
>            Priority: Major
>
> I am trying to replace pandas with pyspark.pandas library, when I tried this :
> pdf is a pyspark.pandas dataframe
> {code:java}
> pdf["date_diff"] = (pdf["date1"] - pdf["date2"])/pdf.Timedelta(days=30){code}
> I got the below error :
> {code:java}
> File 
> "C:\Users\abc\Anaconda3\envs\test\lib\site-packages\pyspark\pandas\data_type_ops\datetime_ops.py",
>  line 75, in sub
> raise TypeError("Datetime subtraction can only be applied to datetime 
> series.") {code}
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to