Re: Re: [how to]RDD using JDBC data source in PySpark

2022-09-19 Thread javaca...@163.com
Thank you Bjorn Jorgensen and also thank to Sean Owen. DataFrame and .format("jdbc") is good way to resolved it. But in some reasons, i can't using DataFrame API, only can use RDD API in PySpark. ...T_T... thanks all you guys help. but still need new idea to resolve it. XD javaca...

Re: Re: [how to]RDD using JDBC data source in PySpark

2022-09-20 Thread Bjørn Jørgensen
There is a PR for this now. [SPARK-40491][SQL] Expose a jdbcRDD function in SparkContext man. 19. sep. 2022 kl. 12:47 skrev javaca...@163.com : > Thank you Bjorn Jorgensen and also thank to Sean Owen. > > DataFrame and .format("jdbc") is good way to re