stoty commented on code in PR #145:
URL:
https://github.com/apache/phoenix-connectors/pull/145#discussion_r2120203482
##########
phoenix5-spark/src/it/java/org/apache/phoenix/spark/DataSourceApiIT.java:
##########
@@ -73,10 +73,8 @@ public Configuration getConfiguration(Configuration
confToClone) {
@Test
public void basicWriteAndReadBackTest() throws SQLException {
- SparkConf sparkConf = new
SparkConf().setMaster("local").setAppName("phoenix-test")
- .set("spark.hadoopRDD.ignoreEmptySplits", "false");
- JavaSparkContext jsc = new JavaSparkContext(sparkConf);
- SQLContext sqlContext = new SQLContext(jsc);
+
+ SparkSession spark = SparkUtil.getSparkSession();
Review Comment:
That one doesn't have spark.hadoopRDD.ignoreEmptySplits for Spark3 (which is
fine)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]