zhengruifeng commented on code in PR #38460:
URL: https://github.com/apache/spark/pull/38460#discussion_r1010078740


##########
python/pyspark/sql/connect/client.py:
##########
@@ -145,6 +145,39 @@ def _build_metrics(self, metrics: "pb2.Response.Metrics") 
-> typing.List[PlanMet
     def sql(self, sql_string: str) -> "DataFrame":
         return DataFrame.withPlan(SQL(sql_string), self)
 
+    def range(
+        self,
+        start: int,
+        end: int,
+        step: Optional[int] = None,

Review Comment:
   i think we can use `step: int = 1`



##########
python/pyspark/sql/tests/connect/test_connect_plan_only.py:
##########
@@ -118,6 +118,15 @@ def test_relation_alias(self):
         plan = df.alias("table_alias")._plan.to_proto(self.connect)
         self.assertEqual(plan.root.common.alias, "table_alias")
 
+    def test_range(self):
+        plan = self.connect.range(start=10, end=20, step=3, 
num_partitions=4)._plan.to_proto(
+            self.connect
+        )
+        self.assertEqual(plan.root.range.start, 10)
+        self.assertEqual(plan.root.range.end, 20)
+        self.assertEqual(plan.root.range.step.step, 3)
+        self.assertEqual(plan.root.range.num_partitions.num_partitions, 4)
+

Review Comment:
   what about adding a new case like `range(start=10, end=20)` and check:
   1, step is set 1 (if we make the default value 1);
   2, num_partitions not set;



##########
python/pyspark/sql/connect/client.py:
##########
@@ -145,6 +145,39 @@ def _build_metrics(self, metrics: "pb2.Response.Metrics") 
-> typing.List[PlanMet
     def sql(self, sql_string: str) -> "DataFrame":
         return DataFrame.withPlan(SQL(sql_string), self)
 
+    def range(
+        self,
+        start: int,
+        end: int,
+        step: Optional[int] = None,

Review Comment:
   furthermore, i think we can make `step` a required field in proto



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to