cdkrot commented on code in PR #42949: URL: https://github.com/apache/spark/pull/42949#discussion_r1331785918
########## python/pyspark/sql/connect/client/artifact.py: ########## @@ -243,11 +244,15 @@ def _create_requests( self, *path: str, pyfile: bool, archive: bool, file: bool ) -> Iterator[proto.AddArtifactsRequest]: """Separated for the testing purpose.""" - return self._add_artifacts( - chain( - *(self._parse_artifacts(p, pyfile=pyfile, archive=archive, file=file) for p in path) + try: + yield from self._add_artifacts( + chain( + *(self._parse_artifacts(p, pyfile=pyfile, archive=archive, file=file) for p in path) + ) ) - ) + except Exception as e: + logger.error(f"Failed to submit addArtifacts request: {e}") Review Comment: I don't think we can do something like that here. We are not having server-side error response needed to process. We have purely client-side error of exception being thrown while grpc consumes iterator of requests. It's not possible to extract any information on outside of grpc being called since it suppresses error entirely. I think the only option to make proper exception classes is to preload all the artifacts into memory and construct corresponding requests before streaming them into grpc. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org