YuanHanzhong commented on code in PR #43626:
URL: https://github.com/apache/spark/pull/43626#discussion_r1379459881


##########
hadoop-cloud/README.md:
##########
@@ -16,5 +16,5 @@ Integration tests will have some extra configurations for 
example selecting the
 run the test against. Those configs are passed as environment variables and 
the existence of these
 variables must be checked by the test.
 Like for `AwsS3AbortableStreamBasedCheckpointFileManagerSuite` the S3 bucket 
used for testing
-is passed in the `S3A_PATH` and the credetinals to access AWS S3 are 
AWS_ACCESS_KEY_ID and
+is passed in the `S3A_PATH` and the credentials to access AWS S3 are 
AWS_ACCESS_KEY_ID and

Review Comment:
   Yes, of course, I will contribute more.  Actually this typo takes me several 
hours to locate it.  In the future, I would like to make more technical 
contributions.  Let me participate in,  let's work together to make spark more 
beautiful.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to