This is an automated email from the ASF dual-hosted git repository.

roryqi pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-uniffle.git


The following commit(s) were added to refs/heads/master by this push:
     new e828c504 [MINOR] fix: the description of Spark patches in the 
README.md (#801)
e828c504 is described below

commit e828c5044a6d33a2e33bcdaa91d27dfdfd99da84
Author: roryqi <[email protected]>
AuthorDate: Fri Apr 7 10:20:42 2023 +0800

    [MINOR] fix: the description of Spark patches in the README.md (#801)
    
    ### What changes were proposed in this pull request?
    
    We modify the structure of directory and we should modify the descrition
    
    ### Why are the changes needed?
    Modify the description of directory
    
    ### Does this PR introduce _any_ user-facing change?
    No.
    
    ### How was this patch tested?
    No
---
 README.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/README.md b/README.md
index db0d9c4c..0b4edecd 100644
--- a/README.md
+++ b/README.md
@@ -67,7 +67,7 @@ The shuffle data is stored with index file and data file. 
Data file has all bloc
 ## Supported Spark Version
 Currently supports Spark 2.3.x, Spark 2.4.x, Spark 3.0.x, Spark 3.1.x, Spark 
3.2.x, Spark 3.3.x
 
-Note: To support dynamic allocation, the patch(which is included in 
client-spark/patch folder) should be applied to Spark
+Note: To support dynamic allocation, the patch(which is included in 
patch/spark folder) should be applied to Spark
 
 ## Supported MapReduce Version
 Currently supports the MapReduce framework of Hadoop 2.8.5
@@ -212,7 +212,7 @@ rss-xxx.tgz will be generated for deployment
 ### Support Spark dynamic allocation
 
 To support spark dynamic allocation with Uniffle, spark code should be updated.
-There are 3 patches for spark (2.4.6/3.1.2/3.2.1) in spark-patches folder for 
reference.
+There are 3 patches for spark (2.4.6/3.1.2/3.2.1) in patch/spark folder for 
reference.
 
 After apply the patch and rebuild spark, add following configuration in spark 
conf to enable dynamic allocation:
   ```

Reply via email to