LuciferYang commented on code in PR #44183:
URL: https://github.com/apache/spark/pull/44183#discussion_r1416729223


##########
common/network-common/src/main/java/org/apache/spark/network/util/DBProvider.java:
##########
@@ -38,17 +38,18 @@ public static DB initDB(
         StoreVersion version,
         ObjectMapper mapper) throws IOException {
       if (dbFile != null) {
-        switch (dbBackend) {
-          case LEVELDB:
+        return switch (dbBackend) {
+          case LEVELDB -> {
             org.iq80.leveldb.DB levelDB = LevelDBProvider.initLevelDB(dbFile, 
version, mapper);
             logger.warn("The LEVELDB is deprecated. Please use ROCKSDB 
instead.");
-            return levelDB != null ? new LevelDB(levelDB) : null;
-          case ROCKSDB:
+            yield levelDB != null ? new LevelDB(levelDB) : null;
+          }
+          case ROCKSDB -> {
             org.rocksdb.RocksDB rocksDB = RocksDBProvider.initRockDB(dbFile, 
version, mapper);
-            return rocksDB != null ? new RocksDB(rocksDB) : null;
-          default:
-            throw new IllegalArgumentException("Unsupported DBBackend: " + 
dbBackend);
-        }
+            yield rocksDB != null ? new RocksDB(rocksDB) : null;
+          }
+          default -> throw new IllegalArgumentException("Unsupported 
DBBackend: " + dbBackend);

Review Comment:
   How about
   
   ```scala
   return switch (dbBackend) {
             case LEVELDB -> {
               logger.warn("The LEVELDB is deprecated. Please use ROCKSDB 
instead.");
               yield new LevelDB(LevelDBProvider.initLevelDB(dbFile, version, 
mapper));
             }
             case ROCKSDB -> new RocksDB(RocksDBProvider.initRockDB(dbFile, 
version, mapper));
           };
   ```
   `dbBackend` is an enumeration type, the default case is not necessary.



##########
launcher/src/main/java/org/apache/spark/launcher/CommandBuilderUtils.java:
##########
@@ -143,59 +143,61 @@ static List<String> parseOptionString(String s) {
         escapeNext = false;
       } else if (inOpt) {
         switch (c) {
-        case '\\':
-          if (inSingleQuote) {
-            opt.appendCodePoint(c);
-          } else {
-            escapeNext = true;
+          case '\\' -> {

Review Comment:
   For this particular case, I personally believe that `Pattern Matching for 
switch` does not demonstrate its advantages.



##########
sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:
##########
@@ -84,22 +84,15 @@ public enum _Fields implements 
org.apache.thrift.TFieldIdEnum {
      * Find the _Fields constant that matches fieldId, or null if its not 
found.
      */
     public static _Fields findByThriftId(int fieldId) {

Review Comment:
   this file copy from Hive 0.13, let's leaving it as it is.



##########
sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/ParquetVectorUpdaterFactory.java:
##########
@@ -181,9 +181,8 @@ public ParquetVectorUpdater getUpdater(ColumnDescriptor 
descriptor, DataType spa
         } else if (sparkType == DataTypes.BinaryType) {
           return new FixedLenByteArrayUpdater(arrayLen);
         }
-        break;
-      default:
-        break;
+      }
+      default -> {}

Review Comment:
   Just to confirm, can we move 
   
   ```java
   throw constructConvertNotSupportedException(descriptor, sparkType);
   ``` 
   
   from line 190 into the default branch?
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to