[ https://issues.apache.org/jira/browse/SPARK-45664?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Jiaan Geng updated SPARK-45664: ------------------------------- Description: Currently, Spark supported all the orc compression codecs, but the orc supported compression codecs and spark supported are not completely one-on-one due to Spark introduce two compression codecs none and UNCOMPRESSED. There are a lot of magic strings copy from orc compression codecs. This issue lead to developers need to manually maintain its consistency. It is easy to make mistakes and reduce development efficiency. was: Currently, Spark supported all the orc compression codecs, but the orc supported compression codecs and spark supported are not completely one-on-one due to Spark introduce two compression codecs none and UNCOMPRESSED. There are a lot of magic strings copy from parquet compression codecs. This issue lead to developers need to manually maintain its consistency. It is easy to make mistakes and reduce development efficiency. > Introduce a mapper for orc compression codecs > --------------------------------------------- > > Key: SPARK-45664 > URL: https://issues.apache.org/jira/browse/SPARK-45664 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 4.0.0 > Reporter: Jiaan Geng > Assignee: Jiaan Geng > Priority: Major > > Currently, Spark supported all the orc compression codecs, but the orc > supported compression codecs and spark supported are not completely > one-on-one due to Spark introduce two compression codecs none and > UNCOMPRESSED. > There are a lot of magic strings copy from orc compression codecs. This issue > lead to developers need to manually maintain its consistency. It is easy to > make mistakes and reduce development efficiency. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org