dongjoon-hyun commented on PR #45408:
URL: https://github.com/apache/spark/pull/45408#issuecomment-1984433848

   Thank you for the confirmation, @ted-jenks . Well, in this case, it's too 
late to change the behavior again. Apache Spark 3.3 is already the EOL status 
since last year and I don't think we need to change the behavior for Apache 
Spark 3.4.3 and 3.5.2 because Apache Spark community didn't have such an 
official contract before. It would be great if you participate the community at 
Apache Spark 3.3.0 RC votes at that time.
   
   > > It sounds like you have other systems to read Spark's data.
   >
   > Correct. The issue was that from 3.2 to 3.3 there was a behavior change in 
the base64 encodings used in spark. Previously, they did not chunk. Now, they 
do. Chunked base64 cannot be read by non-MIME compatible base64 decoders 
causing the data output by Spark to be corrupt to systems following the normal 
base64 standard.
   > 
   > I think the best path forward is to use MIME encoding/decoding without 
chunking as this is the most fault tolerant meaning existing use-cases will not 
break, but the pre 3.3 base64 behavior is upheld.
   
   However, I understand and agree with @ted-jenks 's point as a nice-to-have 
of Apache Spark 4+ officially. In other words, if we want to merge this PR, we 
need to make it official from Apache Spark 4.0.0 and protect that as a kind of 
developer interface for all future releases. Do you think it's okay, @ted-jenks 
?
   
   BTW, how do you think about this proposal, @yaooqinn (the original author of 
#35110) and @cloud-fan and @HyukjinKwon ?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to