Github user scottcarey commented on the issue:

    https://github.com/apache/spark/pull/21070
  
    > This is about getting Parquet updated, not about worrying whether users 
can easily add compression implementations to their classpath.
    
    Yes, of course.
    
    My hunch is that someone else will read the release notes that spark 2.3.0 
supports zstandard, and parquet 1.10.0 supports zstandard, then realize it 
doesn't work in combination and end up here.  So I feel that this is the right 
place to discuss the state of these features until there is another more 
specific place to do so.
    
    The discussion here has been useful to get closer to understanding what 
further tasks there may be.  If there are any follow-on issues, the discussion 
can move there.
    
    
    I would love to be able to test this with my full use case, and give it a 
big thumbs up if it works.  Unfortunately my only motivation for this upgrade 
is access to ZStandard, and I'm not as excited to say 'works for me if I don't 
use new parquet codecs'.
    



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to