[ https://issues.apache.org/jira/browse/SPARK-21280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16071749#comment-16071749 ]
Eran Moscovici edited comment on SPARK-21280 at 7/2/17 6:35 PM: ---------------------------------------------------------------- There is a self evident use-case for using the BloomFilter class as a value class, which is a table of BloomFilters filtering out requests for entries which don't exist in the table holding the actual data. Other use-cases can be thought of. Unfortunately adding some getter/setters to the BloomFilter class won't help since the BloomFilter class itself is abstract and uses another class for instantiation, among others. If the BloomFilter class was intended as a static utility class then maybe it shouldn't be instantiatiable. was (Author: emoscovici): There is a self evident use-case for using the BloomFilter class as a value class, which is a table of BloomFilters filtering out requests for entries which don't exist in the table holding the actual data. Other use-cases can be thought of. Unfortunately adding some getter/setters to the BloomFilter class won't help since the BloomFilter class itself is abstract and uses another class for instantiation, among others. > org.apache.spark.util.sketch.BloomFilter not bean compliant > ----------------------------------------------------------- > > Key: SPARK-21280 > URL: https://issues.apache.org/jira/browse/SPARK-21280 > Project: Spark > Issue Type: Improvement > Components: Java API > Affects Versions: 2.1.1 > Reporter: Eran Moscovici > Priority: Minor > > Trying to work with Dataset<BloomFilter> fails in runtime with the 'not bean > compliant' exception. > This means that BloomFilter objects cannot be used as values to be handled > within a Spark Dataset or saved (for example as a parquet file). > One would expect an object within the Spark ecosystem > ('org.apache.spark.util.sketch.BloomFilter') to be able to do that. -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org