Probably for purposes of chaining, though won't be very useful here. Like
df.unpersist().cache(... some other settings ...)

foreachBatch wants a function that evaluates to Unit, but this qualifies -
doesn't matter what the value of the block is, if it's ignored.
This does seem to compile; are you sure? what error? may not be related to
that, quite.


On Thu, Oct 22, 2020 at 5:40 AM German Schiavon <gschiavonsp...@gmail.com>
wrote:

> Hello!
>
> I'd like to ask if there is any reason to return *type *when calling
> *dataframe.unpersist*
>
> def unpersist(blocking: Boolean): this.type = {
>   sparkSession.sharedState.cacheManager.uncacheQuery(
>     sparkSession, logicalPlan, cascade = false, blocking)
>   this
> }
>
>
> Just pointing it out because this example from the docs don't compile
> since unpersist() is not Unit
>
> streamingDF.writeStream.foreachBatch { (batchDF: DataFrame, batchId: Long) =>
>   batchDF.persist()
>   batchDF.write.format(...).save(...)  // location 1
>   batchDF.write.format(...).save(...)  // location 2
>   batchDF.unpersist()}
>
>
> Thanks!
>

Reply via email to