Hey,
Just seen this reply.
We have Ignite persistence enabled. The caches/tables are the primary
source of the data. That's the use case.
If we build an ML model from the data in a cache, Ignite's behaviour of
deleting the cache means we'll have lost that data.
We were just lucky this showed up in tests before it got anywhere near
production data.

In our case, we're push data into a cache continually and rebuilding the
model periodically.

Regards,
Courtney Robinson
Founder and CEO, Hypi
Tel: ++44 208 123 2413 (GMT+0) <https://hypi.io>

<https://hypi.io>
https://hypi.io


On Mon, Aug 3, 2020 at 5:28 PM zaleslaw <zaleslaw....@gmail.com> wrote:

> Dear Courtney Robinson, let's discuss here the possible behaviour of this
> CacheBased Dataset closing.
>
> When designed this feature we think, that the all training parts and stuff
> should be deleted from Caches ad model should be serialized or exported
> somwhere.
>
> What is your use-case& Could you share some code or pseudo-code?
> How are you going to handle data after training?
>
>
>
> --
> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>

Reply via email to