Hi Ara,

Are you running Kafka Streams v0.10.1? If so, there is a resource leak in
the window store. See: https://github.com/apache/kafka/pull/2122
You can either try checking out the Apache Kafka 0.10.1 branch (which has
the fix) and building it manually, or you can try using the latest
confluent packaged version of the library, which you can get from
confluents maven repository:

<repositories>
    <repository>
        <id>confluent</id>
        <url>http://packages.confluent.io/maven/</url>
    </repository></repositories>

<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-streams</artifactId>
    <version>0.10.1.0-cp2</version></dependency><dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>0.10.1.0-cp2</version></dependency>

Thanks,

Damian


On Wed, 16 Nov 2016 at 14:52 Ara Ebrahimi <ara.ebrah...@argyledata.com>
wrote:

> Hi,
>
> I have a few KTables in my application. Some of them have unlimited
> windows. If I leave the application to run for a few hours, I see the java
> process consume more and more memory, way above the -Xmx limit. I
> understand this is due to the rocksdb native lib used by kafka streams.
> What I don’t understand is how I can control how much rocksdb data should
> be kept in memory. I know I can use the RocksDBConfigSetter class, but I
> need more information on why this unbounded memory allocation happens and
> how to tune it.
>
> Thanks,
> Ara.
>
>
>
> ________________________________
>
> This message is for the designated recipient only and may contain
> privileged, proprietary, or otherwise confidential information. If you have
> received it in error, please notify the sender immediately and delete the
> original. Any other use of the e-mail by you is prohibited. Thank you in
> advance for your cooperation.
>
> ________________________________
>

Reply via email to