Hi,

so I updated the libs locally, built and re-ran the example with this version 
and it now worked without any problems.

Chris



Am 04.04.18, 12:58 schrieb "Christofer Dutz" <christofer.d...@c-ware.de>:

    Hi all,
    
    reporting back from my easter holidays :-)
    
    Today I had to help a customer with getting a POC working that uses PLC4X 
and Edgent. Unfortunately it seems that in order to use the kafka connector I 
can only use 0.x versions of Kafka. When connecting to 1.x versions I get 
stack-overflows and OutOfMemory errors. I did a quick test with updating the 
kafaka libs from the ancient 0.8.2.2 to 1.1.0 seemed to not break anything ... 
I'll do some local tests with an updated Kafka client. 
    
    @vino yang ... have you been working on adding the Annotations to the 
client?
    
    @all others ... does anyone have objections to updating the kafka client 
libs to 1.1.0? It shouldn't break anything as it should be backward compatible. 
As we are currently not using anything above the API level of 0.8.2 there 
should also not be any Exceptions (I don't know of any removed things, which 
could be a problem).
    
    Chris
    
    
    
    Am 20.03.18, 10:33 schrieb "Christofer Dutz" <christofer.d...@c-ware.de>:
    
        Ok,
        
        So I just added a new Annotation type to the Kafka module. 
        
        org.apache.edgent.connectors.kafka.annotations.KafkaVersion
        
        It has a fromVersion and a toVersion attribute. Both should be optional 
so just adding the annotation would have no effect (besides a few additional 
CPU operations). The annotation can be applied to methods or classes (every 
method then inherits this). I hope that's ok, because implementing this on a 
parameter Level would make things extremely difficult.
        
        @vino yang With this you should be able to provide Kafka version 
constraints to your code changes. Just tell me if something's missing or needs 
to be done differently
        
        For now this annotation will have no effect as I haven't implemented 
the Aspect for doing the checks, but I'll start working on that as soon as you 
have annotated something.
        
        Chris
        
        Am 20.03.18, 10:11 schrieb "Christofer Dutz" 
<christofer.d...@c-ware.de>:
        
            Ok ... maybe I should add the Annotation prior to continuing my 
work on the AWS connector ...
            
            
            Chris
            
            Am 04.03.18, 08:10 schrieb "vino yang" <yanghua1...@gmail.com>:
            
                The reason is that Kafka 0.9+ provided a new consumer API which 
has more
                features and better performance.
                
                Just like Flink's implementation :
                https://github.com/apache/flink/tree/master/flink-connectors.
                
                vinoyang
                Thanks.
                
            
            
        
        
    
    

Reply via email to