Re: kafka.javaapi.consumer.SimpleConsumer class not found

2016-03-15 Thread Robert Metzger
Great to hear!

On Tue, Mar 15, 2016 at 1:14 PM, Balaji Rajagopalan <
balaji.rajagopa...@olacabs.com> wrote:

> Robert,
>   I got it working for 1.0.0.
>
> balaji
>
> On Mon, Mar 14, 2016 at 4:41 PM, Balaji Rajagopalan <
> balaji.rajagopa...@olacabs.com> wrote:
>
>> Yep the same issue as before(class not found)  with flink 0.10.2 with
>> scala version 2.11. I was not able to use scala 2.10 since connector for
>> flink_connector_kafka for 0.10.2 is not available.
>>
>> balaji
>>
>> On Mon, Mar 14, 2016 at 4:20 PM, Balaji Rajagopalan <
>> balaji.rajagopa...@olacabs.com> wrote:
>>
>>> Yes figured that out, thanks for point that, my bad. I have put back
>>> 0.10.2 as flink version, will try to reproduce the problem again, this time
>>> I have explicitly called out the scala version as 2.11.
>>>
>>>
>>> On Mon, Mar 14, 2016 at 4:14 PM, Robert Metzger 
>>> wrote:
>>>
 Hi,

 flink-connector-kafka_ doesn't exist for 1.0.0. You have to use either
 flink-connector-kafka-0.8_ or flink-connector-kafka-0.9_


 On Mon, Mar 14, 2016 at 11:17 AM, Balaji Rajagopalan <
 balaji.rajagopa...@olacabs.com> wrote:

> What I noticied was that, if I remove the dependency on
> flink-connector-kafka so it is clearly to do something with that
> dependency.
>
>
> On Mon, Mar 14, 2016 at 3:46 PM, Balaji Rajagopalan <
> balaji.rajagopa...@olacabs.com> wrote:
>
>> Robert,
>>I have  moved on to latest version of flink of 1.0.0 hoping that
>> will solve my problem with kafka connector . Here is my pom.xml but now I
>> cannot get the code compiled.
>>
>> [ERROR] Failed to execute goal
>> net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first)
>> on project flink-streaming-demo: Execution scala-compile-first of goal
>> net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact
>> {null:null:null:jar}: The groupId cannot be empty. -> [Help 1]
>>
>> I read about the above errors in most cases people where able to
>> overcome is by deleting the .m2 directory, and that did not fix the issue
>> for me.
>>
>> What I noticied was that, if I remove the dependency on
>>
>> Here is my pom.xml
>>
>> 
>> 
>> http://maven.apache.org/POM/4.0.0; 
>> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
>> xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
>> http://maven.apache.org/xsd/maven-4.0.0.xsd;>
>>4.0.0
>>
>>com.dataArtisans
>>flink-streaming-demo
>>0.1
>>jar
>>
>>Flink Streaming Demo
>>http://www.data-artisans.com
>>
>>
>>   UTF-8
>>   1.7.12
>>   1.0.0
>>   2.10
>>
>>
>>
>>
>>
>>
>>   
>>  org.apache.flink
>>  flink-streaming-scala_${scala.version}
>>  ${flink.version}
>>   
>>
>>   
>>  org.apache.flink
>>  flink-runtime-web_${scala.version}
>>  ${flink.version}
>>   
>>
>>   
>>  org.elasticsearch
>>  elasticsearch
>>  1.7.3
>>  compile
>>   
>>
>>   
>>  joda-time
>>  joda-time
>>  2.7
>>   
>>
>>   
>>  org.apache.kafka
>>  kafka_${scala.version}
>>  0.8.2.0
>>   
>>
>> 
>>   org.apache.flink
>>   flink-connector-kafka_${scala.version}
>>   ${flink.version}
>>   
>>
>>   
>> org.json4s
>> json4s-native_${scala.version}
>> 3.3.0
>>   
>>
>>
>>
>>
>>
>>   
>>
>>  
>>  
>> net.alchim31.maven
>> scala-maven-plugin
>> 3.2.1
>> 
>>
>>
>>   scala-compile-first
>>   process-resources
>>   
>>  compile
>>   
>>
>>
>>
>>
>>   scala-test-compile
>>   process-test-resources
>>   
>>  testCompile
>>   
>>
>> 
>> 
>>
>>   -Xms128m
>>   -Xmx512m
>>
>> 
>>  
>>
>>  
>> org.apache.maven.plugins
>> maven-dependency-plugin
>> 2.9
>> 
>>
>>   unpack
>>   
>> 

Re: kafka.javaapi.consumer.SimpleConsumer class not found

2016-03-15 Thread Balaji Rajagopalan
Robert,
  I got it working for 1.0.0.

balaji

On Mon, Mar 14, 2016 at 4:41 PM, Balaji Rajagopalan <
balaji.rajagopa...@olacabs.com> wrote:

> Yep the same issue as before(class not found)  with flink 0.10.2 with
> scala version 2.11. I was not able to use scala 2.10 since connector for
> flink_connector_kafka for 0.10.2 is not available.
>
> balaji
>
> On Mon, Mar 14, 2016 at 4:20 PM, Balaji Rajagopalan <
> balaji.rajagopa...@olacabs.com> wrote:
>
>> Yes figured that out, thanks for point that, my bad. I have put back
>> 0.10.2 as flink version, will try to reproduce the problem again, this time
>> I have explicitly called out the scala version as 2.11.
>>
>>
>> On Mon, Mar 14, 2016 at 4:14 PM, Robert Metzger 
>> wrote:
>>
>>> Hi,
>>>
>>> flink-connector-kafka_ doesn't exist for 1.0.0. You have to use either
>>> flink-connector-kafka-0.8_ or flink-connector-kafka-0.9_
>>>
>>>
>>> On Mon, Mar 14, 2016 at 11:17 AM, Balaji Rajagopalan <
>>> balaji.rajagopa...@olacabs.com> wrote:
>>>
 What I noticied was that, if I remove the dependency on
 flink-connector-kafka so it is clearly to do something with that
 dependency.


 On Mon, Mar 14, 2016 at 3:46 PM, Balaji Rajagopalan <
 balaji.rajagopa...@olacabs.com> wrote:

> Robert,
>I have  moved on to latest version of flink of 1.0.0 hoping that
> will solve my problem with kafka connector . Here is my pom.xml but now I
> cannot get the code compiled.
>
> [ERROR] Failed to execute goal
> net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first)
> on project flink-streaming-demo: Execution scala-compile-first of goal
> net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact
> {null:null:null:jar}: The groupId cannot be empty. -> [Help 1]
>
> I read about the above errors in most cases people where able to
> overcome is by deleting the .m2 directory, and that did not fix the issue
> for me.
>
> What I noticied was that, if I remove the dependency on
>
> Here is my pom.xml
>
> 
> 
> http://maven.apache.org/POM/4.0.0; 
> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
> xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> http://maven.apache.org/xsd/maven-4.0.0.xsd;>
>4.0.0
>
>com.dataArtisans
>flink-streaming-demo
>0.1
>jar
>
>Flink Streaming Demo
>http://www.data-artisans.com
>
>
>   UTF-8
>   1.7.12
>   1.0.0
>   2.10
>
>
>
>
>
>
>   
>  org.apache.flink
>  flink-streaming-scala_${scala.version}
>  ${flink.version}
>   
>
>   
>  org.apache.flink
>  flink-runtime-web_${scala.version}
>  ${flink.version}
>   
>
>   
>  org.elasticsearch
>  elasticsearch
>  1.7.3
>  compile
>   
>
>   
>  joda-time
>  joda-time
>  2.7
>   
>
>   
>  org.apache.kafka
>  kafka_${scala.version}
>  0.8.2.0
>   
>
> 
>   org.apache.flink
>   flink-connector-kafka_${scala.version}
>   ${flink.version}
>   
>
>   
> org.json4s
> json4s-native_${scala.version}
> 3.3.0
>   
>
>
>
>
>
>   
>
>  
>  
> net.alchim31.maven
> scala-maven-plugin
> 3.2.1
> 
>
>
>   scala-compile-first
>   process-resources
>   
>  compile
>   
>
>
>
>
>   scala-test-compile
>   process-test-resources
>   
>  testCompile
>   
>
> 
> 
>
>   -Xms128m
>   -Xmx512m
>
> 
>  
>
>  
> org.apache.maven.plugins
> maven-dependency-plugin
> 2.9
> 
>
>   unpack
>   
>   prepare-package
>   
>  unpack
>   
>   
>  
> 
> 
>org.apache.flink
> 

Re: kafka.javaapi.consumer.SimpleConsumer class not found

2016-03-14 Thread Balaji Rajagopalan
Yep the same issue as before(class not found)  with flink 0.10.2 with scala
version 2.11. I was not able to use scala 2.10 since connector for
flink_connector_kafka for 0.10.2 is not available.

balaji

On Mon, Mar 14, 2016 at 4:20 PM, Balaji Rajagopalan <
balaji.rajagopa...@olacabs.com> wrote:

> Yes figured that out, thanks for point that, my bad. I have put back
> 0.10.2 as flink version, will try to reproduce the problem again, this time
> I have explicitly called out the scala version as 2.11.
>
>
> On Mon, Mar 14, 2016 at 4:14 PM, Robert Metzger 
> wrote:
>
>> Hi,
>>
>> flink-connector-kafka_ doesn't exist for 1.0.0. You have to use either
>> flink-connector-kafka-0.8_ or flink-connector-kafka-0.9_
>>
>>
>> On Mon, Mar 14, 2016 at 11:17 AM, Balaji Rajagopalan <
>> balaji.rajagopa...@olacabs.com> wrote:
>>
>>> What I noticied was that, if I remove the dependency on
>>> flink-connector-kafka so it is clearly to do something with that
>>> dependency.
>>>
>>>
>>> On Mon, Mar 14, 2016 at 3:46 PM, Balaji Rajagopalan <
>>> balaji.rajagopa...@olacabs.com> wrote:
>>>
 Robert,
I have  moved on to latest version of flink of 1.0.0 hoping that
 will solve my problem with kafka connector . Here is my pom.xml but now I
 cannot get the code compiled.

 [ERROR] Failed to execute goal
 net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first)
 on project flink-streaming-demo: Execution scala-compile-first of goal
 net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact
 {null:null:null:jar}: The groupId cannot be empty. -> [Help 1]

 I read about the above errors in most cases people where able to
 overcome is by deleting the .m2 directory, and that did not fix the issue
 for me.

 What I noticied was that, if I remove the dependency on

 Here is my pom.xml

 
 
 http://maven.apache.org/POM/4.0.0; 
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
 http://maven.apache.org/xsd/maven-4.0.0.xsd;>
4.0.0

com.dataArtisans
flink-streaming-demo
0.1
jar

Flink Streaming Demo
http://www.data-artisans.com


   UTF-8
   1.7.12
   1.0.0
   2.10






   
  org.apache.flink
  flink-streaming-scala_${scala.version}
  ${flink.version}
   

   
  org.apache.flink
  flink-runtime-web_${scala.version}
  ${flink.version}
   

   
  org.elasticsearch
  elasticsearch
  1.7.3
  compile
   

   
  joda-time
  joda-time
  2.7
   

   
  org.apache.kafka
  kafka_${scala.version}
  0.8.2.0
   

 
   org.apache.flink
   flink-connector-kafka_${scala.version}
   ${flink.version}
   

   
 org.json4s
 json4s-native_${scala.version}
 3.3.0
   





   

  
  
 net.alchim31.maven
 scala-maven-plugin
 3.2.1
 


   scala-compile-first
   process-resources
   
  compile
   




   scala-test-compile
   process-test-resources
   
  testCompile
   

 
 

   -Xms128m
   -Xmx512m

 
  

  
 org.apache.maven.plugins
 maven-dependency-plugin
 2.9
 

   unpack
   
   prepare-package
   
  unpack
   
   
  
 
 
org.apache.flink

 flink-connector-kafka_${scala.version}
1.0.0
jar
false

 ${project.build.directory}/classes
org/apache/flink/**
 
   

Re: kafka.javaapi.consumer.SimpleConsumer class not found

2016-03-14 Thread Balaji Rajagopalan
Yes figured that out, thanks for point that, my bad. I have put back 0.10.2
as flink version, will try to reproduce the problem again, this time I have
explicitly called out the scala version as 2.11.


On Mon, Mar 14, 2016 at 4:14 PM, Robert Metzger  wrote:

> Hi,
>
> flink-connector-kafka_ doesn't exist for 1.0.0. You have to use either
> flink-connector-kafka-0.8_ or flink-connector-kafka-0.9_
>
>
> On Mon, Mar 14, 2016 at 11:17 AM, Balaji Rajagopalan <
> balaji.rajagopa...@olacabs.com> wrote:
>
>> What I noticied was that, if I remove the dependency on
>> flink-connector-kafka so it is clearly to do something with that
>> dependency.
>>
>>
>> On Mon, Mar 14, 2016 at 3:46 PM, Balaji Rajagopalan <
>> balaji.rajagopa...@olacabs.com> wrote:
>>
>>> Robert,
>>>I have  moved on to latest version of flink of 1.0.0 hoping that will
>>> solve my problem with kafka connector . Here is my pom.xml but now I cannot
>>> get the code compiled.
>>>
>>> [ERROR] Failed to execute goal
>>> net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first)
>>> on project flink-streaming-demo: Execution scala-compile-first of goal
>>> net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact
>>> {null:null:null:jar}: The groupId cannot be empty. -> [Help 1]
>>>
>>> I read about the above errors in most cases people where able to
>>> overcome is by deleting the .m2 directory, and that did not fix the issue
>>> for me.
>>>
>>> What I noticied was that, if I remove the dependency on
>>>
>>> Here is my pom.xml
>>>
>>> 
>>> 
>>> http://maven.apache.org/POM/4.0.0; 
>>> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
>>> xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
>>> http://maven.apache.org/xsd/maven-4.0.0.xsd;>
>>>4.0.0
>>>
>>>com.dataArtisans
>>>flink-streaming-demo
>>>0.1
>>>jar
>>>
>>>Flink Streaming Demo
>>>http://www.data-artisans.com
>>>
>>>
>>>   UTF-8
>>>   1.7.12
>>>   1.0.0
>>>   2.10
>>>
>>>
>>>
>>>
>>>
>>>
>>>   
>>>  org.apache.flink
>>>  flink-streaming-scala_${scala.version}
>>>  ${flink.version}
>>>   
>>>
>>>   
>>>  org.apache.flink
>>>  flink-runtime-web_${scala.version}
>>>  ${flink.version}
>>>   
>>>
>>>   
>>>  org.elasticsearch
>>>  elasticsearch
>>>  1.7.3
>>>  compile
>>>   
>>>
>>>   
>>>  joda-time
>>>  joda-time
>>>  2.7
>>>   
>>>
>>>   
>>>  org.apache.kafka
>>>  kafka_${scala.version}
>>>  0.8.2.0
>>>   
>>>
>>> 
>>>   org.apache.flink
>>>   flink-connector-kafka_${scala.version}
>>>   ${flink.version}
>>>   
>>>
>>>   
>>> org.json4s
>>> json4s-native_${scala.version}
>>> 3.3.0
>>>   
>>>
>>>
>>>
>>>
>>>
>>>   
>>>
>>>  
>>>  
>>> net.alchim31.maven
>>> scala-maven-plugin
>>> 3.2.1
>>> 
>>>
>>>
>>>   scala-compile-first
>>>   process-resources
>>>   
>>>  compile
>>>   
>>>
>>>
>>>
>>>
>>>   scala-test-compile
>>>   process-test-resources
>>>   
>>>  testCompile
>>>   
>>>
>>> 
>>> 
>>>
>>>   -Xms128m
>>>   -Xmx512m
>>>
>>> 
>>>  
>>>
>>>  
>>> org.apache.maven.plugins
>>> maven-dependency-plugin
>>> 2.9
>>> 
>>>
>>>   unpack
>>>   
>>>   prepare-package
>>>   
>>>  unpack
>>>   
>>>   
>>>  
>>> 
>>> 
>>>org.apache.flink
>>>
>>> flink-connector-kafka_${scala.version}
>>>1.0.0
>>>jar
>>>false
>>>
>>> ${project.build.directory}/classes
>>>org/apache/flink/**
>>> 
>>> 
>>> 
>>>org.apache.kafka
>>>kafka_${scala.version}
>>>0.8.2.0
>>>jar
>>>false
>>>
>>> ${project.build.directory}/classes
>>>kafka/**
>>> 
>>>  
>>>   
>>>
>>>

Re: kafka.javaapi.consumer.SimpleConsumer class not found

2016-03-14 Thread Robert Metzger
Hi,

flink-connector-kafka_ doesn't exist for 1.0.0. You have to use either
flink-connector-kafka-0.8_ or flink-connector-kafka-0.9_


On Mon, Mar 14, 2016 at 11:17 AM, Balaji Rajagopalan <
balaji.rajagopa...@olacabs.com> wrote:

> What I noticied was that, if I remove the dependency on
> flink-connector-kafka so it is clearly to do something with that
> dependency.
>
>
> On Mon, Mar 14, 2016 at 3:46 PM, Balaji Rajagopalan <
> balaji.rajagopa...@olacabs.com> wrote:
>
>> Robert,
>>I have  moved on to latest version of flink of 1.0.0 hoping that will
>> solve my problem with kafka connector . Here is my pom.xml but now I cannot
>> get the code compiled.
>>
>> [ERROR] Failed to execute goal
>> net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first)
>> on project flink-streaming-demo: Execution scala-compile-first of goal
>> net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact
>> {null:null:null:jar}: The groupId cannot be empty. -> [Help 1]
>>
>> I read about the above errors in most cases people where able to overcome
>> is by deleting the .m2 directory, and that did not fix the issue for me.
>>
>> What I noticied was that, if I remove the dependency on
>>
>> Here is my pom.xml
>>
>> 
>> 
>> http://maven.apache.org/POM/4.0.0; 
>> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
>> xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
>> http://maven.apache.org/xsd/maven-4.0.0.xsd;>
>>4.0.0
>>
>>com.dataArtisans
>>flink-streaming-demo
>>0.1
>>jar
>>
>>Flink Streaming Demo
>>http://www.data-artisans.com
>>
>>
>>   UTF-8
>>   1.7.12
>>   1.0.0
>>   2.10
>>
>>
>>
>>
>>
>>
>>   
>>  org.apache.flink
>>  flink-streaming-scala_${scala.version}
>>  ${flink.version}
>>   
>>
>>   
>>  org.apache.flink
>>  flink-runtime-web_${scala.version}
>>  ${flink.version}
>>   
>>
>>   
>>  org.elasticsearch
>>  elasticsearch
>>  1.7.3
>>  compile
>>   
>>
>>   
>>  joda-time
>>  joda-time
>>  2.7
>>   
>>
>>   
>>  org.apache.kafka
>>  kafka_${scala.version}
>>  0.8.2.0
>>   
>>
>> 
>>   org.apache.flink
>>   flink-connector-kafka_${scala.version}
>>   ${flink.version}
>>   
>>
>>   
>> org.json4s
>> json4s-native_${scala.version}
>> 3.3.0
>>   
>>
>>
>>
>>
>>
>>   
>>
>>  
>>  
>> net.alchim31.maven
>> scala-maven-plugin
>> 3.2.1
>> 
>>
>>
>>   scala-compile-first
>>   process-resources
>>   
>>  compile
>>   
>>
>>
>>
>>
>>   scala-test-compile
>>   process-test-resources
>>   
>>  testCompile
>>   
>>
>> 
>> 
>>
>>   -Xms128m
>>   -Xmx512m
>>
>> 
>>  
>>
>>  
>> org.apache.maven.plugins
>> maven-dependency-plugin
>> 2.9
>> 
>>
>>   unpack
>>   
>>   prepare-package
>>   
>>  unpack
>>   
>>   
>>  
>> 
>> 
>>org.apache.flink
>>
>> flink-connector-kafka_${scala.version}
>>1.0.0
>>jar
>>false
>>
>> ${project.build.directory}/classes
>>org/apache/flink/**
>> 
>> 
>> 
>>org.apache.kafka
>>kafka_${scala.version}
>>0.8.2.0
>>jar
>>false
>>
>> ${project.build.directory}/classes
>>kafka/**
>> 
>>  
>>   
>>
>> 
>>  
>>
>>  
>>
>>  
>> org.apache.maven.plugins
>> maven-compiler-plugin
>> 3.1
>> 
>>1.8 
>>1.8 
>> 
>>  
>>
>>  
>> org.apache.rat
>> apache-rat-plugin
>> 0.10
>> false
>> 
>>
>>   verify
>>   
>>  

Re: kafka.javaapi.consumer.SimpleConsumer class not found

2016-03-14 Thread Balaji Rajagopalan
What I noticied was that, if I remove the dependency on
flink-connector-kafka so it is clearly to do something with that
dependency.


On Mon, Mar 14, 2016 at 3:46 PM, Balaji Rajagopalan <
balaji.rajagopa...@olacabs.com> wrote:

> Robert,
>I have  moved on to latest version of flink of 1.0.0 hoping that will
> solve my problem with kafka connector . Here is my pom.xml but now I cannot
> get the code compiled.
>
> [ERROR] Failed to execute goal
> net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first)
> on project flink-streaming-demo: Execution scala-compile-first of goal
> net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact
> {null:null:null:jar}: The groupId cannot be empty. -> [Help 1]
>
> I read about the above errors in most cases people where able to overcome
> is by deleting the .m2 directory, and that did not fix the issue for me.
>
> What I noticied was that, if I remove the dependency on
>
> Here is my pom.xml
>
> 
> 
> http://maven.apache.org/POM/4.0.0; 
> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
> xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> http://maven.apache.org/xsd/maven-4.0.0.xsd;>
>4.0.0
>
>com.dataArtisans
>flink-streaming-demo
>0.1
>jar
>
>Flink Streaming Demo
>http://www.data-artisans.com
>
>
>   UTF-8
>   1.7.12
>   1.0.0
>   2.10
>
>
>
>
>
>
>   
>  org.apache.flink
>  flink-streaming-scala_${scala.version}
>  ${flink.version}
>   
>
>   
>  org.apache.flink
>  flink-runtime-web_${scala.version}
>  ${flink.version}
>   
>
>   
>  org.elasticsearch
>  elasticsearch
>  1.7.3
>  compile
>   
>
>   
>  joda-time
>  joda-time
>  2.7
>   
>
>   
>  org.apache.kafka
>  kafka_${scala.version}
>  0.8.2.0
>   
>
> 
>   org.apache.flink
>   flink-connector-kafka_${scala.version}
>   ${flink.version}
>   
>
>   
> org.json4s
> json4s-native_${scala.version}
> 3.3.0
>   
>
>
>
>
>
>   
>
>  
>  
> net.alchim31.maven
> scala-maven-plugin
> 3.2.1
> 
>
>
>   scala-compile-first
>   process-resources
>   
>  compile
>   
>
>
>
>
>   scala-test-compile
>   process-test-resources
>   
>  testCompile
>   
>
> 
> 
>
>   -Xms128m
>   -Xmx512m
>
> 
>  
>
>  
> org.apache.maven.plugins
> maven-dependency-plugin
> 2.9
> 
>
>   unpack
>   
>   prepare-package
>   
>  unpack
>   
>   
>  
> 
> 
>org.apache.flink
>
> flink-connector-kafka_${scala.version}
>1.0.0
>jar
>false
>
> ${project.build.directory}/classes
>org/apache/flink/**
> 
> 
> 
>org.apache.kafka
>kafka_${scala.version}
>0.8.2.0
>jar
>false
>
> ${project.build.directory}/classes
>kafka/**
> 
>  
>   
>
> 
>  
>
>  
>
>  
> org.apache.maven.plugins
> maven-compiler-plugin
> 3.1
> 
>1.8 
>1.8 
> 
>  
>
>  
> org.apache.rat
> apache-rat-plugin
> 0.10
> false
> 
>
>   verify
>   
>  check
>   
>
> 
> 
>false
>0
>
>   
>implementation="org.apache.rat.analysis.license.SimplePatternBasedLicense">
>  AL2 
>  Apache License 2.0
>  
>  
> Copyright 2015 data Artisans GmbH
> 

Re: kafka.javaapi.consumer.SimpleConsumer class not found

2016-03-14 Thread Balaji Rajagopalan
Robert,
   I have  moved on to latest version of flink of 1.0.0 hoping that will
solve my problem with kafka connector . Here is my pom.xml but now I cannot
get the code compiled.

[ERROR] Failed to execute goal
net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first)
on project flink-streaming-demo: Execution scala-compile-first of goal
net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact
{null:null:null:jar}: The groupId cannot be empty. -> [Help 1]

I read about the above errors in most cases people where able to overcome
is by deleting the .m2 directory, and that did not fix the issue for me.

What I noticied was that, if I remove the dependency on

Here is my pom.xml



http://maven.apache.org/POM/4.0.0;
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
   4.0.0

   com.dataArtisans
   flink-streaming-demo
   0.1
   jar

   Flink Streaming Demo
   http://www.data-artisans.com

   
  UTF-8
  1.7.12
  1.0.0
  2.10
   

   



  
 org.apache.flink
 flink-streaming-scala_${scala.version}
 ${flink.version}
  

  
 org.apache.flink
 flink-runtime-web_${scala.version}
 ${flink.version}
  

  
 org.elasticsearch
 elasticsearch
 1.7.3
 compile
  

  
 joda-time
 joda-time
 2.7
  

  
 org.apache.kafka
 kafka_${scala.version}
 0.8.2.0
  


  org.apache.flink
  flink-connector-kafka_${scala.version}
  ${flink.version}
  

  
org.json4s
json4s-native_${scala.version}
3.3.0
  


   

   
  

 
 
net.alchim31.maven
scala-maven-plugin
3.2.1

   
   
  scala-compile-first
  process-resources
  
 compile
  
   

   
   
  scala-test-compile
  process-test-resources
  
 testCompile
  
   


   
  -Xms128m
  -Xmx512m
   

 

 
org.apache.maven.plugins
maven-dependency-plugin
2.9

   
  unpack
  
  prepare-package
  
 unpack
  
  
 


   org.apache.flink

flink-connector-kafka_${scala.version}
   1.0.0
   jar
   false

${project.build.directory}/classes
   org/apache/flink/**



   org.apache.kafka
   kafka_${scala.version}
   0.8.2.0
   jar
   false

${project.build.directory}/classes
   kafka/**

 
  
   

 

 

 
org.apache.maven.plugins
maven-compiler-plugin
3.1

   1.8 
   1.8 

 

 
org.apache.rat
apache-rat-plugin
0.10
false

   
  verify
  
 check
  
   


   false
   0
   
  
  
 AL2 
 Apache License 2.0
 
 
Copyright 2015 data Artisans GmbH
Licensed under the Apache License,
Version 2.0 (the "License");
 
  
   
   
  
 Apache License 2.0
  
   
   
  
  **/.*
  **/*.prefs
  **/*.properties
  **/*.log
  *.txt/**
  
  **/README.md
  CHANGELOG
  
  **/*.iml
  
  **/target/**
  **/build/**
   

 

 
org.apache.maven.plugins
maven-checkstyle-plugin
2.12.1
 

Re: kafka.javaapi.consumer.SimpleConsumer class not found

2016-03-14 Thread Robert Metzger
Can you send me the full build file to further investigate the issue?

On Fri, Mar 11, 2016 at 4:56 PM, Balaji Rajagopalan <
balaji.rajagopa...@olacabs.com> wrote:

> Robert,
>   That did not fix it ( using flink and connector same version) . Tried
> with scala version 2.11, so will try to see scala 2.10 makes any
> difference.
>
> balaji
>
> On Fri, Mar 11, 2016 at 8:06 PM, Robert Metzger 
> wrote:
>
>> Hi,
>>
>> you have to use the same version for all dependencies from the
>> "org.apache.flink" group.
>>
>> You said these are the versions you are using:
>>
>> flink.version = 0.10.2
>> kafka.verison = 0.8.2
>> flink.kafka.connection.verion=0.9.1
>>
>> For the connector, you also need to use 0.10.2.
>>
>>
>>
>> On Fri, Mar 11, 2016 at 9:56 AM, Balaji Rajagopalan <
>> balaji.rajagopa...@olacabs.com> wrote:
>>
>>> I am tyring to use the flink kafka connector, for this I have specified
>>> the kafka connector dependency and created a fat jar since default flink
>>> installation does not contain kafka connector jars. I have made sure that
>>> flink-streaming-demo-0.1.jar has the
>>> kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not
>>> found exception.
>>>
>>> The code for kafka connector in flink.
>>>
>>> val env = StreamExecutionEnvironment.getExecutionEnvironment
>>> val prop:Properties = new Properties()
>>> prop.setProperty("zookeeper.connect","somezookeer:2181")
>>> prop.setProperty("group.id","some-group")
>>> prop.setProperty("bootstrap.servers","somebroker:9092")
>>>
>>> val stream = env
>>>   .addSource(new FlinkKafkaConsumer082[String]("location", new 
>>> SimpleStringSchema, prop))
>>>
>>> jar tvf flink-streaming-demo-0.1.jar | grep
>>> kafka.javaapi.consumer.SimpleConsumer
>>>
>>>   5111 Fri Mar 11 14:18:36 UTC 2016
>>> *kafka/javaapi/consumer/SimpleConsumer*.class
>>>
>>> flink.version = 0.10.2
>>> kafka.verison = 0.8.2
>>> flink.kafka.connection.verion=0.9.1
>>>
>>> The command that I use to run the flink program in yarn cluster is
>>> below,
>>>
>>> HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c
>>> com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster
>>> -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar
>>>
>>> java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer
>>>
>>> at
>>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)
>>>
>>> at
>>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.(FlinkKafkaConsumer.java:281)
>>>
>>> at
>>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.(FlinkKafkaConsumer082.java:49)
>>>
>>> at
>>> com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)
>>>
>>> at
>>> com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)
>>>
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>
>>> at
>>> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)
>>>
>>> at
>>> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)
>>>
>>> at org.apache.flink.client.program.Client.runBlocking(Client.java:252)
>>>
>>> at
>>> org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)
>>>
>>> at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)
>>>
>>> at
>>> org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)
>>>
>>> at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)
>>>
>>> Caused by: java.lang.ClassNotFoundException:
>>> kafka.javaapi.consumer.SimpleConsumer
>>>
>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>
>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>>
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>
>>> ... 16 more
>>>
>>>
>>> Any help appreciated.
>>>
>>>
>>> balaji
>>>
>>
>>
>


Re: kafka.javaapi.consumer.SimpleConsumer class not found

2016-03-11 Thread Balaji Rajagopalan
Robert,
  That did not fix it ( using flink and connector same version) . Tried
with scala version 2.11, so will try to see scala 2.10 makes any
difference.

balaji

On Fri, Mar 11, 2016 at 8:06 PM, Robert Metzger  wrote:

> Hi,
>
> you have to use the same version for all dependencies from the
> "org.apache.flink" group.
>
> You said these are the versions you are using:
>
> flink.version = 0.10.2
> kafka.verison = 0.8.2
> flink.kafka.connection.verion=0.9.1
>
> For the connector, you also need to use 0.10.2.
>
>
>
> On Fri, Mar 11, 2016 at 9:56 AM, Balaji Rajagopalan <
> balaji.rajagopa...@olacabs.com> wrote:
>
>> I am tyring to use the flink kafka connector, for this I have specified
>> the kafka connector dependency and created a fat jar since default flink
>> installation does not contain kafka connector jars. I have made sure that
>> flink-streaming-demo-0.1.jar has the
>> kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not
>> found exception.
>>
>> The code for kafka connector in flink.
>>
>> val env = StreamExecutionEnvironment.getExecutionEnvironment
>> val prop:Properties = new Properties()
>> prop.setProperty("zookeeper.connect","somezookeer:2181")
>> prop.setProperty("group.id","some-group")
>> prop.setProperty("bootstrap.servers","somebroker:9092")
>>
>> val stream = env
>>   .addSource(new FlinkKafkaConsumer082[String]("location", new 
>> SimpleStringSchema, prop))
>>
>> jar tvf flink-streaming-demo-0.1.jar | grep
>> kafka.javaapi.consumer.SimpleConsumer
>>
>>   5111 Fri Mar 11 14:18:36 UTC 2016
>> *kafka/javaapi/consumer/SimpleConsumer*.class
>>
>> flink.version = 0.10.2
>> kafka.verison = 0.8.2
>> flink.kafka.connection.verion=0.9.1
>>
>> The command that I use to run the flink program in yarn cluster is below,
>>
>> HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c
>> com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster
>> -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar
>>
>> java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer
>>
>> at
>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)
>>
>> at
>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.(FlinkKafkaConsumer.java:281)
>>
>> at
>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.(FlinkKafkaConsumer082.java:49)
>>
>> at
>> com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)
>>
>> at
>> com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)
>>
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>
>> at java.lang.reflect.Method.invoke(Method.java:497)
>>
>> at
>> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)
>>
>> at
>> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)
>>
>> at org.apache.flink.client.program.Client.runBlocking(Client.java:252)
>>
>> at
>> org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)
>>
>> at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)
>>
>> at
>> org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)
>>
>> at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)
>>
>> Caused by: java.lang.ClassNotFoundException:
>> kafka.javaapi.consumer.SimpleConsumer
>>
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>
>> ... 16 more
>>
>>
>> Any help appreciated.
>>
>>
>> balaji
>>
>
>


Re: kafka.javaapi.consumer.SimpleConsumer class not found

2016-03-11 Thread Robert Metzger
Hi,

you have to use the same version for all dependencies from the
"org.apache.flink" group.

You said these are the versions you are using:

flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

For the connector, you also need to use 0.10.2.



On Fri, Mar 11, 2016 at 9:56 AM, Balaji Rajagopalan <
balaji.rajagopa...@olacabs.com> wrote:

> I am tyring to use the flink kafka connector, for this I have specified
> the kafka connector dependency and created a fat jar since default flink
> installation does not contain kafka connector jars. I have made sure that
> flink-streaming-demo-0.1.jar has the
> kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not
> found exception.
>
> The code for kafka connector in flink.
>
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val prop:Properties = new Properties()
> prop.setProperty("zookeeper.connect","somezookeer:2181")
> prop.setProperty("group.id","some-group")
> prop.setProperty("bootstrap.servers","somebroker:9092")
>
> val stream = env
>   .addSource(new FlinkKafkaConsumer082[String]("location", new 
> SimpleStringSchema, prop))
>
> jar tvf flink-streaming-demo-0.1.jar | grep
> kafka.javaapi.consumer.SimpleConsumer
>
>   5111 Fri Mar 11 14:18:36 UTC 2016
> *kafka/javaapi/consumer/SimpleConsumer*.class
>
> flink.version = 0.10.2
> kafka.verison = 0.8.2
> flink.kafka.connection.verion=0.9.1
>
> The command that I use to run the flink program in yarn cluster is below,
>
> HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c
> com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster
> -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar
>
> java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer
>
> at
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)
>
> at
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.(FlinkKafkaConsumer.java:281)
>
> at
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.(FlinkKafkaConsumer082.java:49)
>
> at
> com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)
>
> at
> com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:497)
>
> at
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)
>
> at
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)
>
> at org.apache.flink.client.program.Client.runBlocking(Client.java:252)
>
> at
> org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)
>
> at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)
>
> at
> org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)
>
> at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)
>
> Caused by: java.lang.ClassNotFoundException:
> kafka.javaapi.consumer.SimpleConsumer
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
> ... 16 more
>
>
> Any help appreciated.
>
>
> balaji
>


kafka.javaapi.consumer.SimpleConsumer class not found

2016-03-11 Thread Balaji Rajagopalan
I am tyring to use the flink kafka connector, for this I have specified the
kafka connector dependency and created a fat jar since default flink
installation does not contain kafka connector jars. I have made sure that
flink-streaming-demo-0.1.jar has the
kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not
found exception.

The code for kafka connector in flink.

val env = StreamExecutionEnvironment.getExecutionEnvironment
val prop:Properties = new Properties()
prop.setProperty("zookeeper.connect","somezookeer:2181")
prop.setProperty("group.id","some-group")
prop.setProperty("bootstrap.servers","somebroker:9092")

val stream = env
  .addSource(new FlinkKafkaConsumer082[String]("location", new
SimpleStringSchema, prop))

jar tvf flink-streaming-demo-0.1.jar | grep
kafka.javaapi.consumer.SimpleConsumer

  5111 Fri Mar 11 14:18:36 UTC 2016 *kafka/javaapi/consumer/SimpleConsumer*
.class

flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

The command that I use to run the flink program in yarn cluster is below,

HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c
com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster
-yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar

java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer

at
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)

at
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.(FlinkKafkaConsumer.java:281)

at
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.(FlinkKafkaConsumer082.java:49)

at
com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)

at
com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)

at
org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)

at org.apache.flink.client.program.Client.runBlocking(Client.java:252)

at
org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)

at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)

at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)

at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)

Caused by: java.lang.ClassNotFoundException:
kafka.javaapi.consumer.SimpleConsumer

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 16 more


Any help appreciated.


balaji