Hi Joe,
 There was a wrong kafka release picked up from our local repo. Sorry for
bugging you with the no-issue.


On Thu, Jan 23, 2014 at 10:43 PM, Joe Stein <joe.st...@stealth.ly> wrote:

> Hi Abhinav,
>
> I just compiled and ran a consumer in java with maven and it worked fine.
>
> [root@localhost ~]# java -version
> java version "1.6.0_29"
> Java(TM) SE Runtime Environment (build 1.6.0_29-b11)
> Java HotSpot(TM) 64-Bit Server VM (build 20.4-b02, mixed mode)
>
> I did this using
> https://github.com/stealthly/dropwizard-kafka-http(changing the target
> & source in the pom to 1.6)
>
> are you using the Oracle JDK ?
>
> /*******************************************
>  Joe Stein
>  Founder, Principal Consultant
>  Big Data Open Source Security LLC
>  http://www.stealth.ly
>  Twitter: @allthingshadoop <http://www.twitter.com/allthingshadoop>
> ********************************************/
>
>
> On Thu, Jan 23, 2014 at 2:21 AM, Abhinav Anand <ab.rv...@gmail.com> wrote:
>
> > Hi Joe,
> >   I am trying to setup a Kafka Simple Consumer in java. I am using the
> >  kafka.consumer.Consumer.createJavaConsumerConnector to create a consumer
> > connector. The connector is used for get message streams. I am using
> > "kafka.javaapi.consumer.ConsumerConnector"
> >
> > *Code: *
> >
> > import kafka.consumer.*;
> > import kafka.javaapi.consumer.ConsumerConnector;
> > *ConsumerConnector consumerConnector =
> > Consumer.createJavaConsumerConnector(getConsumerConfig());*
> >  Map<String,Integer> topicCountMap = new HashMap<String, Integer>();
> >  topicCountMap.put(topic, 1);
> >  Map<String, List<KafkaStream<byte[], byte[]>>> topicStreamMap =
> > consumerConnector.createMessageStreams(topicCountMap);
> >
> >
> >
> > *My maven dependency reads as *
> >    <dependency>
> >   <groupId>org.apache.kafka</groupId>
> >   <artifactId>kafka_2.10</artifactId>
> >   <version>0.8.0</version>
> >     </dependency>
> > *JRE version: 1.6*
> >
> > *The error I am getting is *
> >
> > testHasNext(com.walmartlabs.mupd8.KafkaSourceTest)  Time elapsed: 0.851
> sec
> >  <<< ERROR!
> >
> > java.lang.UnsupportedClassVersionError:
> > kafka/javaapi/consumer/ConsumerConnector : Unsupported major.minor
> version
> > 51.0
> >
> > at java.lang.ClassLoader.defineClass1(Native Method)
> >
> > at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
> >
> > at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
> >
> > at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> >
> > at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> >
> > at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
> >
> > at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> >
> > at java.security.AccessController.doPrivileged(Native Method)
> >
> > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >
> > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >
> > at kafka.consumer.Consumer.createJavaConsumerConnector(Unknown Source)
> >
> > at com.walmartlabs.mupd8.KafkaSource.getIterator(KafkaSource.java:111)
> >
> > at com.walmartlabs.mupd8.KafkaSource.initialize(KafkaSource.java:143)
> >
> >
> >
> >
> >
> > On Thu, Jan 23, 2014 at 12:05 PM, Joe Stein <joe.st...@stealth.ly>
> wrote:
> >
> > > It would be helpful if you can reproduce the issue.
> > >
> > > /*******************************************
> > >  Joe Stein
> > >  Founder, Principal Consultant
> > >  Big Data Open Source Security LLC
> > >  http://www.stealth.ly
> > >  Twitter: @allthingshadoop
> > > ********************************************/
> > >
> > >
> > > On Jan 22, 2014, at 11:22 PM, Joe Stein <joe.st...@stealth.ly> wrote:
> > >
> > > > Can you share your reference?
> > > >
> > > >
> > > > /*******************************************
> > > > Joe Stein
> > > > Founder, Principal Consultant
> > > > Big Data Open Source Security LLC
> > > > http://www.stealth.ly
> > > > Twitter: @allthingshadoop
> > > > ********************************************/
> > > >
> > > >
> > > > On Jan 22, 2014, at 10:36 PM, Abhinav Anand <ab.rv...@gmail.com>
> > wrote:
> > > >
> > > >> Hi Joe,
> > > >> I am using jre 1.6 and I don't see any reason for the error. But I
> am
> > > >> still getting the exception while running the consumer with jre 6.
> It
> > > runs
> > > >> fine with jre 7
> > > >>
> > > >> Regards,
> > > >> Abhinav
> > > >>
> > > >>
> > > >> On Thu, Jan 23, 2014 at 3:57 AM, Joe Stein <joe.st...@stealth.ly>
> > > wrote:
> > > >>
> > > >>> 0.8.0 final release was built with JDK 6 (which was RC5)
> > > >>>
> > > >>> /*******************************************
> > > >>> Joe Stein
> > > >>> Founder, Principal Consultant
> > > >>> Big Data Open Source Security LLC
> > > >>> http://www.stealth.ly
> > > >>> Twitter: @allthingshadoop <http://www.twitter.com/allthingshadoop>
> > > >>> ********************************************/
> > > >>>
> > > >>>
> > > >>> On Wed, Jan 22, 2014 at 5:16 PM, Abhinav Anand <ab.rv...@gmail.com
> >
> > > wrote:
> > > >>>
> > > >>>> I am using jre 1.6.
> > > >>>>
> > > >>>> All the release candidates were built against 1.7. was the final
> > > release
> > > >>>> also built against 1.7 ?
> > > >>>>
> > > >>>>
> > > >>>> On Thu, Jan 23, 2014 at 3:25 AM, Abhinav Anand <
> ab.rv...@gmail.com>
> > > >>> wrote:
> > > >>>>
> > > >>>>> Hi,
> > > >>>>> I have kafka_2.10 version 0.8.0 in my maven dependency. I am
> trying
> > > to
> > > >>>>> run a consumer. It is throwing major.minor version error.
> > > >>>>>
> > > >>>>> *java.lang.UnsupportedClassVersionError:
> > > >>>>> kafka/javaapi/consumer/ConsumerConnector : Unsupported
> major.minor
> > > >>>> version
> > > >>>>> 51.0*
> > > >>>>>
> > > >>>>> Is the Kafka repo built against jdk 1.7?
> > > >>>>>
> > > >>>>> --
> > > >>>>> Abhinav Anand
> > > >>>>>
> > > >>>>
> > > >>>>
> > > >>>>
> > > >>>> --
> > > >>>> Abhinav Anand
> > > >>>>
> > > >>>
> > > >>
> > > >>
> > > >>
> > > >> --
> > > >> Abhinav Anand
> > >
> >
> >
> >
> > --
> > Abhinav Anand
> >
>



-- 
Abhinav Anand

Reply via email to