RE: [EXTERNAL]Cannot list topics

2017-11-07 Thread Preston, Dale
It's almost certainly a typo in your command lines.  Not sure how to help 
without you posting them as requested.

Also, post the console output from when you created the topic.

Dale

-Original Message-
From: Donghun Kim [mailto:kimdho...@gmail.com] 
Sent: Monday, November 6, 2017 7:03 PM
To: users@kafka.apache.org
Subject: [EXTERNAL]Cannot list topics

I’m just following Quickstart document from 
https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fkafka.apache.org%2F=01%7C01%7CDale.Preston%40conocophillips.com%7Cc73b087b8ac74aa6c3de08d5257c4cdb%7Cb449db5ea80a48eba4c23c88bb78353b%7C0=dW5HwUOZvrj8vmWWBSkqWenomcAgXNCsacdpdjw2ksk%3D=0
 
.
As instantiated, Tried to list topic that I made but nothing show up.
I might be doing something wrong, but it seems to need fix.

Thanx :)


RE: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup

2017-11-06 Thread Preston, Dale
Thanks, Jaikiran.  I will do that today or tonight.

Dale

-Original Message-
From: Jaikiran Pai [mailto:jai.forums2...@gmail.com] 
Sent: Monday, November 6, 2017 10:43 AM
To: users@kafka.apache.org
Subject: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup

Can you take a couple of thread dumps with an interval of around 5 seconds each 
when that 60 second delay occurs? You can use a tool like jstack to do that. 
That might give some hint on what’s going on.

-Jaikiran

On Monday, November 6, 2017, Preston, Dale <dale.pres...@conocophillips.com>
wrote:

> (Answering now from my work email so please don't be confused.)
>
> The topic is already existing.
>
> -Original Message-
> From: Jaikiran Pai [mailto:jai.forums2...@gmail.com <javascript:;>]
> Sent: Sunday, November 5, 2017 10:56 PM
> To: users@kafka.apache.org <javascript:;>
> Subject: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup
>
> Is the topic to which the message is being produced, already present 
> or is it auto created?
>
> -Jaikiran
>
>
> On 05/11/17 3:43 PM, Dale wrote:
> > I am using the 0.9.0.0 log4j appender for Kafka because I have a lot 
> > of
> apps dependent on log4j 1.2.x that cannot be upgraded to use newer versions
> of log4j.   It appears that the appender has become part of log4j code in
> later versions of both tools.
> >
> > When I start my test app, the first message takes an exact and
> consistent 60 seconds plus a couple milliseconds to go out.  The 
> second message takes right around 200 milliseconds, and all the 
> messages after that take a couple of milliseconds.  The timing from 
> message 1 to 2 could be tolerated but the 60 seconds will never work 
> since the production use case app would typically run for 20 to 30 seconds.
> >
> > For testing, I brought the appender code into my project and added 
> > some
> additional console messages so I could see what is going on.  Here’s a 
> snippet of the console output:
> >
> > START LOG SNIPPET*** 
> > G:\kafkademoworkspace\testlog4jgenerator>java -Dlog4j.debug 
> > -Dlog4j.configuration=file:///g:\kafkademoworkspace\testlog4jgenerat
> > or
> > \log4j.properties -cp
> > .\;G:\kafkademoworkspace\testlog4jgenerator\target\testlog4jgenerator.
> > jar;g:\kafkademoworkspace\testlog4jgenerator\target\libs\log4j-1.2.17.
> > jar;g:\kafkademoworkspace\testlog4jgenerator\target\libs\*;g:\kafkad
> > em 
> > oworkspace\testlog4jgenerator\target\libs\kafka-clients-0.9.0.0.jar
> > com.mydomainname.messaging.testlog4jgenerator.LogGenerator
> > log4j: Using URL 
> > [file:/g:/kafkademoworkspace/testlog4jgenerator/log4j.properties]
> for automatic log4j configuration.
> > log4j: Reading configuration from URL 
> > file:/g:/kafkademoworkspace/testlog4jgenerator/log4j.properties
> > log4j: Parsing for [root] with value=[DEBUG,file,KAFKA].
> > log4j: Level token is [DEBUG].
> > log4j: Category root set to DEBUG
> > log4j: Parsing appender named "file".
> > log4j: Parsing layout options for "file".
> > log4j: Setting property [conversionPattern] to [%d{-MM-dd
> HH:mm:ss,SSS} %-5p %c{1}:%L - %m%n].
> > log4j: End of parsing for "file".
> > log4j: Setting property [file] to [/apps/logs/logtest.log].
> > log4j: Setting property [maxBackupIndex] to [10].
> > log4j: Setting property [maxFileSize] to [10MB].
> > log4j: setFile called: /apps/logs/logtest.log, true
> > log4j: setFile ended
> > log4j: Parsed "file" options.
> > log4j: Parsing appender named "KAFKA".
> > log4j: Parsing layout options for "KAFKA".
> > log4j: Setting property [conversionPattern] to [%d{-MM-dd
> HH:mm:ss,SSS} %-5p %c{1}:%L - %m%n].
> > log4j: End of parsing for "KAFKA".
> > log4j: Setting property [compressionType] to [none].
> > log4j: Setting property [topic] to [test].
> > log4j: Setting property [brokerList] to [localhost:9092].
> > log4j: Setting property [syncSend] to [false].
> > DPLOG: 2017-11-05T09:56:16.072Z - in Producer - creating new 
> > KafkaProducer
> > log4j: Kafka producer connected to localhost:9092
> > log4j: Logging for topic: test
> > log4j: Parsed "KAFKA" options.
> > log4j: Finished configuring.
> > 
> > DPLOG: 2017-11-05T09:56:16.338Z - append START
> > DPLOG: 2017-11-05T09:56:16.339Z - after subAppend.  Message is:
> 2017-11-05 03:56:16,333 DEBUG Sender:123 - Starting Kafka producer I/O 
> thread.
> >
> > log4j: [Sun Nov 05 03:56:16 CST 2017]2017-11-05 03:56:16,333 DEBUG

RE: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup

2017-11-06 Thread Preston, Dale
(Answering now from my work email so please don't be confused.)

The topic is already existing.

-Original Message-
From: Jaikiran Pai [mailto:jai.forums2...@gmail.com] 
Sent: Sunday, November 5, 2017 10:56 PM
To: users@kafka.apache.org
Subject: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup

Is the topic to which the message is being produced, already present or is it 
auto created?

-Jaikiran


On 05/11/17 3:43 PM, Dale wrote:
> I am using the 0.9.0.0 log4j appender for Kafka because I have a lot of apps 
> dependent on log4j 1.2.x that cannot be upgraded to use newer versions of 
> log4j.   It appears that the appender has become part of log4j code in later 
> versions of both tools.
>
> When I start my test app, the first message takes an exact and consistent 60 
> seconds plus a couple milliseconds to go out.  The second message takes right 
> around 200 milliseconds, and all the messages after that take a couple of 
> milliseconds.  The timing from message 1 to 2 could be tolerated but the 60 
> seconds will never work since the production use case app would typically run 
> for 20 to 30 seconds.
>
> For testing, I brought the appender code into my project and added some 
> additional console messages so I could see what is going on.  Here’s a 
> snippet of the console output:
>
> START LOG SNIPPET***
> G:\kafkademoworkspace\testlog4jgenerator>java -Dlog4j.debug 
> -Dlog4j.configuration=file:///g:\kafkademoworkspace\testlog4jgenerator
> \log4j.properties -cp 
> .\;G:\kafkademoworkspace\testlog4jgenerator\target\testlog4jgenerator.
> jar;g:\kafkademoworkspace\testlog4jgenerator\target\libs\log4j-1.2.17.
> jar;g:\kafkademoworkspace\testlog4jgenerator\target\libs\*;g:\kafkadem
> oworkspace\testlog4jgenerator\target\libs\kafka-clients-0.9.0.0.jar 
> com.mydomainname.messaging.testlog4jgenerator.LogGenerator
> log4j: Using URL 
> [file:/g:/kafkademoworkspace/testlog4jgenerator/log4j.properties] for 
> automatic log4j configuration.
> log4j: Reading configuration from URL 
> file:/g:/kafkademoworkspace/testlog4jgenerator/log4j.properties
> log4j: Parsing for [root] with value=[DEBUG,file,KAFKA].
> log4j: Level token is [DEBUG].
> log4j: Category root set to DEBUG
> log4j: Parsing appender named "file".
> log4j: Parsing layout options for "file".
> log4j: Setting property [conversionPattern] to [%d{-MM-dd HH:mm:ss,SSS} 
> %-5p %c{1}:%L - %m%n].
> log4j: End of parsing for "file".
> log4j: Setting property [file] to [/apps/logs/logtest.log].
> log4j: Setting property [maxBackupIndex] to [10].
> log4j: Setting property [maxFileSize] to [10MB].
> log4j: setFile called: /apps/logs/logtest.log, true
> log4j: setFile ended
> log4j: Parsed "file" options.
> log4j: Parsing appender named "KAFKA".
> log4j: Parsing layout options for "KAFKA".
> log4j: Setting property [conversionPattern] to [%d{-MM-dd HH:mm:ss,SSS} 
> %-5p %c{1}:%L - %m%n].
> log4j: End of parsing for "KAFKA".
> log4j: Setting property [compressionType] to [none].
> log4j: Setting property [topic] to [test].
> log4j: Setting property [brokerList] to [localhost:9092].
> log4j: Setting property [syncSend] to [false].
> DPLOG: 2017-11-05T09:56:16.072Z - in Producer - creating new 
> KafkaProducer
> log4j: Kafka producer connected to localhost:9092
> log4j: Logging for topic: test
> log4j: Parsed "KAFKA" options.
> log4j: Finished configuring.
> 
> DPLOG: 2017-11-05T09:56:16.338Z - append START
> DPLOG: 2017-11-05T09:56:16.339Z - after subAppend.  Message is: 2017-11-05 
> 03:56:16,333 DEBUG Sender:123 - Starting Kafka producer I/O thread.
>
> log4j: [Sun Nov 05 03:56:16 CST 2017]2017-11-05 03:56:16,333 DEBUG Sender:123 
> - Starting Kafka producer I/O thread.
>
> DPLOG: 2017-11-05T09:56:16.342Z - getting ready to send to producer.
> DPLOG: 2017-11-05T09:57:16.347Z - after send to producer.
> DPLOG: 2017-11-05T09:57:16.348Z - append END
> 
> DPLOG: 2017-11-05T09:57:16.352Z - append START
> DPLOG: 2017-11-05T09:57:16.353Z - after subAppend.  Message is: 
> 2017-11-05 03:56:16,338 INFO  root:36 - Logging message: x=0
>
> log4j: [Sun Nov 05 03:56:16 CST 2017]2017-11-05 03:56:16,338 INFO  
> root:36 - Logging message: x=0
>
> DPLOG: 2017-11-05T09:57:16.361Z - getting ready to send to producer.
> DPLOG: 2017-11-05T09:57:16.526Z - after send to producer.
> DPLOG: 2017-11-05T09:57:16.526Z - append END
> 
> DPLOG: 2017-11-05T09:57:16.527Z - append START
> DPLOG: 2017-11-05T09:57:16.528Z - after subAppend.  Message is: 
> 2017-11-05 03:57:16,527 INFO  root:36 - Logging message: x=1
>
> log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,527 INFO  
> root:36 - Logging message: x=1
>
> DPLOG: 2017-11-05T09:57:16.529Z - getting ready to send to producer.
> DPLOG: 2017-11-05T09:57:16.530Z - after send to producer.
> DPLOG: 

Kafka log4j appender

2017-11-03 Thread Preston, Dale
Did the log4j appender get deprecated?  I've spent a good amount of time on 
google and don't find any such announcement but I also don't find any 
documentation since 0.7.

Thanks,

Dale