#general
@subin.tp: Hello:raised_hands: Would like to track the average message waiting period of message in pinot input topic. Do we have any metric for that ?
@ssubrama: You will need kafka support for this. You need to get the ingestion time (into kafka) of the message, and then use that to compute the elapsed time from that to ingestion into pinot.
@ssubrama: Pinot provides support for `RowMetadata` class where the underlying plugin can populate this information
@ssubrama: For each query, pinot indicates the largest time difference faced amongst all segments, in the metadata sent out by broker.
@mbshrikanth: @mbshrikanth has joined the channel
@jsegall: @jsegall has joined the channel
@dongxiaoman: @dongxiaoman has joined the channel
#random
@mbshrikanth: @mbshrikanth has joined the channel
@jsegall: @jsegall has joined the channel
@dongxiaoman: @dongxiaoman has joined the channel
#troubleshooting
@cechovsky.jozef: Hi there, any help please? I’m struggling to connect external Kafka to Pinot. I have Kafka deployed in one Kubernetes cluster and Pinot in separated one. I’m 100% sure that the communication between these two clusters are correct. Deployment of Pinot is done by this tutorial
@mayanks: No, Pinot does not need Kafka to be on same ZK. What do you see in the server logs?
@tanmay.movva: Hello, We are trying to connect Pinot with Trino and we are getting this error ```No valid brokers found for backendentityview'``` We got to know it is because, the trino-pinot connector doesn’t support mixed case table name. Is anything planned to support mixed case table names in the connector?
@tanmay.movva: It is failing at this point
@g.kishore: @elon.azoulay ^^
@elon.azoulay: Which version of trino and pinot are you using?
@elon.azoulay: Ah, we have a pr to address this:
@elon.azoulay:
@mbshrikanth: @mbshrikanth has joined the channel
@jsegall: @jsegall has joined the channel
@kchavda: Hi all, I'm pushing a table from postgres to kafka (using debezium) to Pinot. The table has a few geography columns. When creating the realtime table however, I get am getting error on Pinot (below). ```java.lang.IllegalStateException: Cannot read single-value from Collection: [AQEAACDmEAAA5no2BviTXcB1T2ijhAxBQA==, 4326] for column: point at shaded.com.google.common.base.Preconditions.checkState(Preconditions.java:721) ~[pinot-all-0.7.1-jar-with-dependencies.jar:0.7.1-afa4b252ab1c424ddd6c859bb305b2aa342b66ed] at org.apache.pinot.core.data.recordtransformer.DataTypeTransformer.standardizeCollection(DataTypeTransformer.java:193) ~[pinot-all-0.7.1-jar-with-dependencies.jar:0.7.1-afa4b252ab1c424ddd6c859bb305b2aa342b66ed] at org.apache.pinot.core.data.recordtransformer.DataTypeTransformer.standardize(DataTypeTransformer.java:138) ~[pinot-all-0.7.1-jar-with-dependencies.jar:0.7.1-afa4b252ab1c424ddd6c859bb305b2aa342b66ed] at org.apache.pinot.core.data.recordtransformer.DataTypeTransformer.transform(DataTypeTransformer.java:88) ~[pinot-all-0.7.1-jar-with-dependencies.jar:0.7.1-afa4b252ab1c424ddd6c859bb305b2aa342b66ed] at org.apache.pinot.core.data.recordtransformer.CompositeTransformer.transform(CompositeTransformer.java:82) ~[pinot-all-0.7.1-jar-with-dependencies.jar:0.7.1-afa4b252ab1c424ddd6c859bb305b2aa342b66ed] at org.apache.pinot.core.data.manager.realtime.LLRealtimeSegmentDataManager.processStreamEvents(LLRealtimeSegmentDataManager.java:491) [pinot-all-0.7.1-jar-with-dependencies.jar:0.7.1-afa4b252ab1c424ddd6c859bb305b2aa342b66ed] at org.apache.pinot.core.data.manager.realtime.LLRealtimeSegmentDataManager.consumeLoop(LLRealtimeSegmentDataManager.java:402) [pinot-all-0.7.1-jar-with-dependencies.jar:0.7.1-afa4b252ab1c424ddd6c859bb305b2aa342b66ed] at org.apache.pinot.core.data.manager.realtime.LLRealtimeSegmentDataManager$PartitionConsumer.run(LLRealtimeSegmentDataManager.java:538) [pinot-all-0.7.1-jar-with-dependencies.jar:0.7.1-afa4b252ab1c424ddd6c859bb305b2aa342b66ed] at java.lang.Thread.run(Thread.java:748) [?:1.8.0_282]``` The point column has this as value: ```"point" : { "wkb" : "AQEAACDmEAAA5no2BviTXcB1T2ijhAxBQA==", "srid" : 4326 },``` Any suggestions on how to resolve? I have the column as string in the Pinot table schema.
@jackie.jxt: Can you please share the schema? Are you planning to add geo index to this column?
@jackie.jxt: Add @yupeng to the discussion
@yupeng: try `ST_GeogFromWKB` from
@jackie.jxt: You should be able to decode the input string using the in-built `base64Decode()` function. See
@dongxiaoman: @dongxiaoman has joined the channel
#pinot-dev
@atri.sharma: Please review:
#getting-started
@cechovsky.jozef: @cechovsky.jozef has joined the channel
--------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@pinot.apache.org For additional commands, e-mail: dev-h...@pinot.apache.org