#general


@suresh.intuit: @suresh.intuit has joined the channel
@weixiang.sun: @weixiang.sun has joined the channel
@chengweili402: @chengweili402 has joined the channel

#random


@suresh.intuit: @suresh.intuit has joined the channel
@weixiang.sun: @weixiang.sun has joined the channel
@chengweili402: @chengweili402 has joined the channel

#feat-text-search


@sirsh: @sirsh has joined the channel

#troubleshooting


@suresh.intuit: @suresh.intuit has joined the channel
@surajkmth29: Hi Folks, We have a field of LONG data type which was earlier mentioned as "dimensionField" in our schema. This field stored the time in epoch milliseconds. Using this schema the data was loaded. However, we now want to make this field "metricField" so that we can apply range indexing to it. When we tried to update the schema it resulted in the error `Backward incompatible schema <schemaName>.Only allow adding new columns` What can be the way to proceed ?
  @mrpringle: Should this not be a datetimefield? Think we can have multiple although not sure how pinot deals with multiple time indexes. Think to update you need to drop table, schema and recreate.
  @mayanks: Yeah, backward incompatible changes to schema are not allowed.
  @npawar: btw you dont need to move the field just to apply range index. Range index can be applied to dimensionField
  @npawar: Indexing doesn’t care about the column categorization
@phuchdh: Got the same issue. I’m workaround by using jars upload on gcs but it’s doesn’t work. Any idea ?
@sina.tamizi: Hi Friends, has anyone had issue with pinot connector with trino: I am getting the following Errors and the query never runs: `Caught exception while deserializing data table of size: 1505 from server: 10.12.2.133_R` `java.lang.UnsupportedOperationException: Unsupported data table version: 3` `2021-09-13T06:35:32.016Z INFO dispatcher-query-15 io.trino.event.QueryMonitor TIMELINE: Query 20210913_063531_00006_7vsrs :: Transaction:[71205803-47f0-475d-88d4-56d5b4719d14] :: elapsed 220ms :: planning 46ms :: waiting 5ms :: scheduling 31ms :: running 20ms :: finishing 123ms :: begin 2021-09-13T06:35:31.783Z :: end 2021-09-13T06:35:32.003Z` `2021-09-13T06:35:37.244Z INFO Query-20210913_063537_00007_7vsrs-464 io.trino.plugin.pinot.PinotSplitManager Got routing table for speedtest: {speedtest_REALTIME={Server_10.12.2.133_8098=[speedtest__0__0__20210913T0339Z]}}` `2021-09-13T06:35:37.256Z INFO 20210913_063537_00007_7vsrs.1.0-0-135 io.trino.plugin.pinot.PinotSegmentPageSource Query 'SELECT upload_throughput, e_time_stamp, cpeid, download_throughput, avcid, service_plan, state, epoc_time_stamp FROM speedtest_REALTIME LIMIT 10' on host 'Optional[Server_10.12.2.133_8098]' for segment splits: [speedtest__0__0__20210913T0339Z]` `2021-09-13T06:35:37.273Z ERROR nioEventLoopGroup-2-2 org.apache.pinot.core.transport.DataTableHandler Caught exception while deserializing data table of size: 1505 from server: 10.12.2.133_R` `java.lang.UnsupportedOperationException: Unsupported data table version: 3` `at org.apache.pinot.core.common.datatable.DataTableFactory.getDataTable(DataTableFactory.java:37)` `at org.apache.pinot.core.transport.DataTableHandler.channelRead0(DataTableHandler.java:67)` `at org.apache.pinot.core.transport.DataTableHandler.channelRead0(DataTableHandler.java:36)` `at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)` `at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)` `at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)` `at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)` `at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:321)` `at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:295)` `at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)` `at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)` `at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)` `at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)` `at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)` `at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)` `at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)` `at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)` `at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)` `at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)` `at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)` `at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)` `at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)` `at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)` `at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)` `at java.base/java.lang.Thread.run(Thread.java:829)` `2021-09-13T06:36:37.263Z ERROR SplitRunner-28-135 io.trino.execution.executor.TaskExecutor Error processing Split 20210913_063537_00007_7vsrs.1.0-0 PinotSplit{splitType=SEGMENT, segments=[speedtest__0__0__20210913T0339Z], segmentHost=Optional[Server_10.12.2.133_8098]} (start = 4.476924908050052E9, wall = 60007 ms, cpu = 0 ms, wait = 0 ms, calls = 1)` `java.lang.NullPointerException: null value in entry: Server_10.12.2.133_8098=null` `at com.google.common.collect.CollectPreconditions.checkEntryNotNull(CollectPreconditions.java:32)` `at com.google.common.collect.SingletonImmutableBiMap.<init>(SingletonImmutableBiMap.java:42)` `at com.google.common.collect.ImmutableBiMap.of(ImmutableBiMap.java:72)` `at com.google.common.collect.ImmutableMap.of(ImmutableMap.java:119)` `at com.google.common.collect.ImmutableMap.copyOf(ImmutableMap.java:454)` `at com.google.common.collect.ImmutableMap.copyOf(ImmutableMap.java:433)` `at io.trino.plugin.pinot.PinotSegmentPageSource.queryPinot(PinotSegmentPageSource.java:221)` `at io.trino.plugin.pinot.PinotSegmentPageSource.fetchPinotData(PinotSegmentPageSource.java:182)` `at io.trino.plugin.pinot.PinotSegmentPageSource.getNextPage(PinotSegmentPageSource.java:150)` `at io.trino.operator.TableScanOperator.getOutput(TableScanOperator.java:311)` `at io.trino.operator.Driver.processInternal(Driver.java:387)` `at io.trino.operator.Driver.lambda$processFor$9(Driver.java:291)` `at io.trino.operator.Driver.tryWithLock(Driver.java:683)` `at io.trino.operator.Driver.processFor(Driver.java:284)` `at io.trino.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:1076)` `at io.trino.execution.executor.PrioritizedSplitRunner.process(PrioritizedSplitRunner.java:163)` `at io.trino.execution.executor.TaskExecutor$TaskRunner.run(TaskExecutor.java:484)` `at io.trino.$gen.Trino_361____20210913_051507_2.run(Unknown Source)` `at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)` `at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)` `at java.base/java.lang.Thread.run(Thread.java:829)` `2021-09-13T06:36:37.265Z ERROR remote-task-callback-13 io.trino.execution.StageStateMachine Stage 20210913_063537_00007_7vsrs.1 failed` `java.lang.NullPointerException: null value in entry: Server_10.12.2.133_8098=null` `at com.google.common.collect.CollectPreconditions.checkEntryNotNull(CollectPreconditions.java:32)` `at com.google.common.collect.SingletonImmutableBiMap.<init>(SingletonImmutableBiMap.java:42)` `at com.google.common.collect.ImmutableBiMap.of(ImmutableBiMap.java:72)` `at com.google.common.collect.ImmutableMap.of(ImmutableMap.java:119)` `at com.google.common.collect.ImmutableMap.copyOf(ImmutableMap.java:454)` `at com.google.common.collect.ImmutableMap.copyOf(ImmutableMap.java:433)` `at io.trino.plugin.pinot.PinotSegmentPageSource.queryPinot(PinotSegmentPageSource.java:221)` `at io.trino.plugin.pinot.PinotSegmentPageSource.fetchPinotData(PinotSegmentPageSource.java:182)` `at io.trino.plugin.pinot.PinotSegmentPageSource.getNextPage(PinotSegmentPageSource.java:150)` `at io.trino.operator.TableScanOperator.getOutput(TableScanOperator.java:311)` `at io.trino.operator.Driver.processInternal(Driver.java:387)` `at io.trino.operator.Driver.lambda$processFor$9(Driver.java:291)` `at io.trino.operator.Driver.tryWithLock(Driver.java:683)` `at io.trino.operator.Driver.processFor(Driver.java:284)` `at io.trino.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:1076)` `at io.trino.execution.executor.PrioritizedSplitRunner.process(PrioritizedSplitRunner.java:163)` `at io.trino.execution.executor.TaskExecutor$TaskRunner.run(TaskExecutor.java:484)` `at io.trino.$gen.Trino_361____20210913_051507_2.run(Unknown Source)` `at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)` `at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)` `at java.base/java.lang.Thread.run(Thread.java:829)` `2021-09-13T06:36:37.280Z INFO dispatcher-query-15 io.trino.event.QueryMonitor TIMELINE: Query 20210913_063537_00007_7vsrs :: Transaction:[e51637ae-dc8c-409b-a9be-fe544757a91a] :: elapsed 60076ms :: planning 55ms :: waiting 3ms :: scheduling 12ms :: running 60007ms :: finishing 2ms :: begin 2021-09-13T06:35:37.189Z :: end 2021-09-13T06:36:37.265Z` ================================================================ `trino:default> SHOW TABLES FROM pinot.default;` `Table` `------------` `speedtest` `speedtest2` `speedtest3` `(3 rows)` `Query 20210913_063531_00006_7vsrs, FINISHED, 1 node` `Splits: 19 total, 19 done (100.00%)` `0.22 [4 rows, 103B] [18 rows/s, 468B/s]` `trino:default> select * from speedtest limit 10;` `Query 20210913_063537_00007_7vsrs, FAILED, 1 node` `Splits: 18 total, 0 done (0.00%)` `1:00 [0 rows, 0B] [0 rows/s, 0B/s]` `Query 20210913_063537_00007_7vsrs failed: null value in entry: Server_10.12.2.133_8098=null`
  @xiangfu0: you are running a higher version of pinot
  @xiangfu0: can you try to add this config into pinot server conf file, so pinot will use old version of data serde format, which current trino can understand ```pinot.server.instance.currentDataTableVersion=2```
  @xiangfu0: then restart pinot servers
  @xiangfu0: @elon.azoulay is working on upgrading trino to pinot 0.8, once that is in place, this should work without adding a new config
  @elon.azoulay: Speaking of, I just updated it: - added support for all pinot functions, aliases, can even use date_time_convert and time_convert and the connector will auto uppercase the parameters.
@dadelcas: I have a deployment with one node of each type at the moment. One thing I've just noticed is that when the server is redeployed the segments are moved to deep storage under DeletedSegments and the realtime table doesn't reprocess them - they won't be available to queries. I had to delete the table and recreate it to consume all the data again. Is this an expected behaviour? I was actually expecting the low level consumer to restart the ingestion from the earliest offset that still has not been moved to the offline table
@weixiang.sun: @weixiang.sun has joined the channel
@chengweili402: @chengweili402 has joined the channel

#pinot-dev


@sirsh: @sirsh has joined the channel

#pinot-docsrus


@yash.agarwal: @yash.agarwal has joined the channel

#pinot-trino


@sirsh: @sirsh has joined the channel
--------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@pinot.apache.org For additional commands, e-mail: dev-h...@pinot.apache.org

Reply via email to