#general


@mathur.amol: @mathur.amol has joined the channel
@tymm: Hello, I'm running pinot on docker, and am creating and pushing new data/ segments (from csv files) into pinot every 1 minute. I realize that the amount of time to push the segment into pinot increases as the data/ segment increases, to the point where it take more than a minute to push the new segment into pinot. How can I make the pushing of a segment faster? Thanks.
  @g.kishore: Why every minute? If you plan to push that frequently, it’s might be better to stream it in via Kafka
  @g.kishore: How big is each file
  @tymm: The existing services are using pulsar to stream data, and pinot doesnt have pulsar connector yet, hence the batch ingestion. Around 100kb ~ 3mb per file.
@rupaligwen: @rupaligwen has joined the channel
@avg1206: @avg1206 has joined the channel
@brijeshbhalodiya.bb: @brijeshbhalodiya.bb has joined the channel

#random


@mathur.amol: @mathur.amol has joined the channel
@rupaligwen: @rupaligwen has joined the channel
@avg1206: @avg1206 has joined the channel
@brijeshbhalodiya.bb: @brijeshbhalodiya.bb has joined the channel

#feat-text-search


@sudhir: @sudhir has joined the channel

#feat-presto-connector


@sudhir: @sudhir has joined the channel

#pql-2-calcite


@sudhir: @sudhir has joined the channel

#troubleshooting


@mathur.amol: @mathur.amol has joined the channel
@elon.azoulay: Hi, our brokers are taking 25s (max time) to return queries, but direct server queries return instantly. I took heap dumps, pmaps, etc. and the one thing that stands out is jstack output. Looks like HelixTaskExecutor threads are all waiting on the same object. Anyone ever see this behavior?
  @elon.azoulay: All waiting on `0x00000007188b2318`
  @elon.azoulay:
  @elon.azoulay: This is on all brokers
  @elon.azoulay: grizzly server is responsive, zk calls were coming back quickly, only thing that stood out is all HelixTaskExecutor threads waiting on the same object.
  @elon.azoulay: Looks like 1 server is in a gc loop, maybe the brokers are waiting on it? Strange thing is that via presto server requests return instantly
  @g.kishore: Helixtaskexecutors are not in query path
  @g.kishore: Can you check broker log
  @elon.azoulay: Seemed fine, anything in particular?
  @elon.azoulay: After restarting servers latency went back to 0
  @elon.azoulay: And I see the same HelixTaskExecutors in waiting state - looks like it was a red herring
  @elon.azoulay: I did notice from the heap dump on the server that we had a lot of those DirectR buffer refs
  @elon.azoulay: referring to mmapped segments
  @elon.azoulay: what would cause servers to be responsive to direct requests (i.e. AsyncQueryRequest) and unresponsive to broker requests?
  @g.kishore: Can you find the queries with high latency from the logs?
  @elon.azoulay: I see that not all the servers responded, ex. 2/3 servers responded in 25s
  @g.kishore: So the problem is on server side
  @ssubrama: We have recently seen helix threads waiting for queue full in the server in Linked. These things lined up with other problems, so we were not sure if this was a helix issue or a pinot issue. I tend to believe that we have a pinot issue when trying to download segemnts from the controller, but I could be wrong. @steotia investigated this most recently, but he is away on a holiday and may not respond until next year. We will watch out for more at our end. Meanwhile, is this reproducable? Is this offline only or offline+realtime or realtime only use case
  @elon.azoulay: It was for all tables on the server. Seems to happen after the server is running for about 1 week or so. I don't think it was the helix threads, after restarting they all appear to be in `WAITING` state. All I see is that 1 server can do that to the entire cluster but when you submit an AsyncServerRequest via presto the servers respond instantly.
  @elon.azoulay: Thanks @ssubrama! I hope you all have a great holiday!
@elon.azoulay: .
@rupaligwen: @rupaligwen has joined the channel
@avg1206: @avg1206 has joined the channel
@brijeshbhalodiya.bb: @brijeshbhalodiya.bb has joined the channel

#pinot-s3


@brijeshbhalodiya.bb: @brijeshbhalodiya.bb has joined the channel

#docs


@brijeshbhalodiya.bb: @brijeshbhalodiya.bb has joined the channel

#pinot-dev


@mathur.amol: @mathur.amol has joined the channel

#getting-started


@mathur.amol: @mathur.amol has joined the channel
@brijeshbhalodiya.bb: @brijeshbhalodiya.bb has joined the channel
--------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]

Reply via email to