#general
@humengyuk18: Hi, I’m getting an error when using lookup on a local cluster, does anyone know how to solve it? ```[ { "errorCode": 200, "message": "QueryExecutionError:\norg.apache.pinot.core.query.exception.BadQueryRequestException: Caught exception while initializing transform function: lookup\n\tat org.apache.pinot.core.operator.transform.function.TransformFunctionFactory.get(TransformFunctionFactory.java:207)\n\tat org.apache.pinot.core.operator.transform.TransformOperator.<init>(TransformOperator.java:56)\n\tat org.apache.pinot.core.plan.TransformPlanNode.run(TransformPlanNode.java:52)\n\tat org.apache.pinot.core.plan.SelectionPlanNode.run(SelectionPlanNode.java:83)\n\tat org.apache.pinot.core.plan.CombinePlanNode.run(CombinePlanNode.java:100)\n\tat org.apache.pinot.core.plan.InstanceResponsePlanNode.run(InstanceResponsePlanNode.java:33)\n\tat org.apache.pinot.core.plan.GlobalPlanImplV0.execute(GlobalPlanImplV0.java:45)\n\tat org.apache.pinot.core.query.executor.ServerQueryExecutorV1Impl.processQuery(ServerQueryExecutorV1Impl.java:294)\n\tat org.apache.pinot.core.query.executor.ServerQueryExecutorV1Impl.processQuery(ServerQueryExecutorV1Impl.java:215)\n\tat org.apache.pinot.core.query.executor.QueryExecutor.processQuery(QueryExecutor.java:60)\n\tat org.apache.pinot.core.query.scheduler.QueryScheduler.processQueryAndSerialize(QueryScheduler.java:157)\n\tat org.apache.pinot.core.query.scheduler.QueryScheduler.lambda$createQueryFutureTask$0(QueryScheduler.java:141)\n\tat java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)" } ]```
@g.kishore: whats the query
@humengyuk18: select who, lookup(‘mango_company’, ‘name’, ‘company_id’, company_id) from fetrace_biz limit 10
@g.kishore: whats the name of the table that contains the company_id column
@lvs.pjx: `Lookup` function work with dimTable. But we found table could not be created successful with `isDimTable` = true.
@lvs.pjx: The condition of create dim table successful with `idDimTable` is that schema need primary key.
@fx19880617: @canerbalci can you help here :slightly_smiling_face:
@humengyuk18: got it, the DimTable need to have a primary key as @lvs.pjx said, thanks everyone
@huwfoley: @huwfoley has joined the channel
@qiaoliang310: @qiaoliang310 has joined the channel
@humengyuk18: Another question regarding using hdfs as pinot deep storage, I have put hadoop-client-3.1.1.3.1.0.0-78.jar, hadoop-common-3.1.1.3.1.0.0-78.jar, hadoop-hdfs-3.1.1.3.1.0.0-78.jar, hadoop-hdfs-client-3.1.1.3.1.0.0-78.jar these jars in pinot controller’s classpath, but controller still reporting class not found for org/apache/hadoop/fs/FSDataInputStream, what other jars should I include? Below are the stack trace for this error: ```2021/01/18 10:26:32.704 INFO [ControllerStarter] [main] Initializing PinotFSFactory Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream at java.lang.Class.getDeclaredConstructors0(Native Method) at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671) at java.lang.Class.getConstructor0(Class.java:3075) at java.lang.Class.getConstructor(Class.java:1825) at org.apache.pinot.spi.plugin.PluginManager.createInstance(PluginManager.java:295) at org.apache.pinot.spi.plugin.PluginManager.createInstance(PluginManager.java:264) at org.apache.pinot.spi.plugin.PluginManager.createInstance(PluginManager.java:245) at org.apache.pinot.spi.filesystem.PinotFSFactory.register(PinotFSFactory.java:53) at org.apache.pinot.spi.filesystem.PinotFSFactory.init(PinotFSFactory.java:74) at org.apache.pinot.controller.ControllerStarter.initPinotFSFactory(ControllerStarter.java:481) at org.apache.pinot.controller.ControllerStarter.setUpPinotController(ControllerStarter.java:329) at org.apache.pinot.controller.ControllerStarter.start(ControllerStarter.java:287) at org.apache.pinot.tools.service.PinotServiceManager.startController(PinotServiceManager.java:116) at org.apache.pinot.tools.service.PinotServiceManager.startRole(PinotServiceManager.java:91) at org.apache.pinot.tools.admin.command.StartServiceManagerCommand.lambda$startBootstrapServices$0(StartServiceManagerCommand.java:234) at org.apache.pinot.tools.admin.command.StartServiceManagerCommand.startPinotService(StartServiceManagerCommand.java:286) at org.apache.pinot.tools.admin.command.StartServiceManagerCommand.startBootstrapServices(StartServiceManagerCommand.java:233) at org.apache.pinot.tools.admin.command.StartServiceManagerCommand.execute(StartServiceManagerCommand.java:183) at org.apache.pinot.tools.admin.command.StartControllerCommand.execute(StartControllerCommand.java:130) at org.apache.pinot.tools.admin.PinotAdministrator.execute(PinotAdministrator.java:162) at org.apache.pinot.tools.admin.PinotAdministrator.main(PinotAdministrator.java:182) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ... 21 more``` And below are the startup opts: ```JAVA_OPTS -Xms256M -Xmx1G -XX:+UseG1GC -XX:MaxGCPauseMillis=200 -XX:+PrintGCDetails -XX:+PrintGCDateStamps -XX:+PrintGCApplicationStoppedTime -XX:+PrintGCApplicationConcurrentTime -Xloggc:/opt/pinot/gc-pinot-controller.log -Dlog4j2.configurationFile=/opt/pinot/conf/pinot-controller-log4j2.xml -Dplugins.dir=/opt/pinot/plugins -Dplugins.include=pinot-hdfs -classpath /opt/hadoop-lib/hadoop-common-3.1.1.3.1.0.0-78.jar:/opt/hadoop-lib/hadoop-client-3.1.1.3.1.0.0-78.jar:/opt/hadoop-lib/hadoop-hdfs-3.1.1.3.1.0.0-78.jar:/opt/hadoop-lib/hadoop-hdfs-client-3.1.1.3.1.0.0-78.jar```
@ken: I believe Pinot is built against Hadoop 2.7, so I’d try putting jars for that version of Hadoop on the classpath, not 3.1.1
@ken: Though Hadoop 3.1.1 source also has FSDataInputStream in hadoop-common, in the same package. So switching to 2.7 seems less likely to fix your issue, sorry.
@davideberdin: @davideberdin has joined the channel
@davideberdin: Hello everybody! fantastic project :rocket: I’m totally in love with Apache Pinot :heart: keep up the great work!
@kennybastani: Welcome!
@davideberdin: Maybe I am too late to the party but I started a project for the Apache Pinot Kubernetes Operator :rocket: You can find it here:
@g.kishore: Thanks Davide! this is fantastic. lets create an issue and continue to discuss?
@kennybastani: Sweet!
@pavel: @pavel has joined the channel
@gstbimo: @gstbimo has joined the channel
#random
@huwfoley: @huwfoley has joined the channel
@qiaoliang310: @qiaoliang310 has joined the channel
@davideberdin: @davideberdin has joined the channel
@davideberdin: @davideberdin has left the channel
@pavel: @pavel has joined the channel
@gstbimo: @gstbimo has joined the channel
#troubleshooting
@huwfoley: @huwfoley has joined the channel
@qiaoliang310: @qiaoliang310 has joined the channel
@davideberdin: @davideberdin has joined the channel
@pavel: @pavel has joined the channel
@gstbimo: @gstbimo has joined the channel
#pinot-k8s-operator
@davideberdin: @davideberdin has joined the channel
@davideberdin: Hello there! I hope you are doing well :slightly_smiling_face: Maybe I am too late to the party but I started a project for the Apache Pinot Kubernetes Operator :rocket: You can find it here:
@fx19880617: wow
@fx19880617: Many thanks !!!!! I will take a look!
#community
@davideberdin: @davideberdin has joined the channel
#feat-pravega-connector
@pabraham.usa: @pabraham.usa has joined the channel
--------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
