juice416021 opened a new issue, #3357:
URL: https://github.com/apache/hop/issues/3357

   ### Apache Hop version?
   
   2.6
   
   ### Java version?
   
   11.0.21
   
   ### Operating system
   
   macOS
   
   ### What happened?
   
   I tested a new pipeline scenario with only Beam BigQuery Input. After 
clicking "Get fields," it indeed retrieved the columns from the specified 
table. However, when I executed this pipeline, it worked successfully with the 
Hop Engine but failed with the Beam Direct pipeline engine.
   
   
--------------------------------------------------------------------------------------------------------------
   2023/11/06 14:28:48 - Hop - Pipeline opened.
   2023/11/06 14:28:48 - Hop - Launching pipeline [final_test5]...
   2023/11/06 14:28:48 - Hop - Started the pipeline execution.
   2023/11/06 14:28:49 - General - Created Apache Beam pipeline with name 
'final-test5'
   2023/11/06 14:28:49 - General - Handled generic transform (TRANSFORM) : 
Table input, gets data from 0 previous transform(s), targets=0, infos=0
   2023/11/06 14:28:49 - General - Handled transform (BQ OUTPUT) : Beam 
BigQuery Output, gets data from Table input
   2023/11/06 14:34:27 - Hop - Pipeline opened.
   2023/11/06 14:34:27 - Hop - Launching pipeline [TEST]...
   2023/11/06 14:34:27 - Hop - Started the pipeline execution.
   2023/11/06 14:34:34 - General - Created Apache Beam pipeline with name 'test'
   2023/11/06 14:34:34 - Hop - ERROR: TEST: preparing pipeline execution failed
   2023/11/06 14:34:34 - Hop - ERROR: 
org.apache.hop.core.exception.HopException: 
   2023/11/06 14:34:34 - Hop - Error preparing remote pipeline
   2023/11/06 14:34:34 - Hop - Error converting Hop pipeline to Beam
   2023/11/06 14:34:34 - Hop - 
   2023/11/06 14:34:34 - Hop -  at 
org.apache.hop.beam.engines.BeamPipelineEngine.prepareExecution(BeamPipelineEngine.java:293)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.hop.ui.hopgui.file.pipeline.HopGuiPipelineGraph.lambda$preparePipeline$16(HopGuiPipelineGraph.java:4537)
   2023/11/06 14:34:34 - Hop -  at 
java.base/java.lang.Thread.run(Thread.java:829)
   2023/11/06 14:34:34 - Hop - Caused by: java.lang.Exception: Error converting 
Hop pipeline to Beam
   2023/11/06 14:34:34 - Hop -  at 
org.apache.hop.beam.pipeline.HopPipelineMetaToBeamPipelineConverter.createPipeline(HopPipelineMetaToBeamPipelineConverter.java:308)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.hop.beam.engines.BeamPipelineEngine.prepareExecution(BeamPipelineEngine.java:262)
   2023/11/06 14:34:34 - Hop -  ... 2 more
   2023/11/06 14:34:34 - Hop - Caused by: java.lang.RuntimeException: Error in 
beam input transform
   2023/11/06 14:34:34 - Hop -  at 
org.apache.hop.beam.core.transform.BeamBQInputTransform.expand(BeamBQInputTransform.java:111)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.hop.beam.core.transform.BeamBQInputTransform.expand(BeamBQInputTransform.java:38)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:545)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:479)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.beam.sdk.values.PBegin.apply(PBegin.java:44)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.beam.sdk.Pipeline.apply(Pipeline.java:175)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.hop.beam.transforms.bq.BeamBQInputMeta.handleTransform(BeamBQInputMeta.java:145)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.hop.beam.pipeline.HopPipelineMetaToBeamPipelineConverter.handleBeamInputTransforms(HopPipelineMetaToBeamPipelineConverter.java:373)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.hop.beam.pipeline.HopPipelineMetaToBeamPipelineConverter.createPipeline(HopPipelineMetaToBeamPipelineConverter.java:295)
   2023/11/06 14:34:34 - Hop -  ... 3 more
   2023/11/06 14:34:34 - Hop - Caused by: java.lang.IllegalArgumentException: 
Invalid BigQueryIO.Read: Specifies a table with a SQL dialect preference, which 
only applies to queries
   2023/11/06 14:34:34 - Hop -  at 
org.apache.beam.vendor.guava.v32_1_2_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:143)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.expand(BigQueryIO.java:1202)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.expand(BigQueryIO.java:903)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:545)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:479)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.beam.sdk.values.PBegin.apply(PBegin.java:44)
   2023/11/06 14:34:34 - Hop -  at 
org.apache.hop.beam.core.transform.BeamBQInputTransform.expand(BeamBQInputTransform.java:104)
   2023/11/06 14:34:34 - Hop -  ... 11 more
   
--------------------------------------------------------------------------------------------------------------
   
   
   Later, I added Table Input and Beam BigQuery Output to the pipeline. For 
Table Input, I configured it to connect to a local database, and for Beam 
BigQuery Output, I configured it to connect to Google BigQuery. While I'm not 
sure if a fat jar is required, I did set it up.
   
   I conducted the initial testing using the Beam Direct pipeline engine, and 
it resulted in an error.
   
--------------------------------------------------------------------------------------------------------------
   2023/11/06 14:36:42 - Hop - Pipeline opened.
   2023/11/06 14:36:42 - Hop - Launching pipeline [final_test5]...
   2023/11/06 14:36:42 - Hop - Started the pipeline execution.
   2023/11/06 14:36:50 - General - Created Apache Beam pipeline with name 
'final-test5'
   2023/11/06 14:36:50 - General - Handled generic transform (TRANSFORM) : 
Table input, gets data from 0 previous transform(s), targets=0, infos=0
   2023/11/06 14:36:50 - General - Handled transform (BQ OUTPUT) : Beam 
BigQuery Output, gets data from Table input
   2023/11/06 14:36:50 - final_test5 - Executing this pipeline using the Beam 
Pipeline Engine with run configuration 'Direct'
   2023/11/06 14:36:50 - final_test5 - ERROR: Error starting the Beam pipeline
   2023/11/06 14:36:50 - final_test5 - ERROR: 
org.apache.hop.core.exception.HopException: 
   2023/11/06 14:36:50 - final_test5 - Error executing pipeline with runner 
Direct
   2023/11/06 14:36:50 - final_test5 - java.lang.NoClassDefFoundError: Could 
not initialize class 
com.google.cloud.bigquery.storage.v1.stub.GrpcBigQueryWriteStub
   2023/11/06 14:36:50 - final_test5 - 
   2023/11/06 14:36:50 - final_test5 -  at 
org.apache.hop.beam.engines.BeamPipelineEngine.executePipeline(BeamPipelineEngine.java:319)
   2023/11/06 14:36:50 - final_test5 -  at 
org.apache.hop.beam.engines.BeamPipelineEngine.lambda$startThreads$0(BeamPipelineEngine.java:369)
   2023/11/06 14:36:50 - final_test5 -  at 
java.base/java.lang.Thread.run(Thread.java:829)
   2023/11/06 14:36:50 - final_test5 - Caused by: 
org.apache.beam.sdk.Pipeline$PipelineExecutionException: 
java.lang.NoClassDefFoundError: Could not initialize class 
com.google.cloud.bigquery.storage.v1.stub.GrpcBigQueryWriteStub
   2023/11/06 14:36:50 - final_test5 -  at 
org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:374)
   2023/11/06 14:36:50 - final_test5 -  at 
org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:342)
   2023/11/06 14:36:50 - final_test5 -  at 
org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:218)
   2023/11/06 14:36:50 - final_test5 -  at 
org.apache.hop.beam.engines.BeamPipelineEngine.executePipeline(BeamPipelineEngine.java:307)
   2023/11/06 14:36:50 - final_test5 -  ... 2 more
   2023/11/06 14:36:50 - final_test5 - Caused by: 
java.lang.NoClassDefFoundError: Could not initialize class 
com.google.cloud.bigquery.storage.v1.stub.GrpcBigQueryWriteStub
   2023/11/06 14:36:50 - final_test5 -  at 
com.google.cloud.bigquery.storage.v1.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:148)
   2023/11/06 14:36:50 - final_test5 -  at 
com.google.cloud.bigquery.storage.v1.BigQueryWriteClient.<init>(BigQueryWriteClient.java:143)
   2023/11/06 14:36:50 - final_test5 -  at 
com.google.cloud.bigquery.storage.v1.BigQueryWriteClient.create(BigQueryWriteClient.java:125)
   2023/11/06 14:36:50 - final_test5 -  at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1552)
   2023/11/06 14:36:50 - final_test5 -  at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:157)
   2023/11/06 14:36:50 - final_test5 -  at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:616)
   2023/11/06 14:36:50 - final_test5 -  at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:552)
   2023/11/06 14:36:50 - final_test5 -  at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:196)
   2023/11/06 14:36:50 - final_test5 -  at 
org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination.getDatasetService(UpdateSchemaDestination.java:371)
   2023/11/06 14:36:50 - final_test5 -  at 
org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination.finishBundle(UpdateSchemaDestination.java:213)
   
--------------------------------------------------------------------------------------------------------------
   
   
![image](https://github.com/apache/hop/assets/130071336/197f78b6-4677-421c-a1e2-daeebaaf220c)
   
![image](https://github.com/apache/hop/assets/130071336/1dffb9d8-1b54-4441-a8d5-d8a7ea19515b)
   
   
   
   ### Issue Priority
   
   Priority: 1
   
   ### Issue Component
   
   Component: Beam


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@hop.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to