[ 
https://issues.apache.org/jira/browse/PIG-5448?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Koji Noguchi resolved PIG-5448.
-------------------------------
    Fix Version/s: 0.19.0
     Hadoop Flags: Reviewed
       Resolution: Fixed

Thanks for the review Rohini! 
Committed to trunk.

> All TestHBaseStorage tests failing on pig-on-spark3
> ---------------------------------------------------
>
>                 Key: PIG-5448
>                 URL: https://issues.apache.org/jira/browse/PIG-5448
>             Project: Pig
>          Issue Type: Bug
>          Components: spark
>            Reporter: Koji Noguchi
>            Assignee: Koji Noguchi
>            Priority: Minor
>             Fix For: 0.19.0
>
>         Attachments: pig-5448-v01.patch
>
>
> For Pig on Spark3 (with PIG-5439), all of the TestHBaseStorage unit tests are 
> failing with 
> {noformat}
> org.apache.pig.PigException: ERROR 1002: Unable to store alias b
> at org.apache.pig.PigServer.storeEx(PigServer.java:1127)
> at org.apache.pig.PigServer.store(PigServer.java:1086)
> at 
> org.apache.pig.test.TestHBaseStorage.testStoreToHBase_1_with_delete(TestHBaseStorage.java:1251)
> Caused by: org.apache.pig.impl.plan.VisitorException: ERROR 0: fail to get 
> the rdds of this spark operator:
> at 
> org.apache.pig.backend.hadoop.executionengine.spark.JobGraphBuilder.visitSparkOp(JobGraphBuilder.java:115)
> at 
> org.apache.pig.backend.hadoop.executionengine.spark.plan.SparkOperator.visit(SparkOperator.java:140)
> at 
> org.apache.pig.backend.hadoop.executionengine.spark.plan.SparkOperator.visit(SparkOperator.java:37)
> at 
> org.apache.pig.impl.plan.DependencyOrderWalker.walk(DependencyOrderWalker.java:87)
> at org.apache.pig.impl.plan.PlanVisitor.visit(PlanVisitor.java:46)
> at 
> org.apache.pig.backend.hadoop.executionengine.spark.SparkLauncher.launchPig(SparkLauncher.java:241)
> at 
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.launchPig(HExecutionEngine.java:290)
> at org.apache.pig.PigServer.launchPlan(PigServer.java:1479)
> at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1464)
> at org.apache.pig.PigServer.storeEx(PigServer.java:1123)
> Caused by: java.lang.RuntimeException: No task metrics available for jobId 0
> at 
> org.apache.pig.tools.pigstats.spark.SparkJobStats.collectStats(SparkJobStats.java:109)
> at 
> org.apache.pig.tools.pigstats.spark.SparkPigStats.addJobStats(SparkPigStats.java:77)
> at 
> org.apache.pig.tools.pigstats.spark.SparkStatsUtil.waitForJobAddStats(SparkStatsUtil.java:73)
> at 
> org.apache.pig.backend.hadoop.executionengine.spark.JobGraphBuilder.sparkOperToRDD(JobGraphBuilder.java:225)
> at 
> org.apache.pig.backend.hadoop.executionengine.spark.JobGraphBuilder.visitSparkOp(JobGraphBuilder.java:112)
> {noformat}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to