[jira] [Updated] (HBASE-24815) hbase-connectors mvn install error

2023-10-17 Thread Tak-Lon (Stephen) Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/HBASE-24815?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tak-Lon (Stephen) Wu updated HBASE-24815:
-
Fix Version/s: hbase-connectors-1.0.1
   (was: hbase-connectors-1.1.0)

> hbase-connectors mvn install error
> --
>
> Key: HBASE-24815
> URL: https://issues.apache.org/jira/browse/HBASE-24815
> Project: HBase
>  Issue Type: Bug
>  Components: hbase-connectors
>Reporter: leookok
>Assignee: Mate Szalay-Beko
>Priority: Blocker
> Fix For: hbase-connectors-1.0.1
>
>
> *when  maven  command-line*
> mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
> -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
> will return error
> {color:red}[ERROR]{color} [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216:
>  overloaded method value addTaskCompletionListener with alternatives:
>   (f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext 
>   (listener: 
> org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
>  does not take type parameters
> {color:red}[ERROR] {color}one error found
> *but use the spark.version=2.4.0 is ok*
> mvn -Dspark.version=2.4.0 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
> -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
>  
> *other try*
> mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12 
>  -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
> return error 
> {color:red}[ERROR]{color} [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439:
>  object SparkHadoopUtil in package deploy cannot be accessed in package 
> org.apache.spark.deploy
> [ERROR] [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487:
>  not found: value SparkHadoopUtil
> {color:red}[ERROR]{color} two errors found
> go to the [spark 
> @github|https://github.com/apache/spark/blob/e1ea806b3075d279b5f08a29fe4c1ad6d3c4191a/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala]
> define SparkHadoopUtil  to private[spark] 
> {code:java}
> private[spark] class SparkHadoopUtil extends Logging {}
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (HBASE-24815) hbase-connectors mvn install error

2021-02-02 Thread Mate Szalay-Beko (Jira)


 [ 
https://issues.apache.org/jira/browse/HBASE-24815?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mate Szalay-Beko updated HBASE-24815:
-
Fix Version/s: connector-1.1.0

> hbase-connectors mvn install error
> --
>
> Key: HBASE-24815
> URL: https://issues.apache.org/jira/browse/HBASE-24815
> Project: HBase
>  Issue Type: Bug
>  Components: hbase-connectors
>Reporter: leookok
>Assignee: Mate Szalay-Beko
>Priority: Blocker
> Fix For: connector-1.1.0
>
>
> *when  maven  command-line*
> mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
> -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
> will return error
> {color:red}[ERROR]{color} [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216:
>  overloaded method value addTaskCompletionListener with alternatives:
>   (f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext 
>   (listener: 
> org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
>  does not take type parameters
> {color:red}[ERROR] {color}one error found
> *but use the spark.version=2.4.0 is ok*
> mvn -Dspark.version=2.4.0 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
> -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
>  
> *other try*
> mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12 
>  -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
> return error 
> {color:red}[ERROR]{color} [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439:
>  object SparkHadoopUtil in package deploy cannot be accessed in package 
> org.apache.spark.deploy
> [ERROR] [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487:
>  not found: value SparkHadoopUtil
> {color:red}[ERROR]{color} two errors found
> go to the [spark 
> @github|https://github.com/apache/spark/blob/e1ea806b3075d279b5f08a29fe4c1ad6d3c4191a/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala]
> define SparkHadoopUtil  to private[spark] 
> {code:java}
> private[spark] class SparkHadoopUtil extends Logging {}
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (HBASE-24815) hbase-connectors mvn install error

2020-08-04 Thread leookok (Jira)


 [ 
https://issues.apache.org/jira/browse/HBASE-24815?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

leookok updated HBASE-24815:

Description: 
*when  maven  command-line*

mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
-Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

will return error

{color:red}[ERROR]{color} [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216:
 overloaded method value addTaskCompletionListener with alternatives:
  (f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext 
  (listener: 
org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
 does not take type parameters
{color:red}[ERROR] {color}one error found

*but use the spark.version=2.4.0 is ok*
mvn -Dspark.version=2.4.0 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
-Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
 
*other try*
mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12  
-Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

return error 

{color:red}[ERROR]{color} [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439:
 object SparkHadoopUtil in package deploy cannot be accessed in package 
org.apache.spark.deploy
[ERROR] [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487:
 not found: value SparkHadoopUtil
{color:red}[ERROR]{color} two errors found

go to the [spark 
@github|https://github.com/apache/spark/blob/e1ea806b3075d279b5f08a29fe4c1ad6d3c4191a/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala]
define SparkHadoopUtil  to private[spark] 
{code:java}
private[spark] class SparkHadoopUtil extends Logging {}
{code}








  was:
*when  maven  command-line*

mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
-Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

will return error

{color:red}[ERROR]{color} [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216:
 overloaded method value addTaskCompletionListener with alternatives:
  (f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext 
  (listener: 
org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
 does not take type parameters
{color:red}[ERROR] {color}one error found
 
*other try*
mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12  
-Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

return error 

{color:red}[ERROR]{color} [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439:
 object SparkHadoopUtil in package deploy cannot be accessed in package 
org.apache.spark.deploy
[ERROR] [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487:
 not found: value SparkHadoopUtil
{color:red}[ERROR]{color} two errors found






> hbase-connectors mvn install error
> --
>
> Key: HBASE-24815
> URL: https://issues.apache.org/jira/browse/HBASE-24815
> Project: HBase
>  Issue Type: Bug
>  Components: hbase-connectors
>Reporter: leookok
>Priority: Blocker
>
> *when  maven  command-line*
> mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
> -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
> will return error
> {color:red}[ERROR]{color} [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216:
>  overloaded method value addTaskCompletionListener with alternatives:
>   (f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext 
>   (listener: 
> org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
>  does not take type parameters
> {color:red}[ERROR] {color}one error found
> *but use the spark.version=2.4.0 is ok*
> mvn -Dspark.version=2.4.0 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
> -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
>  
> *other try*
> mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12 
>  -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
> return error 
> {color:red}[ERROR]{color} [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439:
>  object SparkHadoopUtil in package deploy cannot be accessed in package 
> org.apache.spark.deploy
> [ERROR] [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487:
>  not found: value SparkHadoop

[jira] [Updated] (HBASE-24815) hbase-connectors mvn install error

2020-08-04 Thread leookok (Jira)


 [ 
https://issues.apache.org/jira/browse/HBASE-24815?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

leookok updated HBASE-24815:

Description: 
*when  maven  command-line*

mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
-Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

will return error

{color:red}[ERROR]{color} [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216:
 overloaded method value addTaskCompletionListener with alternatives:
  (f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext 
  (listener: 
org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
 does not take type parameters
{color:red}[ERROR] {color}one error found
 
*other try*
mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12  
-Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

return error 

{color:red}[ERROR]{color} [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439:
 object SparkHadoopUtil in package deploy cannot be accessed in package 
org.apache.spark.deploy
[ERROR] [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487:
 not found: value SparkHadoopUtil
{color:red}[ERROR]{color} two errors found





  was:
*when  maven  command-line*

mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
-Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

will return error

{color:red}[ERROR]{color} [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216:
 overloaded method value addTaskCompletionListener with alternatives:
  (f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext 
  (listener: 
org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
 does not take type parameters
{color:red}[ERROR] {color}one error found
 
*other try*
mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12  
-Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

return error 

{color:red}[ERROR]{color} [Error] 
F:\projects\git-hub\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439:
 object SparkHadoopUtil in package deploy cannot be accessed in package 
org.apache.spark.deploy
[ERROR] [Error] 
F:\projects\git-hub\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487:
 not found: value SparkHadoopUtil
{color:red}[ERROR]{color} two errors found






> hbase-connectors mvn install error
> --
>
> Key: HBASE-24815
> URL: https://issues.apache.org/jira/browse/HBASE-24815
> Project: HBase
>  Issue Type: Bug
>  Components: hbase-connectors
>Reporter: leookok
>Priority: Blocker
>
> *when  maven  command-line*
> mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
> -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
> will return error
> {color:red}[ERROR]{color} [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216:
>  overloaded method value addTaskCompletionListener with alternatives:
>   (f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext 
>   (listener: 
> org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
>  does not take type parameters
> {color:red}[ERROR] {color}one error found
>  
> *other try*
> mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12 
>  -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
> return error 
> {color:red}[ERROR]{color} [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439:
>  object SparkHadoopUtil in package deploy cannot be accessed in package 
> org.apache.spark.deploy
> [ERROR] [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487:
>  not found: value SparkHadoopUtil
> {color:red}[ERROR]{color} two errors found



--
This message was sent by Atlassian Jira
(v8.3.4#803005)