[ 
https://issues.apache.org/jira/browse/SPARK-46072?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dmitry Kravchuk updated SPARK-46072:
------------------------------------
    Summary: Missing .jars when applying code to spark-connect  (was: Missing 
.jars when trying to apply code to spark-connect)

> Missing .jars when applying code to spark-connect
> -------------------------------------------------
>
>                 Key: SPARK-46072
>                 URL: https://issues.apache.org/jira/browse/SPARK-46072
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.4.1
>         Environment: python 3.9
> scala 2.12
> spark 3.4.1
> hdfs 3.1.2
> hive 3.1.3
>            Reporter: Dmitry Kravchuk
>            Priority: Major
>             Fix For: 3.4.2, 3.5.1
>
>
> I've built spark with following maven code for our onprem hadoop cluster:
> {code:bash}
> ./build/mvn -Pyarn -Pkubernetes -Dhadoop.version=3.1.2 -Pscala-2.12 -Phive 
> -Phive-thriftserver -DskipTests clean package
> {code}
>  
> So I start connect server like that:
> {code:bash}
> ./sbin/start-connect-server.sh --packages 
> org.apache.spark:spark-connect_2.12:3.4.1
> {code}
>  
> When I'm trying to run any code after following code I always have an error 
> from connect-server side:
> {code:bash}
> ./bin/pyspark --remote "sc://localhost"
> {code}
> Error: 
> {code:bash}
>           
> /home/zeppelin/.ivy2/local/org.apache.spark/spark-connect_2.12/3.4.1/jars/spark-connect_2.12.jar
>         ==== central: tried
>           
> https://repo1.maven.org/maven2/org/apache/spark/spark-connect_2.12/3.4.1/spark-connect_2.12-3.4.1.pom
>           -- artifact 
> org.apache.spark#spark-connect_2.12;3.4.1!spark-connect_2.12.jar:
>           
> https://repo1.maven.org/maven2/org/apache/spark/spark-connect_2.12/3.4.1/spark-connect_2.12-3.4.1.jar
>         ==== spark-packages: tried
>           
> https://repos.spark-packages.org/org/apache/spark/spark-connect_2.12/3.4.1/spark-connect_2.12-3.4.1.pom
>           -- artifact 
> org.apache.spark#spark-connect_2.12;3.4.1!spark-connect_2.12.jar:
>           
> https://repos.spark-packages.org/org/apache/spark/spark-connect_2.12/3.4.1/spark-connect_2.12-3.4.1.jar
>                 ::::::::::::::::::::::::::::::::::::::::::::::
>                 ::          UNRESOLVED DEPENDENCIES         ::
>                 ::::::::::::::::::::::::::::::::::::::::::::::
>                 :: org.apache.spark#spark-connect_2.12;3.4.1: not found
>                 ::::::::::::::::::::::::::::::::::::::::::::::
> {code}
>  
> Where am I wrong? I thought it's a firewall issue what it's not cause I fixed 
> to set http_proxy and https_proxy variables with my own credentials.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to