coordinates? So that
> we run sth like pip install or download from pypi index?
>
>
>
> *From:* Mich Talebzadeh
> *Sent:* Mittwoch, 24. November 2021 18:28
> *Cc:* user@spark.apache.org
> *Subject:* Re: [issue] not able to add external libs to pyspark job while
> using s
Can we add Python dependencies as we can do for mvn coordinates? So that we run
sth like pip install or download from pypi index?
From: Mich Talebzadeh
Sent: Mittwoch, 24. November 2021 18:28
Cc: user@spark.apache.org
Subject: Re: [issue] not able to add external libs to pyspark job while
The easiest way to set this up is to create dependencies.zip file.
Assuming that you have a virtual environment already set-up, where there is
directory called site-packages, go to that directory and just create a
minimal a shell script say package_and_zip_dependencies.sh to do it for you
Exampl
external libs to pyspark job while using
spark-submit
You don't often get email from sro...@gmail.com. Learn why this is
important<http://aka.ms/LearnAboutSenderIdentification>
External Sender: be CAUTION , Particularly with links and attachments.
That's not how you add a librar
That's not how you add a library. From the docs:
https://spark.apache.org/docs/latest/api/python/user_guide/python_packaging.html
On Wed, Nov 24, 2021 at 8:02 AM Atheer Alabdullatif
wrote:
> Dear Spark team,
> hope my email finds you well
>
>
> I am using pyspark 3.0 and facing an issue with add
Dear Spark team,
hope my email finds you well
I am using pyspark 3.0 and facing an issue with adding external library
[configparser] while running the job using [spark-submit] & [yarn]
issue:
import configparser
ImportError: No module named configparser
21/11/24 08:54:38 INFO util.ShutdownHo