Hi all,
I am intending to create a docker image with Python 3.1.1 and Java 8 to
include Python libraries for Data Science. Other versions with Java 11
will come later. The build process is automated with specific dockerfiles
for different purposes.
I am intending to install the following packag
Hello,
This question has been addressed on Stack Overflow using the spark shell,
but not PySpark.
I found within the Spark SQL documentation where in PySpark SQL I can load
a JAR into my SparkSession config such as:
*spark = SparkSession\*
*.builder\*
*.appName("appname")\*
*.config(