Unable to pickle pySpark PipelineModel

2020-12-10 Thread Pralabh Kumar
Hi Dev , User I want to store spark ml model in databases , so that I can reuse them later on . I am unable to pickle them . However while using scala I am able to convert them into byte array stream . So for .eg I am able to do something below in scala but not in python val modelToByteArray

Re: mysql connector java issue

2020-12-10 Thread lec ssmi
If you can not assembly the jdbc driver jar in your application jar package, you can put the jdbc driver jar in the spark classpath, generally, $SPARK_HOME/jars or $SPARK_HOME/lib. Artemis User 于2020年12月11日周五 上午5:21写道: > What happened was that you made the mysql jar file only available to the

Re: mysql connector java issue

2020-12-10 Thread Artemis User
What happened was that you made the mysql jar file only available to the spark driver, not the executors.  Use the --jars parameter instead of driver-class-path to specify your third-party jar files, or copy the third-party jar files to the jars directory for Spark in your HDFS, and specify

Migration issue upgrading from Spark 2.4.5 to spark 3.0.1

2020-12-10 Thread Nathan Kronenfeld
Hi, everyone. We are in the process of upgrading our Spark version, and are hitting what appears to be the bug described in https://issues.apache.org/jira/browse/SPARK-29497 - namely, that when we pass anonymous functions to the workers, we get an error Caused by: java.lang.ClassCastException:

unsubscribe

2020-12-10 Thread Luis Mateos
unsubscribe

Unsubscribe

2020-12-10 Thread Przemysław S . Gliniecki
unsubscribe

unsubscribe

2020-12-10 Thread Eric Richardson
unsubscribe

mysql connector java issue

2020-12-10 Thread ismail elhammoud
Hello,  Guys I have an issue with mysql connector java, even if I declared it in sbt file It couldn't work if I don't give the whole path spark-submit --master yarn --driver-class-path /home/node2/Téléchargements/mysql-connector-java-5.1.24-bin.jar ./Sdatahub-assembly-0.1.jar Regards, Isma

Unsubscribe

2020-12-10 Thread Vikas Kumar
Unsubscribe

unsubscribe

2020-12-10 Thread Richardson, Jeff

Re: Implementing TableProvider in Spark 3.0

2020-12-10 Thread tbisson
Following up on this older thread. Looking a the implementation of DataFrameWriter, it doesn't seem possible to use the schema from the dataframe itself when writing out a V2 interface? In order to pass the dataframe schema to a datasourceV2 implementation, a custom write DataSource needs to

Re: Implementing TableProvider in Spark 3.0

2020-12-10 Thread Timothy Bisson
Following up on this older thread. Looking at the implementation of DataFrameWriter, it doesn't seem possible to use the schema from the dataframe itself when writing out a V2 interface? In order to pass the dataframe schema to a datasourceV2 implementation, a custom write DataSource needs to