Re: Pyspark ML model Save Error

2022-11-16 Thread Raja bhupati
Share more details on error to help suggesting solutions

On Wed, Nov 16, 2022, 22:13 Artemis User  wrote:

> What problems did you encounter?  Most likely your problem may be
> related to saving the model object in different partitions.  If that the
> case, just apply the dataframe's coalesce(1) method before saving the
> model to a shared disk drive...
>
> On 11/16/22 1:51 AM, Vajiha Begum S A wrote:
> > Hi,
> > This is Vajiha, Senior Research Analyst. I'm working for Predictive
> > Analysis with Pyspark ML models. It's quite good working with the
> > features of spark in python. Though I'm having issues saving the
> > pyspark trained ML models. I have read many articles,stack overflow
> > and spark forum comments and applied all those and still I'm facing
> > issues on saving the ML model.
> > Kindly help us to solve this issue to continue working with spark.I
> > hope i will get support from the spark team to resolve my issue.
> > Kindly do the needful at the earliest. Thanks in Advance.
> >
> > Spark version- 3.3.0 (we are using this version)
> >
> > Regards,
> > Vajiha Begum
> > Sr.Research Analyst
> > Maestrowiz Solutions Pvt. Ltd
> > India
> >
>
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: Pyspark ML model Save Error

2022-11-16 Thread Artemis User
What problems did you encounter?  Most likely your problem may be 
related to saving the model object in different partitions.  If that the 
case, just apply the dataframe's coalesce(1) method before saving the 
model to a shared disk drive...


On 11/16/22 1:51 AM, Vajiha Begum S A wrote:

Hi,
This is Vajiha, Senior Research Analyst. I'm working for Predictive 
Analysis with Pyspark ML models. It's quite good working with the 
features of spark in python. Though I'm having issues saving the 
pyspark trained ML models. I have read many articles,stack overflow 
and spark forum comments and applied all those and still I'm facing 
issues on saving the ML model.
Kindly help us to solve this issue to continue working with spark.I 
hope i will get support from the spark team to resolve my issue. 
Kindly do the needful at the earliest. Thanks in Advance.


Spark version- 3.3.0 (we are using this version)

Regards,
Vajiha Begum
Sr.Research Analyst
Maestrowiz Solutions Pvt. Ltd
India




-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org