Github user auskalia commented on the issue:
https://github.com/apache/spark/pull/17919
Hi @mpjlu , your are right. But I consider that sometimes we have to use
several spark mission to finish our work, especially the resource is
insufficient in hadoop cluster. Due to save and reload
Github user auskalia commented on the issue:
https://github.com/apache/spark/pull/17919
Hi, @MLnick, We find that just do repartition for userFeatures and
productFeatures can improve the efficiency significantly on the ALS
recommendForAll().
Here is our procedure:
1