Re: Custom Spark Error on Hadoop Cluster

2016-07-18 Thread Xiangrui Meng
Glad to hear. Could you please share your solution on the user mailing list? -Xiangrui On Mon, Jul 18, 2016 at 2:26 AM Alger Remirata wrote: > Hi Xiangrui, > > We have now solved the problem. Thanks for all the tips you've given. > > Best Regards, > > Alger > > On Thu,

Re: Custom Spark Error on Hadoop Cluster

2016-07-11 Thread Xiangrui Meng
(+user@spark. Please copy user@ so other people could see and help.) The error message means you have an MLlib jar on the classpath but it didn't contain ALS$StandardNNLSSolver. So it is either the modified jar not deployed to the workers or there existing an unmodified MLlib jar sitting in front

Re: Custom Spark Error on Hadoop Cluster

2016-07-07 Thread Xiangrui Meng
This seems like a deployment or dependency issue. Please check the following: 1. The unmodified Spark jars were not on the classpath (already existed on the cluster or pulled in by other packages). 2. The modified jars were indeed deployed to both master and slave nodes. On Tue, Jul 5, 2016 at