Re: Maximum size of vector that reduce can handle

2015-01-27 Thread Boromir Widas
gregation? If that is the > > problem, then how to force it to do aggregation after receiving each > > portion of data from Workers? > > > > > > Best regards, Alexander > > > > > > -Original Message- > > > From: DB Tsai [mailto:dbt...@dbts

Re: Maximum size of vector that reduce can handle

2015-01-27 Thread Xiangrui Meng
m Workers? > > > > Best regards, Alexander > > > > -Original Message- > > From: DB Tsai [mailto:dbt...@dbtsai.com] > > Sent: Friday, January 23, 2015 11:53 AM > > To: Ulanov, Alexander > > Cc: dev@spark.apache.org > > Subject: Re: Maximum size of vect

Re: Maximum size of vector that reduce can handle

2015-01-23 Thread DB Tsai
then how to force it to do aggregation after receiving each portion > of data from Workers? > > Best regards, Alexander > > -Original Message- > From: DB Tsai [mailto:dbt...@dbtsai.com] > Sent: Friday, January 23, 2015 11:53 AM > To: Ulanov, Alexander > Cc: dev@sp

RE: Maximum size of vector that reduce can handle

2015-01-23 Thread Ulanov, Alexander
53 AM To: Ulanov, Alexander Cc: dev@spark.apache.org Subject: Re: Maximum size of vector that reduce can handle Hi Alexander, When you use `reduce` to aggregate the vectors, those will actually be pulled into driver, and merged over there. Obviously, it's not scaleable given you are doing

Re: Maximum size of vector that reduce can handle

2015-01-23 Thread DB Tsai
Hi Alexander, When you use `reduce` to aggregate the vectors, those will actually be pulled into driver, and merged over there. Obviously, it's not scaleable given you are doing deep neural networks which have so many coefficients. Please try treeReduce instead which is what we do in linear regre