Hi, spark users.
When running a spark application with lots of executors(300+), I see following
failures:
java.net.SocketTimeoutException: Read timed out at
java.net.SocketInputStream.socketRead0(Native Method) at
java.net.SocketInputStream.read(SocketInputStream.java:152) at
on my side, not sure if OP was having the same
problem though
On Wed, Feb 11, 2015 at 12:03 AM, Arush Kharbanda
ar...@sigmoidanalytics.com wrote:
Hi
Can you share the code you are trying to run.
Thanks
Arush
On Wed, Feb 11, 2015 at 9:12 AM, Tianshuo Deng td...@twitter.com.invalid
wrote:
I
on my side, not sure if OP was having the same
problem though
On Wed, Feb 11, 2015 at 12:03 AM, Arush Kharbanda
ar...@sigmoidanalytics.com wrote:
Hi
Can you share the code you are trying to run.
Thanks
Arush
On Wed, Feb 11, 2015 at 9:12 AM, Tianshuo Deng td...@twitter.com.invalid
wrote:
I
Hi,
Currently in GradientDescent.scala, weights is constructed as a dense
vector:
initialWeights = Vectors.dense(new Array[Double](numFeatures))
And the numFeatures is determined in the loadLibSVMFile as the max index of
features.
But in the case of using hash function to compute feature