Unsubscribe

2023-12-16 Thread Andrew Milkowski

unsubscribe

2021-01-24 Thread Andrew Milkowski

unsubscribe

2019-01-30 Thread Andrew Milkowski
unsubscribe

Re: freeing up memory occupied by processed Stream Blocks

2017-01-25 Thread Andrew Milkowski
/spark/streaming/dstream/DStream.scala#L463). > > You can control this behaviour by StreamingContext#remember to some extent. > > // maropu > > > On Fri, Jan 20, 2017 at 3:17 AM, Andrew Milkowski <amgm2...@gmail.com> > wrote: > >> hello >> >> using

freeing up memory occupied by processed Stream Blocks

2017-01-19 Thread Andrew Milkowski
hello using spark 2.0.2 and while running sample streaming app with kinesis noticed (in admin ui Storage tab) "Stream Blocks" for each worker keeps climbing up then also (on same ui page) in Blocks section I see blocks such as below input-0-1484753367056 that are marked as Memory Serialized

Futures timed out after [120 seconds]

2016-02-08 Thread Andrew Milkowski
Hello, have question , we seeing below exceptions, and at the moment are enabling JVM profiler to look into gc activity on workers and if you have any other suggestions do let know please , we dont just want increase rpc timeout (from 120) to 600 sec lets say but get to reason why workers timeout

akka.tcp://spark@localhost:7077/user/MapOutputTracker akka.actor.ActorNotFound

2014-07-28 Thread Andrew Milkowski
Hello community Using following distros: spark: http://archive.cloudera.com/cdh5/cdh/5/spark-1.0.0-cdh5.1.0-src.tar.gz mesos: http://archive.apache.org/dist/mesos/0.19.0/mesos-0.19.0.tar.gz both assembled with with scala 2.10.4 and java 7 my #!/usr/bin/env bash my spark-env.sh looks as

running Spark App on Yarn produces: Exception in thread main java.lang.NoSuchFieldException: DEFAULT_YARN_APPLICATION_CLASSPATH

2014-07-16 Thread Andrew Milkowski
Hello community, tried to run storm app on yarn, using cloudera hadoop and spark distro (from http://archive.cloudera.com/cdh5/cdh/5) hadoop version: hadoop-2.3.0-cdh5.0.3.tar.gz spark version: spark-0.9.0-cdh5.0.3.tar.gz DEFAULT_YARN_APPLICATION_CLASSPATH is part of hadoop-api-yarn jar ...

Re: running Spark App on Yarn produces: Exception in thread main java.lang.NoSuchFieldException: DEFAULT_YARN_APPLICATION_CLASSPATH

2014-07-16 Thread Andrew Milkowski
are not actually running vs Hadoop 2 binaries. Your cluster is certainly Hadoop 2, but your client is not using the Hadoop libs you think it is (or your compiled binary is linking against Hadoop 1, which is the default for Spark -- did you change it?) On Wed, Jul 16, 2014 at 5:45 PM, Andrew Milkowski amgm2

Re: running Spark App on Yarn produces: Exception in thread main java.lang.NoSuchFieldException: DEFAULT_YARN_APPLICATION_CLASSPATH

2014-07-16 Thread Andrew Milkowski
at 5:45 PM, Andrew Milkowski amgm2...@gmail.com wrote: Hello community, tried to run storm app on yarn, using cloudera hadoop and spark distro (from http://archive.cloudera.com/cdh5/cdh/5) hadoop version: hadoop-2.3.0-cdh5.0.3.tar.gz spark version: spark-0.9.0-cdh5.0.3.tar.gz

Re: running Spark App on Yarn produces: Exception in thread main java.lang.NoSuchFieldException: DEFAULT_YARN_APPLICATION_CLASSPATH

2014-07-16 Thread Andrew Milkowski
/share/hadoop/hdfs/lib/*, $HADOOP_YARN_HOME/share/hadoop/yarn/*, $HADOOP_YARN_HOME/share/hadoop/yarn/lib/*/value /property On Wed, Jul 16, 2014 at 1:47 PM, Andrew Milkowski amgm2...@gmail.com wrote: Sandy, perfect! you saved me tons of time! added this in yarn-site.xml job ran