Spark 1.5.2.
在 2015-11-26 13:19:39,"张志强(旺轩)" 写道:
What’s your spark version?
发件人: wyphao.2007 [mailto:wyphao.2...@163.com]
发送时间: 2015年11月26日 10:04
收件人: user
抄送:dev@spark.apache.org
主题: Spark checkpoint problem
I am test checkpoint to understand how it works, My code as following:
I am test checkpoint to understand how it works, My code as following:
scala> val data = sc.parallelize(List("a", "b", "c"))
data: org.apache.spark.rdd.RDD[String] = ParallelCollectionRDD[0] at
parallelize at :15
scala> sc.setCheckpointDir("/tmp/checkpoint")
15/11/25 18:09:07 WARN spark.Sp
Hi,
I am test checkpoint to understand how it works, My code as following:
scala> val data = sc.parallelize(List("a", "b", "c"))
data: org.apache.spark.rdd.RDD[String] = ParallelCollectionRDD[0] at
parallelize at :15
scala> sc.setCheckpointDir("/tmp/checkpoint")
15/11/25 18:09:07 WARN spark
-dc5aa97c2554.png
At 2015-04-29 18:48:33,zhangxiongfei wrote:
The mount of memory that the driver consumes depends on your program logic,did
you try to collect the result of Spark job?
At 2015-04-29 18:42:04, "wyphao.2007" wrote:
Hi, Dear developer, I am using Spark Streaming to
Hi, Dear developer, I am using Spark Streaming to read data from kafka, the
program already run about 120 hours, but today the program failed because of
driver's OOM as follow:
Container [pid=49133,containerID=container_1429773909253_0050_02_01] is
running beyond physical memory limits. Cu
Best Regards
>
>On Tue, Apr 28, 2015 at 7:35 AM, wyphao.2007 wrote:
>
>> Hi everyone, I am using val messages =
>> KafkaUtils.createDirectStream[String, String, StringDecoder,
>> StringDecoder](ssc, kafkaParams, topicsSet) to read data from
>> kafka(1k/second
Hi everyone, I am using val messages = KafkaUtils.createDirectStream[String,
String, StringDecoder, StringDecoder](ssc, kafkaParams, topicsSet) to read data
from kafka(1k/second), and store the data in windows,the code snippets as
follow:val windowedStreamChannel =
streamChannel.combin
application_1428664056212_0017 not application_1428664056212_0016.
At 2015-04-20 11:46:12,"Sean Owen" wrote:
>This is why spark.hadoop.validateOutputSpecs exists, really:
>https://spark.apache.org/docs/latest/configuration.html
>
>On Mon, Apr 20, 2015 at 3:40 AM, wyphao.2007 wr
Hi,
When I recovery from checkpoint in yarn-cluster mode using Spark Streaming,
I found it will reuse the application id (In my case is
application_1428664056212_0016) before falied to write spark eventLog, But now
my application id is application_1428664056212_0017,then spark write eventLog
I want to get removed RDD from windows as follow, The old RDDs will removed
from current window,
// _
// | previous window _|___
// |___| current window| --> Time
// |___
I want to get removed RDD from windows as follow, The old RDDs will removed
from current window,
// _
// | previous window _|___
// |___| current window| --> Time
// |___
Hi all, Today download Spark source from http://spark.apache.org/downloads.html
page, and I use
./make-distribution.sh --tgz -Phadoop-2.2 -Pyarn -DskipTests
-Dhadoop.version=2.2.0 -Phive
to build the release, but I encountered an exception as follow:
[INFO] --- build-helper-maven-plugin:1.
In the http://spark.apache.org/downloads.html page,We cann't download the
newest Spark release.
At 2014-12-19 17:55:29,"Sean Owen" wrote:
>Tag 1.2.0 is older than 1.2.0-rc2. I wonder if it just didn't get
>updated. I assume it's going to be 1.2.0-rc2 plus a few commits
>related to the r
Hi, When I run spark job on yarn,and the job finished success,but I found
there are some error logs in the logfile as follow(the red color text):
14/09/17 18:25:03 INFO ui.SparkUI: Stopped Spark web UI at
http://sparkserver2.cn:63937
14/09/17 18:25:03 INFO scheduler.DAGScheduler: Stopping DAGS
Hi, When I building spark with maven, but failed, the error message is as
following. I didn't found the satisfactory solution by google. Anyone can
help me? Thank you!
INFO]
[INFO] Reactor Summary:
[INFO]
[INFO] Spark
Hi,I want to know how to use jdbcRDD in JAVA not scala, trying to figure out
the last parameter in the constructor of jdbcRDD
thanks
16 matches
Mail list logo