When I run a spark application,sometimes I get follow ERROR:
16/04/21 09:26:45 ERROR SparkContext: Error initializing SparkContext.
java.util.concurrent.TimeoutException: Futures timed out after [1
milliseconds]
at
When I run a spark application ,sometimes I will get follow error:
16/04/21 09:26:45 ERROR SparkContext: Error initializing SparkContext.
java.util.concurrent.TimeoutException: Futures timed out after [1
milliseconds]
at
@All
There is a strange problem,I had been running a spark streaming application for
long time,follow is the application info:
1) Fetch data from kafka use dricet api
2) Use sql to write each rdd data of Dstream into redis
3) Read data from redis
Everything seems ok during
hi ,all
there two examples one is throw Task not serializable when execute in spark
shell,the other one is ok,i am very puzzled,can anyone give what's different
about this two code and why the other is ok
1.The one which throw Task not serializable :
import org.apache.spark._
import
hi,all
if i want to change the /tmp folder to any other folder for spark ut use
sbt,how can i do?