Hi,
I am using facing strange issue while using chronos, As job is not able to find
the Main class while invoking spark-submit using chronos.
Issue I identified as "colon" in the task name
Env -Chronos scheduled job on mesos
Hi,
While using reference with in JDBCRdd , It is throwing serializable exception.
Does JDBCRdd does not except reference from other part of code.?
confMap= ConfFactory.getConf(ParquetStreaming)
val jdbcRDD = new JdbcRDD(sc, () => {
Hi,
Is there any way equivalent to profiles in maven in sbt. I want spark build to
pick up endpoints based on environment jar is built for
In build.sbt we are ingesting variable dev,stage etc and pick up all
dependencies. Similar way I need a way to pick up config for external
dependencies
We are going to use EMR cluster for spark jobs in aws. Any suggestion for
instance type to be used.
M3.xlarge or r3.xlarge.
Details:
1) We are going to run couple of streaming jobs so we need on demand
instance type.
2) There is no data on hdfs/s3 all data pull from kafka or
Hi,
I am running spark on yarn using oozie.
When submit through command line using spark-submit spark is able to read env
variable. But while submit through oozie its not able toget env variable and
don't see driver log.
Is there any way we specify env variable in oozie spark action.
Any clue on this.
Jobs are running fine , But not able to access Spark UI in EMR -yarn.
Where I can see statistics like , No of events /per sec and rows processed
for streaming in log files (If UI is not working)
-Saurabh
From: Saurabh Malviya (samalviy)
Sent: Monday, January 09, 2017 10:59
Hi,
We are using EMR and using oozie right now to deploy streaming job (Workflow).
I just want to know best practice to deploy streaming job. (In mesos we deploy
using marathon, but what should be best approach in yarn which enforce only
once instance and restart if it fails for any reason)
Spark web UI for detailed monitoring for streaming jobs stop rendering after 2
weeks. Its keep looping to fetch the page. Is there any clue I can get that
page. Or logs where I can see how many events coming in spark for each internval
-Saurabh