I recommend to run it with your unit tests executed with your build tool.
There is no need to have it in the ide running in the background.
> On 3. Mar 2018, at 17:57, sujeet jog <sujeet@gmail.com> wrote:
>
> Is there a way to run Spark-JobServer in eclipse ?
Is there a way to run Spark-JobServer in eclipse ?.. any pointers in this
regard ?
A better forum would be
https://groups.google.com/forum/#!forum/spark-jobserver
or
https://gitter.im/spark-jobserver/spark-jobserver
Regards,
Noorul
Madabhattula Rajesh Kumar <mrajaf...@gmail.com> writes:
> Hi,
>
> I am getting below an exception when I start
Hi,
I am getting below an exception when I start the job-server
./server_start.sh: line 41: kill: (11482) - No such process
Please let me know how to resolve this error
Regards,
Rajesh
Hi,
I'm working wiht the latest version of Spark JobServer together with Spark
2.0.2. I'm able to do almost all my needs but there is only one noisy thing.
I have placed a hive-site.xml to specify a connection to my mysql db so I
can have the metastore_db on mysql, that's works fine while
Hi
I'm going to deploy jobserver on my CentOS (spark is installed with cdh5.7).
I'm using oracle jdk1.8, sbt-0.13.13, spark-1.6.0 and jobserver-0.6.2.
When I run sbt command (after running sbt publish-local) I encountered the
bellow message :
[cloudera@quickstart spark-jobserver]$ sbt
[info
Reza zade <kntrm...@gmail.com> writes:
> Hi
>
> I have set up a cloudera cluster and work with spark. I want to install
> spark-jobserver on it. What should I do?
Maybe you should send this to spark-jobserver mailing list.
https://github.com/spark-jobserver/spark-jobserv
Hi
I have set up a cloudera cluster and work with spark. I want to install
spark-jobserver on it. What should I do?
Hi
I am using a shared sparkContext for all of my Spark jobs. Some of the jobs
use HiveContext, but there isn't a getOrCreate method on HiveContext which
will allow reuse of an existing HiveContext. Such a method exists on
SQLContext only (def getOrCreate(sparkContext: SparkContext): SQLContext).
Have you noticed the following method of HiveContext ?
* Returns a new HiveContext as new session, which will have separated
SQLConf, UDF/UDAF,
* temporary tables and SessionState, but sharing the same CacheManager,
IsolatedClientLoader
* and Hive client (both of execution and metadata)
On 25 January 2016 at 21:09, Deenar Toraskar <
deenar.toras...@thinkreactive.co.uk> wrote:
> No I hadn't. This is useful, but in some cases we do want to share the
> same temporary tables between jobs so really wanted a getOrCreate
> equivalent on HIveContext.
>
> Deenar
>
>
>
> On 25 January
Hi all,
I have some questions about spark -jobserver.
I deployed a spark-jobserver in yarn-client mode using docker.
I’d like to use dynamic resource allocation option for yarn in spark-jobserver.
How can I add this option?
And when will it be support 1.5.x version ?
(https://hub.docker.com/r
}
}
*When I push the Jar using spark-jobServer and execute it I get this on
spark-jobserver terminal
*
job-server[ERROR] Exception in thread pool-1-thread-1
java.lang.NoClassDefFoundError:
org/apache/cassandra/hadoop/cql3/CqlPagingInputFormat
job-server[ERROR] at
spark.jobserver.CassandraCQLTest
I was able to fix the issues by providing right version of cassandra-all and
thrift libraries
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Cassandra-Connection-Issue-with-Spark-jobserver-tp22587p22664.html
Sent from the Apache Spark User List mailing
],
job.getConfiguration()
)
casRdd.count
}
}
*When I push the Jar using spark-jobServer and execute it I get this on
spark-jobserver terminal
*
job-server[ERROR] Exception in thread pool-1-thread-1
java.lang.NoClassDefFoundError:
org/apache/cassandra/hadoop/cql3/CqlPagingInputFormat
job-server[ERROR
Hi,
I am trying to Spark Jobserver(
https://github.com/spark-jobserver/spark-jobserver
https://github.com/spark-jobserver/spark-jobserver ) for running Spark
SQL jobs.
I was able to start the server but when I run my application(my Scala class
which extends SparkSqlJob), I am getting
You shouldn't need to do anything special. Are you using a named context?
I'm not sure those work with SparkSqlJob.
By the way, there is a forum on Google groups for the Spark Job Server:
https://groups.google.com/forum/#!forum/spark-jobserver
On Thu, Apr 2, 2015 at 5:10 AM, Harika matha.har
by Twitter.
Regards,
Vasu C
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-pass-parameters-to-a-spark-jobserver-Scala-class-tp21671p21727.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
-to-a-spark-jobserver-Scala-class-tp21671p21695.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h
Thank you very much Vasu. Let me add some more points to my question. We are
developing a Java program for connection spark-jobserver to Vaadin (Java
framework). Following is the sample code I wrote for connecting both (the
code works fine):
/
URL url = null;
HttpURLConnection connection = null
Hi Sasi,
To pass parameters to spark-jobserver usecurl -d input.string = a b c
a b see and in Job server class use config.getString(input.string).
You can pass multiple parameters like starttime,endtime etc and use
config.getString() to get the values.
The examples are shown here
https
Thank you Abhishek. The code works.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-define-SparkContext-with-Cassandra-connection-for-spark-jobserver-tp21119p21184.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
-SparkContext-with-Cassandra-connection-for-spark-jobserver-tp21119p21162.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e
Hi all,
I'm able to submit spark jobs through spark-jobserver. But this allows to
use spark only in yarn-client mode. I want to use spark also in
yarn-cluster mode but jobserver does not allow it, like says in the README
file https://github.com/spark-jobserver/spark-jobserver.
Could you tell
Dear All,
For our requirement, we need to define a SparkContext with SparkConf which
has Cassandra connection details. And this SparkContext need to be shared
for subsequent runJobs and through out the application.
So, How to define SparkContext with Cassandra connection for
spark-jobserver
wrote:
Thank you Abhishek. That works.
--
If you reply to this email, your message will be added to the
discussion below:
http://apache-spark-user-list.1001560.n3.nabble.com/Removing-JARs-from-spark-jobserver-tp21081p21084.html
To start a new topic under Apache
Thank you Abhishek. That works.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Removing-JARs-from-spark-jobserver-tp21081p21084.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
There is path /tmp/spark-jobserver/file where all the jar are kept by
default. probably deleting from there should work
On 11 Jan 2015 12:51, Sasi [via Apache Spark User List]
ml-node+s1001560n21081...@n3.nabble.com wrote:
How to remove submitted JARs from spark-jobserver
How to remove submitted JARs from spark-jobserver?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Removing-JARs-from-spark-jobserver-tp21081.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
/scala-2.10/spark-jobserver-examples_2.10-1.0.0.jar
localhost:8090/jars/sparking* command to upload
as mentioned in https://github.com/fedragon/spark-jobserver-examples link.
We done some samples earlier for connecting Apache Cassandra to spark using
Scala language. Initially, we faced same
We are able to resolve *SparkException: Job aborted due to stage failure: All
masters are unresponsive! Giving up* as well. Spark-jobserver working fine
now and need to experiment more.
Thank you guys.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Set
Thank you Pankaj. We are able to create the Uber JAR (very good to bind all
dependency JARs together) and run it on spark-jobserver. One step better
than what we are.
However, now facing *SparkException: Job aborted due to stage failure: All
masters are unresponsive! Giving up*. We may need
Skype
pankaj.narang
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Set-EXTRA-JAR-environment-variable-for-spark-jobserver-tp20989p20992.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
Or you can use:
sc.addJar(/path/to/your/datastax.jar)
Thanks
Best Regards
On Tue, Jan 6, 2015 at 5:53 PM, bchazalet bchaza...@companywatch.net
wrote:
I don't know much about spark-jobserver, but you can set jars
programatically
using the method setJars on SparkConf. Looking at your code
I don't know much about spark-jobserver, but you can set jars programatically
using the method setJars on SparkConf. Looking at your code it seems that
you're importing classes from com.datastax.spark.connector._ to load data
from cassandra, so you may need to add that datastax jar to your
)
.setMaster(local[4])
.setJars(Seq(C:/spark-jobserver/lib/spark-cassandra-connector_2.10-1.1.0-alpha3.jar))
Am I missing something?
Meanwhile, I will try for Pankaj's reply of using uber jar.
--
View this message in context:
http://apache-spark-user-list
jobserver.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Set-EXTRA-JAR-environment-variable-for-spark-jobserver-tp20989p20998.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
-for-spark-jobserver-tp20989p20998.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org
,
I'm investigating spark for a new project and I'm trying to use
spark-jobserver because... I need to reuse and share RDDs and from what I
read in the forum that's the standard :D
Turns out that spark-jobserver doesn't seem to work on yarn, or at least
it does not on 1.1.1
My config
Thanks Akhil, that will help a lot !
It turned out that spark-jobserver does not work in development mode but
if you deploy a server it works (looks like the dependencies when running
jobserver from sbt are not right)
On Thu, Jan 1, 2015 at 5:22 AM, Akhil Das ak...@sigmoidanalytics.com
Does my question make sense or required some elaboration?
Sasi
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Need-help-for-Spark-JobServer-setup-on-Maven-for-Java-programming-tp20849p20896.html
Sent from the Apache Spark User List mailing list archive
If you reply to this email, your message will be added to the discussion
below:
http://apache-spark-user-list.1001560.n3.nabble.com/Need-help-for-Spark-JobServer-setup-on-Maven-for-Java-programming-tp20849p20896.html
To start a new topic under Apache Spark User List, email
The reason being, we had Vaadin (Java Framework) application which displays
data from Spark RDD, which in turn gets data from Cassandra. As we know, we
need to use Maven for building Spark API in Java.
We tested the spark-jobserver using SBT and able to run it. However, for our
requirement, we
-node+s1001560n20898...@n3.nabble.com wrote:
The reason being, we had Vaadin (Java Framework) application which
displays data from Spark RDD, which in turn gets data from Cassandra. As we
know, we need to use Maven for building Spark API in Java.
We tested the spark-jobserver using SBT and able
/Need-help-for-Spark-JobServer-setup-on-Maven-for-Java-programming-tp20849p20902.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional
-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespacebreadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Need-help-for-Spark-JobServer-setup
Thanks Abhishek. We are good know with an answer to try.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Need-help-for-Spark-JobServer-setup-on-Maven-for-Java-programming-tp20849p20906.html
Sent from the Apache Spark User List mailing list archive
Hi all,
I'm investigating spark for a new project and I'm trying to use
spark-jobserver because... I need to reuse and share RDDs and from what I
read in the forum that's the standard :D
Turns out that spark-jobserver doesn't seem to work on yarn, or at least it
does not on 1.1.1
My config
Dear All,
We are trying to share RDDs across different sessions of same Web
application (Java). We need to share single RDD between those sessions. As
we understand from some posts, it is possible through Spark-JobServer.
Is there any guidelines you can provide to setup Spark-JobServer for Maven
Hi,
I'm working on the problem of remotely submitting apps to the spark
master. I'm trying to use the spark-jobserver project
(https://github.com/ooyala/spark-jobserver) for that purpose.
For scala apps looks like things are working smoothly, but for java
apps, I have an issue with implementing
for managing your Spark jobs and job history and status.
In order to make sure the project can continue to move forward
independently, new features developed and contributions merged, we are
moving the project to a new github organization. The new location is:
https://github.com/spark-jobserver/spark
I'm looking for something like the ooyala spark-jobserver (
https://github.com/ooyala/spark-jobserver) that basically manages a
SparkContext for use from a REST or web application environment, but for
python jobs instead of scala.
Has anyone written something like this? Looking for a project
That's good to know. I will try it out.
Thanks Romain
On Friday, June 27, 2014, Romain Rigaux romain.rig...@gmail.com wrote:
So far Spark Job Server does not work with Spark 1.0:
https://github.com/ooyala/spark-jobserver
So this works only with Spark 0.9 currently:
http://gethue.com/get
So far Spark Job Server does not work with Spark 1.0:
https://github.com/ooyala/spark-jobserver
So this works only with Spark 0.9 currently:
http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/
Romain
Romain
On Tue, Jun 24, 2014 at 9:04 AM
54 matches
Mail list logo