and livy https://hortonworks.com/blog/livy-a-rest-interface-for-apache-spark/
On Fri, May 12, 2017 at 10:51 PM, Sam Elamin wrote:
> Hi Nipun
>
> Have you checked out the job servwr
>
> https://github.com/spark-jobserver/spark-jobserver
>
> Regards
> Sam
> On Fri, 12 May
hello
use google translate and
https://mkdev.me/posts/ci-i-monitoring-spark-prilozheniy
On Mon, May 16, 2016 at 6:13 PM, Ashok Kumar
wrote:
> Hi,
>
> I would like to know the approach and tools please to get the full
> performance for a Spark app running through
spark + zabbix + jmx
https://translate.google.ru/translate?sl=ru=en=y=_t=en=UTF-8=https%3A%2F%2Fmkdev.me%2Fposts%2Fci-i-monitoring-spark-prilozheniy=
On Mon, May 16, 2016 at 6:13 PM, Ashok Kumar
wrote:
> Hi,
>
> I would like to know the approach and tools please to
hello
you can try to use df.limit(5).show()
just trick :)
On Fri, Dec 25, 2015 at 2:34 PM, Eugene Morozov
wrote:
> Hello, I'm basically stuck as I have no idea where to look;
>
> Following simple code, given that my Datasource is working gives me an
> exception.
>
>
hello
if i correct understand - sparkui with rest api - for monitoring
spark-jobserver - for submit job.
On Mon, Dec 7, 2015 at 9:42 AM, sunil m <260885smanik...@gmail.com> wrote:
> Dear Spark experts!
>
> I would like to know the best practices used for invoking spark jobs via
> REST API.
>
>
there is example how i mock mysql
import org.scalamock.scalatest.MockFactory
val connectionMock = mock[java.sql.Connection]
val statementMock = mock[PreparedStatement]
(conMock.prepareStatement(_:
String)).expects(sql.toString).returning(statementMock)
(statementMock.executeUpdate
st one more core than number of receivers if you want to get any work
> done.
>
> Try using the direct stream if you can't combine receivers.
>
> On Mon, Sep 14, 2015 at 11:10 PM, Василец Дмитрий <
> pronix.serv...@gmail.com> wrote:
>
>> I use local[*]. And i have 4
hello
I have 4 streams from kafka and streaming not working.
without any errors or logs
but with 3 streams everything perfect.
make sense only amount of streams , different triple combinations always
working.
any ideas how to debug or fix it ?
I use local[*]. And i have 4 cores on laptop.
On 14 Sep 2015 23:19, "Gerard Maas" <gerard.m...@gmail.com> wrote:
> How many cores are you assigning to your spark streaming job?
>
> On Mon, Sep 14, 2015 at 10:33 PM, Василец Дмитрий <
> pronix.serv...@gmail.com