Re: Restful API Spark Application

2017-05-12 Thread Василец Дмитрий
and livy https://hortonworks.com/blog/livy-a-rest-interface-for-apache-spark/

On Fri, May 12, 2017 at 10:51 PM, Sam Elamin  wrote:
> Hi Nipun
>
> Have you checked out the job servwr
>
> https://github.com/spark-jobserver/spark-jobserver
>
> Regards
> Sam
> On Fri, 12 May 2017 at 21:00, Nipun Arora  wrote:
>>
>> Hi,
>>
>> We have written a java spark application (primarily uses spark sql). We
>> want to expand this to provide our application "as a service". For this, we
>> are trying to write a REST API. While a simple REST API can be easily made,
>> and I can get Spark to run through the launcher. I wonder, how the spark
>> context can be used by service requests, to process data.
>>
>> Are there any simple JAVA examples to illustrate this use-case? I am sure
>> people have faced this before.
>>
>>
>> Thanks
>> Nipun

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Monitoring Spark application progress

2016-05-16 Thread Василец Дмитрий
hello
use google translate and
https://mkdev.me/posts/ci-i-monitoring-spark-prilozheniy

On Mon, May 16, 2016 at 6:13 PM, Ashok Kumar 
wrote:

> Hi,
>
> I would like to know the approach and tools please to get the full
> performance for a Spark app running through Spark-shell and Spark-sumbit
>
>
>1. Through Spark GUI at 4040?
>2. Through OS utilities top, SAR
>3. Through Java tools like jbuilder etc
>4. Through integration Spark with monitoring tools.
>
>
> Thanks
>


Re: Monitoring Spark application progress

2016-05-16 Thread Василец Дмитрий
spark + zabbix + jmx
https://translate.google.ru/translate?sl=ru=en=y=_t=en=UTF-8=https%3A%2F%2Fmkdev.me%2Fposts%2Fci-i-monitoring-spark-prilozheniy=

On Mon, May 16, 2016 at 6:13 PM, Ashok Kumar 
wrote:

> Hi,
>
> I would like to know the approach and tools please to get the full
> performance for a Spark app running through Spark-shell and Spark-sumbit
>
>
>1. Through Spark GUI at 4040?
>2. Through OS utilities top, SAR
>3. Through Java tools like jbuilder etc
>4. Through integration Spark with monitoring tools.
>
>
> Thanks
>


Re: Stuck with DataFrame df.select("select * from table");

2015-12-25 Thread Василец Дмитрий
hello
you can try to use df.limit(5).show()
just trick :)

On Fri, Dec 25, 2015 at 2:34 PM, Eugene Morozov 
wrote:

> Hello, I'm basically stuck as I have no idea where to look;
>
> Following simple code, given that my Datasource is working gives me an
> exception.
>
> DataFrame df = sqlc.load(filename, "com.epam.parso.spark.ds.DefaultSource");
> df.cache();
> df.printSchema();   <-- prints the schema perfectly fine!
>
> df.show();  <-- Works perfectly fine (shows table with 20 
> lines)!
> df.registerTempTable("table");
> df.select("select * from table limit 5").show(); <-- gives weird exception
>
> Exception is:
>
> AnalysisException: cannot resolve 'select * from table limit 5' given input 
> columns VER, CREATED, SOC, SOCC, HLTC, HLGTC, STATUS
>
> I can do a collect on a dataframe, but cannot select any specific columns
> either "select * from table" or "select VER, CREATED from table".
>
> I use spark 1.5.2.
> The same code perfectly works through Zeppelin 0.5.5.
>
> Thanks.
> --
> Be well!
> Jean Morozov
>


Re: Available options for Spark REST API

2015-12-07 Thread Василец Дмитрий
hello
if i correct understand - sparkui with rest api - for monitoring
spark-jobserver - for submit job.


On Mon, Dec 7, 2015 at 9:42 AM, sunil m <260885smanik...@gmail.com> wrote:

> Dear Spark experts!
>
> I would like to know the best practices used for invoking spark jobs via
> REST API.
>
> We tried out the hidden REST API mentioned here:
> http://arturmkrtchyan.com/apache-spark-hidden-rest-api
>
> It works fine for spark standalone mode but does not seem to be working
> when i specify
>  "spark.master" : "YARN-CLUSTER" or "mesos://..."
> did anyone encounter a similar problem?
>
> Has anybody used:
> https://github.com/spark-jobserver/spark-jobserver
>
> If yes please share your experience. Does it work good with both Scala and
> Java classes. I saw only scala example. Are there any known disadvantages
> of using it?
>
> Is there anything better available, which is used in Production
> environment?
>
> Any advise is appreciated. We are using Spark 1.5.1.
>
> Thanks in advance.
>
> Warm regards,
> Sunil M.
>


Re: Mock Cassandra DB Connection in Unit Testing

2015-10-29 Thread Василец Дмитрий
there is example how i mock mysql
import org.scalamock.scalatest.MockFactory
 val connectionMock = mock[java.sql.Connection]
 val statementMock = mock[PreparedStatement]
(conMock.prepareStatement(_:
String)).expects(sql.toString).returning(statementMock)
(statementMock.executeUpdate _).expects()


On Thu, Oct 29, 2015 at 12:27 PM, Priya Ch 
wrote:

> Hi All,
>
>   For my  Spark Streaming code, which writes the results to Cassandra DB,
> I need to write Unit test cases. what are the available test frameworks to
> mock the connection to Cassandra DB ?
>


Re: unoin streams not working for streams > 3

2015-09-15 Thread Василец Дмитрий
thanks.I will try.

On Tue, Sep 15, 2015 at 4:19 PM, Cody Koeninger <c...@koeninger.org> wrote:

> I assume you're using the receiver based stream (createStream) rather than
> createDirectStream?
>
> Receivers each get scheduled as if they occupy a core, so you need at
> least one more core than number of receivers if you want to get any work
> done.
>
> Try using the direct stream if you can't combine receivers.
>
> On Mon, Sep 14, 2015 at 11:10 PM, Василец Дмитрий <
> pronix.serv...@gmail.com> wrote:
>
>> I use local[*]. And i have 4 cores on laptop.
>> On 14 Sep 2015 23:19, "Gerard Maas" <gerard.m...@gmail.com> wrote:
>>
>>> How many cores are you assigning to your spark streaming job?
>>>
>>> On Mon, Sep 14, 2015 at 10:33 PM, Василец Дмитрий <
>>> pronix.serv...@gmail.com> wrote:
>>>
>>>> hello
>>>> I have 4 streams from kafka and streaming not working.
>>>> without any errors or logs
>>>> but with 3 streams everything perfect.
>>>> make sense only amount of streams , different triple combinations
>>>> always working.
>>>> any ideas how to debug or fix it ?
>>>>
>>>>
>>>>
>>>
>


unoin streams not working for streams > 3

2015-09-14 Thread Василец Дмитрий
hello
I have 4 streams from kafka and streaming not working.
without any errors or logs
but with 3 streams everything perfect.
make sense only amount of streams , different triple combinations always
working.
any ideas how to debug or fix it ?


Re: unoin streams not working for streams > 3

2015-09-14 Thread Василец Дмитрий
I use local[*]. And i have 4 cores on laptop.
On 14 Sep 2015 23:19, "Gerard Maas" <gerard.m...@gmail.com> wrote:

> How many cores are you assigning to your spark streaming job?
>
> On Mon, Sep 14, 2015 at 10:33 PM, Василец Дмитрий <
> pronix.serv...@gmail.com> wrote:
>
>> hello
>> I have 4 streams from kafka and streaming not working.
>> without any errors or logs
>> but with 3 streams everything perfect.
>> make sense only amount of streams , different triple combinations always
>> working.
>> any ideas how to debug or fix it ?
>>
>>
>>
>