RE: Input file as an argument og a Spark code

2017-07-25 Thread Joaquín Silva
Well I think I will build a REST API over Livy in order to import/export data 
into HDFS.

 

Thanks all!

 

Joaquín Silva | Pentagon Security & AKAINIX

Av. Kennedy 4.700, Piso 10, Of. 1002, Edificio New Century, Vitacura | Código 
Postal (ZIP Code) 7561127

Cel: (56-9) 6304 2498 

 

From: Vivek Suvarna [mailto:vikk...@gmail.com] 
Sent: martes, 25 de julio de 2017 1:19
To: user@livy.incubator.apache.org
Subject: Re: Input file as an argument og a Spark code

 

I had a similar requirement. 

I used webhdfs to first copy the file across to hdfs before starting the spark 
job via Livy. 



Sent from my iPhone


On 25 Jul 2017, at 9:39 AM, Saisai Shao <sai.sai.s...@gmail.com> wrote:

I think you have to make this csv file accessible from Spark cluster, 
putting to HDFS is one possible solution. 

 

On Tue, Jul 25, 2017 at 1:26 AM, Joaquín Silva <joaq...@akainix.com> 
wrote:

Hello,

 

I'm building a BASH program (using Curl)  that should run a Spark code 
remotely using Livy. But one of the code argument  is a CSV file, how can I 
make that spark reads this file?. The file is going to be in client side, not 
in the Spark cluster machines.

 

Regards,

 

Joaquín Silva

 

 



Re: Input file as an argument og a Spark code

2017-07-24 Thread Vivek Suvarna
I had a similar requirement. 
I used webhdfs to first copy the file across to hdfs before starting the spark 
job via Livy. 


Sent from my iPhone

> On 25 Jul 2017, at 9:39 AM, Saisai Shao  wrote:
> 
> I think you have to make this csv file accessible from Spark cluster, putting 
> to HDFS is one possible solution. 
> 
>> On Tue, Jul 25, 2017 at 1:26 AM, Joaquín Silva  wrote:
>> Hello,
>> 
>>  
>> 
>> I'm building a BASH program (using Curl)  that should run a Spark code 
>> remotely using Livy. But one of the code argument  is a CSV file, how can I 
>> make that spark reads this file?. The file is going to be in client side, 
>> not in the Spark cluster machines.
>> 
>>  
>> 
>> Regards,
>> 
>>  
>> 
>> Joaquín Silva
>> 
>>  
>> 
> 


Re: Input file as an argument og a Spark code

2017-07-24 Thread Saisai Shao
I think you have to make this csv file accessible from Spark cluster,
putting to HDFS is one possible solution.

On Tue, Jul 25, 2017 at 1:26 AM, Joaquín Silva  wrote:

> Hello,
>
>
>
> I'm building a BASH program (using Curl)  that should run a Spark code
> remotely using Livy. But one of the code argument  is a CSV file, how can I
> make that spark reads this file?. The file is going to be in client side,
> not in the Spark cluster machines.
>
>
>
> Regards,
>
>
>
> Joaquín Silva
>
>
>