Hi,
is there any chance to write custom attributes into AVRO file schema?
A = load 'data' using PigStorage(',') AS (b1:int,b2:int,b3:bytearray);
STORE A INTO 'testOutput'
USING org.apache.pig.piggybank.storage.avro.AvroStorage(
'schema',
' {"type":"record","name":"X",
"fields":[{"name":"b1","ty
You could write a streaming function that takes a variable number of inputs.
On Wed, Jan 6, 2016 at 1:11 PM, Sunilmanohar Kancharlapalli -X (sunkanch -
ZENSAR TECHNOLOGIES INC at Cisco) wrote:
> Hi All,
>
>
>
> I am trying to automate the loading process using Pig from a CSV file
> using piggyba
Hi All,
I am trying to automate the loading process using Pig from a CSV file using
piggybank's CSVExcelStorage.
When I am about to store, I want a UDF that needs to be applied to each and
every column of that excel.
Right now, I am using
B = load 'path/to/input/file' using
org.apache.pig.pi
Hi,
When I run a Pig job on a Kerberos secured cluster it uses the tickets
obtained from the kinit I did just before starting the job.
In some cases the job will run for a longer time than the max renew time of
the kerberos tickets.
In other Yarn applications (like Apache Flink) I can login using
Hello
I tried reading a text file from HDFS into pig using the statement
> student = LOAD 'hdfs://localhost:8020/tmp/student.txt' USING PigStorage(',')
>>as (id:int, firstname:chararray, lastname:chararray, age:int,
>> phone:chararray, city:chararray);
>>
>>
My student.txt file is
001,Raji