python setup.py build-->giving error
Packaging Java classes sh: 1: jar: not found error: Error packaging java
component. Command: jar -cf
build/lib.linux-i686-2.7/pydoop/pydoop_1_1_2.jar -C
build/temp.linux-i686-2.7/pipes-1.1.2 ./it
On Sat, Dec 7, 2013 at 12:00 PM, Nitin Pawar wrote:
> Can you
Can you share the error?
On Dec 7, 2013 8:49 AM, "Haider" wrote:
> Hi All
>
> Thanks for you suggestions
> But in my case I have thousands small files and I want read them one by
> one.I think it is only possible by using listdir().
> As per Nitin comment I tried to install Pydoop but it is t
Hi All
Thanks for you suggestions
But in my case I have thousands small files and I want read them one by
one.I think it is only possible by using listdir().
As per Nitin comment I tried to install Pydoop but it is throwing me some
strange error and I am not finding any inforamtion on pydoop o
Haider,
You can use TextLoader to read a file in HDFS line by line, and then you can
pass those lines to your python UDF. Something like the following should work:
x = load '/tmp/my_file_on_hdfs' using TextLoader() as (line:chararray);
y = foreach x generate my_udf(line);
-Original Message--
Hi
:
QUESTION:
:
Can anyone confirm if HCatalogStore works with a hive table that was
declared with buckets?
:
DETAILS:
:
I have a table in hive that was created with buckets. But when I tried to
load the data with HCatalogStorer it fails with the followin
You can use 'matches' keyword in Pig to find out LIKE wise words.
On Dec 6, 2013, at 2:17 PM, Krishnan Narayanan wrote:
> Hi All
>
> How to write the below query to pig, can i use REGEX_EXTRACT_ALL ?
>
> select * from temp where column1 like ( '%abc%' or '%def% or '%aacc%');
>
> Help much a
Hi All
How to write the below query to pig, can i use REGEX_EXTRACT_ALL ?
select * from temp where column1 like ( '%abc%' or '%def% or '%aacc%');
Help much appreciated with example.
Thanks
Krishnan
Can you try with Hadoop 0.23.8 or 0.23.9?
-Rohini
On Mon, Dec 2, 2013 at 11:26 AM, Uttam Kumar wrote:
> Hi All,
>
> I am trying to run PIG 12 with Hadoop 0.23.1 and getting following error
> msg, Can someone please help and suggest what I am missing. I can run PIG
> in local mode without any
Thanks :)
On Fri, Dec 6, 2013 at 4:36 PM, Serega Sheypak wrote:
> Mapred. Child.Java.opts
> 06.12.2013 20:32 пользователь "praveenesh kumar"
> написал:
>
> > Hi all,
> >
> > I am using my custom build class and my UDF is acting as wrapper.
> > I need to pass the command line argument to my UDF
Mapred. Child.Java.opts
06.12.2013 20:32 пользователь "praveenesh kumar"
написал:
> Hi all,
>
> I am using my custom build class and my UDF is acting as wrapper.
> I need to pass the command line argument to my UDF, which can be passed
> down to the actual class that needs it.
>
> One way would b
Hi all,
I am using my custom build class and my UDF is acting as wrapper.
I need to pass the command line argument to my UDF, which can be passed
down to the actual class that needs it.
One way would be to use the standard command line passing method and pass
the parameter via constructors down t
Hey there
First, my Environment: Hortonworks HDP2(HBase 0.95.2.2.0.5.0-64, Pig
0.11.1).
I use pig to load data from hbase, then got Exception Message of
java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.filter.WritableByteArrayComparable.
My script is like below:
samples = LO
Haidar, you can not use python system level functions on hadoop directly.
You may want to take a look at PyDoop project if you want those features
On Fri, Dec 6, 2013 at 2:22 PM, shashwat shriparv wrote:
> I am not sure that this function can list out hdfs dir and files.
>
>
> On Fri, Dec 6, 20
I am not sure that this function can list out hdfs dir and files.
On Fri, Dec 6, 2013 at 11:42 AM, Haider wrote:
> listdir
*Thanks & Regards*
∞
Shashwat Shriparv
14 matches
Mail list logo