ISSUE with Filter Hbase Table using SingleColumnValueFilter

2013-11-27 Thread samir das mohapatra
Dear developer I am looking for a solution where i can applu the *SingleColumnValueFilter to select only the value which i will mention in the value parameter not other then the value which i will pass.* * Exxample:* SingleColumnValueFilter colValFilter = new

Did anyone work with Hbase mapreduce with multiple table as input ?

2013-11-17 Thread samir das mohapatra
Dear hadoop/hbase developer Did Anyone work with Hbase mapreduce with multiple table as input ? Any url-link or example will help me alot. Thanks in advance. Thanks, samir.

Re: Getting error from sqoop2 command

2013-10-08 Thread samir das mohapatra
connector --all *Exception has occurred during processing command * *Exception: com.sun.jersey.api.client.ClientHandlerException Message: java.net.ConnectException: Connection refused* Regards, samir. On Tue, Oct 8, 2013 at 12:16 PM, samir das mohapatra samir.help...@gmail.com wrote: Dear Sqoop

Facing issue using Sqoop2

2013-10-08 Thread samir das mohapatra
Dear All I am getting error like blow mention, did any one got from sqoop2 Error: sqoop:000 set server --host hostname1 --port 8050 --webapp sqoop Server is set successfully sqoop:000 show server -all Server host: hostname1 Server port: 8050 Server webapp: sqoop sqoop:000 show version --all

how to use Sqoop command without Hardcoded password while using sqoop command

2013-10-04 Thread samir das mohapatra
Dear Hadoop/Sqoop users Is there any way to call sqoop command without hard coding the password for the specific RDBMS. ?. If we are hard coding the password then it will be huge issue with sequrity. Regards, samir.

How to ignore empty file comming out from hive map side join

2013-09-13 Thread samir das mohapatra
Dear Hive/Hadoop Developer Just I was runing hive mapside join , along with output data I colud see some empty file in map stage, Why it is ? and how to ignore this file . Regards, samir.

While Inserting data into hive Why I colud not able to query ?

2013-07-16 Thread samir das mohapatra
Dear All, Did any one faced the issue : While Loading huge dataset into hive table , hive restricting me to query from same table. I have set hive.support.concurrency=true, still showing conflicting lock present for TABLENAME mode SHARED property namehive.support.concurrency/name

Error While Processing SequeceFile with Lzo Compressed in hive External table (CDH4.3)

2013-06-19 Thread samir das mohapatra
Dear All, Any One would have face this type of Issue ? I am getting Some error while processing Sequecen file with LZO compresss in hive query In CDH4.3.x Distribution. Error Logs: SET hive.exec.compress.output=true; SET

How to get the intermediate mapper output file name

2013-06-03 Thread samir das mohapatra
Hi all, How to get the mapper output filename inside the the mapper . or How to change the mapper ouput file name. Default it looks like part-m-0,part-m-1 etc. Regards, samir.

Pulling data from secured hadoop cluster to another hadoop cluster

2013-05-28 Thread samir das mohapatra
Hi All, I could able to connect the hadoop (source ) cluster after ssh is established. But i wanted to know, If I want to pull some data using distcp from source secured hadoop box to another hadoop cluster , I could not able to ping name node machine. In this approach how to run distcp

Re: Pulling data from secured hadoop cluster to another hadoop cluster

2013-05-28 Thread samir das mohapatra
able to see the hdfs on server1 from server2? On Tue, May 28, 2013 at 5:17 PM, samir das mohapatra samir.help...@gmail.com wrote: Hi All, I could able to connect the hadoop (source ) cluster after ssh is established. But i wanted to know, If I want to pull some data using distcp from

Issue with data Copy from CDH3 to CDH4

2013-05-24 Thread samir das mohapatra
Hi all, We tried to pull the data from upstream cluster(cdh3) which is running cdh3 to down stream system (running cdh4) ,Using *distcp* to copy the data, it was throughing some exception bcz due to version isssue. I wanted to know is there any solution to pull the data from CDH3 to CDH4

Re: how to copy a table from one hbase cluster to another cluster?

2013-03-20 Thread samir das mohapatra
/ops_mgt.html#copytable What kind of help do you need? JM 2013/3/20 samir das mohapatra samir.help...@gmail.com: Hi All, Can you help me to copy one hbase table to another cluster hbase (Table copy) . Regards, samir

Re: how to copy a table from one hbase cluster to another cluster?

2013-03-20 Thread samir das mohapatra
simply open the org.apache.hadoop.hbase.mapreduce.CopyTable, look into it, and do almost the same thing for your needs? JM 2013/3/20 samir das mohapatra samir.help...@gmail.com: Thanks, for reply I need to copy the hbase table into another cluster through the java code. Any example

Re: How to pull Delta data from one cluster to another cluster ?

2013-03-14 Thread samir das mohapatra
...@gmail.com wrote: Use distcp. Warm Regards, Tariq https://mtariq.jux.com/ cloudfront.blogspot.com On Thu, Mar 14, 2013 at 3:40 PM, samir das mohapatra samir.help...@gmail.com wrote: Regards, samir.

Re: How to pull Delta data from one cluster to another cluster ?

2013-03-14 Thread samir das mohapatra
. samir das mohapatra samir.help...@gmail.com wrote: how to pull delta data that means filter data not whole data as off now i know we can do whole data through the distcp, colud you plese help if i am wrong or any other way to pull efficiently. like : get data based on filter condition

Is there any way to get information from Hbase once some record get updated?

2013-03-14 Thread samir das mohapatra
Hi All, Is there any way to get information from Hbase once some record get updated? , Like the Database Trigger. Regards, samir.

Re: How to shuffle (Key,Value) pair from mapper to multiple reducer

2013-03-13 Thread samir das mohapatra
Use can use Custom Partitioner for that same. Regards, Samir. On Wed, Mar 13, 2013 at 2:29 PM, Vikas Jadhav vikascjadha...@gmail.comwrote: Hi I am specifying requirement again with example. I have use case where i need to shufffle same (key,value) pair to multiple reducers For

Why hadoop is spawing two map over file size 1.5 KB ?

2013-03-12 Thread samir das mohapatra
Hi All, I have very fundamental doubt, I have file having size 1.5KB and block size is default block size, But i could see two mapper it got creted during the Job. Could you please help to get whole picture why it is . Regards, samir.

Re: How can I record some position of context in Reduce()?

2013-03-12 Thread samir das mohapatra
Through the RecordReader and FileStatus you can get it. On Tue, Mar 12, 2013 at 4:08 PM, Roth Effy effyr...@gmail.com wrote: Hi,everyone, I want to join the k-v pairs in Reduce(),but how to get the record position? Now,what I thought is to save the context status,but class Context doesn't

Re: Hadoop cluster hangs on big hive job

2013-03-10 Thread samir das mohapatra
Problem I could see in you log file is , No available free map slot for job. I think you have to increase the block size to reduce the # of MAP , Bcz you are passing Big data as Input. The ideal approach is , first increase the 1) block size, 2) mapp site buffer 3) jvm re-use etc.

Re: Need help optimizing reducer

2013-03-04 Thread samir das mohapatra
Austin, I think you have to use partitioner to spawn more then one reducer for small data set. Default Partitioner will allow you only one reducer, you have to overwrite and implement you own logic to spawn more then one reducer. On Tue, Mar 5, 2013 at 1:27 AM, Austin Chungath

Re: Issue with sqoop and HANA/ANY DB Schema name

2013-03-01 Thread samir das mohapatra
Any help... On Fri, Mar 1, 2013 at 12:06 PM, samir das mohapatra samir.help...@gmail.com wrote: Hi All, I am facing one problem , how to specify the schema name before the table while executing the sqoop import statement. $ sqoop import --connect jdbc:sap://host:port/db_name --driver

Re: Issue in Datanode (using CDH4.1.2)

2013-02-28 Thread samir das mohapatra
few more things Same setup was working in Ubuntu machine(Dev cluster), only failing under CentOS 6.3(prod Cluster) On Thu, Feb 28, 2013 at 9:06 PM, samir das mohapatra samir.help...@gmail.com wrote: Hi All, I am facing on strange issue, That is In a cluster having 1k machine i could

Issue with sqoop and HANA/ANY DB Schema name

2013-02-28 Thread samir das mohapatra
Hi All, I am facing one problem , how to specify the schema name before the table while executing the sqoop import statement. $ sqoop import --connect jdbc:sap://host:port/db_name --driver com.sap.db.jdbc.Driver --table SchemaName.Test-m 1 --username --password

How to use sqoop import

2013-02-28 Thread samir das mohapatra
Hi All, Can any one share some example how to run sqoop Import results of SQL 'statement' ? for example: sqoop import -connect jdbc:. --driver xxx after this if i am specifying --query select statement it is even not recognizing as sqoop valid statement..

How to take Whole Database From RDBMS to HDFS Instead of Table/Table

2013-02-27 Thread samir das mohapatra
Hi All, Using sqoop how to take entire database table into HDFS insted of Table by Table ?. How do you guys did it? Is there some trick? Regards, samir.

Re: How to take Whole Database From RDBMS to HDFS Instead of Table/Table

2013-02-27 Thread samir das mohapatra
thanks all. On Wed, Feb 27, 2013 at 4:41 PM, Jagat Singh jagatsi...@gmail.com wrote: You might want to read this http://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_literal_sqoop_import_all_tables_literal On Wed, Feb 27, 2013 at 10:09 PM, samir das mohapatra samir.help

Re: How to take Whole Database From RDBMS to HDFS Instead of Table/Table

2013-02-27 Thread samir das mohapatra
... Sent from a remote device. Please excuse any typos... Mike Segel On Feb 27, 2013, at 5:15 AM, samir das mohapatra samir.help...@gmail.com wrote: thanks all. On Wed, Feb 27, 2013 at 4:41 PM, Jagat Singh jagatsi...@gmail.com wrote: You might want to read this http://sqoop.apache.org

Fwd: ISSUE IN CDH4.1.2 : transfer data between different HDFS clusters.(using distch)

2013-02-25 Thread samir das mohapatra
-- Forwarded message -- From: samir das mohapatra samir.help...@gmail.com Date: Mon, Feb 25, 2013 at 3:05 PM Subject: ISSUE IN CDH4.1.2 : transfer data between different HDFS clusters.(using distch) To: cdh-u...@cloudera.org Hi All, I am getting bellow error , can any one help

Re: ISSUE IN CDH4.1.2 : transfer data between different HDFS clusters.(using distch)

2013-02-25 Thread samir das mohapatra
yes On Mon, Feb 25, 2013 at 3:30 PM, Nitin Pawar nitinpawar...@gmail.comwrote: does this match with your issue https://groups.google.com/a/cloudera.org/forum/#!topic/cdh-user/kIPOvrFaQE8 On Mon, Feb 25, 2013 at 3:20 PM, samir das mohapatra samir.help...@gmail.com wrote

Re: ISSUE IN CDH4.1.2 : transfer data between different HDFS clusters.(using distch)

2013-02-25 Thread samir das mohapatra
I am using CDH4.1.2 with MRv1 not YARN. On Mon, Feb 25, 2013 at 3:47 PM, samir das mohapatra samir.help...@gmail.com wrote: yes On Mon, Feb 25, 2013 at 3:30 PM, Nitin Pawar nitinpawar...@gmail.comwrote: does this match with your issue https://groups.google.com/a/cloudera.org/forum

Re: ISSUE :Hadoop with HANA using sqoop

2013-02-21 Thread samir das mohapatra
more like a SAP side fault than a Hadoop side one and you should ask on their forums with the stacktrace posted. On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra samir.help...@gmail.com wrote: Hi All Can you plese tell me why I am getting error while loading data from SAP HANA

Fwd: Delivery Status Notification (Failure)

2013-02-12 Thread samir das mohapatra
Hi All, I wanted to know how to connect Hive(hadoop-cdh4 distribution) with MircoStrategy Any help is very helpfull. Witing for you response Note: It is little bit urgent do any one have exprience in that Thanks, samir

Re: Hive Metastore DB Issue ( Cloudera CDH4.1.2 MRv1 with hive-0.9.0-cdh4.1.2)

2013-02-07 Thread samir das mohapatra
mailing list and do not copy this to hdfs-user. On Thu, Feb 7, 2013 at 7:20 AM, samir das mohapatra samir.help...@gmail.com wrote: Any Suggestion... On Thu, Feb 7, 2013 at 4:17 PM, samir das mohapatra samir.help...@gmail.com wrote: Hi All, I could not see the hive meta

All MAP Jobs(Java Custom Map Reduce Program) are assigned to one Node why?

2013-01-31 Thread samir das mohapatra
Hi All, I am using cdh4 with MRv1 . When I am running any hadoop mapreduce program from java , all the map task is assigned to one node. It suppose to distribute the map task among the cluster's nodes. Note : 1) My jobtracker web-UI is showing 500 nodes 2) when it is comming to

How to Integrate MicroStrategy with Hadoop

2013-01-30 Thread samir das mohapatra
Hi All, I wanted to know how to connect HAdoop with MircoStrategy Any help is very helpfull. Witing for you response Note: Any Url and Example will be really help full for me. Thanks, samir

How to Integrate SAP HANA WITH Hadoop

2013-01-30 Thread samir das mohapatra
Hi all I we need the connectivity of SAP HANA with Hadoop, Do you have any experience with that can you please share some documents and example with me ,so that it will be really help full for me thanks, samir

Re: How to Integrate MicroStrategy with Hadoop

2013-01-30 Thread samir das mohapatra
We are using coludera Hadoop On Thu, Jan 31, 2013 at 2:12 AM, samir das mohapatra samir.help...@gmail.com wrote: Hi All, I wanted to know how to connect HAdoop with MircoStrategy Any help is very helpfull. Witing for you response Note: Any Url and Example will be really help

Recommendation required for Right Hadoop Distribution (CDH OR HortonWork)

2013-01-30 Thread samir das mohapatra
Hi All, My Company wanted to implement right Distribution for Apache Hadoop for its Production as well as Dev. Can any one suggest me which one will good for future. Hints: They wanted to know both pros and cons. Regards, samir.

Re: What is the best way to load data from one cluster to another cluster (Urgent requirement)

2013-01-30 Thread samir das mohapatra
thanks all. On Thu, Jan 31, 2013 at 11:19 AM, Satbeer Lamba satbeer.la...@gmail.comwrote: I might be wrong but have you considered distcp? On Jan 31, 2013 11:15 AM, samir das mohapatra samir.help...@gmail.com wrote: Hi All, Any one knows, how to load data from one hadoop cluster(CDH4

Re: Hadoop Nutch Mkdirs failed to create file

2013-01-24 Thread samir das mohapatra
just try to apply $chmod 755 -R /home/wj/apps/apache-nutch-1.6 then try after it. On Wed, Jan 23, 2013 at 9:23 PM, 吴靖 qhwj2...@126.com wrote: hi, everyone! I want use the nutch to crawl the web pages, but problem comes as the log like, I think it maybe some permissions problem,but i am

Re: different input/output formats

2012-05-30 Thread samir das mohapatra
, new Path(args[1])); JobClient.runJob(conf); return 0; } public static void main(String[] args) throws Exception { int exitCode = ToolRunner.run(new SortByNorm1(), args); System.exit(exitCode); } On Tue, May 29, 2012 at 1:55 PM, samir das mohapatra

Re: How to mapreduce in the scenario

2012-05-30 Thread samir das mohapatra
Yes . Hadoop Is only for Huge Dataset Computaion . May not good for small dataset. On Wed, May 30, 2012 at 6:53 AM, liuzhg liu...@cernet.com wrote: Hi, Mike, Nitin, Devaraj, Soumya, samir, Robert Thank you all for your suggestions. Actually, I want to know if hadoop has any advantage

Re: Small glitch with setting up two node cluster...only secondary node starts (datanode and namenode don't show up in jps)

2012-05-30 Thread samir das mohapatra
In your logs details i colud not find the NN stating. It is the Problem of NN itself. Harsh also suggested for that same. On Sun, May 27, 2012 at 10:51 PM, Rohit Pandey rohitpandey...@gmail.comwrote: Hello Hadoop community, I have been trying to set up a double node Hadoop cluster

Re: Small glitch with setting up two node cluster...only secondary node starts (datanode and namenode don't show up in jps)

2012-05-30 Thread samir das mohapatra
*Step wise Details (Ubantu 10.x version ): Go through properly and Run one by one. it will sove your problem (You can change the path,IP ,Host name as you like to do)* - 1. Start the terminal

Re: different input/output formats

2012-05-30 Thread samir das mohapatra
])); SequenceFileOutputFormat.setOutputPath(conf, new Path(args[1])); JobClient.runJob(conf); return 0; } } On Wed, May 30, 2012 at 6:57 PM, samir das mohapatra samir.help...@gmail.com wrote: PFA. On Wed, May 30, 2012 at 2:45 AM, Mark question markq2...@gmail.comwrote: Hi Samir, can you email me your main

Re: How to Integrate LDAP in Hadoop ?

2012-05-29 Thread samir das mohapatra
, 2012, at 7:40 AM, samir das mohapatra samir.help...@gmail.com wrote: Hi All, Did any one work on hadoop with LDAP integration. Please help me for same. Thanks samir

Re: How to mapreduce in the scenario

2012-05-29 Thread samir das mohapatra
Yes it is possible by using MultipleInputs format to multiple mapper (basically 2 different mapper) Setp: 1 MultipleInputs.addInputPath(conf, new Path(args[0]), TextInputFormat.class, *Mapper1.class*); MultipleInputs.addInputPath(conf, new Path(args[1]), TextInputFormat.class, *Mapper2.class*);

Re: different input/output formats

2012-05-29 Thread samir das mohapatra
Hi Mark public void map(LongWritable offset, Text val,OutputCollector FloatWritable,Text output, Reporter reporter) throws IOException { output.collect(new FloatWritable(*1*), val); *//chanage 1 to 1.0f then it will work.* } let me know the status after the change On Wed, May

Re: different input/output formats

2012-05-29 Thread samir das mohapatra
Hi Mark See the out put for that same Application . I am not getting any error. On Wed, May 30, 2012 at 1:27 AM, Mark question markq2...@gmail.com wrote: Hi guys, this is a very simple program, trying to use TextInputFormat and SequenceFileoutputFormat. Should be easy but I get the

How to configure application for Eternal jar

2012-05-26 Thread samir das mohapatra
Hi All, How to configure the external jar , which is use by application internally. For eample: JDBC ,Hive Driver etc. Note:- I dont have permission to start and stop the hadoop machine. So I need to configure application level (Not hadoop level ) If we will put

Re: Right way to implement MR ?

2012-05-24 Thread samir das mohapatra
://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/mapreduce/lib/input/MultipleInputs.html . On Thu, May 24, 2012 at 1:17 AM, samir das mohapatra samir.help...@gmail.com wrote: Hi All, How to compare to input file In M/R Job. let A Log file around 30GB and B Log file

Re: RemoteException writing files

2012-05-19 Thread samir das mohapatra
Hi This Could be due to the Following reason 1) The *NameNode http://wiki.apache.org/hadoop/NameNode* does not have any available DataNodes 2) Namenode not able to start properly 3) other wise some IP Issue . Note:- Pleaes mention localhost instead of 127.0.0.1 (If it is in local)

Re: RemoteException writing files

2012-05-19 Thread samir das mohapatra
) Follow URL: http://wiki.apache.org/hadoop/FAQ#What_does_.22file_could_only_be_replicated_to_0_nodes.2C_instead_of_1.22_mean.3F Thanks samir On Sat, May 19, 2012 at 11:30 PM, samir das mohapatra samir.help...@gmail.com wrote: Hi This Could be due to the Following reason 1) The *NameNode http

Re: hadoop File loading

2012-05-15 Thread samir das mohapatra
HI, Your requirment is that your M/R will use full xml file while operating. (If it is write then please one of the approach bellow) So you can put this xml file in DistrubutedChache which will shared accross the M/R . So that your will get whole xml instead of chunk of data. Thanks Samir

Re: Moving files from JBoss server to HDFS

2012-05-12 Thread samir das mohapatra
Hi financeturd financet...@yahoo.com, My Point of view second step like bellow is the good approach {Separate server} -- {JBoss server} and then {Separate server} -- HDFS thanks samir On Sat, May 12, 2012 at 6:00 AM, financeturd financeturd financet...@yahoo.com wrote: Hello, We

Re: java.io.IOException: Task process exit with nonzero status of 1

2012-05-11 Thread samir das mohapatra
Hi Mohit, 1) Hadoop is more portable with Linux,Ubantu or any non dos file system. but you are running hadoop on window it colud be the problem bcz hadoop will generate some partial out put file for temporary use. 2) Another thing is that your are running hadoop version as 0.19 , I think