Fwd: Quest Data Connector for Oracle throws error.

2015-06-26 Thread Kumar Jayapal
Hello All, I have installed Quest Data Connector for Oracle but it is showing error while importing data using sqoop. I am able to import the same data from oracle when i disable Quest Data Connector . I have copied debug logs. I dint find any blog about resolve this issue. please let me know

Re: how to assign unique ID (Long Value) in mapper

2015-06-26 Thread Ravikant Dindokar
yes , there can be loop in the graph On Fri, Jun 26, 2015 at 9:09 AM, Harshit Mathur mathursh...@gmail.com wrote: Are there loops in your graph? On Thu, Jun 25, 2015 at 10:39 PM, Ravikant Dindokar ravikant.i...@gmail.com wrote: Hi Hadoop user, I have a file containing one line for each

Re: Only one reducer working

2015-06-26 Thread Ravikant Dindokar
text file with tab separated values On Fri, Jun 26, 2015 at 7:35 PM, Krishna Kalyan krishnakaly...@gmail.com wrote: What is the file format?. Thanks, Krishna On 26 Jun 2015 7:33 pm, Ravikant Dindokar ravikant.i...@gmail.com wrote: Hi Hadoop User, I am processing 23 GB file on 21 nodes.

Re: Invalid key type in the map task

2015-06-26 Thread Dieter De Witte
You need to show the driver class as well. Are you using textinputformat? Are you aware that this standard inputformat will take as a value the complete line (until newline separator), the key in that case is the bitoffset in the file and definitely not the number you assume it will be.

Invalid key type in the map task

2015-06-26 Thread xeonmailinglist-gmail
Hi, I have this map class that is accepting input files with a key as LongWritable and a value of Text. The input file is in [1]. Here we can see that it contains a key as a Long (I think) and bytes as value. In [2], it is my map class. The goal of the map class is to read the input data,

Only one reducer working

2015-06-26 Thread Ravikant Dindokar
Hi Hadoop User, I am processing 23 GB file on 21 nodes. I have tried both options : mapreduce.job.reduces=50 mapred.tasktracker.reduce.tasks.maximum=5 in mapred-site.xml but still only one reducer is running. Any configuration setting still to be corrected? Thanks Ravikant

Re: Only one reducer working

2015-06-26 Thread Krishna Kalyan
What is the file format?. Thanks, Krishna On 26 Jun 2015 7:33 pm, Ravikant Dindokar ravikant.i...@gmail.com wrote: Hi Hadoop User, I am processing 23 GB file on 21 nodes. I have tried both options : mapreduce.job.reduces=50 mapred.tasktracker.reduce.tasks.maximum=5 in

HBASE Region server failing to start after Kerberos is enabled

2015-06-26 Thread Gangavarupu, Venkata - Contingent Worker
HI All, The region servers failing to start, after Kerberos is enabled, with below error. Hadoop -2.6.0 HBase-0.98.4 2015-06-24 15:58:48,884 DEBUG [RS_OPEN_META-mdcthdpdas06lp:60020-0] regionserver.HRegion: Registered coprocessor service: region=hbase:meta,,1 service=AuthenticationService

Container isolation

2015-06-26 Thread Daniel Haviv
Hi, Is there some kind a security aspect to a container in terms of local filesystem access? Is it possible for example to chroot for containers so they won't be able to read/write to anywhere on the local FS but their own home dir? Thanks, Daniel

Re: HBASE Region server failing to start after Kerberos is enabled

2015-06-26 Thread Ted Yu
Can you post the complete stack trace for 'Failed to get FileSystem instance' ? What's the permission for /apps/hbase/staging ? Looking at commit log of SecureBulkLoadEndpoint.java, there have been a lot bug fixes since 0.98.4 Please consider upgrading hbase Cheers On Fri, Jun 26, 2015 at

Re: Only one reducer working

2015-06-26 Thread Ravikant Dindokar
set the value directly in code. Given a JobConf instance job, call job.setNumReduceTasks(100); This worked for me Thanks On Fri, Jun 26, 2015 at 9:07 PM, Ravikant Dindokar ravikant.i...@gmail.com wrote: text file with tab separated values On Fri, Jun 26, 2015 at 7:35 PM, Krishna Kalyan

Re: how to assign unique ID (Long Value) in mapper

2015-06-26 Thread Ravikant Dindokar
The problem can be thought as assigning line number for each line. Is there any inbuilt functionality in hadoop which can do this? On Fri, Jun 26, 2015 at 1:11 PM, Ravikant Dindokar ravikant.i...@gmail.com wrote: yes , there can be loop in the graph On Fri, Jun 26, 2015 at 9:09 AM, Harshit

ResourceManager fails to start

2015-06-26 Thread Alexandru Pacurar
Hello, I'm running Hadoop 2.6 and I have encountered a problem with the resourcemanager. After a restart the resourcemanager refuses to start with the following error: 2015-06-26 08:54:10,342 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:recover(796)) - Recovering attempt:

Re: how to assign unique ID (Long Value) in mapper

2015-06-26 Thread Shahab Yunus
I see 2 issues here which go kind of against the architecture and idea of M/R (or distributed and parallel programming models.) 1- The map and reduce tasks are suppose to be shared-nothing and independent tasks. If you add a functionality like this where you need more sure that some data is