Re: Missing Snapshots for 2.5.0

2014-08-26 Thread Karthik Kambatla
There was an issue with the infrastructure. It is now fixed and the 2.5.0 artifacts are available. Mark - can you please retry now. Thanks Karthik On Tue, Aug 26, 2014 at 6:54 AM, Karthik Kambatla wrote: > Thanks for reporting this, Mark. > > It appears the artifacts are published to > https:

AW: Running job issues

2014-08-26 Thread Blanca Hernandez
Hi, thanks for your answers. Sorry, I forgot to add it, I couldn´t run the command neither: C:\development\tools\hadoop>%HADOOP_PREFIX%\bin\hdfs dfs -format -format: Unknown command C:\development\tools\hadoop>echo %HADOOP_PREFIX% C:\development\tools\hadoop By using –help command there is no f

Re: Appending to HDFS file

2014-08-26 Thread rab ra
hello Here is d code snippet, I use to append def outFile = "${outputFile}.txt" Path pt = new Path("${hdfsName}/${dir}/${outFile}") def fs = org.apache.hadoop.fs.FileSystem.get(configuration); FSDataOutputStream fp = fs.create(pt, true) fp << "${key} ${value}\n" On 27 Aug 2014 09:46, "Stanley

Re: connecting hiveserver2 through ssh tunnel - time out

2014-08-26 Thread murat migdisoglu
Hello, I solved the problem by tunneling ports 1 and 8021. But besides that, the ugly odbc of the ugly windows uses the ugly registries to keep the url of the the webhdfs.. Thus regedit and change HKEY_LOCAL_MACHINE/SOFTWARE/ODBC/ODBC.INI/[systemdsn]/WEBHDFSHost to localhost.. That did the t

Re: Hadoop on Safe Mode because Resources are low on NameNode

2014-08-26 Thread unmesha sreeveni
You can leave safe mode: Namenode in safe mode how to leave: http://www.unmeshasreeveni.blogspot.in/2014/04/name-node-is-in-safe-mode-how-to-leave.html On Wed, Aug 27, 2014 at 9:38 AM, Stanley Shi wrote: > You can force the namenode to get out of safe mode: hadoop dfsadmin > -safemode leave >

Re: Appending to HDFS file

2014-08-26 Thread Stanley Shi
would you please past the code in the loop? On Sat, Aug 23, 2014 at 2:47 PM, rab ra wrote: > Hi > > By default, it is true in hadoop 2.4.1. Nevertheless, I have set it to > true explicitly in hdfs-site.xml. Still, I am not able to achieve append. > > Regards > On 23 Aug 2014 11:20, "Jagat Singh

Re: Local file system to access hdfs blocks

2014-08-26 Thread Stanley Shi
I am not sure this is what you want but you can try this shell command: find [DATANODE_DIR] -name [blockname] On Tue, Aug 26, 2014 at 6:42 AM, Demai Ni wrote: > Hi, folks, > > New in this area. Hopefully to get a couple pointers. > > I am using Centos and have Hadoop set up using cdh5.1(Hadoop

Re: Hadoop on Safe Mode because Resources are low on NameNode

2014-08-26 Thread Stanley Shi
You can force the namenode to get out of safe mode: hadoop dfsadmin -safemode leave On Tue, Aug 26, 2014 at 11:05 PM, Vincent Emonet wrote: > Hello, > > We have a 11 nodes Hadoop cluster installed from Hortonworks RPM doc: > > http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.0.9.1/bk_install

RE: winutils and security

2014-08-26 Thread John Lilley
One more follow up, in case someone stumbles across this in the future. From what we can tell, the Hadoop security initialization is very sensitive to startup order, and this has been confirmed by discussions with other people. The only thing that we've been able to make work at all reliably u

Re: Running job issues

2014-08-26 Thread Arpit Agarwal
> And the namenode does not even start: 14/08/26 12:01:09 WARN > namenode.FSNamesystem: Encountered exception loading fsimage > java.io.IOException: NameNode is not formatted. Have you formatted HDFS (step 3.4)? On Tue, Aug 26, 2014 at 3:08 AM, Blanca Hernandez < blanca.hernan...@willhaben.at>

Filter data set by applying many rules

2014-08-26 Thread Amit Mittal
Hi All, I have a data set in text csv files and are compressed using gzip compression. Each record is having around 100 fields. I need to filter the data by applying various checks like "1. type of field", "2. nullable?", "3. min & max length", "4. value belongs to predefined list", "5. value subs

Re: Running job issues

2014-08-26 Thread Olivier Renault
Looking at the error message, it looks like your namenode is not formated. On the namenode, could you run hadoop namenode -format Hope it helps. Kind regards Olivier On 26 Aug 2014 11:08, "Blanca Hernandez" wrote: > Hi! > > > > I have just installed hadoop in my windows x64 machine.l followd

Hadoop on Safe Mode because Resources are low on NameNode

2014-08-26 Thread Vincent Emonet
Hello, We have a 11 nodes Hadoop cluster installed from Hortonworks RPM doc: http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.0.9.1/bk_installing_manually_book/content/rpm-chap1.html The cluster was working fine since it went on Safe Mode during the execution of a job with this message on the

Re: Missing Snapshots for 2.5.0

2014-08-26 Thread Karthik Kambatla
Thanks for reporting this, Mark. It appears the artifacts are published to https://repository.apache.org/content/repositories/releases/org/apache/hadoop/hadoop-common/2.5.0/, but haven't propagated to http://central.maven.org/maven2/org/apache/hadoop/hadoop-common/ I am following up on this, and

Running job issues

2014-08-26 Thread Blanca Hernandez
Hi! I have just installed hadoop in my windows x64 machine.l followd carefully the instructions in https://wiki.apache.org/hadoop/Hadoop2OnWindows but in the 3.5 and 3.6 points I have some problems I can not handle. %HADOO

Re: connecting hiveserver2 through ssh tunnel - time out

2014-08-26 Thread Kadir Sert
http://community.jaspersoft.com/jaspersoft-aws/connect-emr I think that's because of hive version. 2014-08-26 12:13 GMT+03:00 murat migdisoglu : > Hi, thx for your answer. I don't think that ssh tunnel is the issue(btw, why > port 10004 and not 1?) > > > Does ODBC driver connects to any other

Re: Hadoop InputFormat - Processing large number of small files

2014-08-26 Thread rab ra
Hi, Is it not good idea to model key as Text type? I have a large number of sequential files that has bunch of key value pairs. I will read these seq files inside the map. Hence my map needs only filenames. I believe, with CombineFileInputFormat the map will run on nodes where data is already ava

Re: connecting hiveserver2 through ssh tunnel - time out

2014-08-26 Thread murat migdisoglu
Hi, thx for your answer. I don't think that ssh tunnel is the issue(btw, why port 10004 and not 1?) Does ODBC driver connects to any other port?service(hive metastore etc..) On Tue, Aug 26, 2014 at 9:55 AM, Kadir Sert wrote: > Hi, > > could you please try, > > ssh -o ServerAliveInterval=1