Hello and my first request for help: courses.

2024-03-06 Thread SysAdm CID
Hi, I hope this question is appropriate for this forum. If not, please advise. Total beginner here. I just finished a free introductory Hadoop course at a very popular online school, it was very didactic and was good to get me started. But it was based on a very old version of Hadoop (2.6.0), and

Re: hadoop-hdfs-native-client Help

2021-09-13 Thread Masatake Iwasaki
does, however, include tools.jar which seems odd to me. -Original Message- From: Paula Logan To: iwasak...@oss.nttdata.co.jp Sent: Mon, Sep 13, 2021 9:08 am Subject: Re: hadoop-hdfs-native-client Help Thank you for taking the time to do a check.  RHEL 8.x must have some incompatibil

Re: hadoop-hdfs-native-client Help

2021-09-12 Thread Masatake Iwasaki
doop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml [ERROR] -> [Help 1] This RuntimeException error also appears for Native Test Case 2 but that test case doesn't fail. Also, see a lot of "File not found" messages.  Assume at this point th

Re: hadoop-hdfs-native-client Help

2021-09-11 Thread Paula Logan
-hdfs-native-client Help Hi Paula,   I am not sure how to answer your questions but is there a reason why you are using an EC2 instance instead of amazonz EMR (elastic Map reduce) Hadoop cluster. As far as I know you can set that up to work with HDFS setup as well as S3 buckets if you don’t n

RE: hadoop-hdfs-native-client Help

2021-09-10 Thread Jonathan Aquilina
stay online. Regards, Jonathan From: Paula Logan Sent: 10 September 2021 16:13 To: user@hadoop.apache.org Subject: hadoop-hdfs-native-client Help Hello, I am new to building Hadoop locally, and am having some issues. Please let me know if this information should be sent to a different distro

hadoop-hdfs-native-client Help

2021-09-10 Thread Paula Logan
hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 8[ERROR] around Ant part .. @ 6:150 in /home/ec2-user/workspaces/hadoop-3.3.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/antrun/build-main.xml[ERROR] -> [Help 1] This RuntimeException error also appe

Need Help - java.io.EOFException happened when visiting remote HDFS cluster from windows10 os

2021-06-25 Thread wind.fly....@outlook.com
org.apache.hadoop hadoop-client ${hadoop.version} Besides, my env info are: os: windows 10 jdk: 1.8 hadoop version: 3.0.0 remote cluster hadoop version: 3.0.0 From the stacktrace info we can see exception occured when reading the response inputStream's first four bytes, but why? This problem has been bothering me for several days, sincerely hope to get your help! Best, Junbao Zhang

Need help with a compilation issue

2021-06-09 Thread Sharan T
-Dtar I seem to be running into the below CMake error. Can someone please help? [INFO] --- hadoop-maven-plugins:3.2.2:cmake-compile (cmake-compile) @ hadoop-common --- [INFO] Running cmake /root/hadoop/hadoop-3.2.2-src/hadoop-common-project/hadoop-common/src -DGENERATED_JAVAH=/root/hadoop/hado

need help diagnosing errors

2019-10-21 Thread Manuel Sopena Ballesteros
xplanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] For more detailed output, check the application tracking page: http://gl-hdp-ctrl03-mlx.mlx:8088/cluster/app/application_1570749574365_0050 Then click on links to logs of each attempt. Could someone please help me to u

[NEED HELP] Hadoop 3.x Production Deployment Can be Publicly Talk About?

2019-09-24 Thread Wangda Tan
Hi devs and users, Tomorrow (sorry for the short notice) we will do a presentation at Strata Data Conf @ NY for a Community update of Hadoop 3.x. I'm thinking to create a slide about existing production deployment on Hadoop 3.x. Basically, I want to put a logo wall with a list of big names so we c

Fwd: help on a nodemanager API

2018-02-09 Thread Mohan
​H​ i folks, I see that node manager API ws/v1/node/apps return the list of apps that ran on it. I referred to https://hadoop.apache.org/docs/r2.6.0/hadoop-yarn/ hadoop-yarn-site/NodeManagerRest.html#Applications_API The response as per the page looks like below, which has *the complete

RE: Help me understand hadoop caching behavior

2017-12-27 Thread Frank Luo
First, Hadoop itself doesn’t have any caching. Secondly, if it is a mapper only job, then the data doesn’t go through the network. So look at somewhere else 😉 From: Avery, John [mailto:jav...@akamai.com] Sent: Wednesday, December 27, 2017 3:20 PM To: user@hadoop.apache.org Subject: Help me

Re: Help me understand hadoop caching behavior

2017-12-27 Thread Avery, John
Nevermind. I found my stupid mistake. I didn’t reset a variable…this fact had escaped me for the past two days. From: "Avery, John" Date: Wednesday, December 27, 2017 at 4:20 PM To: "user@hadoop.apache.org" Subject: Help me understand hadoop caching behavior I’m writing a

Help me understand hadoop caching behavior

2017-12-27 Thread Avery, John
I’m writing a program using the C API for Hadoop. I have a 4-node cluster. (Cluster was setup according to https://www.tutorialspoint.com/hadoop/hadoop_multi_node_cluster.htm) Of the 4 nodes, one is the namenode and a datanode, the others are datanodes (with one being a secondary namenode). I’

Re: spark on yarn error -- Please help

2017-09-01 Thread Akira Ajisaka
-shell gives below error. Kindly help me to resolve this issue _ _ _SPARK-DEFAULT.CONF_ spark.master spark://master2:7077 spark.eventLog.enabled true spark.eventLog.dir hdfs://ha-cluster/user/spark/ApplicationHistory spark.shuffle.service.enabled

spark on yarn error -- Please help

2017-08-28 Thread sidharth kumar
Hi, I have configured apace spark over yarn. I am able to run map reduce job successfully but spark-shell gives below error. Kindly help me to resolve this issue *SPARK-DEFAULT.CONF* spark.master spark://master2:7077 spark.eventLog.enabled true

Help with WebHDFS authentication: simple vs simple-dt

2016-09-27 Thread Benjamin Ross
All, I'm in the process of setting up encryption at rest on a cluster, but I want to make sure that everything else remains permissive - otherwise it will break existing processes that we have in place. I'm very close to getting this working - the last piece is that webhdfs is not permissive:

RE: Help starting RangerKMS

2016-08-26 Thread Benjamin Ross
riday, August 26, 2016 9:49 AM To: user@hadoop.apache.org Subject: Help starting RangerKMS Hey guys, I'm trying to start the RangerKMS server and I'm running into this very obscure error. Any help would be appreciated. We have confirmed JCE is installed on the node running RangerKMS. W

Help starting RangerKMS

2016-08-26 Thread Benjamin Ross
Hey guys, I'm trying to start the RangerKMS server and I'm running into this very obscure error. Any help would be appreciated. We have confirmed JCE is installed on the node running RangerKMS. We're using Java JDK 1.7 and Ranger 0.5.0.2.3 (HDP 2.3.6.0-3796). [root@bodcde

Re: New cluster help

2016-07-14 Thread Ravi Prakash
ion > > at java.io.DataInputStream.readInt(DataInputStream.java:392) > > at > org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:358) > > at > org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getEncryptedStreams(SaslDataTransferServer.java:178) > > Is this in relation to my ssl configuration ? > I'm confused about whats going on here. Thank you in advance for any help. >

New cluster help

2016-07-14 Thread tombin
is in relation to my ssl configuration ? I'm confused about whats going on here. Thank you in advance for any help.

Re: Help designing application architecture

2016-07-09 Thread venito camelas
"instant" : "2016-06-20T13:28:06.419Z" }, "temp" : { "value" : 25.6, "measurement_unit" : "Celsius", "instant" : "2016-06-20T13:28:06.419Z" }, "instant" : "2016-06-20T13:28:06.419Z" } That piece of

Re: Help designing application architecture

2016-07-07 Thread Ted Yu
For 1) you don't have to introduce external storage. You can define case classes for the known formats. FYI On Thu, Jul 7, 2016 at 4:40 PM, venito camelas wrote: > I'm pretty new to this and I have a use case I'm not sure how to > implement, I'll try to explain it and I'd appreciate if anyone

Help designing application architecture

2016-07-07 Thread venito camelas
I'm pretty new to this and I have a use case I'm not sure how to implement, I'll try to explain it and I'd appreciate if anyone could point me in the right direction. The case has these requirements: 1 - Any user shoud be able to define the format of the information they want to store (channel).

[HELP] Failed to work with DockerContainerExecutor in Yarn

2016-04-22 Thread Shuai Zhang
Hi there, I’m trying to work with DockerContainerExecutor in Yarn, which is described in https://hadoop.apache.org/docs/r2.7.2/hadoop-yarn/hadoop-yarn-site/DockerContainerExecutor.html I followed the document and prepared everything, but failed to run even a simple wordcount job with DockerCont

Re: [HELP:]Save Spark Dataframe in Phoenix Table

2016-04-08 Thread Josh Mahonin
the data back to Phoenix table > I am getting below error : > > org.apache.spark.sql.AnalysisException: > org.apache.phoenix.spark.DefaultSource does not allow user-specified > schemas.; > > Can any body help in resolving the above errors or any other solution of > saving Spark

[HELP:]Save Spark Dataframe in Phoenix Table

2016-04-07 Thread Divya Gehlot
: org.apache.spark.sql.AnalysisException: org.apache.phoenix.spark.DefaultSource does not allow user-specified schemas.; Can any body help in resolving the above errors or any other solution of saving Spark Dataframes to Phoenix. Would really appareciate the help. Thanks, Divya

A Mapreduce job failed. Need Help!

2016-03-09 Thread Juri Yanase Triantaphyllou
give me any feedback about my case? Do you have any ideas of whyI am failing? If you needmore information, I would be glad to send you! Thank youfor your help in advance! --Juri Here is somerelevant information: Yarn nodemanagershows: 16/03/0815:24:25 INFO

RE: Just switched to yarn/Hadoop 2.6.0 - help with logs please!

2016-02-22 Thread Tony Burton
elong to this node at all." What have I missed? Can you provide a link to a foolproof Hadoop 2 setup guide? Thanks, Tony -Original Message- From: Susheel Kumar Gadalay [mailto:skgada...@gmail.com] Sent: 25 November 2015 10:47 To: user@hadoop.apache.org Subject: Re: Just switched

Re: Need help :Does anybody has HDP cluster on EC2?

2016-02-15 Thread Chandeep Singh
You could also fire up a VNC session and access all internal pages from there. > On Feb 15, 2016, at 9:19 AM, Divya Gehlot wrote: > > Hi Sabarish, > Thanks alot for your help. > I am able to view the logs now > > Thank you very much . > > Cheers, > Divya >

Re: Need help :Does anybody has HDP cluster on EC2?

2016-02-15 Thread Divya Gehlot
Hi Sabarish, Thanks alot for your help. I am able to view the logs now Thank you very much . Cheers, Divya On 15 February 2016 at 16:51, Sabarish Sasidharan < sabarish.sasidha...@manthan.com> wrote: > You can setup SSH tunneling. > > > http://docs.aws.amazon.com/Elast

Need help :Does anybody has HDP cluster on EC2?

2016-02-15 Thread Divya Gehlot
one or redirecting to external ? Attached screenshots for better understanding of my issue. Would really appreciate help. Thanks, Divya - To unsubscribe, e-mail: user-unsubscr...@hadoop.apache.org For additional commands, e-mail:

Re: Help for a hadoop build error: "Failed to parse plugin descriptor for org.apache.hadoop:hadoop-maven-plugins:2.7.1"

2016-01-14 Thread Boric Tan
sub-module, and Maven > will be able to find hadoop-maven-plugins from the local repository cache. > > --Chris Nauroth > > From: Boric Tan > Date: Thursday, January 14, 2016 at 1:09 PM > To: "user@hadoop.apache.org" > Subject: Help for a hadoop build error: "F

Re: Help for a hadoop build error: "Failed to parse plugin descriptor for org.apache.hadoop:hadoop-maven-plugins:2.7.1"

2016-01-14 Thread Chris Nauroth
tre...@gmail.com>> Date: Thursday, January 14, 2016 at 1:09 PM To: "user@hadoop.apache.org<mailto:user@hadoop.apache.org>" mailto:user@hadoop.apache.org>> Subject: Help for a hadoop build error: "Failed to parse plugin descriptor for org.apache.hadoop:hadoop-ma

Help for a hadoop build error: "Failed to parse plugin descriptor for org.apache.hadoop:hadoop-maven-plugins:2.7.1"

2016-01-14 Thread Boric Tan
plugin descriptor found at META-INF/maven/plugin.xml -> [Help 1] org.apache.maven.plugin.PluginDescriptorParsingException: Failed to parse plugin descriptor for org.apache.hadoop:hadoop-maven-plugins:2.7.1 (/home/long/builds/hadoop-2.7.1-src/hadoop-maven-plugins/target/classes): No plugin descrip

Re: Help on perl streaming

2015-12-07 Thread Dingcheng Li
Thanks for your quick response. It seems to make sense that I should put the resource file and script into the same directory. Sigh, I cannot test it now since our hadoop environment is down for maintenance this week. I will keep you posted if this will work. Thanks a lot, Dingcheng On Sun, Dec 6

Re: Help on perl streaming

2015-12-06 Thread Dingcheng Li
Without it, it works well after I comment the script to create and read the resource file. For python, exactly the same file structure, it works. I do not think that the resource file ("salesData/salesFilter.txt") should be in HDFS directory since the resource file is like a dictionary which I use

Help on perl streaming

2015-12-06 Thread Dingcheng Li
Hi, folks, I am using hadoop streaming to call perl scripts as mapper. Things are working well. But I found that the resource file reading is a problem. Basically I think that I am on the right track, -file option is the correct way to get resource file read. I tested on python script. But for pe

RE: Just switched to yarn/Hadoop 2.6.0 - help with logs please!

2015-11-25 Thread Tony Burton
apache.org Subject: Re: Just switched to yarn/Hadoop 2.6.0 - help with logs please! Use port 8088 :8088 On 11/25/15, Tony Burton wrote: > Hi, > > After a long time using Hadoop 1.x, I've recently switched to Hadoop 2.6.0. > I've got a MapReduce program running, but I want

Re: Just switched to yarn/Hadoop 2.6.0 - help with logs please!

2015-11-25 Thread Susheel Kumar Gadalay
Use port 8088 :8088 On 11/25/15, Tony Burton wrote: > Hi, > > After a long time using Hadoop 1.x, I've recently switched to Hadoop 2.6.0. > I've got a MapReduce program running, but I want to see the logs and debug > info that I used to be able to view via the JobTracker at > http://localhost:5

Just switched to yarn/Hadoop 2.6.0 - help with logs please!

2015-11-25 Thread Tony Burton
Hi, After a long time using Hadoop 1.x, I've recently switched to Hadoop 2.6.0. I've got a MapReduce program running, but I want to see the logs and debug info that I used to be able to view via the JobTracker at http://localhost:50030/jobtracker.jsp. Googling around so far has suggested I ena

Re: New to hadoop. Need help

2015-10-31 Thread Tenghuan He
> Best, Caesar. > > > > *From:* Farhan Iqbal [mailto:farhan.iq...@gmail.com] > *Sent:* Friday, October 30, 2015 12:27 PM > *To:* user@hadoop.apache.org > *Subject:* New to hadoop. Need help > > > > Hi guys > > I am new to Hadoop. I just installed Hadoop 2.7.1 in w

RE: New to hadoop. Need help

2015-10-30 Thread Caesar Samsi
@hadoop.apache.org Subject: New to hadoop. Need help Hi guys I am new to Hadoop. I just installed Hadoop 2.7.1 in windows 7 using https://wiki.apache.org/hadoop/Hadoop2OnWindows Now my question is How can I browse file in windows which are in hdfs system e.g. when I executed

Re: New to hadoop. Need help

2015-10-30 Thread Muhammad Atif
You can browse the out files by going to http://http//localhost:8080/.They are stored in hadoop file system which is different from windows file system On Fri, Oct 30, 2015 at 12:36 PM, Farhan Iqbal wrote: > No sir, my question is can I browse these files in windows? Like what is > the physical

Re: New to hadoop. Need help

2015-10-30 Thread Farhan Iqbal
No sir, my question is can I browse these files in windows? Like what is the physical location of these files in windows? Farhan Iqbal On Fri, Oct 30, 2015 at 10:28 AM, Namikaze Minato wrote: > %HADOOP_PREFIX%\bin\hdfs dfs -ls / > %HADOOP_PREFIX%\bin\hdfs dfs -cat /myfile.txt > > ? > > Regard

New to hadoop. Need help

2015-10-30 Thread Farhan Iqbal
Hi guys I am new to Hadoop. I just installed Hadoop 2.7.1 in windows 7 using https://wiki.apache.org/hadoop/Hadoop2OnWindows Now my question is How can I browse file in windows which are in hdfs system e.g. when I executed this command %HADOOP_PREFIX%\bin\yarn jar %HADOOP_PREFIX%\share\hadoo

Re: New to hadoop. Need help

2015-10-30 Thread Namikaze Minato
%HADOOP_PREFIX%\bin\hdfs dfs -ls / %HADOOP_PREFIX%\bin\hdfs dfs -cat /myfile.txt ? Regards, LLoyd

Re: Help troubleshooting multi-cluster setup

2015-09-23 Thread Kuhu Shukla
Hi Daniel, The RM will list only NodeManagers and not the datanodes. You can view the datanodes on the NameNode page (eg. 192.168.51.4:50070). The one node you see on the RM page 'Nodes' list is from this: hadoop@hadoop-master:~$ jps24641 SecondaryNameNode24435 DataNode24261 NameNode24791 Resourc

Re: Help troubleshooting multi-cluster setup

2015-09-23 Thread Daniel Watrous
I was able to get the jobs submitting to the cluster by adding the following property to mapred-site.xml mapreduce.framework.name yarn I also had to add the following properties to yarn-site.xml yarn.nodemanager.aux-services mapreduce_shuffle yarn.nodemanager

Re: Help troubleshooting multi-cluster setup

2015-09-23 Thread Daniel Watrous
I'm not sure if this is related, but I'm seeing some errors in hadoop-hadoop-namenode-hadoop-master.log 2015-09-23 19:56:27,798 WARN org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: Unresolved datanode registration: hostname cannot be resolved (ip=192.168.51.1, hostname=192.168.51.1)

Help troubleshooting multi-cluster setup

2015-09-23 Thread Daniel Watrous
Hi, I have deployed a multi-node cluster with one master and two data nodes. Here's what jps shows: hadoop@hadoop-master:~$ jps 24641 SecondaryNameNode 24435 DataNode 24261 NameNode 24791 ResourceManager 25483 Jps 24940 NodeManager hadoop@hadoop-data1:~$ jps 15556 DataNode 16198 NodeManager 1639

Help with hadoop 2.5.2 in windows

2015-06-18 Thread Nishanth S
Hey friends. I build hadoop 2.5.2 on my pc and I am able to run map reduce jobs locally after setting hadoop_home.I am trying to set this up in another machine by using the same tar file that i built in mine but getting the below error. Can you please help Exception in thread "

Re: Help with implementing a Storm topology to stream tweets

2015-05-08 Thread Shahab Yunus
hdfs. > > Is there a tutorial to do the exact same thing? > > How do I start? > > Thanks for the help. > > PS: I have already installed Storm in HDP using Ambari. > > -- > Thanks, > *Manikandan Ramakrishnan* >

Help with implementing a Storm topology to stream tweets

2015-05-08 Thread mani kandan
Hi I'm new to Storm, and I would like to create a Storm topology to stream tweets, do analysis and store on hdfs. Is there a tutorial to do the exact same thing? How do I start? Thanks for the help. PS: I have already installed Storm in HDP using Ambari. -- Thanks, *Manikandan Ramakrishnan*

Re: Help with Kerberos

2015-05-06 Thread Nishanth S
resolution.Any help is appreciated. Exception in thread "main" java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/abc.domain@domain.com; Host Details : local host is: "xyz/10.a.b.c";

Re: Map Reduce Help

2015-05-05 Thread Drake민영근
Hi. The mapreduce example is the case. See this: https://github.com/apache/hadoop/blob/trunk/hadoop-mapreduce-project/hadoop-mapreduce-examples/src/main/java/org/apache/hadoop/examples/ExampleDriver.java Drake 민영근 Ph.D kt NexR On Wed, May 6, 2015 at 2:00 AM, Chandrashekhar Kotekar < shekhar.kote

RE: Help with Kerberos

2015-05-05 Thread Birender Saini
Are those principals in KDC? Try to get kerberos ticket for that principal using the same key tab that's in config. Original message From: Nishanth S Date:05/05/2015 6:33 PM (GMT-05:00) To: user@hadoop.apache.org Subject: Re: Help with Kerberos Thanks Chris.I checked

Re: Help with Kerberos

2015-05-05 Thread Nishanth S
on. It has more > details on this and other relevant configuration properties. > > http://hadoop.apache.org/docs/r2.7.0/hadoop-project-dist/hadoop-common/Secu > reMode.html > > I hope this helps. > > --Chris Nauroth > > > From: Nishanth S > Reply-To: "us

Re: Help with Kerberos

2015-05-05 Thread Chris Nauroth
configuration properties. http://hadoop.apache.org/docs/r2.7.0/hadoop-project-dist/hadoop-common/Secu reMode.html I hope this helps. --Chris Nauroth From: Nishanth S Reply-To: "user@hadoop.apache.org" Date: Tuesday, May 5, 2015 at 12:09 PM To: "user@hadoop.apache.org"

Help with Kerberos

2015-05-05 Thread Nishanth S
Hello , I am getting the below exception when trying to run a simple map reduce from a machine outside the hadoop cluster.This is the kerberos code I am using.Please help org.apache.hadoop.conf.Configuration conf = new org.apache.hadoop.conf.Configuration(); conf.set

Re: Map Reduce Help

2015-05-05 Thread Chandrashekhar Kotekar
Technically yes, you can keep all map reduce jobs in single jar file because all map reduce jobs are nothing but java classes but I think its better to keep all map-reduce job isolated so that you will be able to modify them easily in future. Regards, Chandrash3khar Kotekar Mobile - +91 860001145

Map Reduce Help

2015-05-05 Thread Nishanth S
Hello, I am very new to map reduce.We need to wirte few map reduce jobs to process different binary files.Can all the different map reduce programs be packaged into a single jar file?. Thanks, Chinchu

Re: New to hadoop..plz help

2015-04-18 Thread Marco Shaw
Hi, We are usually here to help, but not really to do this all for you. If you need someone to write this for you, there are some freelance sites on the Internet where you can post a job or request and have people bid on it. Marco On Sat, Apr 18, 2015 at 7:13 PM, shanthi k wrote: > I n

Re: New to hadoop...plz help

2015-04-18 Thread Ted Yu
Droppinh hive mailing list since you mentioned mapreduce in your email. Can you give us a bit more detail on what you're trying to do ? Cheers On Sat, Apr 18, 2015 at 3:08 PM, shanthi k wrote: > I need mapreduce program in java for this input and output.... plz help >

New to hadoop..plz help

2015-04-18 Thread shanthi k
I need mapreduce program in java for this following input and output

New to hadoop...plz help

2015-04-18 Thread shanthi k
I need mapreduce program in java for this input and output plz help

Need help on Hadoop cluster capacity and hardware specification

2015-01-29 Thread Saravanan Nagarajan
HI, Need help on Hadoop cluster capacity and hardware specification: = We are plan to migrate the existing “Enterprise Data warehouse”/”Business intelligent “ to Hadoop based solution. In the current system has Teradata as storage, Abinitio

Re: Can you help me to install HDFS Federation and test?

2015-01-22 Thread Visioner Sadak
for federation test ) install I have a big trouble when > file putting test > > > I command to hadoop > Can you help me to install HDFS Federation and test? > ./bin/hadoop fs -put test.txt /NN1/ > > there is error message > "put: Renames across FileSystems not suppor

Re: Need some help with RecordReader

2014-10-28 Thread jay vyas
I only can see that records ended after reading the > first line of the next record. > > I would be very thankful for any help with implementing such a > RecordReader. > > Thanks in advance, > John. > -- jay vyas

Re: Need some help with RecordReader

2014-10-28 Thread Steve Lewis
alue. > > As far as I understand, I need to implement some kind of custom > RecordReader class to parse that format. But all examples I found on the > Internet deal with formats where there is some mark at the end of the > record, but in my case I only can see that records ended afte

Need some help with RecordReader

2014-10-28 Thread John Dison
fter reading the first line of the next record. I would be very thankful for any help with implementing such a RecordReader. Thanks in advance,John.

Re: help with pig

2014-10-01 Thread Ted Yu
bproject. I was wondering if you could help me with this, I have been > trying to compile and run tests for 4 days and I cannot make it work. > I really hope you can hit me back with some hints, it would save me a lot of > work and time. > Thanks a lot for your time, > > seb.

help with pig

2014-10-01 Thread Sebastian Lamelas_marcote
7e I need to recreate the bug reported there (infinite loop) in the piggybank subproject. I was wondering if you could help me with this, I have been trying to compile and run tests for 4 days and I cannot make it work. I really hope you can hit me back with some hints, it would save me a l

Re: libhdfs result in JVM crash issue, please help me

2014-08-28 Thread Vincent,Wei
#0 0x7f1e3872c425 in raise () from /lib/x86_64-linux-gnu/libc.so.6 (gdb) bt #0 0x7f1e3872c425 in raise () from /lib/x86_64-linux-gnu/libc.so.6 #1 0x7f1e3872fb8b in abort () from /lib/x86_64-linux-gnu/libc.so.6 #2 0x7f1e380a4405 in os::abort(bool) () from /usr/jdk1.7.0_51/jre/lib

libhdfs result in JVM crash issue, please help me

2014-08-27 Thread Vincent,Wei
All I am using libhdfs, I need some usage like following ,and when the JNI call return, it had result in some Crash in JVM, Attachment is the detail information. JAVA Call JNI Call C LIB Call Libhdfs Crash info # # A fatal error has be

Re: Can anyone help me resolve this Error: unable to create new native thread

2014-08-14 Thread Chris MacKenzie
oUk/posts> <http://www.linkedin.com/in/chrismackenziephotography/> From: Ravi Prakash Reply-To: Date: Friday, 15 August 2014 01:31 To: "user@hadoop.apache.org" Subject: Re: Can anyone help me resolve this Error: unable to create new native thread Hi Chris! When is this

Re: Can anyone help me resolve this Error: unable to create new native thread

2014-08-14 Thread Ravi Prakash
Hi Chris! When is this error caused? Which logs do you see this in? Are you sure you are setting the ulimit for the correct user? What application are you trying to run which is causing you to run up against this limit? HTH Ravi On Saturday, August 9, 2014 6:07 AM, Chris MacKenzie wrote:

Can anyone help me resolve this Error: unable to create new native thread

2014-08-09 Thread Chris MacKenzie
Hi, I¹ve scrabbled around looking for a fix for a while and have set the soft ulimit size to 13172. I¹m using Hadoop 2.4.1 Thanks in advance, Chris MacKenzie telephone: 0131 332 6967 email: stu...@chrismackenziephotography.co.uk corporate: www.chrismackenziephotography.co.uk

Re: Help running hadoop example

2014-06-13 Thread Sindhu Hosamane
May be u should give the complete path of jar file . That is absolute path . Best Regards, Sindhu > On 14.06.2014, at 01:22, Ryan de Vera wrote: > > Hello everyone, > > I just downloaded Hadoop verision 1.2.1 to my macbook and I am having trouble > running the example. I used this tutorial

Help running hadoop example

2014-06-13 Thread Ryan de Vera
Hello everyone, I just downloaded Hadoop verision 1.2.1 to my macbook and I am having trouble running the example. I used this tutorial http://wiki.apache.org/hadoop/Running_Hadoop_On_OS_X_10.5_64-bit_(Single-Node_Cluster)

Re: Need help get the hadoop cluster started in EC2

2014-03-28 Thread Yusaku Sako
, Max Zhao wrote: > Hi Everybody, > > I am trying to get my first hadoop cluster started using the Amazon EC2. I > tried quite a few times and searched the web for the solutions, yet I still > cannot get it up. I hope somebody can help out here. > > Here is what I did based on t

Need help get the hadoop cluster started in EC2

2014-03-28 Thread Max Zhao
Hi Everybody, I am trying to get my first hadoop cluster started using the Amazon EC2. I tried quite a few times and searched the web for the solutions, yet I still cannot get it up. I hope somebody can help out here. Here is what I did based on the Apache Whirr Quick Guide ( http

Re: I am about to lose all my data please help

2014-03-24 Thread Fatih Haltas
>>>> Regards, >>>>>>>> *Stanley Shi,* >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Sun, Mar 16, 2014 at 5:29 PM, Mirko Kämpf < >>>>>>>> mirko.kae

Re: I am about to lose all my data please help

2014-03-24 Thread praveenesh kumar
>>>>>> >>>>>>>> Hi, >>>>>>>> >>>>>>>> what is the location of the namenodes fsimage and editlogs? >>>>>>>> And how much memory has the NameNod

Re: I am about to lose all my data please help

2014-03-23 Thread Stanley Shi
> Did you work with a Secondary NameNode or a Standby NameNode for >>>>>>> checkpointing? >>>>>>> >>>>>>> Where are your HDFS blocks located, are those still safe? >>>>>>> >>>>>>

Re: I am about to lose all my data please help

2014-03-23 Thread Fatih Haltas
blocks located, are those still safe? >>>>>> >>>>>> With this information at hand, one might be able to fix your setup, >>>>>> but do not format the old namenode before >>>>>> all is working with a fresh

Re: I am about to lose all my data please help

2014-03-19 Thread praveenesh kumar
at hand, one might be able to fix your setup, >>>>> but do not format the old namenode before >>>>> all is working with a fresh one. >>>>> >>>>> Grab a copy of the maintainance guide: >>>>> http://shop.oreilly.com/product/06

Re: I am about to lose all my data please help

2014-03-19 Thread Fatih Haltas
roblems as well. >>>> >>>> Best wishes >>>> Mirko >>>> >>>> >>>> 2014-03-16 9:07 GMT+00:00 Fatih Haltas : >>>> >>>> Dear All, >>>>> >>>>> I

Re: I am about to lose all my data please help

2014-03-17 Thread Stanley Shi
;>> Mirko >>> >>> >>> 2014-03-16 9:07 GMT+00:00 Fatih Haltas : >>> >>> Dear All, >>>> >>>> I have just restarted machines of my hadoop clusters. Now, I am trying >>>> to restart hadoop clusters again, but g

Re: I am about to lose all my data please help

2014-03-17 Thread Azuryy Yu
;> I have just restarted machines of my hadoop clusters. Now, I am trying >>> to restart hadoop clusters again, but getting error on namenode restart. I >>> am afraid of loosing my data as it was properly running for more than 3 >>> months. Currently, I believe if I

Re: I am about to lose all my data please help

2014-03-17 Thread Azuryy Yu
tas : >>> >>> Dear All, >>>> >>>> I have just restarted machines of my hadoop clusters. Now, I am trying >>>> to restart hadoop clusters again, but getting error on namenode restart. I >>>> am afraid of loosing my

Re: I am about to lose all my data please help

2014-03-17 Thread Stanley Shi
>> restart hadoop clusters again, but getting error on namenode restart. I am >> afraid of loosing my data as it was properly running for more than 3 >> months. Currently, I believe if I do namenode formatting, it will work >> again, however, data will be lost. Is there anyway t

Re: I am about to lose all my data please help

2014-03-16 Thread Mirko Kämpf
e if I do namenode formatting, it will work > again, however, data will be lost. Is there anyway to solve this without > losing the data. > > I will really appreciate any help. > > Thanks. > > > = > Here is the logs; >

I am about to lose all my data please help

2014-03-16 Thread Fatih Haltas
work again, however, data will be lost. Is there anyway to solve this without losing the data. I will really appreciate any help. Thanks. = Here is the logs; 2014-02-26 16:02:39,698 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG

RE: Need help: fsck FAILs, refuses to clean up corrupt fs

2014-03-04 Thread divye sheth
a:549) > > at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212) > > at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404) > > at > org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410) > > at &

RE: Need help: fsck FAILs, refuses to clean up corrupt fs

2014-03-04 Thread John Lilley
Ah... found the answer. I had to manually leave safe mode to delete the corrupt files. john From: John Lilley [mailto:john.lil...@redpoint.net] Sent: Tuesday, March 04, 2014 9:33 AM To: user@hadoop.apache.org Subject: RE: Need help: fsck FAILs, refuses to clean up corrupt fs More information

RE: Need help: fsck FAILs, refuses to clean up corrupt fs

2014-03-04 Thread John Lilley
dPoint.run(SelectChannelEndPoint.java:410) at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582) From: John Lilley [mailto:john.lil...@redpoint.net] Sent: Tuesday, March 04, 2014 6:08 AM To: user@hadoop.apache.org Subject: Need help: fsck FAILs, refuses to clean up corrupt fs

Need help: fsck FAILs, refuses to clean up corrupt fs

2014-03-04 Thread John Lilley
I have a file system with some missing/corrupt blocks. However, running hdfs fsck -delete also fails with errors. How do I get around this? Thanks John [hdfs@metallica yarn]$ hdfs fsck -delete /rpdm/tmp/ProjectTemp_461_40/TempFolder_4/data00012_00.dld Connecting to namenode via http://anth

Re: Need help to understand hadoop.tmp.dir

2014-03-03 Thread Chengwei Yang
On Mon, Mar 03, 2014 at 09:03:28AM -0500, JCAD Cell 1 wrote: > With the services stopped you would change the setting in core-site.xml: >   >     hadoop.tmp.dir >     /var/hadoop/tmp >   > > Then move your /tmp/hadoop folder over to the new location: > mv /tmp/hadoop /var/hadoop/tmp Thank you,

Re: Need help to understand hadoop.tmp.dir

2014-03-03 Thread JCAD Cell 1
With the services stopped you would change the setting in core-site.xml: hadoop.tmp.dir /var/hadoop/tmp Then move your /tmp/hadoop folder over to the new location: mv /tmp/hadoop /var/hadoop/tmp On Mon, Mar 3, 2014 at 5:55 AM, Chengwei Yang wrote: > On Mon, Mar 03, 2014 at 01:57:

  1   2   3   4   >