[no subject]

2024-02-03 Thread Gavin McDonald
Hello to all users, contributors and Committers! The Travel Assistance Committee (TAC) are pleased to announce that travel assistance applications for Community over Code EU 2024 are now open! We will be supporting Community over Code EU, Bratislava, Slovakia, June 3th - 5th, 2024. TAC exists to

[no subject]

2022-03-30 Thread Dmitrii Kuzin
Unsubscribe

[no subject]

2022-03-30 Thread Megan Liu
Unsubscribe

[no subject]

2021-03-29 Thread Theresa vail1
Hi?

[no subject]

2020-11-03 Thread Jay Johnson
unsubscribe

[no subject]

2020-02-05 Thread Paul Rimba
unsubscribe

[no subject]

2019-02-17 Thread Shuubham Ojha
Hello, I am trying to use Hadoop 3.1.1 on my cluster. I wish to experiment with the Hitchhiker Code which I believe was introduced in Hadoop 3 itself. I don't understand how do I activate the hitchhiker feature for the blocks of files I put on the datanode. I also don't know which erasure coding po

[no subject]

2018-12-19 Thread Shuubham Ojha
Hello All, I am Shuubham Ojha a graduate researcher with the University Of Melbourne. We have developed a block placement strategy which optimises delay associated with reconstruction. As a result of this optimisation problem, we get a placement matrix for blocks which tells

[no subject]

2017-12-21 Thread 学生张洪斌
why will be displayed as follows when i start hadoop: starting org.apache.spark.deploy.master.Master, logging to /opt/apps/spark-2.0.0-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.master.Master-1-head2.out node4: starting org.apache.spark.deploy.worker.Worker, logging to /opt/apps/spark-

[no subject]

2017-10-27 Thread gu.yizhou
Hi All, As an application over hadoop, is it recommended to use "org.apache.hadoop.fsClass FileContext" rather then "org.apache.hadoop.fs Class FileSystem"? And why, or why not? Besides, my target version will be Apache Hadoop V2.7.3, and the application will be running over both HDFS HA andFed

[no subject]

2017-08-26 Thread Dominique Rozenberg
unsubscribe דומיניק רוזנברג, מנהלת פרויקטים נייד: 052-7722006 > משרד: 08-6343595 > פקס: 08-9202801 d...@datacube.co.il www.datacube.co.il

[no subject]

2017-08-26 Thread Mohammed Q. Hussian
unsubscribe

[no subject]

2017-07-04 Thread Nishant Verma
Not sure if this is the exact forum for this query. HDFS is 2.7.3 Kafka Connect is 3.2.0 and Kafka is 0.10.2.0. We are using Kafka Connect HDFS Connector to pull records from Kafka topic to HDFS. Till yesyerday, we had just one namenode in our cluster and we were using "hdfs.url" as "hdfs://abc.

[no subject]

2016-10-10 Thread Fei Hu
Hi All, I am running some spark scala code on zeppelin on CDH 5.5.1 (Spark version 1.5.0). I customized the Spark interpreter to use org.apache.spark.serializer.KryoSerializer as spark.serializer. And in the dependency I added Kyro-3.0.3 as following: com.esotericsoftware:kryo:3.0.3 When I wrot

[no subject]

2016-09-19 Thread Vinodh Nagaraj
Hi All, When I execute *hdfs dfs -ls*,it shows all the directory. I have created one directory in hadoop. Remaining files are created at OS level. Executing from Hadoop home/bin. Thanks,

[no subject]

2016-08-30 Thread Alexandru Calin
Hello I want to measure the time taken to read/write from HDFS and feed data to the mapper/reducer vs the actual map/reduce time for the WordCount example. I have enabled HTrace with

[no subject]

2016-06-07 Thread Anit Alexander's i
unsubscribe

[no subject]

2016-06-04 Thread Vishal Kharde
Unsubscribe

[no subject]

2015-12-10 Thread Aditya Vyas
unsubscribe

[no subject]

2015-11-29 Thread MONTMORY Alain
Unsubscribe [@@ THALES GROUP INTERNAL @@]

[no subject]

2015-11-27 Thread vedavyasa reddy
Unsubscribe

[no subject]

2015-11-27 Thread Vikram Bajaj
Unsubscribe

[no subject]

2015-11-26 Thread Caesar Samsi
Hi, I got further in running TeraSort, there is just one single error left related to the Java engine below. How do I debug this further? Thank you, Caesar. Container: container_1448509237184_0002_01_55 on berry3_32841 === Lo

[no subject]

2015-11-02 Thread Joe Doherty
Unsubscribe

[no subject]

2015-10-29 Thread andreina j
Subscribe me for users mailing list Andreina J 华为技术有限公司 Huawei Technologies Co., Ltd. [Company_logo] Phone: Fax: Mobile: Email: 地址:深圳市龙岗区坂田华为基地 邮编:518129 Huawei Technologies Co., Ltd. Bantian, Longgang District,Shenzhen 518129, P.R.China http://www.huawei.com

[no subject]

2015-10-09 Thread James Teng

[no subject]

2015-09-22 Thread Ilya Karpov
-- Ilya Karpov Developer CleverDATA make your data clever

[no subject]

2015-09-22 Thread vedavyasa reddy
unsubscribe

[no subject]

2015-08-13 Thread Anil Thirunagari

[no subject]

2015-05-07 Thread Kumar Jayapal
Can some one please help me. I am running the simple sqoop command to import the table with split by options I am getting this error. Does any one solved this error before. I searched site no resolution so far. sqoop command sqoop import --connect "jdbc:oracle:thin:@mysql.1521/PR" --username "

[no subject]

2015-05-02 Thread Nishanth S
Hello All, I am looking to write a map only program in java .The output of which are two avro files.There can be two types of records in the input file(let us say lion and tiger) and based on an identifier they are processed and written to two different avro files using 'AvroMultipleOutputs'

[no subject]

2015-04-02 Thread Sanjeev Tripurari
-- _ The information contained in this communication is intended solely for the use of the individual or entity to whom it is addressed and others authorized to receive it. It may contain confidential or legally privileged informatio

[no subject]

2015-03-12 Thread Deepak Bansal
-- - Thanks & Regards Deepak Bansal

Re: (no subject)

2015-03-05 Thread SP
I resolved it by downloaded slf4j-simple-1.7.10.jar and copied it to $HADOOP_HOME/lib in .bashrc added this variable. export HADOOP_CLASSPATH=$HADOOP_HOME/lib Issue got resolved. thanks a lot for your response Raj. Thanks SP On Thu, Mar 5, 2015 at 11:52 AM, Raj K Singh wrote: > just confi

Re: (no subject)

2015-03-05 Thread Raj K Singh
just configure logging appender in log4j setting and rerun the command On Mar 5, 2015 12:30 AM, "SP" wrote: > Hello All, > > Why am I getting this error every time I execute a command. It was working > fine with CDH4 version. When I upgraded to CDH5 version this message > started showing up. > >

[no subject]

2015-03-04 Thread SP
Hello All, Why am I getting this error every time I execute a command. It was working fine with CDH4 version. When I upgraded to CDH5 version this message started showing up. does any one have resolution for this error sudo -u hdfs hadoop fs -ls / SLF4J: Failed to load class "org.slf4j.impl.Stat

[no subject]

2015-01-11 Thread dinesh dakshan
unsubscribe

[no subject]

2015-01-04 Thread Verma, Indrajeet
unsubscribe "This e-mail and any attachments transmitted with it are for the sole use of the intended recipient(s) and may contain confidential , proprietary or privileged information. If you are not the intended recipient, please contact the sender by reply e-mail and destroy all copies of the

[no subject]

2014-11-26 Thread Jian Feng

[no subject]

2014-11-11 Thread Premal Shah
Hi, We recently upgraded from 1.0.4 to 1.2.1 and now have a ton of DFS balancing to do. On 1.0.4, when I set the bandwidth to 1 or 10 Gbps, the datanodes got working and were using a ton of network to get the nodes balanced. However, looks like 1.2.1 is not honoring the setBandWidth setting, no mat

[no subject]

2014-09-26 Thread Naganarasimha G R (Naga)
Hi All, I have following doubts on pluggable FileSystem and YARN 1. If all the implementations should extend FileSystem then why there is a parallel class AbstractFileSystem. which ViewFS extends ? 2. Is YARN supposed to run on any of the pluggable org.apache.hadoop.fs.FileSystem like s3 ? if it

[no subject]

2014-09-23 Thread Poorvi Ahirwal
Hi, I am executing a mapreduce program with hcatalog and hive database. Even if the jars are included its showing this error: Exception in thread "main" java.io.IOException: com.google.common.util.concurrent.UncheckedExecutionException: javax.jdo.JDOFatalUserException: Class org.datanucleus.api.jd

[no subject]

2014-09-11 Thread Sheena O'Connell
Hi I have a python script that throws an obvious error NameError: name 'asooasdhoasdhio' is not defined and I'm using that script as both the mapper and reducer for a streaming task. /usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/share/hadoop/tools/lib/hadoop-streaming-2.4.0.jar

[no subject]

2014-08-08 Thread Susheel Kumar Gadalay
Hi, I have a question. How do I selectively open port range for Hadoop Yarn App Master on a cluster. I have seen the jira issue in http://mail-archives.apache.org/mod_mbox/hadoop-mapreduce-issues/201204.mbox/%3c74835698.75.1335357881103.javamail.tom...@hel.zones.apache.org%3E fixed in version 0.

[no subject]

2014-03-16 Thread Eric Chiu
HI all, Could anyone tell me How to install and use this hadoop plug-in? https://issues.apache.org/jira/browse/HDFS-385 I read the code but do not know where to install and use what command to install them all. Another problem is that there are .txt and .patch files, which one should be applied

[no subject]

2014-03-06 Thread Avinash Kujur
while impoting jar files using.. mvn clean install -DskipTests -Pdist i am getting this error, [ERROR] The goal you specified requires a project to execute but there is no POM in this directory (/home/cloudera). Please verify you invoked Maven from the correct directory. -> [Help 1] help me ou

[no subject]

2014-03-05 Thread Avinash Kujur
hi, i am getting error in between when downloading all th jars usng maven command: mvn clean install -DskipTests -Pdist the error is: [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @ hadoop-common --- [WARNING] [protoc, --version] failed with error code 1 help me out.

[no subject]

2014-03-05 Thread Avinash Kujur
when i am using this command mvn clean install -DskipTests -Pdist its giving this error: [cloudera@localhost ~]$ mvn clean install -DskipTests -Pdist [INFO] Scanning for projects... [INFO] [INFO] BUILD FAILURE [INFO] ---

[no subject]

2014-03-05 Thread Avinash Kujur
i am getting thia error while cloning the hadoop trunk code from git.apache.org using terminal. error is: [cloudera@localhost ~]$ git clone git://git.apache.org/hadoop-common.githadoop Initialized empty Git repository in /home/cloudera/hadoop/.git/ fatal: Unable to look up git.apache.org (port 9418

[no subject]

2014-02-27 Thread Avinash Kujur
i am new for hadoop. what are the issues i should start working with. i need some proper guidance. it will be helpful for me if someone will share his/her experience with me. i need to go through the code which fixed some issue. please help me.

[no subject]

2014-02-26 Thread Avinash Kujur
Hi, can i solve the hadoop issues in https://koding.com/. ?

[no subject]

2014-02-21 Thread Aaron Zimmerman
The worker nodes on my version 2.2 cluster won't use more than 11 of the 30 total (24 allocated) for mapreduce jobs running in Yarn. Does anyone have an idea what might be constraining the usage of Ram? I followed the steps listed here: http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.0.6.0/b

[no subject]

2014-02-20 Thread x

[no subject]

2014-02-14 Thread VJ Shalish
Can anyone send me valid comparison points or links taking into consideration Bigdata - MapR, Cloudera, Hortonworks, OracleBigdata appliances etc Thanks Shalish.

[no subject]

2014-01-16 Thread Sameer Awasekar
I am new to Hadoop. I have to create a prefix tree from the data set(which is divided across nodes) and use the output of the reducer in phase one as a input in phase two and work on the prefix tree and output the key value pairs for phase two . The tree is constructed only once during phase one

[no subject]

2014-01-10 Thread Andrea Barbato
Hi, i have a simple question. I have this example code: class WordCountMapper : public HadoopPipes::Mapper {public: // constructor: does nothing WordCountMapper( HadoopPipes::TaskContext& context ) { } // map function: receives a line, outputs (word,"1") to reducer. void map( HadoopPipes::

[no subject]

2014-01-05 Thread chandu banavaram
hi experts, plz clarifies the following doubt i am a hadoop learner how to generate the hive reports with regards, chandu.

[no subject]

2014-01-01 Thread Saeed Adel Mehraban
I have a Hadoop 2.2.0 installation on 3 VMs, one as master and 2 as slaves. When I try to run simple jobs like provided wordcount sample, if I try to run the job on 1 or a few files, it probably will succeed or not (somehow 50-50 chance of failure) but with more files, I get failure most of the tim

[no subject]

2013-12-16 Thread xeon Mailinglist
Is it possible to access the YARN webpages in a text browser?

[no subject]

2013-12-12 Thread chandu banavaram
Hi Expert, I want to known that when client wants to store data into HDFS who will divide the big data into blocks and then stored in DataNodes. I mean when the client approachs the NameNode to store data who and how the data is dividing into Blocks and then sent it to the DataNo

[no subject]

2013-10-22 Thread Gopi Krishna M

[no subject]

2013-10-06 Thread dwld0...@gmail.com
Hi I want to deploy a distributed Cloudera Hadoop. I deployed cdh with tarball before,but libhadoop.so is not available in the tarball, besides I can not use service commond to start processes. if deploying the hadoop with cloudera rpm,it's inconvenience. so I want to know which kinds of mean

[no subject]

2013-09-20 Thread jamal sasha
Hi, So in native hadoop streaming, how do i send a helper file.. ? Like in core hadoop, you can write your code in multiple files and then jar it out... But if i am using hadoop streaming, all my code should be in single file?? Is that so?

[no subject]

2013-09-19 Thread Indrajeet, Verma
-- "This e-mail and any attachments transmitted with it are for the sole use of the intended recipient(s) and may contain confidential , proprietary or privileged information. If you are not the intended recipient, please contact the sender by reply e-mail and destroy all copies of the origi

[no subject]

2013-09-13 Thread Young-Geun Park

[no subject]

2013-07-12 Thread Anit Alexander
Hello, I am encountering a problem in cdh4 environment. I can successfully run the map reduce job in the hadoop cluster. But when i migrated the same map reduce to my cdh4 environment it creates an error stating that it cannot read the next block(each block is 64 mb). Why is that so? Hadoop envir

[no subject]

2013-07-02 Thread Chui-Hui Chiu
Hello, I have a Hadoop 2.0.5 Alpha cluster. When I execute any Hadoop command, I see the following message. WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Is it at the lib/native folder? How do I configure the s

[no subject]

2013-06-26 Thread ch huang
hi i build a new hadoop cluster ,but i can not ACCESS hdfs ,why? i use CDH3u4 ,redhat6.2 # hadoop fs -put /opt/test hdfs://192.168.10.22:9000/user/test 13/06/26 15:00:47 INFO ipc.Client: Retrying connect to server: / 192.168.10.22:9000. Already tried 0 time(s). 13/06/26 15:00:48 INFO ipc.Client: R

[no subject]

2013-06-19 Thread Lewis John Mcgibbney
Hi, I have various MR jobs running over at Nutch. An example is generating fetch lists from large numbers of seed's, fetching those fetch lists, etc. I never noticed it before, but although it seems that some tasks complete (both Map and Reduce) successfully, in a number of tasks the Map indicates

[no subject]

2013-06-12 Thread Job Thomas
Hi all, In the hadoop 2 alpha package there is no conf directory , all configurations are in hadoop2home/etc/hadoop directory. And also there is no default configuration file( ex: core.default.xml) in the same package. Can i move with this? Best Regards, Job M Thomas

[no subject]

2013-06-05 Thread Job Thomas
Hi all, When I am starting my jobtracker in gridgain and hadoop combined project i am getting the following error Exception in thread "main" java.lang.RuntimeException: java.lang.ClassNotFoundException: org.gridgain.grid.ggfs.hadoop1.GridGgfsHadoopFileSystem. Can anybody help me? Best

[no subject]

2013-06-01 Thread Lanati, Matteo
Hi all, I stumbled upon this problem as well while trying to run the default wordcount shipped with Hadoop 1.2.0. My testbed is made up of 2 virtual machines: Debian 7, Oracle Java 7, 2 GB RAM, 25 GB hard disk. One node is used as JT+NN, the other as TT+DN. Security is enabled. The input file i

[no subject]

2013-05-30 Thread Job Thomas
Hi All, I amin a team developing hadoop with hive. we are using fair schedeuler. but all hive jobs are going to same pool whose name is same as username of where hive server installed. this is all about, my hive server is in user named 'hadoop'. my hive client program in user named 'abc'. but

[no subject]

2013-05-28 Thread Kabjin Kwon

[no subject]

2013-04-30 Thread Sandeep Nemuri
-- Regards N.H Sandeep

[no subject]

2013-04-30 Thread Niketh Nikky

[no subject]

2013-04-26 Thread Mohsen B.Sarmadi
Hi, I am newbi in hadoop, I am running hadoop on Mac X 10. and i can't load any files in Hdfs. first of all, i am getting this error localhost: 2013-04-26 19:08:31.330 java[14436:1b03] Unable to load realm info from SCDynamicStore which from some posts i understand i should add this line to hado

[no subject]

2013-04-22 Thread suneel hadoop
Can any one help me to change this SQL to pig Latin SELECT ('CSS'||DB.DISTRICT_CODE||DB.BILLING_ACCOUNT_NO) BAC_KEY, CASE WHEN T1.TAC_142 IS NULL THEN 'N' ELSE T1.TAC_142 END TAC_142 FROM ( SELECT DISTRICT_CODE,BILLING_ACCOUNT_NO, MAX(CASE WHEN TAC_1 = 'Y' AND (TAC_2 = 'Y' OR TAC_3 = 'Y')

[no subject]

2013-04-08 Thread Edd Grant
Hi all, I'm new to Hadoop and am posting my first message on this list. I have downloaded and installed the hadoop_1.1.1-1_x86_64.deb distro and have a couple of issues which are blocking me from progressing. I'm working through the 'Hadoop - The Definitive Guide' book and am trying to set up a t

[no subject]

2013-04-01 Thread oualid ait wafli
Bonjour, Y a t-il des francophones ici ? :) Merci

[no subject]

2013-03-29 Thread Mohit Vadhera
Hi, I am getting below error while mounting fuse_dfs i am getting shared library error while running the command. mount -a. Can anybody tell me to fix this plz # cat /etc/fstab | grep hadoop hadoop-fuse-dfs#dfs://localhost:8020 /mnt/san1/hadoop_mount fuse allow_other,usetrash,rw 2 0 # mount -

[no subject]

2013-03-28 Thread oualid ait wafli
Hi Sameone know samething about EMC distribution for Big Data which itegrate Hadoop and other tools ? Thanks

[no subject]

2013-03-27 Thread Mix Nin
I wrote a pig script as follows and stored it in x.pig file Data = LOAD '/' as ( ) NoNullData= FILTER Data by qe is not null; STORE (foreach (group NoNullData all) generate flatten($1)) into 'exp/$inputDatePig'; evnt_dtl =LOAD 'exp/$inputDatePig/part-r-0' AS (cust,) I execut

[no subject]

2013-03-24 Thread Fan Bai
Dear Sir, I have a question about Hadoop, when I use Hadoop and Mapreduce to finish a job (only one job in here), can I control the file to work in which node? For example, I have only one job and this job have 10 files (10 mapper need to run). Also in my severs, I have one head node and four

[no subject]

2013-03-20 Thread Jensen, Daniel
unsubscribe From: turboc...@gmail.com [turboc...@gmail.com] on behalf of John Conwell [j...@iamjohn.me] Sent: Wednesday, March 20, 2013 12:31 PM To: user@hadoop.apache.org Subject: Re: unsubscribe Totally off topic, but kind'a not. Why the hell are we st

[no subject]

2013-03-11 Thread preethi ganeshan
Hi, I want to modify the Task.java so that it gives additional information in the usrlogs files. How do i go about the modification? I am new to Hadoop. Shall i simply open the src .mapred . appropriate file in eclipse modify and save? Will that help? Thank you Regards, Preethi Ganeshan

[no subject]

2013-03-06 Thread ashish_kumar_gupta
Unsubscribe me How many more times, I have to mail u

[no subject]

2013-02-25 Thread lahir marni
hi, when i open the source of hadoop-1.0.4 i can see many files. I donot understand which one to start. can you suggeest me a way to understand the source code for hadoop-1.0.4 thanks, Lahir

[no subject]

2013-02-22 Thread T. Kuro Kurosaka

[no subject]

2013-02-09 Thread Glen Mazza

[no subject]

2012-10-25 Thread lei liu
http://blog.csdn.net/onlyqi/article/details/6544989 https://issues.apache.org/jira/browse/HDFS-2185 http://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/HDFSHighAvailability.html http://blog.csdn.net/chenpingbupt/article/details/7922042 https://issues.apache.org/jira/browse/HADOOP-816

[no subject]

2012-10-17 Thread Zheng, Kai
Hi, When Kerberos authentication is used instead of the default "simple" method, is a Linux user account needed to run a MapReduce job for a principal? Why? For example, for a Kerberos principal "j...@whatever-company.com", if he needs to run a job, is the foll

[no subject]

2012-09-20 Thread Ivan Tretyakov
Hello! I need to change some task trackers options. After options changed I need to restart every task tracker to apply changes. If there is running jobs onto cluster I'll loose the result of map tasks on this task trackers. Is there the way to do it with no losing map tasks results. If not, is th

[no subject]

2012-09-10 Thread Anjish Bhondwe
-- *Regards,* *Anjish Bhondwe.*

[no subject]

2012-08-09 Thread 刘鎏
-- 刘鎏

[no subject]

2012-08-08 Thread Yingchao
unsubscribe Sent from my iPhone

[no subject]

2012-08-08 Thread Adeel Qureshi
Unsubscribe

[no subject]

2012-08-08 Thread A Ashwin
Hi, Is this the mail id to contact hadoop for any queries. Thanks, Ashwini. A Ashwini ASE Tata Consultancy Services Mailto: a.ash...@tcs.com Website: http://www.tcs.com Experience certainty. IT Services Business Solutions Outsourcing _