enable regular expression on which parameter?

2014-05-13 Thread Avinash Kujur
mapreduce-5851
i can see many parameters in Distcp class. in which parameter do we need to
enable regular expressions?

private static final String usage = NAME
  + " [OPTIONS] * " +
  "\n\nOPTIONS:" +
  "\n-p[rbugp]  Preserve status" +
  "\n   r: replication number" +
  "\n   b: block size" +
  "\n   u: user" +
  "\n   g: group" +
  "\n   p: permission" +
  "\n   -p alone is equivalent to -prbugp" +
  "\n-i Ignore failures" +
  "\n-log   Write logs to " +
  "\n-m   Maximum number of simultaneous copies" +
  "\n-overwrite Overwrite destination" +
  "\n-updateOverwrite if src size different from dst
size" +
  "\n-fUse list at  as src list" +
  "\n-filelimit  Limit the total number of files to be <= n"
+
  "\n-sizelimit  Limit the total size to be <= n bytes" +
  "\n-deleteDelete the files existing in the dst but
not in src" +
  "\n-mapredSslConf  Filename of SSL configuration for mapper
task" +

  "\n\nNOTE 1: if -overwrite or -update are set, each source URI is " +
  "\n  interpreted as an isomorphic update to an existing
directory." +
  "\nFor example:" +
  "\nhadoop " + NAME + " -p -update \"hdfs://A:8020/user/foo/bar\" " +
  "\"hdfs://B:8020/user/foo/baz\"\n" +
  "\n would update all descendants of 'baz' also in 'bar'; it would
" +
  "\n *not* update /user/foo/baz/bar" +

  "\n\nNOTE 2: The parameter  in -filelimit and -sizelimit can be " +
  "\n specified with symbolic representation.  For examples," +
  "\n   1230k = 1230 * 1024 = 1259520" +
  "\n   891g = 891 * 1024^3 = 956703965184" +

  "\n";


how to make patch?

2014-04-01 Thread Avinash Kujur
hi,

how can i make patch from my updated file. please provide me any link if
possible for the procedure. when i submitted the .patch file that gave the
error like :

MAPREDUCE-5742 patch is being downloaded at Tue Apr  1 13:51:32 UTC
2014 
fromhttp://issues.apache.org/jira/secure/attachment/12638063/mapreduce-5742.patch
cp: cannot stat `/home/jenkins/buildSupport/lib/*': No such file or directory
The patch does not appear to apply with p0 to p2
PATCH APPLICATION FAILED"

please provide me some link for the procedure.

Regards,

Avinash


patch format

2014-04-01 Thread Avinash Kujur
hello everyone,

Is there any kind of specific format of the patch in which we need to write
the patch file. i submitted one patch but it got failed. i saw some of the
patches which were submitted previously. Is there sny kind of format for
patch? if it is, then please give me the link for format.

Regards,
Avinash


hadoop version

2014-03-31 Thread Avinash Kujur
hi,

how can i know my hadoop version which i have build in my system (apart
from the version which was in-built in cloudera.)

regards,
Avinash


any link for hadoop 1.3.0

2014-03-31 Thread Avinash Kujur
hi,

can anyone provide the link for hadoop 1.3.0?

regards,
Aviansh


how to be assignee ?

2014-03-28 Thread Avinash Kujur
hi,

how can i be assignee fro a particular issue?
i can't see any option for being assignee on the page.

Thanks.


Re: HADOOP_MAPRED_HOME not found!

2014-03-28 Thread Avinash Kujur
i am not getting where to set HADOOP_MAPRED_HOME and how to set.

thanks


On Fri, Mar 28, 2014 at 12:06 AM, divye sheth  wrote:

> You can execute this command on any machine where you have set the
> HADOOP_MAPRED_HOME
>
> Thanks
> Divye Sheth
>
>
> On Fri, Mar 28, 2014 at 12:31 PM, Avinash Kujur  wrote:
>
>> we can execute the above command anywhere or do i need to execute it in
>> any particular directory?
>>
>> thanks
>>
>>
>> On Thu, Mar 27, 2014 at 11:41 PM, divye sheth wrote:
>>
>>> I believe you are using Hadoop 2. In order to get the mapred working you
>>> need to set the HADOOP_MAPRED_HOME path in either your /etc/profile or
>>> .bashrc file or you can use the command given below to temporarily set the
>>> variable.
>>>
>>> export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
>>>
>>> $HADOOP_INSTALL is the location where the hadoop tar ball is extracted.
>>>
>>> This should work for you.
>>>
>>> Thanks
>>> Divye Sheth
>>>
>>>
>>>
>>> On Fri, Mar 28, 2014 at 11:53 AM, Rahul Singh <
>>> smart.rahul.i...@gmail.com> wrote:
>>>
>>>> Try adding the hadoop bin path to system path.
>>>>
>>>>
>>>> -Rahul Singh
>>>>
>>>>
>>>> On Fri, Mar 28, 2014 at 11:32 AM, Azuryy Yu  wrote:
>>>>
>>>>> it was defined at hadoop-config.sh
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Mar 28, 2014 at 1:19 PM, divye sheth wrote:
>>>>>
>>>>>> Which version of hadoop are u using? AFAIK the hadoop mapred home is
>>>>>> the directory where hadoop is installed or in other words untarred.
>>>>>>
>>>>>> Thanks
>>>>>> Divye Sheth
>>>>>> On Mar 28, 2014 10:43 AM, "Avinash Kujur"  wrote:
>>>>>>
>>>>>>> hi,
>>>>>>>
>>>>>>> when i am trying to execute this command:
>>>>>>> hadoop job -history ~/1
>>>>>>> its giving error like:
>>>>>>> DEPRECATED: Use of this script to execute mapred command is
>>>>>>> deprecated.
>>>>>>> Instead use the mapred command for it.
>>>>>>>
>>>>>>> HADOOP_MAPRED_HOME not found!
>>>>>>>
>>>>>>> from where can i get HADOOP_MAPRED_HOME?
>>>>>>>
>>>>>>> thanks.
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>


Re: HADOOP_MAPRED_HOME not found!

2014-03-28 Thread Avinash Kujur
we can execute the above command anywhere or do i need to execute it in any
particular directory?

thanks


On Thu, Mar 27, 2014 at 11:41 PM, divye sheth  wrote:

> I believe you are using Hadoop 2. In order to get the mapred working you
> need to set the HADOOP_MAPRED_HOME path in either your /etc/profile or
> .bashrc file or you can use the command given below to temporarily set the
> variable.
>
> export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
>
> $HADOOP_INSTALL is the location where the hadoop tar ball is extracted.
>
> This should work for you.
>
> Thanks
> Divye Sheth
>
>
>
> On Fri, Mar 28, 2014 at 11:53 AM, Rahul Singh 
> wrote:
>
>> Try adding the hadoop bin path to system path.
>>
>>
>> -Rahul Singh
>>
>>
>> On Fri, Mar 28, 2014 at 11:32 AM, Azuryy Yu  wrote:
>>
>>> it was defined at hadoop-config.sh
>>>
>>>
>>>
>>> On Fri, Mar 28, 2014 at 1:19 PM, divye sheth wrote:
>>>
>>>> Which version of hadoop are u using? AFAIK the hadoop mapred home is
>>>> the directory where hadoop is installed or in other words untarred.
>>>>
>>>> Thanks
>>>> Divye Sheth
>>>> On Mar 28, 2014 10:43 AM, "Avinash Kujur"  wrote:
>>>>
>>>>> hi,
>>>>>
>>>>> when i am trying to execute this command:
>>>>> hadoop job -history ~/1
>>>>> its giving error like:
>>>>> DEPRECATED: Use of this script to execute mapred command is deprecated.
>>>>> Instead use the mapred command for it.
>>>>>
>>>>> HADOOP_MAPRED_HOME not found!
>>>>>
>>>>> from where can i get HADOOP_MAPRED_HOME?
>>>>>
>>>>> thanks.
>>>>>
>>>>
>>>
>>
>


Re: HADOOP_MAPRED_HOME not found!

2014-03-27 Thread Avinash Kujur
 ; then
  CLASS=org.apache.hadoop.tools.HadoopArchives
  CLASSPATH=${CLASSPATH}:${TOOL_PATH}
elif [[ "$COMMAND" = -*  ]] ; then
# class and package names cannot begin with a -
echo "Error: No command named \`$COMMAND' was found. Perhaps you
meant \`hadoop ${COMMAND#-}'"
exit 1
else
  CLASS=$COMMAND
fi
shift

# Always respect HADOOP_OPTS and HADOOP_CLIENT_OPTS
HADOOP_OPTS="$HADOOP_OPTS $HADOOP_CLIENT_OPTS"

#make sure security appender is turned off
HADOOP_OPTS="$HADOOP_OPTS
-Dhadoop.security.logger=${HADOOP_SECURITY_LOGGER:-INFO,NullAppender}"
if $cygwin; then
  CLASSPATH=`cygpath -p -w "$CLASSPATH"`
fi
export CLASSPATH=$CLASSPATH
exec "$JAVA" $JAVA_HEAP_MAX $HADOOP_OPTS $CLASS "$@"
;;

esac

thank you.



On Thu, Mar 27, 2014 at 10:19 PM, divye sheth  wrote:

> Which version of hadoop are u using? AFAIK the hadoop mapred home is the
> directory where hadoop is installed or in other words untarred.
>
> Thanks
> Divye Sheth
> On Mar 28, 2014 10:43 AM, "Avinash Kujur"  wrote:
>
>> hi,
>>
>> when i am trying to execute this command:
>> hadoop job -history ~/1
>> its giving error like:
>> DEPRECATED: Use of this script to execute mapred command is deprecated.
>> Instead use the mapred command for it.
>>
>> HADOOP_MAPRED_HOME not found!
>>
>> from where can i get HADOOP_MAPRED_HOME?
>>
>> thanks.
>>
>


HADOOP_MAPRED_HOME not found!

2014-03-27 Thread Avinash Kujur
hi,

when i am trying to execute this command:
hadoop job -history ~/1
its giving error like:
DEPRECATED: Use of this script to execute mapred command is deprecated.
Instead use the mapred command for it.

HADOOP_MAPRED_HOME not found!

from where can i get HADOOP_MAPRED_HOME?

thanks.


Re: ecxecution of imported source code

2014-03-25 Thread Avinash Kujur
i have already imported the files. i want to build it.


On Tue, Mar 25, 2014 at 3:57 AM, Nitin Pawar wrote:

> you want to import the project into eclipse or you have got the code in
> eclipse and now want to do a build ?
>
>
> On Tue, Mar 25, 2014 at 4:22 PM, Avinash Kujur  wrote:
>
>> hi,
>>
>> how to execute the imported hadoop source code into eclipse?
>>
>> thanks
>>
>
>
>
> --
> Nitin Pawar
>


ecxecution of imported source code

2014-03-25 Thread Avinash Kujur
hi,

how to execute the imported hadoop source code into eclipse?

thanks


hadoop source code

2014-03-24 Thread Avinash Kujur
hi,

i downloaded the hadoop source code from github. after importing those
files in eclipse some of the classes and packages are missing. And i am not
able able to get those files online.
 help me out to get all the files once and some link for what are the files
do i need to import eclipse.

thanks.


history viewer issue

2014-03-18 Thread Avinash Kujur
hi,

How to solve this problem.

[cloudera@localhost ~]$ hadoop job -history ~/1
DEPRECATED: Use of this script to execute mapred command is deprecated.
Instead use the mapred command for it.

Exception in thread "main" java.io.IOException: Not able to initialize
History viewer
at org.apache.hadoop.mapred.HistoryViewer.(HistoryViewer.java:95)
at org.apache.hadoop.mapred.JobClient.viewHistory(JobClient.java:1945)
at org.apache.hadoop.mapred.JobClient.run(JobClient.java:1894)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.apache.hadoop.mapred.JobClient.main(JobClient.java:2162)
Caused by: java.io.IOException: History directory
/home/cloudera/1/_logs/historydoes not exist
at org.apache.hadoop.mapred.HistoryViewer.(HistoryViewer.java:76)
... 5 more

Regards,
Avinash


Re: error in hadoop hdfs while building the code.

2014-03-11 Thread Avinash Kujur
+ import org.apache.hadoop.ipc.RefreshCallQueueProtocol;
+ import org.apache.hadoop.ipc.protocolPB.RefreshCallQueueProtocolPB;
+ import
org.apache.hadoop.ipc.protocolPB.RefreshCallQueueProtocolClientSideTranslatorPB;

+ private static RefreshCallQueueProtocol
  +createNNProxyWithRefreshCallQueueProtocol(InetSocketAddress address,
  Configuration conf, UserGroupInformation ugi) throws IOException {
  + RefreshCallQueueProtocolPB proxy = (RefreshCallQueueProtocolPB)
createNameNodeProxy(address, conf, ugi,
RefreshCallQueueProtocolPB.class, 0);
+ return new RefreshCallQueueProtocolClientSideTranslatorPB(proxy);
  }


All the lines with + signs are showing error with message
"RefreshCallQueueProtocol can nto be resolved" and the type of problem is
"java problem".


regards,
Avinash


On Tue, Mar 11, 2014 at 11:09 PM, unmesha sreeveni wrote:

> I think it is Hadoop problem not java
> https://issues.apache.org/jira/browse/HADOOP-5396
>
>
> On Wed, Mar 12, 2014 at 11:37 AM, Avinash Kujur  wrote:
>
>> hi,
>>  i am getting error like "RefreshCallQueueProtocol can not be resolved".
>> it is a java problem.
>>
>> help me out.
>>
>> Regards,
>> Avinash
>>
>
>
>
> --
> *Thanks & Regards*
>
> Unmesha Sreeveni U.B
> Junior Developer
>
> http://www.unmeshasreeveni.blogspot.in/
>
>
>


error in hadoop hdfs while building the code.

2014-03-11 Thread Avinash Kujur
hi,
 i am getting error like "RefreshCallQueueProtocol can not be resolved".
it is a java problem.

help me out.

Regards,
Avinash


Re: regarding hadoop source code

2014-03-11 Thread Avinash Kujur
[INFO] BUILD FAILURE
[INFO]

[INFO] Total time: 32:00 min
[INFO] Finished at: 2014-03-11T05:01:00-08:00
[INFO] Final Memory: 52M/238M
[INFO]

[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on
project hadoop-common: There are test failures.
[ERROR]
[ERROR] Please refer to
/home/cloudera/hadoop/hadoop-common-project/hadoop-common/target/surefire-reports
for the individual test results.
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

i have already updated my java to 1.7 as u mentioned earlier.
help me out plz.

regards,
avinash



On Mon, Mar 10, 2014 at 9:10 PM, Oleg Zhurakousky <
oleg.zhurakou...@gmail.com> wrote:

> You must be using Java 1.5 or below where @Override is not allowed on any
> method that implements its counterpart from interface.
> Remember, both 1.5 and 1.6 are EOL, so I would suggest upgrading to 1.7.
> Oleg
>
>
> On Mon, Mar 10, 2014 at 10:49 AM, Avinash Kujur  wrote:
>
>>
>> hi,
>>
>> i downloaded the code from https://github.com/apache/hadoop-common.git .
>>
>> but while executing the command
>>
>> mvn install -DskipTests
>>
>> its giving this error in between:
>>
>> [INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ 
>> hadoop-hdfs-httpfs ---
>>
>>
>> [INFO] Compiling 56 source files to 
>> /home/cloudera/hadoop/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/classes
>> [INFO] -
>> [ERROR] COMPILATION ERROR :
>> [INFO] -
>>
>>
>> [ERROR] 
>> /home/cloudera/hadoop/hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/client/HttpFSFileSystem.java:[921,2]
>>  method does not override or implement a method from a supertype
>> [ERROR] 
>> /home/cloudera/hadoop/hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/client/HttpsFSFileSystem.java:[26,7]
>>  org.apache.hadoop.fs.http.client.HttpsFSFileSystem is not abstract and does 
>> not override abstract method 
>> setDelegationToken(org.apache.hadoop.security.token.Token) in 
>> org.apache.hadoop.fs.DelegationTokenRenewer.Renewable
>>
>>
>> [INFO] 2 errors
>> [INFO] -
>> [INFO] 
>> 
>> [INFO] Reactor Summary:
>> [INFO]
>> [INFO] Apache Hadoop Main  SUCCESS [  8.062 
>> s]
>>
>>
>> [INFO] Apache Hadoop Project POM . SUCCESS [  2.231 
>> s]
>> [INFO] Apache Hadoop Annotations . SUCCESS [  4.954 
>> s]
>> [INFO] Apache Hadoop Project Dist POM  SUCCESS [  5.232 
>> s]
>>
>>
>> [INFO] Apache Hadoop Assemblies .. SUCCESS [  1.923 
>> s]
>> [INFO] Apache Hadoop Maven Plugins ... SUCCESS [ 18.385 
>> s]
>> [INFO] Apache Hadoop MiniKDC . SUCCESS [  8.739 
>> s]
>>
>>
>> [INFO] Apache Hadoop Auth  SUCCESS [  7.932 
>> s]
>> [INFO] Apache Hadoop Auth Examples ... SUCCESS [  2.803 
>> s]
>> [INFO] Apache Hadoop Common .. SUCCESS [ 55.787 
>> s]
>>
>>
>> [INFO] Apache Hadoop NFS . SUCCESS [  3.162 
>> s]
>> [INFO] Apache Hadoop Common Project .. SUCCESS [  0.256 
>> s]
>> [INFO] Apache Hadoop HDFS  SUCCESS [01:40 
>> min]
>>
>>
>> [INFO] Apache Hadoop HttpFS .. FAILURE [  2.917 
>> s]
>> [INFO] Apache Hadoop HDFS BookKeeper Journal . SKIPPED
>> [INFO] Apache Hadoop HDFS-NFS  SKIPPED
>>
>>
>> [INFO] Apache Hadoop HDFS Project  SKIPPED
>> [INFO] hadoop-yarn ... SKIPPED
>> [INFO] hadoop-yarn-api ... SKIPPED
>>
>>
>> did you guys faced the same problem?
>>
>> plz give me some suggestion.
>>
>> Regards Avinash
>>
>>
>


Re: regarding hadoop source code

2014-03-10 Thread Avinash Kujur
hi,

i downloaded the code from https://github.com/apache/hadoop-common.git .

but while executing the command

mvn install -DskipTests

its giving this error in between:

[INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @
hadoop-hdfs-httpfs ---
[INFO] Compiling 56 source files to
/home/cloudera/hadoop/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/classes
[INFO] -
[ERROR] COMPILATION ERROR :
[INFO] -
[ERROR] 
/home/cloudera/hadoop/hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/client/HttpFSFileSystem.java:[921,2]
method does not override or implement a method from a supertype
[ERROR] 
/home/cloudera/hadoop/hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/client/HttpsFSFileSystem.java:[26,7]
org.apache.hadoop.fs.http.client.HttpsFSFileSystem is not abstract and
does not override abstract method
setDelegationToken(org.apache.hadoop.security.token.Token) in
org.apache.hadoop.fs.DelegationTokenRenewer.Renewable
[INFO] 2 errors
[INFO] -
[INFO] 
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main  SUCCESS [  8.062 s]
[INFO] Apache Hadoop Project POM . SUCCESS [  2.231 s]
[INFO] Apache Hadoop Annotations . SUCCESS [  4.954 s]
[INFO] Apache Hadoop Project Dist POM  SUCCESS [  5.232 s]
[INFO] Apache Hadoop Assemblies .. SUCCESS [  1.923 s]
[INFO] Apache Hadoop Maven Plugins ... SUCCESS [ 18.385 s]
[INFO] Apache Hadoop MiniKDC . SUCCESS [  8.739 s]
[INFO] Apache Hadoop Auth  SUCCESS [  7.932 s]
[INFO] Apache Hadoop Auth Examples ... SUCCESS [  2.803 s]
[INFO] Apache Hadoop Common .. SUCCESS [ 55.787 s]
[INFO] Apache Hadoop NFS . SUCCESS [  3.162 s]
[INFO] Apache Hadoop Common Project .. SUCCESS [  0.256 s]
[INFO] Apache Hadoop HDFS  SUCCESS [01:40 min]
[INFO] Apache Hadoop HttpFS .. FAILURE [  2.917 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal . SKIPPED
[INFO] Apache Hadoop HDFS-NFS  SKIPPED
[INFO] Apache Hadoop HDFS Project  SKIPPED
[INFO] hadoop-yarn ... SKIPPED
[INFO] hadoop-yarn-api ... SKIPPED


did you guys faced the same problem?

plz give me some suggestion.

Regards Avinash


History viewer issue

2014-03-07 Thread Avinash Kujur
[cloudera@localhost bin]$ hadoop job -history ~/1
DEPRECATED: Use of this script to execute mapred command is deprecated.
Instead use the mapred command for it.

Exception in thread "main" java.io.IOException: Not able to initialize
History viewer
at org.apache.hadoop.mapred.HistoryViewer.(HistoryViewer.java:95)
at org.apache.hadoop.mapred.JobClient.viewHistory(JobClient.java:1945)
at org.apache.hadoop.mapred.JobClient.run(JobClient.java:1894)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.apache.hadoop.mapred.JobClient.main(JobClient.java:2162)
Caused by: java.io.IOException: History directory
/home/cloudera/1/_logs/historydoes not exist
at org.apache.hadoop.mapred.HistoryViewer.(HistoryViewer.java:76)


do i need to make directory for history?


Re: how to import the hadoop code into eclipse.

2014-03-06 Thread Avinash Kujur
i did that. but i have some doubt while importing code. because its showing
some warning and error on imported modules. i was wondering if u could give
me any proper procedure link.


On Thu, Mar 6, 2014 at 9:21 PM, Zhijie Shen  wrote:

> mvn eclipse:eclipse, and then import the existing projects in eclipse.
>
> - Zhijie
>
>
> On Thu, Mar 6, 2014 at 9:00 PM, Avinash Kujur  wrote:
>
>> hi,
>>
>> i have downloaded the hadoop code. And executed maven command
>> successfully. how to import hadoop source code cleanly. because its showing
>> red exclamation mark on some of the modules while i am importing it.
>> help me out.
>>  thanks in advance.
>>
>>
>
>
>
> --
> Zhijie Shen
> Hortonworks Inc.
> http://hortonworks.com/
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.


how to import the hadoop code into eclipse.

2014-03-06 Thread Avinash Kujur
hi,

i have downloaded the hadoop code. And executed maven command successfully.
how to import hadoop source code cleanly. because its showing red
exclamation mark on some of the modules while i am importing it.
help me out.
 thanks in advance.


[no subject]

2014-03-06 Thread Avinash Kujur
while impoting jar files using..
mvn clean install -DskipTests -Pdist

 i am getting this error,


[ERROR] The goal you specified requires a project to execute but there is
no POM in this directory (/home/cloudera). Please verify you invoked Maven
from the correct directory. -> [Help 1]

help me out


Re:

2014-03-05 Thread Avinash Kujur
yes. protobuf is installed. libprotoc 2.4.1
i checked.


On Wed, Mar 5, 2014 at 11:04 PM, Gordon Wang  wrote:

> Do you have protobuf installed on your build box?
> you can use "which protoc" to check.
> Looks like protobuf is missing.
>
>
> On Thu, Mar 6, 2014 at 2:55 PM, Avinash Kujur  wrote:
>
>> hi,
>>
>> i am getting error in between when downloading all th jars usng maven
>> command:
>>
>> mvn clean install -DskipTests -Pdist
>>
>>
>> the error is:
>>
>>
>> [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @
>> hadoop-common ---
>> [WARNING] [protoc, --version] failed with error code 1
>>
>> help me out.
>>
>>  thanks in advance.
>>
>
>
>
> --
> Regards
> Gordon Wang
>


[no subject]

2014-03-05 Thread Avinash Kujur
hi,

i am getting error in between when downloading all th jars usng maven
command:

mvn clean install -DskipTests -Pdist


the error is:


[INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @
hadoop-common ---
[WARNING] [protoc, --version] failed with error code 1

help me out.

 thanks in advance.


Re:

2014-03-05 Thread Avinash Kujur
yes ming.


On Wed, Mar 5, 2014 at 2:56 AM, Mingjiang Shi  wrote:

> Can you access this link?
>
> http://repo.maven.apache.org/maven2/org/apache/felix/maven-bundle-plugin/2.4.0/maven-bundle-plugin-2.4.0.pom
>
>
> On Wed, Mar 5, 2014 at 6:54 PM, Avinash Kujur  wrote:
>
>> if i follow repo.maven.apache.org link on my url, it is showing this
>> message :
>>
>> Browsing for this directory has been disabled.
>>
>> View <http://search.maven.org/#browse> this directory's contents on
>> http://search.maven.org <http://search.maven.org/#browse> instead.
>>
>> so how can i change the link from repo.maven.apache .org to
>> http://search.maven.org ?
>>
>>
>> On Wed, Mar 5, 2014 at 2:49 AM, Avinash Kujur  wrote:
>>
>>> yes. it has internet access.
>>>
>>>
>>> On Wed, Mar 5, 2014 at 2:47 AM, Mingjiang Shi wrote:
>>>
>>>> see the error message:
>>>>
>>>> Unknown host repo.maven.apache.org -> [Help 2]
>>>>
>>>>
>>>> Does your machine has internet access?
>>>>
>>>>
>>>> On Wed, Mar 5, 2014 at 6:42 PM, Avinash Kujur wrote:
>>>>
>>>>> home/cloudera/ contains hadoop files.
>>>>>
>>>>>
>>>>> On Wed, Mar 5, 2014 at 2:40 AM, Avinash Kujur wrote:
>>>>>
>>>>>> [cloudera@localhost hadoop-common-trunk]$ mvn clean install
>>>>>> -DskipTests -Pdist
>>>>>> [INFO] Scanning for projects...
>>>>>> Downloading:
>>>>>> http://repo.maven.apache.org/maven2/org/apache/felix/maven-bundle-plugin/2.4.0/maven-bundle-plugin-2.4.0.pom
>>>>>> [ERROR] The build could not read 1 project -> [Help 1]
>>>>>> [ERROR]
>>>>>> [ERROR]   The project org.apache.hadoop:hadoop-main:3.0.0-SNAPSHOT
>>>>>> (/home/cloudera/hadoop-common-trunk/pom.xml) has 1 error
>>>>>> [ERROR] Unresolveable build extension: Plugin
>>>>>> org.apache.felix:maven-bundle-plugin:2.4.0 or one of its dependencies 
>>>>>> could
>>>>>> not be resolved: Failed to read artifact descriptor for
>>>>>> org.apache.felix:maven-bundle-plugin:jar:2.4.0: Could not transfer 
>>>>>> artifact
>>>>>> org.apache.felix:maven-bundle-plugin:pom:2.4.0 from/to central (
>>>>>> http://repo.maven.apache.org/maven2): repo.maven.apache.org: Unknown
>>>>>> host repo.maven.apache.org -> [Help 2]
>>>>>>
>>>>>> [ERROR]
>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
>>>>>> the -e switch.
>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>>>> [ERROR]
>>>>>> [ERROR] For more information about the errors and possible solutions,
>>>>>> please read the following articles:
>>>>>>  [ERROR] [Help 1]
>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException
>>>>>> [ERROR] [Help 2]
>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
>>>>>>
>>>>>>
>>>>>> when i execute this command from hadoop directory this is the error i
>>>>>> am getting.
>>>>>>
>>>>>>
>>>>>> On Wed, Mar 5, 2014 at 2:33 AM, Mingjiang Shi wrote:
>>>>>>
>>>>>>> Did you execute the command from /home/cloudera? Does it contains
>>>>>>> the hadoop source code? You need to execute the command from the source
>>>>>>> code directory.
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Mar 5, 2014 at 6:28 PM, Avinash Kujur wrote:
>>>>>>>
>>>>>>>> when i am using this command
>>>>>>>> mvn clean install -DskipTests -Pdist
>>>>>>>>
>>>>>>>> its giving this error:
>>>>>>>>
>>>>>>>> [cloudera@localhost ~]$ mvn clean install -DskipTests -Pdist
>>>>>>>> [INFO] Scanning for projects...
>>>>>>>> [INFO]
>>>>>>>> 
>>>>>>>> [INFO] BUILD FAILURE
>>>>>>>> [INFO]
>>>>>>>> 
>>>>>>>> [INFO] Total time: 0.170 s
>>>>>>>> [INFO] Finished at: 2014-03-05T02:25:52-08:00
>>>>>>>> [INFO] Final Memory: 2M/43M
>>>>>>>> [INFO]
>>>>>>>> 
>>>>>>>> [WARNING] The requested profile "dist" could not be activated
>>>>>>>> because it does not exist.
>>>>>>>> [ERROR] The goal you specified requires a project to execute but
>>>>>>>> there is no POM in this directory (/home/cloudera). Please verify you
>>>>>>>> invoked Maven from the correct directory. -> [Help 1]
>>>>>>>> [ERROR]
>>>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven
>>>>>>>> with the -e switch.
>>>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
>>>>>>>> logging.
>>>>>>>> [ERROR]
>>>>>>>> [ERROR] For more information about the errors and possible
>>>>>>>> solutions, please read the following articles:
>>>>>>>> [ERROR] [Help 1]
>>>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> help me out.
>>>>>>>> Thanks in advance. :)
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Cheers
>>>>>>> -MJ
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> Cheers
>>>> -MJ
>>>>
>>>
>>>
>>
>
>
> --
> Cheers
> -MJ
>


Re:

2014-03-05 Thread Avinash Kujur
if i follow repo.maven.apache.org link on my url, it is showing this
message :

Browsing for this directory has been disabled.

View <http://search.maven.org/#browse> this directory's contents on
http://search.maven.org <http://search.maven.org/#browse> instead.

so how can i change the link from repo.maven.apache .org to
http://search.maven.org ?


On Wed, Mar 5, 2014 at 2:49 AM, Avinash Kujur  wrote:

> yes. it has internet access.
>
>
> On Wed, Mar 5, 2014 at 2:47 AM, Mingjiang Shi  wrote:
>
>> see the error message:
>>
>> Unknown host repo.maven.apache.org -> [Help 2]
>>
>>
>> Does your machine has internet access?
>>
>>
>> On Wed, Mar 5, 2014 at 6:42 PM, Avinash Kujur  wrote:
>>
>>> home/cloudera/ contains hadoop files.
>>>
>>>
>>> On Wed, Mar 5, 2014 at 2:40 AM, Avinash Kujur  wrote:
>>>
>>>> [cloudera@localhost hadoop-common-trunk]$ mvn clean install
>>>> -DskipTests -Pdist
>>>> [INFO] Scanning for projects...
>>>> Downloading:
>>>> http://repo.maven.apache.org/maven2/org/apache/felix/maven-bundle-plugin/2.4.0/maven-bundle-plugin-2.4.0.pom
>>>> [ERROR] The build could not read 1 project -> [Help 1]
>>>> [ERROR]
>>>> [ERROR]   The project org.apache.hadoop:hadoop-main:3.0.0-SNAPSHOT
>>>> (/home/cloudera/hadoop-common-trunk/pom.xml) has 1 error
>>>> [ERROR] Unresolveable build extension: Plugin
>>>> org.apache.felix:maven-bundle-plugin:2.4.0 or one of its dependencies could
>>>> not be resolved: Failed to read artifact descriptor for
>>>> org.apache.felix:maven-bundle-plugin:jar:2.4.0: Could not transfer artifact
>>>> org.apache.felix:maven-bundle-plugin:pom:2.4.0 from/to central (
>>>> http://repo.maven.apache.org/maven2): repo.maven.apache.org: Unknown
>>>> host repo.maven.apache.org -> [Help 2]
>>>>
>>>> [ERROR]
>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
>>>> the -e switch.
>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>> [ERROR]
>>>> [ERROR] For more information about the errors and possible solutions,
>>>> please read the following articles:
>>>>  [ERROR] [Help 1]
>>>> http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException
>>>> [ERROR] [Help 2]
>>>> http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
>>>>
>>>>
>>>> when i execute this command from hadoop directory this is the error i
>>>> am getting.
>>>>
>>>>
>>>> On Wed, Mar 5, 2014 at 2:33 AM, Mingjiang Shi wrote:
>>>>
>>>>> Did you execute the command from /home/cloudera? Does it contains the
>>>>> hadoop source code? You need to execute the command from the source code
>>>>> directory.
>>>>>
>>>>>
>>>>> On Wed, Mar 5, 2014 at 6:28 PM, Avinash Kujur wrote:
>>>>>
>>>>>> when i am using this command
>>>>>> mvn clean install -DskipTests -Pdist
>>>>>>
>>>>>> its giving this error:
>>>>>>
>>>>>> [cloudera@localhost ~]$ mvn clean install -DskipTests -Pdist
>>>>>> [INFO] Scanning for projects...
>>>>>> [INFO]
>>>>>> 
>>>>>> [INFO] BUILD FAILURE
>>>>>> [INFO]
>>>>>> 
>>>>>> [INFO] Total time: 0.170 s
>>>>>> [INFO] Finished at: 2014-03-05T02:25:52-08:00
>>>>>> [INFO] Final Memory: 2M/43M
>>>>>> [INFO]
>>>>>> 
>>>>>> [WARNING] The requested profile "dist" could not be activated because
>>>>>> it does not exist.
>>>>>> [ERROR] The goal you specified requires a project to execute but
>>>>>> there is no POM in this directory (/home/cloudera). Please verify you
>>>>>> invoked Maven from the correct directory. -> [Help 1]
>>>>>> [ERROR]
>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
>>>>>> the -e switch.
>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>>>> [ERROR]
>>>>>> [ERROR] For more information about the errors and possible solutions,
>>>>>> please read the following articles:
>>>>>> [ERROR] [Help 1]
>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException
>>>>>>
>>>>>>
>>>>>>
>>>>>> help me out.
>>>>>> Thanks in advance. :)
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Cheers
>>>>> -MJ
>>>>>
>>>>
>>>>
>>>
>>
>>
>> --
>> Cheers
>> -MJ
>>
>
>


Re:

2014-03-05 Thread Avinash Kujur
yes. it has internet access.


On Wed, Mar 5, 2014 at 2:47 AM, Mingjiang Shi  wrote:

> see the error message:
>
> Unknown host repo.maven.apache.org -> [Help 2]
>
>
> Does your machine has internet access?
>
>
> On Wed, Mar 5, 2014 at 6:42 PM, Avinash Kujur  wrote:
>
>> home/cloudera/ contains hadoop files.
>>
>>
>> On Wed, Mar 5, 2014 at 2:40 AM, Avinash Kujur  wrote:
>>
>>> [cloudera@localhost hadoop-common-trunk]$ mvn clean install -DskipTests
>>> -Pdist
>>> [INFO] Scanning for projects...
>>> Downloading:
>>> http://repo.maven.apache.org/maven2/org/apache/felix/maven-bundle-plugin/2.4.0/maven-bundle-plugin-2.4.0.pom
>>> [ERROR] The build could not read 1 project -> [Help 1]
>>> [ERROR]
>>> [ERROR]   The project org.apache.hadoop:hadoop-main:3.0.0-SNAPSHOT
>>> (/home/cloudera/hadoop-common-trunk/pom.xml) has 1 error
>>> [ERROR] Unresolveable build extension: Plugin
>>> org.apache.felix:maven-bundle-plugin:2.4.0 or one of its dependencies could
>>> not be resolved: Failed to read artifact descriptor for
>>> org.apache.felix:maven-bundle-plugin:jar:2.4.0: Could not transfer artifact
>>> org.apache.felix:maven-bundle-plugin:pom:2.4.0 from/to central (
>>> http://repo.maven.apache.org/maven2): repo.maven.apache.org: Unknown
>>> host repo.maven.apache.org -> [Help 2]
>>>
>>> [ERROR]
>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>> -e switch.
>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>> [ERROR]
>>> [ERROR] For more information about the errors and possible solutions,
>>> please read the following articles:
>>>  [ERROR] [Help 1]
>>> http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException
>>> [ERROR] [Help 2]
>>> http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
>>>
>>>
>>> when i execute this command from hadoop directory this is the error i am
>>> getting.
>>>
>>>
>>> On Wed, Mar 5, 2014 at 2:33 AM, Mingjiang Shi wrote:
>>>
>>>> Did you execute the command from /home/cloudera? Does it contains the
>>>> hadoop source code? You need to execute the command from the source code
>>>> directory.
>>>>
>>>>
>>>> On Wed, Mar 5, 2014 at 6:28 PM, Avinash Kujur wrote:
>>>>
>>>>> when i am using this command
>>>>> mvn clean install -DskipTests -Pdist
>>>>>
>>>>> its giving this error:
>>>>>
>>>>> [cloudera@localhost ~]$ mvn clean install -DskipTests -Pdist
>>>>> [INFO] Scanning for projects...
>>>>> [INFO]
>>>>> 
>>>>> [INFO] BUILD FAILURE
>>>>> [INFO]
>>>>> 
>>>>> [INFO] Total time: 0.170 s
>>>>> [INFO] Finished at: 2014-03-05T02:25:52-08:00
>>>>> [INFO] Final Memory: 2M/43M
>>>>> [INFO]
>>>>> 
>>>>> [WARNING] The requested profile "dist" could not be activated because
>>>>> it does not exist.
>>>>> [ERROR] The goal you specified requires a project to execute but there
>>>>> is no POM in this directory (/home/cloudera). Please verify you invoked
>>>>> Maven from the correct directory. -> [Help 1]
>>>>> [ERROR]
>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
>>>>> the -e switch.
>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>>> [ERROR]
>>>>> [ERROR] For more information about the errors and possible solutions,
>>>>> please read the following articles:
>>>>> [ERROR] [Help 1]
>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException
>>>>>
>>>>>
>>>>>
>>>>> help me out.
>>>>> Thanks in advance. :)
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Cheers
>>>> -MJ
>>>>
>>>
>>>
>>
>
>
> --
> Cheers
> -MJ
>


Re:

2014-03-05 Thread Avinash Kujur
home/cloudera/ contains hadoop files.


On Wed, Mar 5, 2014 at 2:40 AM, Avinash Kujur  wrote:

> [cloudera@localhost hadoop-common-trunk]$ mvn clean install -DskipTests
> -Pdist
> [INFO] Scanning for projects...
> Downloading:
> http://repo.maven.apache.org/maven2/org/apache/felix/maven-bundle-plugin/2.4.0/maven-bundle-plugin-2.4.0.pom
> [ERROR] The build could not read 1 project -> [Help 1]
> [ERROR]
> [ERROR]   The project org.apache.hadoop:hadoop-main:3.0.0-SNAPSHOT
> (/home/cloudera/hadoop-common-trunk/pom.xml) has 1 error
> [ERROR] Unresolveable build extension: Plugin
> org.apache.felix:maven-bundle-plugin:2.4.0 or one of its dependencies could
> not be resolved: Failed to read artifact descriptor for
> org.apache.felix:maven-bundle-plugin:jar:2.4.0: Could not transfer artifact
> org.apache.felix:maven-bundle-plugin:pom:2.4.0 from/to central (
> http://repo.maven.apache.org/maven2): repo.maven.apache.org: Unknown host
> repo.maven.apache.org -> [Help 2]
>
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException
> [ERROR] [Help 2]
> http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
>
>
> when i execute this command from hadoop directory this is the error i am
> getting.
>
>
> On Wed, Mar 5, 2014 at 2:33 AM, Mingjiang Shi  wrote:
>
>> Did you execute the command from /home/cloudera? Does it contains the
>> hadoop source code? You need to execute the command from the source code
>> directory.
>>
>>
>> On Wed, Mar 5, 2014 at 6:28 PM, Avinash Kujur  wrote:
>>
>>> when i am using this command
>>> mvn clean install -DskipTests -Pdist
>>>
>>> its giving this error:
>>>
>>> [cloudera@localhost ~]$ mvn clean install -DskipTests -Pdist
>>> [INFO] Scanning for projects...
>>> [INFO]
>>> 
>>> [INFO] BUILD FAILURE
>>> [INFO]
>>> 
>>> [INFO] Total time: 0.170 s
>>> [INFO] Finished at: 2014-03-05T02:25:52-08:00
>>> [INFO] Final Memory: 2M/43M
>>> [INFO]
>>> 
>>> [WARNING] The requested profile "dist" could not be activated because it
>>> does not exist.
>>> [ERROR] The goal you specified requires a project to execute but there
>>> is no POM in this directory (/home/cloudera). Please verify you invoked
>>> Maven from the correct directory. -> [Help 1]
>>> [ERROR]
>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>> -e switch.
>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>> [ERROR]
>>> [ERROR] For more information about the errors and possible solutions,
>>> please read the following articles:
>>> [ERROR] [Help 1]
>>> http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException
>>>
>>>
>>>
>>> help me out.
>>> Thanks in advance. :)
>>>
>>
>>
>>
>> --
>> Cheers
>> -MJ
>>
>
>


Re:

2014-03-05 Thread Avinash Kujur
[cloudera@localhost hadoop-common-trunk]$ mvn clean install -DskipTests
-Pdist
[INFO] Scanning for projects...
Downloading:
http://repo.maven.apache.org/maven2/org/apache/felix/maven-bundle-plugin/2.4.0/maven-bundle-plugin-2.4.0.pom
[ERROR] The build could not read 1 project -> [Help 1]
[ERROR]
[ERROR]   The project org.apache.hadoop:hadoop-main:3.0.0-SNAPSHOT
(/home/cloudera/hadoop-common-trunk/pom.xml) has 1 error
[ERROR] Unresolveable build extension: Plugin
org.apache.felix:maven-bundle-plugin:2.4.0 or one of its dependencies could
not be resolved: Failed to read artifact descriptor for
org.apache.felix:maven-bundle-plugin:jar:2.4.0: Could not transfer artifact
org.apache.felix:maven-bundle-plugin:pom:2.4.0 from/to central (
http://repo.maven.apache.org/maven2): repo.maven.apache.org: Unknown host
repo.maven.apache.org -> [Help 2]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException
[ERROR] [Help 2]
http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException


when i execute this command from hadoop directory this is the error i am
getting.


On Wed, Mar 5, 2014 at 2:33 AM, Mingjiang Shi  wrote:

> Did you execute the command from /home/cloudera? Does it contains the
> hadoop source code? You need to execute the command from the source code
> directory.
>
>
> On Wed, Mar 5, 2014 at 6:28 PM, Avinash Kujur  wrote:
>
>> when i am using this command
>> mvn clean install -DskipTests -Pdist
>>
>> its giving this error:
>>
>> [cloudera@localhost ~]$ mvn clean install -DskipTests -Pdist
>> [INFO] Scanning for projects...
>> [INFO]
>> 
>> [INFO] BUILD FAILURE
>> [INFO]
>> 
>> [INFO] Total time: 0.170 s
>> [INFO] Finished at: 2014-03-05T02:25:52-08:00
>> [INFO] Final Memory: 2M/43M
>> [INFO]
>> 
>> [WARNING] The requested profile "dist" could not be activated because it
>> does not exist.
>> [ERROR] The goal you specified requires a project to execute but there is
>> no POM in this directory (/home/cloudera). Please verify you invoked Maven
>> from the correct directory. -> [Help 1]
>> [ERROR]
>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>> -e switch.
>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>> [ERROR]
>> [ERROR] For more information about the errors and possible solutions,
>> please read the following articles:
>> [ERROR] [Help 1]
>> http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException
>>
>>
>>
>> help me out.
>> Thanks in advance. :)
>>
>
>
>
> --
> Cheers
> -MJ
>


[no subject]

2014-03-05 Thread Avinash Kujur
when i am using this command
mvn clean install -DskipTests -Pdist

its giving this error:

[cloudera@localhost ~]$ mvn clean install -DskipTests -Pdist
[INFO] Scanning for projects...
[INFO]

[INFO] BUILD FAILURE
[INFO]

[INFO] Total time: 0.170 s
[INFO] Finished at: 2014-03-05T02:25:52-08:00
[INFO] Final Memory: 2M/43M
[INFO]

[WARNING] The requested profile "dist" could not be activated because it
does not exist.
[ERROR] The goal you specified requires a project to execute but there is
no POM in this directory (/home/cloudera). Please verify you invoked Maven
from the correct directory. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException



help me out.
Thanks in advance. :)


Re:

2014-03-05 Thread Avinash Kujur
after downloading 150 mb it gave this error.
error: RPC failed; result=18, HTTP code = 20047 MiB | 24 KiB/s
did not get what it means.


On Wed, Mar 5, 2014 at 12:30 AM, Nitin Pawar wrote:

> try this git clone https://github.com/apache/hadoop-common.git hadoop
>
>
> On Wed, Mar 5, 2014 at 1:58 PM, Avinash Kujur  wrote:
>
>> i am getting thia error while cloning the hadoop trunk code from
>> git.apache.org using terminal.
>> error is:
>> [cloudera@localhost ~]$ git clone 
>> git://git.apache.org/hadoop-common.githadoop
>> Initialized empty Git repository in /home/cloudera/hadoop/.git/
>> fatal: Unable to look up git.apache.org (port 9418) (Name or service not
>> known)
>>
>>
>> help me out.
>>
>> thanks in advance.
>>
>
>
>
> --
> Nitin Pawar
>


[no subject]

2014-03-05 Thread Avinash Kujur
i am getting thia error while cloning the hadoop trunk code from
git.apache.org using terminal.
error is:
[cloudera@localhost ~]$ git clone git://git.apache.org/hadoop-common.githadoop
Initialized empty Git repository in /home/cloudera/hadoop/.git/
fatal: Unable to look up git.apache.org (port 9418) (Name or service not
known)


help me out.

thanks in advance.


[no subject]

2014-02-27 Thread Avinash Kujur
i am new for hadoop. what are the issues i should start working with. i
need some proper guidance. it will be helpful for me if someone will share
his/her experience with me. i need to go through the code which fixed some
issue. please help me.


[no subject]

2014-02-26 Thread Avinash Kujur
Hi, can i solve the hadoop issues in https://koding.com/. ?