Re: Unable to build pig from Trunk

2011-12-29 Thread praveenesh kumar
Yeah.. It was ANT issue. Upgrading ANT to 1.7 worked fine.

Thanks,
Praveenesh

On Fri, Dec 30, 2011 at 12:13 AM, Joey Krabacher wrote:

> Looks like you may have an unsupported/older version of ant.
> May try upgrading ant to something over 1.7
>
> --Joey
>
> On Thu, Dec 29, 2011 at 11:54 PM, praveenesh kumar 
> wrote:
> > I set up proxy, Now I am getting the following error :
> >
> > root@lxe9700 [/usr/local/hadoop/pig/new/trunk] $ --> ant
> jar-withouthadoop
> > -verbose
> > Apache Ant version 1.6.5 compiled on June 5 2007
> > Buildfile: build.xml
> > Detected Java version: 1.5 in: /usr/java/jdk1.6.0_25/jre
> > Detected OS: Linux
> > parsing buildfile /usr/local/hadoop/pig/new/trunk/build.xml with URI =
> > file:///usr/local/hadoop/pig/new/trunk/build.xml
> > Project base dir set to: /usr/local/hadoop/pig/new/trunk
> >  [property] Loading /root/build.properties
> >  [property] Unable to find property file: /root/build.properties
> >  [property] Loading /usr/local/hadoop/pig/new/trunk/build.properties
> >  [property] Unable to find property file:
> > /usr/local/hadoop/pig/new/trunk/build.properties
> > Override ignored for property test.log.dir
> > Property ${clover.home} has not been set
> > [available] Unable to find ${clover.home}/lib/clover.jar to set property
> > clover.present
> > Property ${repo} has not been set
> > Override ignored for property build.dir
> > Override ignored for property dist.dir
> > Property ${zookeeper.jarfile} has not been set
> > Build sequence for target(s) `jar-withouthadoop' is [ivy-download,
> > ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve,
> > ivy-compile, init, cc-compile, prepare, genLexer, genParser,
> genTreeParser,
> > gen, compile, jar-withouthadoop]
> > Complete build sequence is [ivy-download, ivy-init-dirs,
> ivy-probe-antlib,
> > ivy-init-antlib, ivy-init, ivy-resolve, ivy-compile, init, cc-compile,
> > prepare, genLexer, genParser, genTreeParser, gen, compile,
> > jar-withouthadoop, forrest.check, ivy-javadoc, javadoc-all, docs,
> > ivy-jdiff, write-null, api-xml, api-report, jar, package, tar,
> source-jar,
> > patch.check, makepom, ivy-releaseaudit, releaseaudit, ivy-test,
> > compile-test, pigunit-jar, javadoc, javadoc-jar, package-release,
> > clover.setup, jarWithSvn, piggybank, test-e2e-local,
> assert-pig-jar-exists,
> > ready-to-publish, copy-jar-to-maven, jar-withouthadoopWithOutSvn,
> > compile-sources, clover.info, clover, clean-sign, sign,
> > test-e2e-deploy-local, ivy-publish-local, ant-task-download, mvn-taskdef,
> > test-commit, test-smoke, copypom, maven-artifacts, published,
> set-version,
> > test-unit, test-e2e, test-contrib, test, jar-withouthadoopWithSvn,
> > clover.check, check-for-findbugs, test-core, ivy-buildJar,
> > checkstyle.check, tar-release, rpm, clean, smoketests-jar, mvn-install,
> > test-e2e-undeploy, ivy-checkstyle, jar-all, test-pigunit, signanddeploy,
> > simpledeploy, mvn-deploy, findbugs, buildJar-withouthadoop, checkstyle,
> > buildJar, findbugs.check, test-patch, jarWithOutSvn, test-e2e-deploy,
> > hudson-test-patch, compile-sources-all-warnings, test-contrib-internal,
> > include-meta, deb, eclipse-files, generate-clover-reports, ]
> >
> > ivy-download:
> >  [get] Getting:
> > http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
> >  [get] To: /usr/local/hadoop/pig/new/trunk/ivy/ivy-2.2.0.jar
> >
> > ivy-init-dirs:
> >
> > ivy-probe-antlib:
> >
> > BUILD FAILED
> > /usr/local/hadoop/pig/new/trunk/build.xml:1437: Class
> > org.apache.tools.ant.taskdefs.ConditionTask doesn't support the nested
> > "typefound" element.
> >at
> >
> org.apache.tools.ant.IntrospectionHelper.throwNotSupported(IntrospectionHelper.java:463)
> >at
> >
> org.apache.tools.ant.IntrospectionHelper.getNestedCreator(IntrospectionHelper.java:517)
> >at
> >
> org.apache.tools.ant.IntrospectionHelper.getElementCreator(IntrospectionHelper.java:583)
> >at
> > org.apache.tools.ant.UnknownElement.handleChild(UnknownElement.java:546)
> >at
> >
> org.apache.tools.ant.UnknownElement.handleChildren(UnknownElement.java:326)
> >at
> > org.apache.tools.ant.UnknownElement.configure(UnknownElement.java:182)
> >at
> >
> org.apache.tools.ant.UnknownElement.maybeConfigure(UnknownElement.java:158)
> >at org.apache.tools.ant.Task.perform(Task.java:363)
> >at org.apache.tools.ant.Target.execute(Target.java:341)
> >at org.apache.tools.ant.Target.performTasks(Target.java:369)
> >at
> > org.apache.tools.ant.Project.executeSortedTargets(Project.java:1216)
> >at org.apache.tools.ant.Project.executeTarget(Project.java:1185)
> >at
> >
> org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:40)
> >at org.apache.tools.ant.Project.executeTargets(Project.java:1068)
> >at org.apache.tools.ant.Main.runBuild(Main.java:668)
> >at org.apache.tools.ant.Main.startAnt(Main.java:187)
> >  

Re: Unable to build pig from Trunk

2011-12-29 Thread Joey Krabacher
Looks like you may have an unsupported/older version of ant.
May try upgrading ant to something over 1.7

--Joey

On Thu, Dec 29, 2011 at 11:54 PM, praveenesh kumar  wrote:
> I set up proxy, Now I am getting the following error :
>
> root@lxe9700 [/usr/local/hadoop/pig/new/trunk] $ --> ant jar-withouthadoop
> -verbose
> Apache Ant version 1.6.5 compiled on June 5 2007
> Buildfile: build.xml
> Detected Java version: 1.5 in: /usr/java/jdk1.6.0_25/jre
> Detected OS: Linux
> parsing buildfile /usr/local/hadoop/pig/new/trunk/build.xml with URI =
> file:///usr/local/hadoop/pig/new/trunk/build.xml
> Project base dir set to: /usr/local/hadoop/pig/new/trunk
>  [property] Loading /root/build.properties
>  [property] Unable to find property file: /root/build.properties
>  [property] Loading /usr/local/hadoop/pig/new/trunk/build.properties
>  [property] Unable to find property file:
> /usr/local/hadoop/pig/new/trunk/build.properties
> Override ignored for property test.log.dir
> Property ${clover.home} has not been set
> [available] Unable to find ${clover.home}/lib/clover.jar to set property
> clover.present
> Property ${repo} has not been set
> Override ignored for property build.dir
> Override ignored for property dist.dir
> Property ${zookeeper.jarfile} has not been set
> Build sequence for target(s) `jar-withouthadoop' is [ivy-download,
> ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve,
> ivy-compile, init, cc-compile, prepare, genLexer, genParser, genTreeParser,
> gen, compile, jar-withouthadoop]
> Complete build sequence is [ivy-download, ivy-init-dirs, ivy-probe-antlib,
> ivy-init-antlib, ivy-init, ivy-resolve, ivy-compile, init, cc-compile,
> prepare, genLexer, genParser, genTreeParser, gen, compile,
> jar-withouthadoop, forrest.check, ivy-javadoc, javadoc-all, docs,
> ivy-jdiff, write-null, api-xml, api-report, jar, package, tar, source-jar,
> patch.check, makepom, ivy-releaseaudit, releaseaudit, ivy-test,
> compile-test, pigunit-jar, javadoc, javadoc-jar, package-release,
> clover.setup, jarWithSvn, piggybank, test-e2e-local, assert-pig-jar-exists,
> ready-to-publish, copy-jar-to-maven, jar-withouthadoopWithOutSvn,
> compile-sources, clover.info, clover, clean-sign, sign,
> test-e2e-deploy-local, ivy-publish-local, ant-task-download, mvn-taskdef,
> test-commit, test-smoke, copypom, maven-artifacts, published, set-version,
> test-unit, test-e2e, test-contrib, test, jar-withouthadoopWithSvn,
> clover.check, check-for-findbugs, test-core, ivy-buildJar,
> checkstyle.check, tar-release, rpm, clean, smoketests-jar, mvn-install,
> test-e2e-undeploy, ivy-checkstyle, jar-all, test-pigunit, signanddeploy,
> simpledeploy, mvn-deploy, findbugs, buildJar-withouthadoop, checkstyle,
> buildJar, findbugs.check, test-patch, jarWithOutSvn, test-e2e-deploy,
> hudson-test-patch, compile-sources-all-warnings, test-contrib-internal,
> include-meta, deb, eclipse-files, generate-clover-reports, ]
>
> ivy-download:
>      [get] Getting:
> http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
>      [get] To: /usr/local/hadoop/pig/new/trunk/ivy/ivy-2.2.0.jar
>
> ivy-init-dirs:
>
> ivy-probe-antlib:
>
> BUILD FAILED
> /usr/local/hadoop/pig/new/trunk/build.xml:1437: Class
> org.apache.tools.ant.taskdefs.ConditionTask doesn't support the nested
> "typefound" element.
>        at
> org.apache.tools.ant.IntrospectionHelper.throwNotSupported(IntrospectionHelper.java:463)
>        at
> org.apache.tools.ant.IntrospectionHelper.getNestedCreator(IntrospectionHelper.java:517)
>        at
> org.apache.tools.ant.IntrospectionHelper.getElementCreator(IntrospectionHelper.java:583)
>        at
> org.apache.tools.ant.UnknownElement.handleChild(UnknownElement.java:546)
>        at
> org.apache.tools.ant.UnknownElement.handleChildren(UnknownElement.java:326)
>        at
> org.apache.tools.ant.UnknownElement.configure(UnknownElement.java:182)
>        at
> org.apache.tools.ant.UnknownElement.maybeConfigure(UnknownElement.java:158)
>        at org.apache.tools.ant.Task.perform(Task.java:363)
>        at org.apache.tools.ant.Target.execute(Target.java:341)
>        at org.apache.tools.ant.Target.performTasks(Target.java:369)
>        at
> org.apache.tools.ant.Project.executeSortedTargets(Project.java:1216)
>        at org.apache.tools.ant.Project.executeTarget(Project.java:1185)
>        at
> org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:40)
>        at org.apache.tools.ant.Project.executeTargets(Project.java:1068)
>        at org.apache.tools.ant.Main.runBuild(Main.java:668)
>        at org.apache.tools.ant.Main.startAnt(Main.java:187)
>        at org.apache.tools.ant.launch.Launcher.run(Launcher.java:246)
>        at org.apache.tools.ant.launch.Launcher.main(Launcher.java:67)
>
> Total time: 0 seconds
>
>
>
> On Fri, Dec 30, 2011 at 11:11 AM, praveenesh kumar 
> wrote:
>
>> When I am pinging its saying "Unknown host."..
>> Is there any kind of proxy setting we need to do, when bui

Re: Unable to build pig from Trunk

2011-12-29 Thread praveenesh kumar
I set up proxy, Now I am getting the following error :

root@lxe9700 [/usr/local/hadoop/pig/new/trunk] $ --> ant jar-withouthadoop
-verbose
Apache Ant version 1.6.5 compiled on June 5 2007
Buildfile: build.xml
Detected Java version: 1.5 in: /usr/java/jdk1.6.0_25/jre
Detected OS: Linux
parsing buildfile /usr/local/hadoop/pig/new/trunk/build.xml with URI =
file:///usr/local/hadoop/pig/new/trunk/build.xml
Project base dir set to: /usr/local/hadoop/pig/new/trunk
 [property] Loading /root/build.properties
 [property] Unable to find property file: /root/build.properties
 [property] Loading /usr/local/hadoop/pig/new/trunk/build.properties
 [property] Unable to find property file:
/usr/local/hadoop/pig/new/trunk/build.properties
Override ignored for property test.log.dir
Property ${clover.home} has not been set
[available] Unable to find ${clover.home}/lib/clover.jar to set property
clover.present
Property ${repo} has not been set
Override ignored for property build.dir
Override ignored for property dist.dir
Property ${zookeeper.jarfile} has not been set
Build sequence for target(s) `jar-withouthadoop' is [ivy-download,
ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve,
ivy-compile, init, cc-compile, prepare, genLexer, genParser, genTreeParser,
gen, compile, jar-withouthadoop]
Complete build sequence is [ivy-download, ivy-init-dirs, ivy-probe-antlib,
ivy-init-antlib, ivy-init, ivy-resolve, ivy-compile, init, cc-compile,
prepare, genLexer, genParser, genTreeParser, gen, compile,
jar-withouthadoop, forrest.check, ivy-javadoc, javadoc-all, docs,
ivy-jdiff, write-null, api-xml, api-report, jar, package, tar, source-jar,
patch.check, makepom, ivy-releaseaudit, releaseaudit, ivy-test,
compile-test, pigunit-jar, javadoc, javadoc-jar, package-release,
clover.setup, jarWithSvn, piggybank, test-e2e-local, assert-pig-jar-exists,
ready-to-publish, copy-jar-to-maven, jar-withouthadoopWithOutSvn,
compile-sources, clover.info, clover, clean-sign, sign,
test-e2e-deploy-local, ivy-publish-local, ant-task-download, mvn-taskdef,
test-commit, test-smoke, copypom, maven-artifacts, published, set-version,
test-unit, test-e2e, test-contrib, test, jar-withouthadoopWithSvn,
clover.check, check-for-findbugs, test-core, ivy-buildJar,
checkstyle.check, tar-release, rpm, clean, smoketests-jar, mvn-install,
test-e2e-undeploy, ivy-checkstyle, jar-all, test-pigunit, signanddeploy,
simpledeploy, mvn-deploy, findbugs, buildJar-withouthadoop, checkstyle,
buildJar, findbugs.check, test-patch, jarWithOutSvn, test-e2e-deploy,
hudson-test-patch, compile-sources-all-warnings, test-contrib-internal,
include-meta, deb, eclipse-files, generate-clover-reports, ]

ivy-download:
  [get] Getting:
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
  [get] To: /usr/local/hadoop/pig/new/trunk/ivy/ivy-2.2.0.jar

ivy-init-dirs:

ivy-probe-antlib:

BUILD FAILED
/usr/local/hadoop/pig/new/trunk/build.xml:1437: Class
org.apache.tools.ant.taskdefs.ConditionTask doesn't support the nested
"typefound" element.
at
org.apache.tools.ant.IntrospectionHelper.throwNotSupported(IntrospectionHelper.java:463)
at
org.apache.tools.ant.IntrospectionHelper.getNestedCreator(IntrospectionHelper.java:517)
at
org.apache.tools.ant.IntrospectionHelper.getElementCreator(IntrospectionHelper.java:583)
at
org.apache.tools.ant.UnknownElement.handleChild(UnknownElement.java:546)
at
org.apache.tools.ant.UnknownElement.handleChildren(UnknownElement.java:326)
at
org.apache.tools.ant.UnknownElement.configure(UnknownElement.java:182)
at
org.apache.tools.ant.UnknownElement.maybeConfigure(UnknownElement.java:158)
at org.apache.tools.ant.Task.perform(Task.java:363)
at org.apache.tools.ant.Target.execute(Target.java:341)
at org.apache.tools.ant.Target.performTasks(Target.java:369)
at
org.apache.tools.ant.Project.executeSortedTargets(Project.java:1216)
at org.apache.tools.ant.Project.executeTarget(Project.java:1185)
at
org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:40)
at org.apache.tools.ant.Project.executeTargets(Project.java:1068)
at org.apache.tools.ant.Main.runBuild(Main.java:668)
at org.apache.tools.ant.Main.startAnt(Main.java:187)
at org.apache.tools.ant.launch.Launcher.run(Launcher.java:246)
at org.apache.tools.ant.launch.Launcher.main(Launcher.java:67)

Total time: 0 seconds



On Fri, Dec 30, 2011 at 11:11 AM, praveenesh kumar wrote:

> When I am pinging its saying "Unknown host."..
> Is there any kind of proxy setting we need to do, when building from ant ?
>
> Thanks,
> Praveenesh
>
>
>
> On Fri, Dec 30, 2011 at 11:02 AM, Joey Krabacher wrote:
>
>> Try pinging
>> http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
>> to see if your server can connect to that URL.
>> If not you have some kind of connection issue with outgoing requests.
>>
>> --Joey
>>
>> On 

Re: Unable to build pig from Trunk

2011-12-29 Thread Joey Krabacher
Can you ping any URL successfully?
Try www.google.com, www.yahoo.com or something like that.

If you can't ping any of those then you are probably behind a firewall
and you'll have to poke a hole into it to get to the outside world.

Or you can download the jar that it is trying to find (ivy-2.2.0.jar)
from another computer and copy it to the one you are building on.
I would put it in this folder :  /usr/local/hadoop/pig/new/trunk/ivy/

--Joey

On Thu, Dec 29, 2011 at 11:41 PM, praveenesh kumar  wrote:
> When I am pinging its saying "Unknown host."..
> Is there any kind of proxy setting we need to do, when building from ant ?
>
> Thanks,
> Praveenesh
>
>
> On Fri, Dec 30, 2011 at 11:02 AM, Joey Krabacher wrote:
>
>> Try pinging
>> http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
>> to see if your server can connect to that URL.
>> If not you have some kind of connection issue with outgoing requests.
>>
>> --Joey
>>
>> On Thu, Dec 29, 2011 at 11:28 PM, praveenesh kumar 
>> wrote:
>> > Hi everyone,
>> > I am trying to build Pig from SVN trunk on hadoop 0.20.205.
>> > While doing that, I am getting the following error : Any idea why its
>> > happening ?
>> >
>> > Thanks,
>> > Praveenesh
>> >
>> >
>> > root@lxe [/usr/local/hadoop/pig/new/trunk] $ --> ant jar-withouthadoop
>> > -verbose
>> > Apache Ant version 1.6.5 compiled on June 5 2007
>> > Buildfile: build.xml
>> > Detected Java version: 1.5 in: /usr/java/jdk1.6.0_25/jre
>> > Detected OS: Linux
>> > parsing buildfile /usr/local/hadoop/pig/new/trunk/build.xml with URI =
>> > file:///usr/local/hadoop/pig/new/trunk/build.xml
>> > Project base dir set to: /usr/local/hadoop/pig/new/trunk
>> >  [property] Loading /root/build.properties
>> >  [property] Unable to find property file: /root/build.properties
>> >  [property] Loading /usr/local/hadoop/pig/new/trunk/build.properties
>> >  [property] Unable to find property file:
>> > /usr/local/hadoop/pig/new/trunk/build.properties
>> > Override ignored for property test.log.dir
>> > Property ${clover.home} has not been set
>> > [available] Unable to find ${clover.home}/lib/clover.jar to set property
>> > clover.present
>> > Property ${repo} has not been set
>> > Override ignored for property build.dir
>> > Override ignored for property dist.dir
>> > Property ${zookeeper.jarfile} has not been set
>> > Build sequence for target(s) `jar-withouthadoop' is [ivy-download,
>> > ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve,
>> > ivy-compile, init, cc-compile, prepare, genLexer, genParser,
>> genTreeParser,
>> > gen, compile, jar-withouthadoop]
>> > Complete build sequence is [ivy-download, ivy-init-dirs,
>> ivy-probe-antlib,
>> > ivy-init-antlib, ivy-init, ivy-resolve, ivy-compile, init, cc-compile,
>> > prepare, genLexer, genParser, genTreeParser, gen, compile,
>> > jar-withouthadoop, forrest.check, ivy-javadoc, javadoc-all, docs,
>> > ivy-jdiff, write-null, api-xml, api-report, jar, package, tar,
>> source-jar,
>> > patch.check, makepom, ivy-releaseaudit, releaseaudit, ivy-test,
>> > compile-test, pigunit-jar, javadoc, javadoc-jar, package-release,
>> > clover.setup, jarWithSvn, piggybank, test-e2e-local,
>> assert-pig-jar-exists,
>> > ready-to-publish, copy-jar-to-maven, jar-withouthadoopWithOutSvn,
>> > compile-sources, clover.info, clover, clean-sign, sign,
>> > test-e2e-deploy-local, ivy-publish-local, ant-task-download, mvn-taskdef,
>> > test-commit, test-smoke, copypom, maven-artifacts, published,
>> set-version,
>> > test-unit, test-e2e, test-contrib, test, jar-withouthadoopWithSvn,
>> > clover.check, check-for-findbugs, test-core, ivy-buildJar,
>> > checkstyle.check, tar-release, rpm, clean, smoketests-jar, mvn-install,
>> > test-e2e-undeploy, ivy-checkstyle, jar-all, test-pigunit, signanddeploy,
>> > simpledeploy, mvn-deploy, findbugs, buildJar-withouthadoop, checkstyle,
>> > buildJar, findbugs.check, test-patch, jarWithOutSvn, test-e2e-deploy,
>> > hudson-test-patch, compile-sources-all-warnings, test-contrib-internal,
>> > include-meta, deb, eclipse-files, generate-clover-reports, ]
>> >
>> > ivy-download:
>> >      [get] Getting:
>> > http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
>> >      [get] To: /usr/local/hadoop/pig/new/trunk/ivy/ivy-2.2.0.jar
>> >      [get] Error getting
>> > http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar to
>> > /usr/local/hadoop/pig/new/trunk/ivy/ivy-2.2.0.jar
>> >
>> > BUILD FAILED
>> > /usr/local/hadoop/pig/new/trunk/build.xml:1443:
>> java.net.ConnectException:
>> > Connection timed out
>> >        at org.apache.tools.ant.taskdefs.Get.execute(Get.java:80)
>> >        at
>> > org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:275)
>> >        at org.apache.tools.ant.Task.perform(Task.java:364)
>> >        at org.apache.tools.ant.Target.execute(Target.java:341)
>> >        at org.apache.tools.ant.Target.performTasks(Target.java:369)
>> >        at
>> > org.apache.tools.ant.Project.

Re: Unable to build pig from Trunk

2011-12-29 Thread praveenesh kumar
When I am pinging its saying "Unknown host."..
Is there any kind of proxy setting we need to do, when building from ant ?

Thanks,
Praveenesh


On Fri, Dec 30, 2011 at 11:02 AM, Joey Krabacher wrote:

> Try pinging
> http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
> to see if your server can connect to that URL.
> If not you have some kind of connection issue with outgoing requests.
>
> --Joey
>
> On Thu, Dec 29, 2011 at 11:28 PM, praveenesh kumar 
> wrote:
> > Hi everyone,
> > I am trying to build Pig from SVN trunk on hadoop 0.20.205.
> > While doing that, I am getting the following error : Any idea why its
> > happening ?
> >
> > Thanks,
> > Praveenesh
> >
> >
> > root@lxe [/usr/local/hadoop/pig/new/trunk] $ --> ant jar-withouthadoop
> > -verbose
> > Apache Ant version 1.6.5 compiled on June 5 2007
> > Buildfile: build.xml
> > Detected Java version: 1.5 in: /usr/java/jdk1.6.0_25/jre
> > Detected OS: Linux
> > parsing buildfile /usr/local/hadoop/pig/new/trunk/build.xml with URI =
> > file:///usr/local/hadoop/pig/new/trunk/build.xml
> > Project base dir set to: /usr/local/hadoop/pig/new/trunk
> >  [property] Loading /root/build.properties
> >  [property] Unable to find property file: /root/build.properties
> >  [property] Loading /usr/local/hadoop/pig/new/trunk/build.properties
> >  [property] Unable to find property file:
> > /usr/local/hadoop/pig/new/trunk/build.properties
> > Override ignored for property test.log.dir
> > Property ${clover.home} has not been set
> > [available] Unable to find ${clover.home}/lib/clover.jar to set property
> > clover.present
> > Property ${repo} has not been set
> > Override ignored for property build.dir
> > Override ignored for property dist.dir
> > Property ${zookeeper.jarfile} has not been set
> > Build sequence for target(s) `jar-withouthadoop' is [ivy-download,
> > ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve,
> > ivy-compile, init, cc-compile, prepare, genLexer, genParser,
> genTreeParser,
> > gen, compile, jar-withouthadoop]
> > Complete build sequence is [ivy-download, ivy-init-dirs,
> ivy-probe-antlib,
> > ivy-init-antlib, ivy-init, ivy-resolve, ivy-compile, init, cc-compile,
> > prepare, genLexer, genParser, genTreeParser, gen, compile,
> > jar-withouthadoop, forrest.check, ivy-javadoc, javadoc-all, docs,
> > ivy-jdiff, write-null, api-xml, api-report, jar, package, tar,
> source-jar,
> > patch.check, makepom, ivy-releaseaudit, releaseaudit, ivy-test,
> > compile-test, pigunit-jar, javadoc, javadoc-jar, package-release,
> > clover.setup, jarWithSvn, piggybank, test-e2e-local,
> assert-pig-jar-exists,
> > ready-to-publish, copy-jar-to-maven, jar-withouthadoopWithOutSvn,
> > compile-sources, clover.info, clover, clean-sign, sign,
> > test-e2e-deploy-local, ivy-publish-local, ant-task-download, mvn-taskdef,
> > test-commit, test-smoke, copypom, maven-artifacts, published,
> set-version,
> > test-unit, test-e2e, test-contrib, test, jar-withouthadoopWithSvn,
> > clover.check, check-for-findbugs, test-core, ivy-buildJar,
> > checkstyle.check, tar-release, rpm, clean, smoketests-jar, mvn-install,
> > test-e2e-undeploy, ivy-checkstyle, jar-all, test-pigunit, signanddeploy,
> > simpledeploy, mvn-deploy, findbugs, buildJar-withouthadoop, checkstyle,
> > buildJar, findbugs.check, test-patch, jarWithOutSvn, test-e2e-deploy,
> > hudson-test-patch, compile-sources-all-warnings, test-contrib-internal,
> > include-meta, deb, eclipse-files, generate-clover-reports, ]
> >
> > ivy-download:
> >  [get] Getting:
> > http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
> >  [get] To: /usr/local/hadoop/pig/new/trunk/ivy/ivy-2.2.0.jar
> >  [get] Error getting
> > http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar to
> > /usr/local/hadoop/pig/new/trunk/ivy/ivy-2.2.0.jar
> >
> > BUILD FAILED
> > /usr/local/hadoop/pig/new/trunk/build.xml:1443:
> java.net.ConnectException:
> > Connection timed out
> >at org.apache.tools.ant.taskdefs.Get.execute(Get.java:80)
> >at
> > org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:275)
> >at org.apache.tools.ant.Task.perform(Task.java:364)
> >at org.apache.tools.ant.Target.execute(Target.java:341)
> >at org.apache.tools.ant.Target.performTasks(Target.java:369)
> >at
> > org.apache.tools.ant.Project.executeSortedTargets(Project.java:1216)
> >at org.apache.tools.ant.Project.executeTarget(Project.java:1185)
> >at
> >
> org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:40)
> >at org.apache.tools.ant.Project.executeTargets(Project.java:1068)
> >at org.apache.tools.ant.Main.runBuild(Main.java:668)
> >at org.apache.tools.ant.Main.startAnt(Main.java:187)
> >at org.apache.tools.ant.launch.Launcher.run(Launcher.java:246)
> >at org.apache.tools.ant.launch.Launcher.main(Launcher.java:67)
> > Caused by: java.net.ConnectExceptio

Re: Hadoop MySQL database access

2011-12-29 Thread JAGANADH G
@Praveen
Thanks . I got it .


-- 
**
JAGANADH G
http://jaganadhg.in
*ILUGCBE*
http://ilugcbe.org.in


Re: Hadoop MySQL database access

2011-12-29 Thread Praveen Sripati
Check the `mapreduce.job.reduce.slowstart.completedmaps` parameter. The
reducers cannot start processing the data from the mappers until the all
the map tasks are complete, but the reducers can start fetching the data
from the nodes on which the map tasks have completed.

Praveen

On Thu, Dec 29, 2011 at 12:44 AM, Prashant Kommireddi
wrote:

> By design reduce would start only after all the maps finish. There is
> no way for the reduce to begin grouping/merging by key unless all the
> maps have finished.
>
> Sent from my iPhone
>
> On Dec 28, 2011, at 8:53 AM, JAGANADH G  wrote:
>
> > Hi All,
> >
> > I wrote a map reduce program to fetch data from MySQL and process the
> > data(word count).
> > The program executes successfully . But I noticed that the reduce task
> > starts after finishing the map task only .
> > Is there any way to run the map and reduce in parallel.
> >
> > The program fetches data from MySQL and writes the processed output to
> > hdfs.
> > I am using hadoop in pseduo-distributed mode .
> > --
> > **
> > JAGANADH G
> > http://jaganadhg.in
> > *ILUGCBE*
> > http://ilugcbe.org.in
>


Re: Unable to build pig from Trunk

2011-12-29 Thread Joey Krabacher
Try pinging http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
to see if your server can connect to that URL.
If not you have some kind of connection issue with outgoing requests.

--Joey

On Thu, Dec 29, 2011 at 11:28 PM, praveenesh kumar  wrote:
> Hi everyone,
> I am trying to build Pig from SVN trunk on hadoop 0.20.205.
> While doing that, I am getting the following error : Any idea why its
> happening ?
>
> Thanks,
> Praveenesh
>
>
> root@lxe [/usr/local/hadoop/pig/new/trunk] $ --> ant jar-withouthadoop
> -verbose
> Apache Ant version 1.6.5 compiled on June 5 2007
> Buildfile: build.xml
> Detected Java version: 1.5 in: /usr/java/jdk1.6.0_25/jre
> Detected OS: Linux
> parsing buildfile /usr/local/hadoop/pig/new/trunk/build.xml with URI =
> file:///usr/local/hadoop/pig/new/trunk/build.xml
> Project base dir set to: /usr/local/hadoop/pig/new/trunk
>  [property] Loading /root/build.properties
>  [property] Unable to find property file: /root/build.properties
>  [property] Loading /usr/local/hadoop/pig/new/trunk/build.properties
>  [property] Unable to find property file:
> /usr/local/hadoop/pig/new/trunk/build.properties
> Override ignored for property test.log.dir
> Property ${clover.home} has not been set
> [available] Unable to find ${clover.home}/lib/clover.jar to set property
> clover.present
> Property ${repo} has not been set
> Override ignored for property build.dir
> Override ignored for property dist.dir
> Property ${zookeeper.jarfile} has not been set
> Build sequence for target(s) `jar-withouthadoop' is [ivy-download,
> ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve,
> ivy-compile, init, cc-compile, prepare, genLexer, genParser, genTreeParser,
> gen, compile, jar-withouthadoop]
> Complete build sequence is [ivy-download, ivy-init-dirs, ivy-probe-antlib,
> ivy-init-antlib, ivy-init, ivy-resolve, ivy-compile, init, cc-compile,
> prepare, genLexer, genParser, genTreeParser, gen, compile,
> jar-withouthadoop, forrest.check, ivy-javadoc, javadoc-all, docs,
> ivy-jdiff, write-null, api-xml, api-report, jar, package, tar, source-jar,
> patch.check, makepom, ivy-releaseaudit, releaseaudit, ivy-test,
> compile-test, pigunit-jar, javadoc, javadoc-jar, package-release,
> clover.setup, jarWithSvn, piggybank, test-e2e-local, assert-pig-jar-exists,
> ready-to-publish, copy-jar-to-maven, jar-withouthadoopWithOutSvn,
> compile-sources, clover.info, clover, clean-sign, sign,
> test-e2e-deploy-local, ivy-publish-local, ant-task-download, mvn-taskdef,
> test-commit, test-smoke, copypom, maven-artifacts, published, set-version,
> test-unit, test-e2e, test-contrib, test, jar-withouthadoopWithSvn,
> clover.check, check-for-findbugs, test-core, ivy-buildJar,
> checkstyle.check, tar-release, rpm, clean, smoketests-jar, mvn-install,
> test-e2e-undeploy, ivy-checkstyle, jar-all, test-pigunit, signanddeploy,
> simpledeploy, mvn-deploy, findbugs, buildJar-withouthadoop, checkstyle,
> buildJar, findbugs.check, test-patch, jarWithOutSvn, test-e2e-deploy,
> hudson-test-patch, compile-sources-all-warnings, test-contrib-internal,
> include-meta, deb, eclipse-files, generate-clover-reports, ]
>
> ivy-download:
>      [get] Getting:
> http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
>      [get] To: /usr/local/hadoop/pig/new/trunk/ivy/ivy-2.2.0.jar
>      [get] Error getting
> http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar to
> /usr/local/hadoop/pig/new/trunk/ivy/ivy-2.2.0.jar
>
> BUILD FAILED
> /usr/local/hadoop/pig/new/trunk/build.xml:1443: java.net.ConnectException:
> Connection timed out
>        at org.apache.tools.ant.taskdefs.Get.execute(Get.java:80)
>        at
> org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:275)
>        at org.apache.tools.ant.Task.perform(Task.java:364)
>        at org.apache.tools.ant.Target.execute(Target.java:341)
>        at org.apache.tools.ant.Target.performTasks(Target.java:369)
>        at
> org.apache.tools.ant.Project.executeSortedTargets(Project.java:1216)
>        at org.apache.tools.ant.Project.executeTarget(Project.java:1185)
>        at
> org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:40)
>        at org.apache.tools.ant.Project.executeTargets(Project.java:1068)
>        at org.apache.tools.ant.Main.runBuild(Main.java:668)
>        at org.apache.tools.ant.Main.startAnt(Main.java:187)
>        at org.apache.tools.ant.launch.Launcher.run(Launcher.java:246)
>        at org.apache.tools.ant.launch.Launcher.main(Launcher.java:67)
> Caused by: java.net.ConnectException: Connection timed out
>        at java.net.PlainSocketImpl.socketConnect(Native Method)
>        at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
>        at
> java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
>        at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
>        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
>        at 

Unable to build pig from Trunk

2011-12-29 Thread praveenesh kumar
Hi everyone,
I am trying to build Pig from SVN trunk on hadoop 0.20.205.
While doing that, I am getting the following error : Any idea why its
happening ?

Thanks,
Praveenesh


root@lxe [/usr/local/hadoop/pig/new/trunk] $ --> ant jar-withouthadoop
-verbose
Apache Ant version 1.6.5 compiled on June 5 2007
Buildfile: build.xml
Detected Java version: 1.5 in: /usr/java/jdk1.6.0_25/jre
Detected OS: Linux
parsing buildfile /usr/local/hadoop/pig/new/trunk/build.xml with URI =
file:///usr/local/hadoop/pig/new/trunk/build.xml
Project base dir set to: /usr/local/hadoop/pig/new/trunk
 [property] Loading /root/build.properties
 [property] Unable to find property file: /root/build.properties
 [property] Loading /usr/local/hadoop/pig/new/trunk/build.properties
 [property] Unable to find property file:
/usr/local/hadoop/pig/new/trunk/build.properties
Override ignored for property test.log.dir
Property ${clover.home} has not been set
[available] Unable to find ${clover.home}/lib/clover.jar to set property
clover.present
Property ${repo} has not been set
Override ignored for property build.dir
Override ignored for property dist.dir
Property ${zookeeper.jarfile} has not been set
Build sequence for target(s) `jar-withouthadoop' is [ivy-download,
ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve,
ivy-compile, init, cc-compile, prepare, genLexer, genParser, genTreeParser,
gen, compile, jar-withouthadoop]
Complete build sequence is [ivy-download, ivy-init-dirs, ivy-probe-antlib,
ivy-init-antlib, ivy-init, ivy-resolve, ivy-compile, init, cc-compile,
prepare, genLexer, genParser, genTreeParser, gen, compile,
jar-withouthadoop, forrest.check, ivy-javadoc, javadoc-all, docs,
ivy-jdiff, write-null, api-xml, api-report, jar, package, tar, source-jar,
patch.check, makepom, ivy-releaseaudit, releaseaudit, ivy-test,
compile-test, pigunit-jar, javadoc, javadoc-jar, package-release,
clover.setup, jarWithSvn, piggybank, test-e2e-local, assert-pig-jar-exists,
ready-to-publish, copy-jar-to-maven, jar-withouthadoopWithOutSvn,
compile-sources, clover.info, clover, clean-sign, sign,
test-e2e-deploy-local, ivy-publish-local, ant-task-download, mvn-taskdef,
test-commit, test-smoke, copypom, maven-artifacts, published, set-version,
test-unit, test-e2e, test-contrib, test, jar-withouthadoopWithSvn,
clover.check, check-for-findbugs, test-core, ivy-buildJar,
checkstyle.check, tar-release, rpm, clean, smoketests-jar, mvn-install,
test-e2e-undeploy, ivy-checkstyle, jar-all, test-pigunit, signanddeploy,
simpledeploy, mvn-deploy, findbugs, buildJar-withouthadoop, checkstyle,
buildJar, findbugs.check, test-patch, jarWithOutSvn, test-e2e-deploy,
hudson-test-patch, compile-sources-all-warnings, test-contrib-internal,
include-meta, deb, eclipse-files, generate-clover-reports, ]

ivy-download:
  [get] Getting:
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
  [get] To: /usr/local/hadoop/pig/new/trunk/ivy/ivy-2.2.0.jar
  [get] Error getting
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar to
/usr/local/hadoop/pig/new/trunk/ivy/ivy-2.2.0.jar

BUILD FAILED
/usr/local/hadoop/pig/new/trunk/build.xml:1443: java.net.ConnectException:
Connection timed out
at org.apache.tools.ant.taskdefs.Get.execute(Get.java:80)
at
org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:275)
at org.apache.tools.ant.Task.perform(Task.java:364)
at org.apache.tools.ant.Target.execute(Target.java:341)
at org.apache.tools.ant.Target.performTasks(Target.java:369)
at
org.apache.tools.ant.Project.executeSortedTargets(Project.java:1216)
at org.apache.tools.ant.Project.executeTarget(Project.java:1185)
at
org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:40)
at org.apache.tools.ant.Project.executeTargets(Project.java:1068)
at org.apache.tools.ant.Main.runBuild(Main.java:668)
at org.apache.tools.ant.Main.startAnt(Main.java:187)
at org.apache.tools.ant.launch.Launcher.run(Launcher.java:246)
at org.apache.tools.ant.launch.Launcher.main(Launcher.java:67)
Caused by: java.net.ConnectException: Connection timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
at
java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
at java.net.Socket.connect(Socket.java:529)
at java.net.Socket.connect(Socket.java:478)
at sun.net.NetworkClient.doConnect(NetworkClient.java:163)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:394)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:529)
at sun.net.www.http.HttpClient.(HttpClient.java:233)
at sun.net.www.http.HttpClient.New(HttpClient.java

Re: Kudos to Harsh J for closing out stale JIRA tickets

2011-12-29 Thread Harsh J
We don't want newcomers to JIRA shy away at the colossal numbers, and
give expressions such as this:
http://www.youtube.com/watch?v=SiMHTK15Pik :-)

Sorry about all that email noise though, can't find a way to avoid
that. There's still a lot left in each. But I think this is good
learning for anyone who's been around only since the 0.20s.

On Fri, Dec 30, 2011 at 1:38 AM, Brian Bockelman  wrote:
> Let's face it, trolling through old JIRA tickets is neither sexy nor fun.  
> Kudos to Harsh for spending some time in the last two days for doing exactly 
> that!
>
> Thanks Harsh!
>
> Brian

Thanks for the support!

-- 
Harsh J


Re: Another newbie - problem with grep example

2011-12-29 Thread patf


I got hadoop 0.22.0 running with Windows.  The most useful instructions 
I found where these:


http://knowlspace.wordpress.com/2011/06/21/setting-up-hadoop-on-windows/

I was able to run examples grep, pi and WordCount.

Saw that 1.0 just came out.  Downloaded it; installed; and tried it 
out.  Don't get so far as even the first example.  tasktracker won't run:


2011-12-29 14:41:27,798 ERROR org.apache.hadoop.mapred.TaskTracker: 
Can not start task tracker because java.io.IOException: Failed to set 
permissions of path: \tmp\hadoop-cyg_server\mapred\local\ttprivate to 0700


The above may be this bug:  
https://issues.apache.org/jira/browse/HADOOP-7682


I can see that it has something to do with permissions between my user 
id and the account cyg_server that seems part of the standard way to get 
sshd running on Windows.


The most obvious thing that's confusing is that, with this bug, why did 
I get as far as I did with 0.22.0?


Went back to 0.22.0 and it still runs fine.

People claim that computers are deterministic machines - which is not 
true.  We get them to work well enough for a great many applications - 
some even life-critical.  But in fact computers are complex systems 
where minute problems can cascade into complete failures of the system.


Pat

On 12/24/2011 3:56 PM, patf wrote:


Thanks Prashant.

That's was yesterday on Linux at work.  It's Sat; I'm at home; and 
trying out hadoop on Windows 7.  The hadoop Windows isntall was 
something of a pain-in-the-ass but I've got hadoop basically working 
(but not computing yet).


Neither the grep nor the pi examples works.  Here's the error I see in 
my Windows installation (Windows 7 with a new cygwin install).


c:/PROGRA~2/Java/jdk1.6.0_25/bin/java -Xmx1000m 
-Dhadoop.log.dir=D:\downloads\hadoop\logs 
-Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=D:\downloads\hadoop\ 
-Dhadoop.id.str= -Dhadoop.root.logger=INFO,console 
-Dhadoop.security.logger=INFO,console 
-Djava.library.path=/cygdrive/d/downloads/hadoop/lib/native/Windows_7-x86-32 
-Dhadoop.policy.file=hadoop-policy.xml 
-Djava.net.preferIPv4Stack=true org.apache.hadoop.util.RunJar 
hadoop-mapred-examples-0.22.0.jar grep input output dfs[a-z.]+
11/12/24 15:03:27 WARN conf.Configuration: 
mapred.used.genericoptionsparser is deprecated. Instead, use 
mapreduce.client.genericoptionsparser.used
11/12/24 15:03:27 WARN mapreduce.JobSubmitter: No job jar file set.  
User classes may not be found. See Job or Job#setJar(String).
11/12/24 15:03:27 INFO input.FileInputFormat: Total input paths to 
process : 19

11/12/24 15:03:28 INFO mapreduce.JobSubmitter: number of splits:19
11/12/24 15:03:29 INFO mapreduce.Job: Running job: job_201112241459_0003
11/12/24 15:03:30 INFO mapreduce.Job:  map 0% reduce 0%
11/12/24 15:03:33 INFO mapreduce.Job: Task Id : 
attempt_201112241459_0003_m_20_0, Status : FAILED

Error initializing attempt_201112241459_0003_m_20_0:
org.apache.hadoop.security.AccessControlException: Permission denied: 
user=cyg_server, access=EXECUTE, 
inode="system":MyId:supergroup:rwx--
at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:161)
at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:128)
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4465)
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkTraverse(FSNamesystem.java:4442)
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:2117)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.getFileInfo(NameNode.java:1022)

at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMeth
11/12/24 15:03:33 WARN mapreduce.Job: Error reading task 
outputhttp://MyId-PC:50060/tasklog?plaintext=true&attemptid=attempt_201112241459_0003_m_20_0&filter=stdout
user cyg_server?  That's because I'm coming in under cygwin ssh.  
Looks like I need either change the privs in the hadoop file system or 
I need to extend more privs to cyg_server.  Although there may be an 
error (WARN) before I get to the cyg_server permissions crash?


Pat


On 12/23/2011 5:21 PM, Prashant Kommireddi wrote:

Seems like you do not have "/user/MyId/input/conf" on HDFS.

Try this.

cd $HADOOP_HOME_DIR (this should be your hadoop root dir)
hadoop fs -put conf input/conf

And then run the MR job again.

-Prashant Kommireddi

On Fri, Dec 23, 2011 at 3:40 PM, Pat Flaherty  wrote:


Hi,

Installed 0.22.0 on CentOS 5.7.  I can start dfs and mapred and see 
their

processes.

Ran the first grep example: bin/hadoop jar hadoop-*-examples.jar grep
input output 'dfs[a-z.]+'.  It seems the correct jar name is
hadoop-mapred-examples-0.22.0.jar - there are no 

Re: Kudos to Harsh J for closing out stale JIRA tickets

2011-12-29 Thread Arun C Murthy
+1!

On Dec 29, 2011, at 12:08 PM, Brian Bockelman wrote:

> Let's face it, trolling through old JIRA tickets is neither sexy nor fun.  
> Kudos to Harsh for spending some time in the last two days for doing exactly 
> that!
> 
> Thanks Harsh!
> 
> Brian



Kudos to Harsh J for closing out stale JIRA tickets

2011-12-29 Thread Brian Bockelman
Let's face it, trolling through old JIRA tickets is neither sexy nor fun.  
Kudos to Harsh for spending some time in the last two days for doing exactly 
that!

Thanks Harsh!

Brian

smime.p7s
Description: S/MIME cryptographic signature


Re: Some question about fault Injection

2011-12-29 Thread Konstantin Boudnik
I suggest to start with fault injection tests. They can found under 
  src/test/aop/org/apache/hadoop
for HDFS in 0.22. Hdfs has been has the best coverage by fault injection.

Test exists in the similar location in the trunk, but they aren't hooked up to
maven build system yet.

Cos

On Thu, Dec 29, 2011 at 03:04AM, sangroya wrote:
> Hi,
> 
> Is there any good documentation to start with fault injection. Please share
> if there is any link to any examples that demonstrate the use of fault
> injection.
> 
> 
> Thanks,
> Amit
> 
> 
> 
> 
> 
> 
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Re-Some-question-about-fault-Injection-tp2555954p3618633.html
> Sent from the Hadoop lucene-users mailing list archive at Nabble.com.


Re: Map Task Capacity Not Changing

2011-12-29 Thread Joey Krabacher
To follow up on what I have found:

I opened up some of the logs on the datanodes and found this message:
"Can not start task tracker because java.net.BindException: Address
already in use"

It was using the default port setting from mapred-default.xml, which was 50060.
I decided to try an add

  
mapred.task.tracker.http.address
0.0.0.0:0
  

to mapred-site.xml so that the first open port would be selected.
This works and it also, allows for the tasktracker to start normally
which in turn allows the mapred.tasktracker.map.tasks.maximum setting
to take effect.

Thanks for all who helped.

--Joey

On Sat, Dec 17, 2011 at 1:42 AM, Joey Krabacher  wrote:
> pid files are there, I checked for running processes with the sameID's
> and they all checked out.
> --Joey
> On Fri, Dec 16, 2011 at 5:40 PM, Rahul Jain  wrote:
>> You might be suffering from HADOOP-7822; I'd suggest you verify your pid
>> files and fix the problem by hand if it is the same issue.
>>
>> -Rahul
>>
>> On Fri, Dec 16, 2011 at 2:40 PM, Joey Krabacher wrote:
>>
>>> Turns out my tasktrackers(on the datanodes) are not starting properly
>>>
>>> so I guess they are taking some alternate route??
>>>
>>> because they are up and running...even though when I run
>>> stop-mapred.sh it says "data01: no tasktracker to stop"
>>>
>>> --Joey
>>>
>>> On Thu, Dec 15, 2011 at 5:37 PM, James Warren
>>>  wrote:
>>> > (moving to mapreduce-user@, bcc'ing common-user@)
>>> >
>>> > Hi Joey -
>>> >
>>> > You'll want to change the value on all of your servers running
>>> tasktrackers
>>> > and then restart each tasktracker to reread the configuration.
>>> >
>>> > cheers,
>>> > -James
>>> >
>>> > On Thu, Dec 15, 2011 at 3:30 PM, Joey Krabacher >> >wrote:
>>> >
>>> >> I have looked up how to up this value on the web and have tried all
>>> >> suggestions to no avail.
>>> >>
>>> >> Any help would be great.
>>> >>
>>> >> Here is some background:
>>> >>
>>> >> Version: 0.20.2, r911707
>>> >> Compiled: Fri Feb 19 08:07:34 UTC 2010 by chrisdo
>>> >>
>>> >> Nodes: 5
>>> >> Current Map Task Capacity : 10  <--- this is what I want to increase.
>>> >>
>>> >> What I have tried :
>>> >>
>>> >> Adding
>>> >>   
>>> >>    mapred.tasktracker.map.tasks.maximum
>>> >>    8
>>> >>    true
>>> >>  
>>> >> to mapred-site.xml on NameNode.  I also added this to one of the
>>> >> datanodes for the hell of it and that didn't work either.
>>> >>
>>> >> Thanks.
>>> >>
>>>


Re: Multi user Hadoop 0.20.205 ?

2011-12-29 Thread Joey Echeverria
Why do you want multiple users starting daemons?

As for submitting jobs, that should work out of the box. If you want the child 
JVMs running the map and reduce tasks to execute as the submitting user, then 
you need to configure your cluster with Kerberos. 

The CDH3 security guide 
(https://ccp.cloudera.com/display/CDHDOC/CDH3+Security+Guide) has a good 
tutorial that should apply to the 0.20.20x and 1.0.0 releases as well. 

-Joey



On Dec 29, 2011, at 7:12, praveenesh kumar  wrote:

> yup.. exactly that... :-) And I also want multiple users to submit jobs.
> 
> Thanks,
> Praveenesh
> 
> On Thu, Dec 29, 2011 at 4:46 PM, Joey Echeverria  wrote:
> 
>> Hey Praveenesh,
>> 
>> What do you mean by multiuser? Do you want to support multiple users
>> starting/stopping daemons?
>> 
>> -Joey
>> 
>> 
>> 
>> On Dec 29, 2011, at 2:49, praveenesh kumar  wrote:.
>>> Guys,
>>> 
>>> Did someone try this thing ?
>>> 
>>> Thanks
>>> 
>>> On Tue, Dec 27, 2011 at 4:36 PM, praveenesh kumar >> wrote:
>>> 
 Hey guys,
 
 How we can make hadoop as multiuser ?
 
 One way to think as whatever group we currently assigned to use hadoop,
 add users to same group and change permissions to hadoop.tmp.dir,
 mapred.system.dir, dfs.data.dir, and what not.
 I was playing on hadoop 0.20.205 and I observed we can't change the
 permission to the above mentioned folders..
 If yes, what are the best practices for making hadoop as a multi user
 entity, specially for hadoop 0.20.205.
 
 Thanks,
 Praveenesh
 
>> 


Re: Multi user Hadoop 0.20.205 ?

2011-12-29 Thread praveenesh kumar
yup.. exactly that... :-) And I also want multiple users to submit jobs.

Thanks,
Praveenesh

On Thu, Dec 29, 2011 at 4:46 PM, Joey Echeverria  wrote:

> Hey Praveenesh,
>
> What do you mean by multiuser? Do you want to support multiple users
> starting/stopping daemons?
>
> -Joey
>
>
>
> On Dec 29, 2011, at 2:49, praveenesh kumar  wrote:.
> > Guys,
> >
> > Did someone try this thing ?
> >
> > Thanks
> >
> > On Tue, Dec 27, 2011 at 4:36 PM, praveenesh kumar  >wrote:
> >
> >> Hey guys,
> >>
> >> How we can make hadoop as multiuser ?
> >>
> >> One way to think as whatever group we currently assigned to use hadoop,
> >> add users to same group and change permissions to hadoop.tmp.dir,
> >> mapred.system.dir, dfs.data.dir, and what not.
> >> I was playing on hadoop 0.20.205 and I observed we can't change the
> >> permission to the above mentioned folders..
> >> If yes, what are the best practices for making hadoop as a multi user
> >> entity, specially for hadoop 0.20.205.
> >>
> >> Thanks,
> >> Praveenesh
> >>
>


Re: Multi user Hadoop 0.20.205 ?

2011-12-29 Thread Joey Echeverria
Hey Praveenesh,

What do you mean by multiuser? Do you want to support multiple users 
starting/stopping daemons?

-Joey



On Dec 29, 2011, at 2:49, praveenesh kumar  wrote:

> Guys,
> 
> Did someone try this thing ?
> 
> Thanks
> 
> On Tue, Dec 27, 2011 at 4:36 PM, praveenesh kumar wrote:
> 
>> Hey guys,
>> 
>> How we can make hadoop as multiuser ?
>> 
>> One way to think as whatever group we currently assigned to use hadoop,
>> add users to same group and change permissions to hadoop.tmp.dir,
>> mapred.system.dir, dfs.data.dir, and what not.
>> I was playing on hadoop 0.20.205 and I observed we can't change the
>> permission to the above mentioned folders..
>> If yes, what are the best practices for making hadoop as a multi user
>> entity, specially for hadoop 0.20.205.
>> 
>> Thanks,
>> Praveenesh
>> 


Re: Some question about fault Injection

2011-12-29 Thread sangroya
Hi,

Is there any good documentation to start with fault injection. Please share
if there is any link to any examples that demonstrate the use of fault
injection.


Thanks,
Amit






--
View this message in context: 
http://lucene.472066.n3.nabble.com/Re-Some-question-about-fault-Injection-tp2555954p3618633.html
Sent from the Hadoop lucene-users mailing list archive at Nabble.com.