Not sure what's so disappointing here, it was never officially announced that
Mahout 0.9 had Hadoop 2.x support.
From trunk, can you build mahout for hadoop2 using this command:
mvn clean package -Dhadoop2.version=YOUR_HADOOP2_VERSION
On Friday, March 7, 2014 12:12 PM, Mahmood Naderan
You have upper-case in your command but lower-case in your declaration in
the properties file; correct that and it should work.
Note:
org.apache.mahout.text.wikipedia.WikipediaXmlSplitter =
wikipediaXmlSplitter : wikipedia splitter
hadoop@solaris:~/mahout-distribution-0.9$ bin/mahout
Thanks Andrew, that seems to have been the issue all the while.
Nevertheless, it is better to run from Head if running on Hadoop 2.3.0
On Saturday, March 8, 2014 2:42 PM, Andrew Musselman
andrew.mussel...@gmail.com wrote:
You have upper-case in your command but lower-case in your
Oh yes... Thanks Andrew you are right
Meanwhile I see two warnings
WARN driver.MahoutDriver: No wikipediaXMLSplitter.props found on classpath,
will use command-line arguments only
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes
You can ignore the warnings.
On Saturday, March 8, 2014 2:58 PM, Mahmood Naderan nt_mahm...@yahoo.com
wrote:
Oh yes... Thanks Andrew you are right
Meanwhile I see two warnings
WARN driver.MahoutDriver: No wikipediaXMLSplitter.props found on classpath,
will use command-line arguments
What a fast reply... Thanks a lot Suneel,
Regards,
Mahmood
On Saturday, March 8, 2014 11:29 PM, Suneel Marthi suneel_mar...@yahoo.com
wrote:
You can ignore the warnings.
On Saturday, March 8, 2014 2:58 PM, Mahmood Naderan nt_mahm...@yahoo.com
wrote:
Oh yes... Thanks Andrew you
Hi
When I run
mahout wikipediaXMLSplitter -d
examples/temp/enwiki-latest-pages-articles.xml -o wikipedia/chunks -c 64
I get this error
14/03/07 16:24:13 WARN driver.MahoutDriver: Unable to add class:
wikipediaXMLSplitter
java.lang.ClassNotFoundException: wikipediaXMLSplitter
at
Mehmood,
wikipediaXMLSplitter is not present in driver.classes.default.props. To
accomplish what u r trying to do, u can edit
src/conf/driver.classes/default/props and add an entry for wikipediaXMLSplitter.
org.apache.mahout.text.wikipedia.WikipediaXmlSplitter = wikipediaXmlSplitter :
In fact, see this file
src/conf/driver.classes.default.props
which is not exactly as what you said. Still I have the same problem. Please
see the complete log
hadoop@solaris:~/mahout-distribution-0.9$ head -n 5
src/conf/driver.classes.default.props
FYI, I am trying to complete the wikipedia example from Apache's document
https://cwiki.apache.org/confluence/display/MAHOUT/Wikipedia+Bayes+Example
Regards,
Mahmood
On Friday, March 7, 2014 5:23 PM, Mahmood Naderan nt_mahm...@yahoo.com wrote:
In fact, see this file
The example as documented on the Wiki should work. The issue u seem to be
running Mahout 0.9 distro that was built with hadoop 1.2.1 profile on a Hadoop
2.3 environment. I don't think that's gonna work.
Suggest that you either:
a) Switch to a Hadoop 1.2.1 environment
b) Work off of present
That is rather disappointing
b) Work off of present Head and build with Hadoop 2.x profile.
Can you explain more?
Regards,
Mahmood
On Friday, March 7, 2014 8:09 PM, Suneel Marthi suneel_mar...@yahoo.com wrote:
The example as documented on the Wiki should work. The issue u seem to
In the wiki page: 'Quick tour of text analysis using the Mahout command
line'.
https://cwiki.apache.org/confluence/display/MAHOUT/Quick+tour+of+text+analysis+using+the+Mahout+command+line
At the very bottom it is said that
1. This will generate the 10 most similar docs to each doc
On Feb 25, 2014, at 9:22 AM, Juan José Ramos jjar...@gmail.com wrote:
In the wiki page: 'Quick tour of text analysis using the Mahout command
line'.
https://cwiki.apache.org/confluence/display/MAHOUT/Quick+tour+of+text+analysis+using+the+Mahout+command+line
At the very bottom
On Thu, Jan 30, 2014 at 10:20 PM, qiaoresearcher
qiaoresearc...@gmail.comwrote:
Andrew Musselman and Suneel Marthi,
Thank you two and I really appreciate it! Wish you two very successful in
your company!
I was wondering is there any documentation or tutorial which can go through
how the
Andrew Musselman and Suneel Marthi,
Thank you two and I really appreciate it! Wish you two very successful in
your company!
I was wondering is there any documentation or tutorial which can go through
how the Hadoop is written, not like the definition guide one.
For example, from the very
when run the command like:
mahout seq2sparse -i inputfile -o outputfile
where is the command seq2sparse defined? how does the system know to
actually run the SparseFileFromSequenceFile class?
what is the language used in the command Mahout such as the language given
below:
Those aliases are defined in src/conf/driver.classes.default.props.
That language is shell-scripting, e.g. bash.
On Wed, Jan 29, 2014 at 2:15 PM, qiaoresearcher qiaoresearc...@gmail.comwrote:
when run the command like:
mahout seq2sparse -i inputfile -o outputfile
where is the command
That's a bash script that invokes a Java class - MahoutDriver which reads the
props file mentioned earlier.
The props files is a mapping of the commandName to the actual Java program.
For Eg:- seq2sparse would be mapped to SparseVectorsFromSequenceFiles in the
props.
On Wednesday, January
I can run the examples under the mahout-distribution directory fine.
However cannot run the bin/mahout command, gives following errors.
mvn exec runs ok fine with mainClass (standard syntax) given (ran few
example classes from cf.taste and they run fine)
Following is command that gives error
Did you build Mahout?
On Nov 27, 2011 12:03 PM, bish maten bishma...@gmail.com wrote:
I can run the examples under the mahout-distribution directory fine.
However cannot run the bin/mahout command, gives following errors.
mvn exec runs ok fine with mainClass (standard syntax) given (ran few
mvn compile done under subdirectory of mahout-distribution.
I think you need mvn package to create artifacts but that is just a first
simple guess.
On Nov 27, 2011 12:14 PM, bish maten bishma...@gmail.com wrote:
mvn compile done under subdirectory of mahout-distribution.
now gives following error
no HADOOP_HOME set, running locally
SLF4J: Failed to load class org.slf4j.impl.StaticLoggerBinder.
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further
details.
Exception in thread
On 27.11.2011 bish maten wrote:
mvn compile done under subdirectory of mahout-distribution.
Did you also run a mvn package from the mahout root directory?
Isabel
signature.asc
Description: This is a digitally signed message part.
mvn install is the better command for this stuff.
mvn install -DskipTests
On Nov 27, 2011, at 11:59 AM, Isabel Drost wrote:
On 27.11.2011 bish maten wrote:
mvn compile done under subdirectory of mahout-distribution.
Did you also run a mvn package from the mahout root directory?
Isabel
Independent of maven: you are using an old version of Mahout. You may find
that some of the programs work better in the trunk than in the 0.5 release.
Also, the wiki documentation generally moves forward with the current trunk
and so it may be confusing.
27 matches
Mail list logo