Hello! I'm deploying Nutch on two computers. When I run start-all.sh script all goes good but data node on slave computer does not log anything. All other parts of Hadoop (namenode, jobtracker, both tasktrackers and datanode on master) log their information properly. Also, when I put some files from local file system into hadoop fs they are put only into master's data folder. Slave's data folder is empty. At the same time when I run stop-all.sh script I get message, that slave's datanode is being stopped. It means that it has been running before. Do you know what may cause this problem?
------------------------------------------------------------------------- This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/ _______________________________________________ Nutch-general mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/nutch-general
