should I add dependencies for 
"spark-core_2.10,spark-yarn_2.10,spark-streaming_2.10, 
org.apache.spark:spark-mllib_2.10,:spark-hive_2.10,:spark-graphx_2.10" in 
pom.xml?if yes, there are 7 pom.xml in HiBench listing below, which one to 
modify? 
[root@spark-study HiBench-master]# find ./ -name pom.xml
./src/nutchindexing/pom.xml
./src/pegasus/pom.xml
./src/sparkbench/pom.xml
./src/mahout/pom.xml
./src/hivebench/pom.xml
./src/pom.xml
./src/autogen/pom.xml


--------------------------------

 

Thanks&Best regards!
San.Luo

----- 原始邮件 -----
发件人:<luohui20...@sina.com>
收件人:"jie.huang" <jie.hu...@intel.com>, "Ted Yu" <yuzhih...@gmail.com>
抄送人:"user" <user@spark.apache.org>
主题:回复:RE: Hibench build fail
日期:2015年07月08日 15点38分

Hi Ted and Grace,       Retried with Spark 1.4.0,still failed with same 
phenomenon.here is a log.FYI.       What else details may help?        BTW, is 
it a necessary step to run Hibench test for my spark cluster?  I also tried to 
skip building Hibench to execute "bin/run-all.sh", but also got errors while 
running.    thanks.
      
       


--------------------------------

 

Thanks&amp;Best regards!
San.Luo

----- 原始邮件 -----
发件人:"Huang, Jie" <jie.hu...@intel.com>
收件人:Ted Yu <yuzhih...@gmail.com>, ?? <luohui20...@sina.com>
抄送人:user <user@spark.apache.org>
主题:RE: Hibench build fail
日期:2015年07月08日 09点20分





Hi Hui,
 
Could you please add more descriptions (about the failure) in HiBench github 
Issues?
 
HiBench works with spark 1.2 and above.

 
Thank you && Best Regards,
Grace
(Huang Jie)
 
From: Ted Yu [mailto:yuzhih...@gmail.com]


Sent: Wednesday, July 8, 2015 12:50 AM

To: 罗辉

Cc: user; Huang, Jie

Subject: Re: Hibench build fail
 

bq. Need I specify my spark version

 


Looks like the build used 1.4.0 SNAPSHOT. Please use 1.4.0 release.


 


Cheers



 

On Mon, Jul 6, 2015 at 11:50 PM, <luohui20...@sina.com> wrote:

Hi grace,
     recently I am trying Hibench to evaluate my spark cluster, however I got a 
problem in building Hibench, would you help to take a look? thanks.
     It fails at building Sparkbench, and you may check the attched pic for 
more info.
     My spark version :1.3.1,hadoop version :2.7.0 and HiBench version:4.0, 
python 2.6.6. It is reported that failed for spark1.4 and MR1,which I didn't 
install in my cluster.Need I specify my spark version and hadoop version when I 
am running "bin/build-all.sh"?
     thanks.
 

--------------------------------



 

Thanks&amp;Best regards!

San.Luo





---------------------------------------------------------------------

To unsubscribe, e-mail: user-unsubscr...@spark.apache.org

For additional commands, e-mail: user-h...@spark.apache.org


 




Reply via email to