Re: Eclipse: Wrong project dependencies in generated by "sbt eclipse"

2016-02-25 Thread Allen Zhang
well, I am using IDEA to import the code base. At 2016-02-25 22:13:11, "Łukasz Gieroń" <lgie...@gmail.com> wrote: I've just checked, and "mvn eclipse:eclipse" generates incorrect projects as well. On Thu, Feb 25, 2016 at 3:04 PM, Allen Zhang <allenzhang...

Re:Eclipse: Wrong project dependencies in generated by "sbt eclipse"

2016-02-25 Thread Allen Zhang
dev/change-scala-version 2.10 may help you? At 2016-02-25 21:55:49, "lgieron" wrote: >The Spark projects generated by sbt eclipse plugin have incorrect dependent >projects (as visible on Properties -> Java Build Path -> Projects tab). All >dependent project are missing

Re:Eclipse: Wrong project dependencies in generated by "sbt eclipse"

2016-02-25 Thread Allen Zhang
why not use maven At 2016-02-25 21:55:49, "lgieron" wrote: >The Spark projects generated by sbt eclipse plugin have incorrect dependent >projects (as visible on Properties -> Java Build Path -> Projects tab). All >dependent project are missing the "_2.11" suffix (for

RE: Using CUDA within Spark / boosting linear algebra

2016-01-21 Thread Allen Zhang
Hi Kazuaki, Jcuda is actually a wrapper of the **pure** CUDA, as your wiki page shows that 3.15x performance boost of logistic regression seems slower than BIDMat-cublas or pure CUDA. Could you elaborate on why you chose Jcuda other then JNI to call CUDA directly? Regards, Allen Zhang

Re: [discuss] dropping Python 2.6 support

2016-01-05 Thread Allen Zhang
plus 1, we are currently using python 2.7.2 in production environment. 在 2016-01-05 18:11:45,"Meethu Mathew" 写道: +1 We use Python 2.7 Regards, Meethu Mathew On Tue, Jan 5, 2016 at 12:47 PM, Reynold Xin wrote: Does anybody here care

Re:Support off-loading computations to a GPU

2016-01-03 Thread Allen Zhang
Hi Kazuaki, I am looking at http://kiszk.github.io/spark-gpu/ , can you point me where is the kick-start scripts that I can give it a go? to be more specifically, what does *"off-loading"* mean? aims to reduce the copy overhead between CPU and GPU? I am a newbie for GPU, how can I specify

Re: latest Spark build error

2015-12-25 Thread Allen Zhang
Try -pl option in mvn command, and append -am or amd for more choice. for instance: mvn clean install -pl :spark-mllib_2.10 -DskipTests At 2015-12-25 17:57:41, "salexln" wrote: >One more question: >Is there a way only to build the MLlib using command line? > > > > >--

Re: [VOTE] Release Apache Spark 1.6.0 (RC4)

2015-12-23 Thread Allen Zhang
+1 (non-binding) I have just tarball a new binary and tested am.nodelabelexpression and executor.nodelabelexpression manully, result is expected. At 2015-12-23 21:44:08, "Iulian Dragoș" wrote: +1 (non-binding) Tested Mesos deployments (client and

Re: A proposal for Spark 2.0

2015-12-21 Thread Allen Zhang
ut GPUs, please start a new thread. On Mon, Dec 21, 2015 at 11:18 PM, Allen Zhang <allenzhang...@126.com> wrote: plus dev 在 2015-12-22 15:15:59,"Allen Zhang" <allenzhang...@126.com> 写道: Hi Reynold, Any new API support for GPU computing in our 2.0 new version

Re: A proposal for Spark 2.0

2015-12-21 Thread Allen Zhang
plus dev 在 2015-12-22 15:15:59,"Allen Zhang" <allenzhang...@126.com> 写道: Hi Reynold, Any new API support for GPU computing in our 2.0 new version ? -Allen 在 2015-12-22 14:12:50,"Reynold Xin" <r...@databricks.com> 写道: FYI I updated the master

Re:Re: [VOTE] Release Apache Spark 1.6.0 (RC3)

2015-12-16 Thread Allen Zhang
plus 1 在 2015-12-17 09:39:39,"Joseph Bradley" 写道: +1 On Wed, Dec 16, 2015 at 5:26 PM, Reynold Xin wrote: +1 On Wed, Dec 16, 2015 at 5:24 PM, Mark Hamstra wrote: +1 On Wed, Dec 16, 2015 at 1:32 PM, Michael

Re: does spark really support label expr like && or || ?

2015-12-16 Thread Allen Zhang
more details commands: 2. yarn rmadmin -replaceLabelsOnNode spark-dev:54321,foo; yarn rmadmin -replaceLabelsOnNode sut-1:54321,bar; yarn rmadmin -replaceLabelsOnNode sut-2:54321,bye; yarn rmadmin -replaceLabelsOnNode sut-3:54321,foo; At 2015-12-17 10:31:20, "Allen

Re:Re: does spark really support label expr like && or || ?

2015-12-16 Thread Allen Zhang
n building DataFrame boolean expressions. example: >>> df = sqlContext.range(10) >>> df.where( (df.id==1) | ~(df.id==1)) DataFrame[id: bigint] On Wed, Dec 16, 2015 at 4:32 PM, Allen Zhang <allenzhang...@126.com> wrote: Hi All, does spark label expression really suppo

does spark really support label expr like && or || ?

2015-12-16 Thread Allen Zhang
Hi All, does spark label expression really support "&&" or "||" or even "!" for label based schedulering? I tried that but it does NOT work. Best Regards, Allen