this took me a bit to get working, but I finally got it up and running so with 
the package that Burak pointed out.

here's some relevant links to my project that should give you some clues:

https://github.com/fluxcapacitor/pipeline/blob/master/myapps/spark/ml/src/main/scala/com/advancedspark/ml/nlp/ItemDescriptionsDF.scala

https://github.com/fluxcapacitor/pipeline/blob/master/myapps/spark/ml/build.sbt

https://github.com/fluxcapacitor/pipeline/blob/master/config/bash/pipeline.bashrc
 (look for the SPARK_SUBMIT_PACKAGES export var)

there's also a few Zeppelin notebooks in that repo that will show you got to 
use it.  I'm doing sentiment analysis in one - as well as entity recognition 
and other fun stuff.

it's a pain to setup, unfortunately.  not sure why it.  lots of missing pieces 
that had to be manually cobbled together.

> On Apr 19, 2016, at 5:00 PM, Burak Yavuz <brk...@gmail.com> wrote:
> 
> A quick search on spark-packages returns: 
> http://spark-packages.org/package/databricks/spark-corenlp.
> 
> You may need to build it locally and add it to your session by --jars.
> 
> On Tue, Apr 19, 2016 at 10:47 AM, Gavin Yue <yue.yuany...@gmail.com> wrote:
>> Hey, 
>> 
>> Want to try the NLP on the spark. Could anyone recommend any easy to run NLP 
>> open source lib on spark?
>> 
>> Also is there any recommended semantic network? 
>> 
>> Thanks a lot.  
> 

Reply via email to