I think there is already an example for this shipped with Spark. However,
you do not benefit really from any spark functionality for this scenario.
If you want to do something more advanced you should look at Elasticsearch
or Solr....

Le ven. 28 août 2015 à 16:15, Darksu <nick_tou...@hotmail.com> a écrit :

> Hello,
>
> This my first post, so i would like to congratulate the spark team for the
> great work!
>
> In short i have been studying Spark for the past week in order to create a
> feasibility project.
>
> The main goal of the project is to process text documents (word count will
> not be over 200 words) in order to find specific keywords:
>
> i.e a log file can have Error, Warn, Info
>
> Once the keywords are found i would like to categorize them i.e:
>
> Error->Level 1
> Warn->Level 2
> Info  ->Level 3
>
> The question is what is the best approach in order to solve my problem?
>
> Thanks for the help!
>
> Best Regards,
>
> Darksu
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Feasibility-Project-Text-Processing-and-Category-Classification-tp24493.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to