Sean,

Would you kindly suggest on which forum, mailing list or issues to ask question 
about the AAS book?
Or no  such provision is made?

regards,
Deepak
--------------------------------------------
On Thu, 2/26/15, Sean Owen <so...@cloudera.com> wrote:

 Subject: Re: value foreach is not a member of 
java.util.List[edu.stanford.nlp.util.CoreMap]
 To: "Deepak Vohra" <dvohr...@yahoo.com>
 Cc: "user@spark.apache.org" <user@spark.apache.org>
 Date: Thursday, February 26, 2015, 10:43 AM
 
 (Books on Spark are not
 produced by the Spark project, and this is not
 the right place to ask about them. This
 question was already answered
 offline,
 too.)
 
 On Thu, Feb 26, 2015 at
 6:38 PM, Deepak Vohra
 <dvohr...@yahoo.com.invalid>
 wrote:
 >   Ch 6 listing from
 Advanced Analytics with Spark generates error. The
 > listing is
 >
 > def plainTextToLemmas(text: String,
 stopWords: Set[String], pipeline:
 >
 StanfordCoreNLP)
 >     :
 Seq[String] = {
 >     val doc
 = new Annotation(text)
 > 
    pipeline.annotate(doc)
 > 
    val lemmas = new ArrayBuffer[String]()
 >     val sentences =
 doc.get(classOf[SentencesAnnotation])
 > 
    for (sentence <- sentences; token <-
 > sentence.get(classOf[TokensAnnotation]))
 {
 >       val lemma =
 token.get(classOf[LemmaAnnotation])
 > 
      if (lemma.length > 2 &&
 !stopWords.contains(lemma) &&
 >
 isOnlyLetters(lemma)) {
 >     
    lemmas += lemma.toLowerCase
 >       }
 >     }
 > 
    lemmas
 >   }
 >
 > The error is
 >
 > <console>:37:
 error: value foreach is not a member of
 >
 java.util.List[edu.stanford.nlp.util.CoreMap]
 >            for (sentence <-
 sentences; token <-
 >
 sentence.get(classOf[TokensAnnot
 >
 ation])) {
 >               
              ^
 
 ---------------------------------------------------------------------
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org
 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to