You need to
spark/external/flume-sink/target/scala-2.11/src_managed/main/compiled_avro
under build path, this is the only thing you need to do manually if I
remember correctly.



On Thu, Jun 23, 2016 at 2:30 PM, Stephen Boesch <java...@gmail.com> wrote:

> Hi Jeff,
>   I'd like to understand what may be different. I have rebuilt and
> reimported many times.  Just now I blew away the .idea/* and *.iml to start
> from scratch.  I just opened the $SPARK_HOME directory from intellij File |
> Open  .  After it finished the initial import I tried to run one of the
> Examples - and it fails in the build:
>
> Here are the errors I see:
>
> Error:(45, 66) not found: type SparkFlumeProtocol
>   val transactionTimeout: Int, val backOffInterval: Int) extends
> SparkFlumeProtocol with Logging {
>                                                                  ^
> Error:(70, 39) not found: type EventBatch
>   override def getEventBatch(n: Int): EventBatch = {
>                                       ^
> Error:(85, 13) not found: type EventBatch
>         new EventBatch("Spark sink has been stopped!", "",
> java.util.Collections.emptyList())
>             ^
>
> /git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/TransactionProcessor.scala
> Error:(80, 22) not found: type EventBatch
>   def getEventBatch: EventBatch = {
>                      ^
> Error:(48, 37) not found: type EventBatch
>   @volatile private var eventBatch: EventBatch = new EventBatch("Unknown
> Error", "",
>                                     ^
> Error:(48, 54) not found: type EventBatch
>   @volatile private var eventBatch: EventBatch = new EventBatch("Unknown
> Error", "",
>                                                      ^
> Error:(115, 41) not found: type SparkSinkEvent
>         val events = new util.ArrayList[SparkSinkEvent](maxBatchSize)
>                                         ^
> Error:(146, 28) not found: type EventBatch
>           eventBatch = new EventBatch("", seqNum, events)
>                            ^
>
> /git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSinkUtils.scala
> Error:(25, 27) not found: type EventBatch
>   def isErrorBatch(batch: EventBatch): Boolean = {
>                           ^
>
> /git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSink.scala
> Error:(86, 51) not found: type SparkFlumeProtocol
>     val responder = new SpecificResponder(classOf[SparkFlumeProtocol],
> handler.get)
>                                                   ^
>
>
> Note: this is just the first batch of errors.
>
>
>
>
> 2016-06-22 20:50 GMT-07:00 Jeff Zhang <zjf...@gmail.com>:
>
>> It works well with me. You can try reimport it into intellij.
>>
>> On Thu, Jun 23, 2016 at 10:25 AM, Stephen Boesch <java...@gmail.com>
>> wrote:
>>
>>>
>>> Building inside intellij is an ever moving target. Anyone have the
>>> magical procedures to get it going for 2.X?
>>>
>>> There are numerous library references that - although included in the
>>> pom.xml build - are for some reason not found when processed within
>>> Intellij.
>>>
>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>
>


-- 
Best Regards

Jeff Zhang

Reply via email to