my
> scala code. 'com.snowplowanalytics:scala-maxmind-iplookups:0.2.0’ was
> imported in Gradle.
>
> spark version: 1.6.0
> scala: 2.10.4
> scala-maxmind-iplookups: 0.2.0
>
> I run my test, got the error as below:
> java.lang.NoClassDefFoundError:
> scala/collection/JavaCon
ups" % "0.2.0"
)
otherwise, to process streaming log I use logstash with kafka as input.
You can set kafka as output if you need to do some extra calculation
with spark.
Le 23/02/2016 15:07, Romain Sagean a écrit :
Hi,
I use maxmind geoip with spark (no streaming). To make it
Hi,
I use maxmind geoip with spark (no streaming). To make it work you should
use mapPartition. I don't know if something similar exist for spark
streaming.
my code for reference:
def parseIP(ip:String, ipLookups: IpLookups): List[String] = {
val lookupResult =
; Thanks,
> >
> > Alex.
> >
> >
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
--
*Romain Sagean*
*romain.sag...@hupi.fr <romain.sag...@hupi.fr>*
ant = getChronology().days().add(getMillis(), days);
>> Maybe catch the NPE and print out the value of currentDate to see if
>> there is more clue ?
>>
>> Cheers
>>
>> On Tue, Nov 10, 2015 at 12:55 PM, Romain Sagean <romain.sag...@hupi.fr>
>> wrote:
>>
&
Hi community,
I try to apply the function below during a flatMapValues or a map but I
get a nullPointerException with the plusDays(1). What did I miss ?
def allDates(dateSeq: Seq[DateTime], dateEnd: DateTime): Seq[DateTime] = {
if (dateSeq.last.isBefore(dateEnd)){
allDates(dateSeq:+
2015-11-10 18:39 GMT+01:00 Ted Yu <yuzhih...@gmail.com>:
> Can you show the stack trace for the NPE ?
>
> Which release of Spark are you using ?
>
> Cheers
>
> On Tue, Nov 10, 2015 at 8:20 AM, romain sagean <romain.sag...@hupi.fr>
> wrote:
>
>> Hi c
--
Romain Sagean