Hi Vikas,

I think you want something like Snowplow Analytics for clickstream data
analytics and a snowplow relay (code that sends a copy of those events) to
Unomi.
I've been thinking about that too. Let me know if you want to join forces
to make that Snowplow Relay.

For Snowplow you can setup your own https://github.com/snowplow/snowplow or
use a hosted version https://www.snowcatcloud.com/ .

See what Snowplow is here: https://www.youtube.com/watch?v=Yx3Z733ElgI

Thanks
Joao Correia

--

*João Correia*

San Diego, CA
mobile: +1 (858) 284-6010
web: https://joaocorreia.io













On Thu, Jan 2, 2020 at 1:29 AM Vikas Yadav <vikas.ya...@mattea.io> wrote:

> Hi folks,
> we are currently working with Unomi for personalization on local I have
> some questions in my mind I want solutions for that basically what we do we
> are trying to create an application that works with spark we are using
> apache spark for prepossessing data, apache Unomi
> for personalization through Kafka. I want to know can we create something
> like when a user comes to our website we can track their clickstream data
> and send it to Unomi and spark at the same time and perform analytics on
> that data. I want to know that Unomi will perform analytics or it only
> store personalized data or can we perform analytics on data by other
> approaches with Unomi and spark.
>
> ---
> Thanks
>
> Vikas Yadav
>
> Enterprise Software Engineer
>
> MATTEA
>
> The Magic of Personalization
>
>

Reply via email to