Yes, there are kafka consumers/producers for almost all the languages, you
can read more over here
https://cwiki.apache.org/confluence/display/KAFKA/Clients#Clients-PHP
Here's a repo for the php version https://github.com/EVODelavega/phpkafka
Thanks
Best Regards
On Sun, Oct 18, 2015 at 12:58 PM,
.com.INVALID>"
Reply-To: "tarek.abouzei...@yahoo.com<mailto:tarek.abouzei...@yahoo.com>"
Date: Sunday, October 18, 2015 at 10:28 AM
To: Xiao Li, Akhil Das
Cc: "user@spark.apache.org<mailto:user@spark.apache.org>"
Subject: Re: Spark handling parallel requests
hi Akhlis
it
se WAL based
implementation
Hope this helps,-adrian
From: "tarek.abouzei...@yahoo.com.INVALID"
Reply-To: "tarek.abouzei...@yahoo.com"
Date: Sunday, October 18, 2015 at 10:28 AM
To: Xiao Li, Akhil Das
Cc: "user@spark.apache.org"
Subject: Re: Spark handling parallel requests
hi Akhlis
its a must to push data to a socket as i am using php as a web service to push
data to socket , then spark catch the data on that socket and process it , is
there a way to push data from php to kafka directly ? -- Best Regards, --
Tarek Abouzeid
On Sunday, October 18, 2015
Instead of pushing your requests to the socket, why don't you push them to
a Kafka or any other message queue and use spark streaming to process them?
Thanks
Best Regards
On Mon, Oct 5, 2015 at 6:46 PM, wrote:
> Hi ,
> i am using Scala , doing a socket
Hi, Tarek,
It is hard to answer your question. Are these requests similar? Caching
your results or intermediate results in your applications? Or does that
mean your throughput requirement is very high? Throttling the number of
concurrent requests? ...
As Akhil said, Kafka might help in your
Hi ,
i am using Scala , doing a socket program to catch multiple requests at same
time and then call a function which uses spark to handle each process , i have
a multi-threaded server to handle the multiple requests and pass each to spark
, but there's a bottleneck as the spark doesn't