ally. I would suggest that instead of
> communicating through the shardRegion actor, create a singleton actor
> locally to the system that "wraps"/proxies-to the shardRegion actor
> emulating its interface but evaluating your constraints.
>
> -Endre
>
>
>
> On Tue, Jan
Hi all,
I'm currently implementing an Akka Sharding based infrastructure and found
a limitation with the solution that potentially becomes a security problem.
There is the web layer (Play) and the data access layer of the akka
cluster. The web layer communicates through Akka Sharding proxy
tnks,
Rod
On Wednesday, December 2, 2015 at 11:52:07 AM UTC, Rodrigo Boavida wrote:
>
> Hi Iulian,
>
> I'm getting build errors using sbt due to the issues you found in the
> below thread in July of this year.
>
> https://mail-archives.apache.org/mod_mbox/spark-dev/201
errors somehow? (also sent you a pvt
msg)
tnks,
Rod
On Tuesday, December 1, 2015 at 9:40:17 PM UTC, Iulian DragoČ™ wrote:
>
>
>
> On Tuesday, December 1, 2015 at 8:52:30 PM UTC+1, Rodrigo Boavida wrote:
>>
>> Hi Iulian,
>>
>> Thanks for the hint. Just to make
t 6:09:39 PM UTC+1, drewhk wrote:
On Tue, Dec 1, 2015 at 5:37 PM, Rodrigo Boavida <rodrigo...@gmail.com> wrote:
If I revert to Akka 2.3.11 on Scala 2.11 I do not have any problem running
Spark.Seems something about the way Spark is calling the API that internally is
creating a c
Is in the Spark
> project, or some binary incompatibility in our release process leaked
> through. I have not suggested anything, I assessed the situation.
>
> Also, Akka 2.4 is built against Java 8, maybe that is your problem?
>
> -Endre
>
> On Tue, Dec 1, 2015 at 3:54 PM
in central
(https://repo1.maven.org/maven2) -> [Help 1]
On Tuesday, December 1, 2015 at 3:10:23 PM UTC, Rodrigo Boavida wrote:
>
> Hi Endre,
>
> This behavior is occurring after building Spark on Java 8. I'm tempted to
> assume by what I've shown on this thread, there is so
e binary compatibility of user facing API
>> (unlikely)
>>
>> -Endre
>>
>> On Tue, Dec 1, 2015 at 2:35 PM, Rodrigo Boavida <rodrigo...@gmail.com
>> > wrote:
>>
>>> Hello Akka users,
>>>
>>> I'm currently trying to build spark wi
nteed to be binary
> compatible (likely)
> - We accidentally broke binary compatibility of user facing API (unlikely)
>
> -Endre
>
> On Tue, Dec 1, 2015 at 2:35 PM, Rodrigo Boavida <rodrigo...@gmail.com
> > wrote:
>
>> Hello Akka users,
>>
>>
Hello Akka users,
I'm currently trying to build spark with Scala 2.11 and Akka 2.4.0.
I've changed both the scala settings in Spark build files and the main
pom.xml file to corresponding akka version - 2.4.0 - and am getting the
following exception when starting the master on standalone:
.
If there are any hints on where to look I will appreciate it.
Tnks,
Rod
On Tuesday, December 1, 2015 at 3:50:40 PM UTC, Rodrigo Boavida wrote:
>
> Hi Endre,
>
> Spark is supposed to be binary compatible with Scala 2.11. It's actually
> documented. Here is where they docum
in upcoming 2.4.1
>
> /Patrik
>
> On Thu, Nov 12, 2015 at 6:15 PM, Rodrigo Boavida <rodrigo...@gmail.com
> > wrote:
>
>> Brice,
>>
>> Thanks for the answer. I will definitely have a look into what the events
>> could give me and how the strate
Hi all,
I'm currently starting to implement an Akka sharding based infrastructure
and given the dynamic load balancing nature and recoverability, part of the
monitoring process would be around monitoring the current shards and
entities owned by each shard.
Would be nice to know if there is
Brice,
Thanks for the answer. I will definitely have a look into what the events
could give me and how the strategy could be tuned to provide such events.
tnks,
Rod
On Thursday, November 12, 2015 at 4:24:27 PM UTC, Brice Figureau wrote:
>
> On Thu, 2015-11-12 at 07:41 -0800, Rodrigo B
viktor.kl...@gmail.com
wrote:
Tried making StateCategory extends Serializable?
On Mon, Sep 15, 2014 at 2:43 PM, Rodrigo Boavida
rodrigo.boav...@gmail.com wrote:
Hi,
I had similiar problem with case classes which also derive from an
abstract one. Did you get around it?
tnks.
Rod
On Tuesday
Hi,
I had similiar problem with case classes which also derive from an abstract
one. Did you get around it?
tnks.
Rod
On Tuesday, December 11, 2012 10:58:33 AM UTC, rkuhn wrote:
Hi,
case objects are still objects (i.e. not static), so they will be
serialized. Are you sure that
on both layers
start successfully.
Looking forward on your comments.
Tnks,
Rod
On Monday, September 15, 2014 11:08:36 AM UTC+1, Patrik Nordwall wrote:
On Fri, Sep 12, 2014 at 6:26 PM, Rodrigo Boavida rodrigo...@gmail.com
javascript: wrote:
Hi Konrad,
We have the same requirement
Hi Konrad,
We have the same requirement. The reason why we need this approach is
because we need two types of cluster: Web and Processing cluster. Both have
different lifecycles and concerns, but the Web cluster needs to subscribe
to data sources processed by the backend cluster. The
18 matches
Mail list logo