The server's ulimit is 50.000
On Monday, November 9, 2015 at 12:31:45 PM UTC+7, Hengky Sucanda wrote:
>
> Hi all,
>
> Recently my system has been hit with a "too many open files" error.
> Attached is the result of running netstat -apn .
>
> The setup is a Play Framework 2.4.2 node sending messag
Hi all,
Recently my system has been hit with a "too many open files" error.
Attached is the result of running netstat -apn .
The setup is a Play Framework 2.4.2 node sending message to sharded actors
using ClusterSharding(system).startProxy(...).
The region is provided by a class called Route
With some fixes on the client side explicitly adding Content-Type now
works. Thanks Viktor.
On Sun, Nov 8, 2015 at 9:28 PM, Viktor Klang wrote:
>
> http://stackoverflow.com/questions/5661596/do-i-need-a-content-type-for-http-get-requests
> On 8 Nov 2015 09:48, "Akira Hayakawa" wrote:
>
>> def
Hi Gary,
yes, you should read the AsyncStage => GraphStage migration:
http://doc.akka.io/docs/akka-stream-and-http-experimental/2.0-M1/scala/migration-guide-1.0-2.x-scala.html#asyncstage-has-been-replaced-by-graphstage
It's a the most complex translation I think, so once you grasp that one you
sh
>
> The 2.0M-1 docs have a TODO for GraphStage. In the meantime is there
> anything out there to help start using them?
>
Gary
--
>> Read the docs: http://akka.io/docs/
>> Check the FAQ:
>> http://doc.akka.io/docs/akka/current/additional/faq.html
>>
On Sun, Nov 8, 2015 at 11:38 PM, wrote:
> >Why would you need to turn them into Seqs for that?
> Becausethat's just the way I did it and it seems to be working. I find
> it useful because it meets my business requirement. I'm not saying it's *the
> answer*, I'm just seeking a code critique an
On Sun, Nov 8, 2015 at 10:57 PM, wrote:
> > That sounds like a feature of questionable value
> I have multiple sources of time series data. Each source is over the same
> time range and I'd like to aggregate them into one time series. So I use
> this method then on each of the resulting source's
That sounds like a feature of questionable value, it's very easy to arrive
at OOMEs with such a method.
On Sun, Nov 8, 2015 at 9:24 PM, wrote:
> I wasn't very clear in my original requirements. In the resulting Source,
> I want its 1st element (i.e. its 1st sequence) to contain all the 1st
> ele
I wasn't very clear in my original requirements. In the resulting Source, I
want its 1st element (i.e. its 1st sequence) to contain all the 1st
elements of the input sources; its 2nd sequence to contain all the 2nd
elements of the input sources; and so on until the shortest input source
ends.
Hi Francesco,
You could emulate what you want using this:
scala> val (q, f) = Source.queue[Option[Int]](10,
OverflowStrategy.backpressure).takeWhile(_.isDefined).map(_.get).toMat(Sink.foreach(println))(Keep.both).run()
q: akka.stream.SourceQueue[Option[Int]] =
akka.stream.impl.AcknowledgeSource$$
Why not just something like the following?
val seqOfSources: Seq[Source[Int, Unit]] = Seq(List(1,2), List(3,4),
List(5,6)).map(Source(_))
val sourceOfSeqs: Source[Seq[Int], Unit] =
Source(seqOfSources.toList).flatMapConcat(identity).grouped(100)
On Sunday, 8 November 2015 19:26:56 UTC+1, osa.
Hi,
the new Source.queue in Akka Stream 2.0-M1 offers an interesting way to
imperatively push elements into the stream without loosing backpressure
information, but how could the complete signal be sent to the source?
In case of Source.actorRef it is possible by sending a PoisonPill or a
Succ
Hello,
I've been working on a way to turn a *Seq[Source[T]]* into *Source[Seq[T]]*
using Akka Streams 2.0. I believe I've succeeded, but I'd like people to
critique my code. Thanks in advance.
object StreamUtils {
def sequencedSource[T](sources: immutable.Seq[Source[T, Unit]]):
Source[immut
Hello,
recently I read through new akka-persistence-query-experimental module and
would like to see some tutorials and examples to see how it can work.
Documentation is really abstract for me. Unfortunately, I'm not able to
find any activator templates nor tutorials.
What I understand is how
I am using LevelDB.
On Sunday, 8 November 2015 22:57:58 UTC+5:30, Konrad Malawski wrote:
>
> Yes it should.
> It's "delete messages to sequence nr 12".
>
> We have tests for it in the TCK, but it's up to the Journal
> implementations to corrently implement this.
> Which journal impl are you talki
Yes it should.
It's "delete messages to sequence nr 12".
We have tests for it in the TCK, but it's up to the Journal implementations to
corrently implement this.
Which journal impl are you talking about?
By the way, these things are very simple to check for yourself - just write a
test which do
if the journal has 4 messages, what would deleteMessages(12) result in?
Will it delete those 4 messages in the journal?
--
>> Read the docs: http://akka.io/docs/
>> Check the FAQ:
>> http://doc.akka.io/docs/akka/current/additional/faq.html
>> Searc
http://stackoverflow.com/questions/5661596/do-i-need-a-content-type-for-http-get-requests
On 8 Nov 2015 09:48, "Akira Hayakawa" wrote:
> def contentType(cth: Option[`Content-Type`]) = cth match {
> case Some(x) ⇒ x.contentType
> case None⇒ ContentTypes.`application/octet-stream`
>
def contentType(cth: Option[`Content-Type`]) = cth match {
case Some(x) ⇒ x.contentType
case None⇒ ContentTypes.`application/octet-stream`
}
I suspect this code in HttpMessageParser is the one that sets the default
Content-Type that's harmful.
I guess this "cth" is of what's p
OK. I see that Content-Type is modeled into Entity.
But, this is harmful if you try to implement S3-compatible storage because
its signature calculation depends on Content-Type value on the client side.
With some works it's revealed that akka-http puts versatile
"application/octet-stream" type
20 matches
Mail list logo