Still isn't it a bug that if stream caching is not enabled in the context, what
happens in rest consumer is that it first uses the cache and second it uploads
the whole input into memory? Shouldn't the default strategy be to spool to file
after some limit?
BTW: I have found a way how to disable
Hello,
I am still very new to using Apache Camel and am trying to adapt an example
provided by someone (https://www.javainuse.com/camel/camel-consume-rest) to
essentially do the same thing the original application does, except by using a
combination of Spring Boot and Camel with XML route confi
Hi
Been struggling with this for a long time now and I can’t seem to get it right.
I process transactions from a JMS queue, +1/24h.
I need to aggregate them before sending them further down the chain.
When ever I receive a transaction with the same transactionNumber I have to
check what h
Hi Claus,
I am not sure I understand it fully: We use separate xml files for each
customer which we keep in the database. With these we start a separate
CamelContext instance from within our code (war file). So, even though
we have multiple xml files, all Camel contexts are created and
contro
I have created https://issues.apache.org/jira/browse/CAMEL-14823
Thanks.
From: Claus Ibsen
Sent: Wednesday, April 1, 2020 12:11
To: users@camel.apache.org
Subject: [EXTERNAL] - Re: stream caching not configurable for rest consumer?
Hi
Ah yeah looks like the cod
Hi
Ah yeah looks like the code should be identical for those 2 (or maybe
just one common method as its InputStream as input).
And then they should check the camel context for stream caching in use
or not. Then you can turn it on|off on context level and have it work.
Can you create a new JIRA tic
Hi,
I am trying to fix upload of huge files via camel rest endpoint. The problem is
that stream caching enables itself in this case (without being enabled in camel
context or the route), but when camel asks a StreamCachingStrategy if the data
should be moved from memory to file, it always says
Hi
You can run as many camel contexts as you like in a JVM.
For example on WildFly or Karaf you can just deploy multiple XML files
(deployments) each with their own .
In other word manage each customer in their own xml file and have them
separated.
Then each deployment is per customer and they c
Hi Raymond,
thank you for your answer. We have dismissed the option you mention
because, although that would separate Camel contexts, it would introduce
a plethora of new problems:
We would need to interact with the operating system (OS) to create,
remove, start, stop, restart and monitor the
Please use a newer version, 2.19.0 is really old and we don't release that
branch anymore.
--
Andrea Cosentino
--
Apache Camel PMC Chair
Apache Karaf Committer
Apache Servicemix PMC Member
Email: ancosen1...@yahoo.com
Twitter: @oscerd2
Github: oscerd
On Wedn
Hello,
I am using apache camel Spring DSL to connect to kafka topics. Everything
works fine while using non SSL kafka cluster but the moment I switch it to
SSL cluster, JVM heap will keep on increasing and eventually kill the
service in some time. However I am able to post and consume messages
suc
11 matches
Mail list logo