I have tried using default queue mechanism. There also i am seeing the same
log "using shared queue". So trying to understand what it actually does
mean.
Also I am sending same object to two different seda routes using multicast.
I wants to understand whether both routes use same object
I am using simple seda route and seeing that seda is using shared queue in
log.
Endpoint
Endpoint[seda://elasticSearchStore?concurrentConsumers=5=%23elasticsearchStoreLinkedQueue]
is using shared queue: seda://elasticSearchStore with size: 2147483647
2016-09-14 21:33:51,952 INFO SedaEndpoint:170
I have a route which will ingest data to elasticsearch. But it is throwing
concurrentModificationException while ingesting to elasticsearch
from("file://somefile")
.multicast()
.to("seda:someroute?concurrentConsumers=8","seda:elasticsearchStore?concurrentConsumers=8");
We are using the below route to pull the data from s3
from("aws-s3://bucketName?amazonS3Client=#awss3Client=false=us-east-1=folder1/folder").to("stream:out")
--
View this message in context:
I am using camel aws s3 to consume files from s3, but it is throwing below
exception
org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for
connection from pool
at
I am polling files from S3 using camel aws-s3 component. But it is
downloading the files which has been already downloaded. Is there any way to
avoid this behavior and keep track of files which has been already
downloaded. Then in the next poll, it should download which are newly
created in s3.
Camel zookeeper component is not exactly creating the znode. I found
something as below logs.
Reading reply sessionid:0x100088b818d0024, packet::
clientPath:/WVI/temp/testx.txt serverPath:/WVI/temp/testx.txt finished:false
header:: 1,5 replyHeader:: 1,154,-101 request::
I am using the camel zookeeper, and able to get the data of the znode but not
able to create it if does not exist. In logs able to see that "node do not
exist. creating it" but same znode is not available in zookeeper.
from("direct:start")
Is there a way to consume all the files in a S3 bucket without removing the
files from S3. If we set deleteAfterRead=false then it is polling the same
messages again and again. I noticed the Jira ticket for the same has been
resolved, but still i am facing the same issue. I am using camel 2.15
I have resolved the issue.. It is an issue with quartz component. Quartz was
being shutdown but the webapp didn't wait for quartz to finish before it
shutdown so Tomcat decided that it had left threads running and complained.
So, i override the shutdown method in quartzcomponent as follows and it
I am setting the exchange body with set of html lines. But when i execute the
program, it is showing only static html but not the javascript and css..
Here is my sample code
String htmlString= htmlheadlink rel=\stylesheet\ type=\text/css\
href=\js/timeline.css\/headbody Hello world /body/html;
Thanks for your replies. I am able to manage it by using mbeans in jmx.
Programatically getting mbeanserverconnection and stopping the cron route.
--
View this message in context:
http://camel.465427.n5.nabble.com/Is-there-any-way-to-unschedule-the-Camel-quartz-job-tp5767916p5770635.html
Sent
I am using camel 2.15.1. Stopping the route using JMX, againg restarting it
using JMX. Also the same getting same error if I restart entire camel
context.
--
View this message in context:
http://camel.465427.n5.nabble.com/Camel-quartz-memory-leak-tp5768063p5768275.html
Sent from the Camel -
I am using camel quartz to run a job every minute. But whenever i restart the
quartz route I get the message:
appears to have started a thread named [MyScheduler_Worker-1] but has failed
to stop it.This is very likely to create a memory leak. Stack trace of
thread: java.lang.Object.wait(Native
I am triggering the quartz cron scheduler in one route. The requirement is to
stop/remove the scheduled quartz job from another route. I have tried the
below code but in vain.
//Quartz route
/from(quartz://myGroup/everyMinute?cron=0+0/1+*+*+*+?stateful=true)
.to(stream:out)/
//Another route to
I needed to track the changes to a file. I thought it would be look like
this:
from(file://mydirectory?noop=truefileName=myfile.jsonidempotentKey={file:modified}-{file:size})
If the file(myfile.json) changes then I should get an Exchange. But this is
not working. It is treating the file as new
Then having an entry in idempotent repo is of no use. If i add a content to
the current file, it will read from the beginning only.
--
View this message in context:
I'm attempting to consume messages from an file using an idempotent Key to
ensure that I consume the messages from the newly generated files by rolling
over the old one.
Here is my file component route
I tried it, but retry was keep on reading the same file again and again
before the new file gets generated.
--
View this message in context:
http://camel.465427.n5.nabble.com/Reading-a-file-using-camel-stream-component-tp5766108p5766160.html
Sent from the Camel - Users mailing list archive at
Hi,
I am reading a file using the stream camel component. But the problem is
when file gets rolled out then camel is not reading the newly generated file
with the same name.
from(stream:file?fileName=myfilescanStream=truescanStreamDelay=1000)
The above route will read the current file
Hi jakub,
Thanks for your reply. I did the same thing. Then it was working fine.
--
View this message in context:
http://camel.465427.n5.nabble.com/camel-elasticsearch-component-tp5764662p5765540.html
Sent from the Camel - Users mailing list archive at Nabble.com.
Any quick suggestions will be more helpful...
--
View this message in context:
http://camel.465427.n5.nabble.com/camel-elasticsearch-component-tp5764662p5765090.html
Sent from the Camel - Users mailing list archive at Nabble.com.
Thanks a lot for quick reply
--
View this message in context:
http://camel.465427.n5.nabble.com/camel-elasticsearch-component-tp5764483p5764663.html
Sent from the Camel - Users mailing list archive at Nabble.com.
I am new to camel, using elasticsearch component. I wants to write my own
query based on fields in elasticsearch and fetch the data. Is there any way
to do this using camel elasticsearch component. I checked documentation but
didnt find anything related to this.
--
View this message in context:
24 matches
Mail list logo