Camel HTTPS - timing out

2017-06-29 Thread ganga_camel
Hi,

I have a requirement to download the data from a rest API. I am using https
endpoint to call the rest API.

The code is as listed below, when I try to access the Rest API through
Postman, its working fine and downloads a file with 50 MB. Where as when I
am trying to access the API through the code that is listed below, it always
times out.

Am I missing on something. Any help or guidance is highly appreciated.

public void configure() throws Exception {
from("timer://foo?fixedRate=true&period=60")
.routeId("Locations file")
.log(LoggingLevel.INFO,"File fetch route started started")
.removeHeaders("*")
.setHeader("CamelHttpMethod", constant("GET"))
.setHeader(Exchange.HTTP_URI,
constant("https://apiurl?key=keyvalue&httpClient.soTimeout=1200";))
.to("http4://myhost.com")
.log(LoggingLevel.INFO, "File fetch started started")
.to("file:Out1/fileName=loc.xml&noop=true")
.end();

  //  My properties file has this url
}

Thanks



--
View this message in context: 
http://camel.465427.n5.nabble.com/Camel-HTTPS-timing-out-tp5805429.html
Sent from the Camel - Users mailing list archive at Nabble.com.


File readlock on multiple files in a directory

2017-05-05 Thread ganga_camel
Hi,

I have a scenario where I need to sftp files from a directory to which files
are being written by another route. I am using file readlock=changed on that
directory. However, the readlock is taken on one file and not all the files.
Hence, when the file component gets the readlock on any one of the files in
the directory, it triggers the sftp component which results in the partial
file transfer of the files to the remote server.

Is there a possibility in camel where we can obtain the readlock on the
directory and once the writing to all the files on that directory is
complete the next step is triggered.

Thanks,
Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/File-readlock-on-multiple-files-in-a-directory-tp5798696.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Camel File: Produce files based on size

2017-02-27 Thread ganga_camel
Hi All,

I am working on a use-case where the need is to create files based on size. 
To elaborate further, the file component should switch to create a new file
every-time, the file that is being created exceeds a particular size of say
2GB. 

So, if I am in the process of writing into a file, and the file size reaches
2GB, then a new file should be created with file size 2GB and so on until
all the records are processed.

>From the camel documentation I could not find any clue to implement this
logic, is it possible to implement this requirement using Camel or should we
combine this will vanilla Java code?

Please advise.

Thanks,
Ganga





--
View this message in context: 
http://camel.465427.n5.nabble.com/Camel-File-Produce-files-based-on-size-tp5794522.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Camel Cache - Trigger an event when the cache expires

2017-02-07 Thread ganga_camel
Hi,

I have a scenario where in I need to perform a set of operations when the
cache expires (based on the timetolive set). As part of this operation, the
cache will be populated with the new value.

>From the documentation, I could not find anything specific. Is this possible
and if yes, any guidance is appreciated.

Thanks,
Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/Camel-Cache-Trigger-an-event-when-the-cache-expires-tp5793574.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Camel Aggregator - Partial Data written to file

2017-01-29 Thread ganga_camel
Hi,

I have written a piece of code that performs the below operations

1.Reads multiple files from input folder
2.applies xslt on each of the response
3.aggregates the transformed data
4.writes to a file in append mode

The Problem I am facing is, when all of the translated data (from all the
multiple files) is written into a single file in append mode, there is a lot
of partial data that is captured in the output file.

However, if I write the transformed output from each hit to input file into
individual multiple files VS single file, then there is no data loss.

Somehow, aggregating data and writing into single file in Append mode is
causing data loss. 

My code looks something like this

from("file:APIDataSamples?noop=true&maxMessagesPerPoll=1")
.log("working on Vendor")
.to("xslt:xslt/VendorTest.xsl")
.aggregate(new
Aggregaterecords()).constant(true).completionTimeout(1500).completionSize(750)
  
.to("file:test/output/OutputFiles/final?fileName=finalOutput.csv&fileExist=Append")
   .end();

Appreciate any help or guidance.

Thanks,
Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/Camel-Aggregator-Partial-Data-written-to-file-tp5793153.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Accessing Websphere MQ using credentials

2016-10-24 Thread ganga_camel
Hi,

I am trying to connect to WMQ using UserName and password. As per one of the
posts in the User group, I tried the below code for creating a wmq component

@Bean
MQQueueConnectionFactory jmsConnectionFactory()
{
jmsConnectionFactory = new MQQueueConnectionFactory();
try {
jmsConnectionFactory.setPort(port);
jmsConnectionFactory.setChannel("channel");
jmsConnectionFactory.setHostName("host");
jmsConnectionFactory.setQueueManager("QM");
jmsConnectionFactory.setTransportType(type);
}catch (Exception e){}
return jmsConnectionFactory;
}

@Bean
UserCredentialsConnectionFactoryAdapter adapter(){
adapter = new UserCredentialsConnectionFactoryAdapter();
adapter.setTargetConnectionFactory(jmsConnectionFactory);
adapter.setUsername("username");
adapter.setPassword("password");

return adapter;
}

@Bean
JmsComponent wmq()
{
JmsComponent wmq=new JmsComponent();
wmq.setConnectionFactory(adapter);
return wmq;
}

and my route looks like

from("wmq:queue:queueName")
.log(LoggingLevel.INFO, "Message read from WMQ is ${body}")
.end();

When I run my route, I get the below error

Caused by: org.springframework.beans.factory.BeanCreationException: Error
creating bean with name
'org.springframework.boot.autoconfigure.jms.JmsAutoConfiguration': Injection
of autowired dependencies failed; nested exception is
org.springframework.beans.factory.BeanCreationException: Could not autowire
field: private javax.jms.ConnectionFactory
org.springframework.boot.autoconfigure.jms.JmsAutoConfiguration.connectionFactory;
nested exception is
org.springframework.beans.factory.NoUniqueBeanDefinitionException: No
qualifying bean of type [javax.jms.ConnectionFactory] is defined: expected
single matching bean but found 2: jmsConnectionFactory,adapter
at
org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:334)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1214)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:482)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:368)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1123)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1018)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:510)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:482)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(Abst

RE: Rest DSL with restlet Component, unable to access API when hosted on remote server

2016-10-14 Thread ganga_camel
This is the error message that I get

Could not get any response
There was an error connecting to
http://:8081/bnt?9780226519791,9780415762564.

Why this might have happened:
The server couldn't send a response:
Ensure that the backend is working properly
SSL connections are being blocked:
Fix this by importing SSL certificates in Chrome
Cookies not being sent:
Use the Postman Interceptor extension
Request timeout:
Change request timeout in Settings > General

Thanks,
Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/Rest-DSL-with-restlet-Component-unable-to-access-API-when-hosted-on-remote-server-tp5788789p5788794.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Rest DSL with restlet Component, unable to access API when hosted on remote server

2016-10-14 Thread ganga_camel
Hi,I have a camel route with Rest DSL using the restlet component,
configuration as listed
belowrestConfiguration().component("restlet").host("{{hostname}}").port("{{port}}")
   
.dataFormatProperty("prettyPrint", "true");   
rest("/books").consumes(MediaType.ALL.toString())   
.produces(MediaType.ALL.toString())   
.post().to("direct:ProcessData");When I run the camel service on my local
machine with port as anything other than 8080(eg: 9091, 8081 etc), I am
able successfully post to the Rest API.However, when I deploy the rest API
code on the remote server to listen on port 9091 or 8081 etc... (anything
other than port 8080), I am unable to post to the rest API. But am able to
successfully post to rest API deployed on remote server when the port is set
to 8080.This behavior is observed only when the rest API is deployed on
remote host but works fine with any port number when rest API is running on
the local machine.Any suggestions would be highly appreciated.Thanks,Ganga 



--
View this message in context: 
http://camel.465427.n5.nabble.com/Rest-DSL-with-restlet-Component-unable-to-access-API-when-hosted-on-remote-server-tp5788789.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Re: Read file one after other in sequence

2016-08-09 Thread ganga_camel
Thanks for the inputsIt works as expected now...

Thanks Vitalii.



--
View this message in context: 
http://camel.465427.n5.nabble.com/Read-file-one-after-other-in-sequence-tp5786216p5786223.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Read file one after other in sequence

2016-08-09 Thread ganga_camel
Hi,

My requirement is to split the large file into smaller files and then
process one file at a time.

However, the camel file component takes the lock on all the split files and
starts processing. This is consuming all of the CPU and Memory usage goes
high and at one point I get OutofMemoryException.

My route looks like

from("file:data/ReadyForProcess?delete=true")
.routeId("FetchFilesForProcessing")
.to("seda:RemoveDeletes")
.end();

I wanted to know if there is a way I can configure my route to obtain lock
on one file at a time and process one file after other.

Any suggestions is highly appreciated...

Thanks,
Ganga 



--
View this message in context: 
http://camel.465427.n5.nabble.com/Read-file-one-after-other-in-sequence-tp5786216.html
Sent from the Camel - Users mailing list archive at Nabble.com.


camel xslt parsing not working with Camel 2.17.1 - Nullpointer Exception

2016-06-24 Thread ganga_camel
Hi,

I have a camel route which will tranform the incomming xml to a specified
format. I am using Camel xslt component for the same.

Everything was working fine when I was using Camel 2.16.2. When I upgrdade
to camel 2.17.1 the xml transformation stopped working with the below error,
I am using camel-saxon maven dependency...

Any help would be highly appreciated, as we have an awaited prod deployment
in a week's time.

2016-06-24 17:25:26 INFO  o.a.c.converter.jaxp.StaxConverter - Created
XMLInputFactory: com.sun.xml.internal.stream.XMLInputFactoryImpl@4398c559.
DOMSource/DOMResult may have issues with
com.sun.xml.internal.stream.XMLInputFactoryImpl@4398c559. We suggest using
Woodstox.
java.lang.NullPointerException
at
net.sf.saxon.event.ReceivingContentHandler.startPrefixMapping(ReceivingContentHandler.java:256)
at
org.apache.camel.converter.jaxp.StAX2SAXSource.parse(StAX2SAXSource.java:140)
at
org.apache.camel.converter.jaxp.StAX2SAXSource.parse(StAX2SAXSource.java:343)
at net.sf.saxon.event.Sender.sendSAXSource(Sender.java:396)
at net.sf.saxon.event.Sender.send(Sender.java:143)
at net.sf.saxon.Controller.transform(Controller.java:1890)
at 
org.apache.camel.builder.xml.XsltBuilder.process(XsltBuilder.java:142)
at
org.apache.camel.impl.ProcessorEndpoint.onExchange(ProcessorEndpoint.java:103)
at
org.apache.camel.component.xslt.XsltEndpoint.onExchange(XsltEndpoint.java:128)
at
org.apache.camel.impl.ProcessorEndpoint$1.process(ProcessorEndpoint.java:71)
at
org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
at 
org.apache.camel.processor.SendProcessor.process(SendProcessor.java:145)
at
org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)
at
org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:468)
at
org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:121)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:83)
at
org.apache.camel.processor.ChoiceProcessor.process(ChoiceProcessor.java:117)
at
org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)
at
org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:468)
at
org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:121)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:83)
at
org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at
org.apache.camel.component.seda.SedaConsumer.sendToConsumers(SedaConsumer.java:298)
at
org.apache.camel.component.seda.SedaConsumer.doRun(SedaConsumer.java:207)
at 
org.apache.camel.component.seda.SedaConsumer.run(SedaConsumer.java:154)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Thanks,
Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/camel-xslt-parsing-not-working-with-Camel-2-17-1-Nullpointer-Exception-tp5784368.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Re: How to Mock Kafka Consumer Endpoint with Spock Framework Unit Test cases

2016-06-12 Thread ganga_camel
Thanks Steve for your inputs. I changed the end point to seda from mock and
things fell in place, the test ran fine

-Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/How-to-Mock-Kafka-Consumer-Endpoint-with-Spock-Framework-Unit-Test-cases-tp5783856p5783952.html
Sent from the Camel - Users mailing list archive at Nabble.com.


How to Mock Kafka Consumer Endpoint with Spock Framework Unit Test cases

2016-06-10 Thread ganga_camel
Hi,

I am using Spock Framework to perform Unit Testing of Camel Routes. My first
route consumes from Kafka Consumer Endpoint. I need to mock this end point
and send the message to mock endpoint as part of the unit testing, below is
how I have tried

camelctx.getRouteDefinitions().get(0).adviceWith(camelctx, new
AdviceWithRouteBuilder() {
@Override
void configure() throws Exception {
   
interceptSendToEndpoint("kafka:{{server}}?topic={{topic}}&zookeeperHost={{zookeeper}}&zookeeperPort={{zookeeperport}}&groupId={{group}}&autoCommitEnable=true&autoOffsetReset=smallest&autoCommitIntervalMs=1000")
.skipSendToOriginalEndpoint()
.to("mock:kafkaCon")
}
})

Here I get the error "java.lang.NullPointerException: Cannot invoke method
getIn() on null object" .

I have also tried 
 camelctx.getRouteDefinitions().get(0).adviceWith(camelctx, new
AdviceWithRouteBuilder() {
@Override
void configure() throws Exception {
   replaceFromWith("mock:kafkaCon")
}
})
I get the error "java.lang.UnsupportedOperationException: You cannot consume
from this endpoint"

Any suggession on how to Mock the Kafka Consumer endpoint and advise if I am
trying to do it wrong..

-Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/How-to-Mock-Kafka-Consumer-Endpoint-with-Spock-Framework-Unit-Test-cases-tp5783856.html
Sent from the Camel - Users mailing list archive at Nabble.com.


xalan 2.7.2 working in IntelliJ IDE and not as .jar

2016-06-09 Thread ganga_camel
Hi,

I am trying to transform xml using XSLT in Camel. The dependency I have is 
compile group: 'xalan', name: 'xalan', version: '2.7.2'

I am trying to perform this step in xslt 

 




Am using this variable later in my xslt for further transformation

When I run my camel routes through IntelliJ IDE all runs fine, however when
I build a .jar file our of it and run it from command-line I get the below
error

/*"javax.xml.transform.TransformerException: "select" attribute is not
allowed on the xsl:sequence element!"*/

Below is the stack trace of the error, is there something that I am missing
as part of dependency.

Any help is highly appreciated

2016-06-09 16:32:28.978  INFO 11904 --- [main]
o.a.camel.spring.SpringCamelContext  : Apache Camel 2.16.2
(CamelContext: camel-1) is shutting down
2016-06-09 16:32:28.985  INFO 11904 --- [main]
o.a.camel.spring.SpringCamelContext  : Apache Camel 2.16.2
(CamelContext: camel-1) uptime 1.150 seconds
2016-06-09 16:32:28.986  INFO 11904 --- [main]
o.a.camel.spring.SpringCamelContext  : Apache Camel 2.16.2
(CamelContext: camel-1) is shutdown in 0.006 seconds
2016-06-09 16:32:28.993  INFO 11904 --- [main]
o.e.jetty.server.handler.ContextHandler  : Stopped
o.s.b.c.e.j.JettyEmbeddedWebAppContext@7cf8f87b{/,jar:file:/C:/Users/z062335/IdeaProjects/IDL-Loader/build/libs/IDL-Loader-1.0-SNAPSHOT.war!/,UNAVAILABLE}
2016-06-09 16:32:29.001 ERROR 11904 --- [main] o.s.boot.SpringApplication   
   
: Application startup failed

org.apache.camel.spring.boot.CamelSpringBootInitializationException:
org.apache.camel.FailedToCreateRouteException: Failed to create route
activeProductsRoute at: >>> Choice[[When[xpath{//ContentCafe/Message/text()
= 'Valid Item'} -> [Log[Valid Item; sending to IDL Queue],
SetBody[simple{Simple: ${property.apiData}}], To[xslt:{{bntToIdlFormat}}],
To[file:src/data?fileName=IDLXml.xml&fileExist=Append],
To[seda:writetoIdlQ Otherwise[[Log[Rejected Item, sending to reject Q],
To[seda:rejectRecords <<< in route:
Route(activeProductsRoute)[[From[seda:processActiveProducts]... because of
Failed to resolve endpoint: xslt://xslt/MMBIDLFormat.xsl due to:
javax.xml.transform.TransformerException: org.xml.sax.SAXException: "select"
attribute is not allowed on the xsl:sequence element!
javax.xml.transform.TransformerException: "select" attribute is not allowed
on the xsl:sequence element!
at
org.apache.camel.spring.boot.RoutesCollector.onApplicationEvent(RoutesCollector.java:94)
at
org.apache.camel.spring.boot.RoutesCollector.onApplicationEvent(RoutesCollector.java:38)
at
org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:163)
at
org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:136)
at
org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:381)
at
org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:335)
at
org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:855)
at
org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.finishRefresh(EmbeddedWebApplicationContext.java:140)
at
org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:541)
at
org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:118)
at
org.springframework.boot.SpringApplication.refresh(SpringApplication.java:766)
at
org.springframework.boot.SpringApplication.createAndRefreshContext(SpringApplication.java:361)
at
org.springframework.boot.SpringApplication.run(SpringApplication.java:307)
at com.tci.item.idlLoader.Application.main(Application.java:21)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at
org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:54)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.camel.FailedToCreateRouteException: Failed to create
route activeProductsRoute at: >>>
Choice[[When[xpath{//ContentCafe/Message/text() = 'Valid Item'} ->
[Log[Valid Item; sending to IDL Queue], SetBody[simple{Simple:
${property.apiData}}], To[xslt:{{bntToIdlFormat}}],
To[file:src/data?fileName=IDLXml.xml&fileExist=Append],
To[seda:writetoIdlQ Otherwise[[Log[Rejected Item, sending to reject Q],
To[seda:rejectRecords <<< in route:
Route(activeProductsRoute)[[From[seda:processActiveProducts]... because of

camel-ftp - Limitation or performance hit on Huge file transfers

2016-05-26 Thread ganga_camel
Hi,

I am exploring the camel-ftp component. My requirement is to ftp file which
is in GB's from server on one Data-center to other Data-Center.

I wanted to know if camel-ftp component poses any performance hit while
transferring the file of such huge size.

Thanks,
Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/camel-ftp-Limitation-or-performance-hit-on-Huge-file-transfers-tp5783111.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Re: Caching POJOs (no serializable and no InputStream)

2016-03-03 Thread ganga_camel
Hi,

By having the pojo implement Serializable, I was successfully able to add
pojo's into cache and retrieve them back without any type conversion.

-Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/Caching-POJOs-no-serializable-and-no-InputStream-tp5738138p5778543.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Re: Caching POJOs (no serializable and no InputStream)

2016-03-03 Thread ganga_camel
Hi,

Was there a patch implemented for this issue. I also have a case where I
need to add Pojo Objects into Cache and am not able to do so unless I
convert the Pojo to a String

If no fix was implemented, is there a work around to achieve this?

Appreciate response.



--
View this message in context: 
http://camel.465427.n5.nabble.com/Caching-POJOs-no-serializable-and-no-InputStream-tp5738138p5778541.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Re: Camel Spring Boot - Adding JmsComponent for WMQ to Camel Context

2016-02-11 Thread ganga_camel
suream new to Spring annotations, will explore further... Thanks for the
insight...

Thanks,
Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/Camel-Spring-Boot-Adding-JmsComponent-for-WMQ-to-Camel-Context-tp5777593p5777627.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Re: Camel Spring Boot - Adding JmsComponent for WMQ to Camel Context

2016-02-11 Thread ganga_camel
thanks for the response.Posted my finding just before I saw your
response.
@Bean worked for me...

Thanks,
Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/Camel-Spring-Boot-Adding-JmsComponent-for-WMQ-to-Camel-Context-tp5777593p5777597.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Re: Camel Spring Boot - Adding JmsComponent for WMQ to Camel Context

2016-02-11 Thread ganga_camel
I found the answer

I had to create a method that returned the jmscomponent and annotate it with
@Bean. @Bean annotation is a synonym for the  in camel-context.xml

The name of the method is the bean name, which I used in my routes to
communicate with the queues.

listed as below, this will register a component by name "wmq". I referenced
this id in my route as listed below

from("file:src/data?noop=true").to("*wmq:QName*").log("writting to Queue is
complete");

@Bean
JmsComponent wmq() {
JmsComponent jmsComp = new JmsComponent();
try {

MQQueueConnectionFactory jmsConF = new 
MQQueueConnectionFactory();
jmsConF.setHostName("<>");
jmsConF.setQueueManager("<>");
jmsConF.setPort(<>);
jmsConF.setChannel("<>");
jmsConF.setTransportType(<>);

JmsTransactionManager trsnMgr = new 
JmsTransactionManager(jmsConF);

jmsComp.setConnectionFactory(jmsConF);
jmsComp.setTransactionManager(trsnMgr);
jmsComp.setTransacted(true);
} catch (Exception e) {
e.printStackTrace();
}

return jmsComp;
}



--
View this message in context: 
http://camel.465427.n5.nabble.com/Camel-Spring-Boot-Adding-JmsComponent-for-WMQ-to-Camel-Context-tp5777593p5777596.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Camel Spring Boot - Adding JmsComponent for WMQ to Camel Context

2016-02-11 Thread ganga_camel
Hi,

I am exploring Spring Boot to write the camel routes...I started off by
creating a Maven Project using the camel-archetype-spring-boot.

The project contained 2 classes,
1. a class that extends from fatjarrouter, which houses the camel routes
2. a class by name WarInitializer which extends FatWarInitializer, which I
understand is the starting point to trigger the class containing the routes

The requirement that I am working on is to add a WebSphere MQ component to
the Camel Context so that I will be able to use it in my routes to
communicate to the WMQ Queues.

Since there is no Main class, neither Camel-context.xml, I am left wondering
where do I register the component into the camel context.

Can help would be appreciated..

Thanks,
Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/Camel-Spring-Boot-Adding-JmsComponent-for-WMQ-to-Camel-Context-tp5777593.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Re: Camel Spring boots

2016-02-11 Thread ganga_camel
use the following archetype while creating the project

GroupId=org.apache.camel.archetypes
Artifactid=camel-archetype-spring-boot
version=2.16.1

This will create a Spring Boot Template to run the camel routes.



--
View this message in context: 
http://camel.465427.n5.nabble.com/Camel-Spring-boots-tp5773560p5777591.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Camel Routes: Identify end of Processing in a Route

2015-12-17 Thread ganga_camel
Hi,

I am new to camel. I understand that Camel routes once started continue to
be active and in running status unless its manually stopped.

I have a requirement where the route should automatically stop after the
processing of all the records are complete. 

Is there a way we can programitically force stop the route once all the
records processing is complete?

Thanks,
Ganga





--
View this message in context: 
http://camel.465427.n5.nabble.com/Camel-Routes-Identify-end-of-Processing-in-a-Route-tp5775224.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Camel Aggregator: Using up the complete heap

2015-12-09 Thread ganga_camel
Hi, 

I have a camel route which will split the file based on the newline
charterer and writes the data into 3 different files based on some business
logic. 

The file is 135MB in size. While running the route, the observation is the
initial processing is pretty good however as the records get processed the
aggregator is filling up the heap constantly and at one point the heap is
full and the processing gets dead slow and comes to complete HOLD. 

The aggregator configuration is a below, is there a way we can ensure the
heap is released as and when the records are dumped into the seda block? 




true





-Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/Camel-Aggregator-Using-up-the-complete-heap-tp5774854.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Camel: Processing Large fixed width files (no header)

2015-12-07 Thread ganga_camel
Hi,

I have a requirement to read a file (fixed width without headers), filter
rejects and then append headers, zip it and send it to the downstream. The
output file should have headers added to it.

The file size that I will need to deal with is in GB's more than 10GB.

What is the best strategy to handle 10GB file size in camel

Thanks,
Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/Camel-Processing-Large-fixed-width-files-no-header-tp5774800.html
Sent from the Camel - Users mailing list archive at Nabble.com.


How to identify the last record that is getting processed?

2015-12-07 Thread ganga_camel
Hi,

I am trying to process a 10GB file (fixed width file) by performing the
below steps
1. Split the file per record based on the newline








2. Perform Filter and transform the data that will result in 3 different CSV
files (| delimited)
3. Add header column to all the 3 files
4. Zip the end result into a GZip file

The problem I am facing is in the step 3 and 4
Below code is how I am trying to aggregate and write to a file, I am using
 to add header before I the data into the file,
however, the header gets added multiple times, one for every 30 seconds,
within the output file, as the aggregation completionInterval is set to
"3". I am facing the same issue with creating the ZIP file.

Is there a way to identify the last record being processed and store the
flag in a global variable, using this variable value I can add the header
and zip the file once all the records are processed.







true










ready_attributeList_inventory_onhand_Rejects







--
View this message in context: 
http://camel.465427.n5.nabble.com/How-to-identify-the-last-record-that-is-getting-processed-tp5774794.html
Sent from the Camel - Users mailing list archive at Nabble.com.


How to identify the last record that is getting processed?

2015-12-07 Thread ganga_camel
Hi,

I am trying to process a 10GB file (fixed width file) by performing the
below steps
1. Split the file per record based on the newline








2. Perform Filter and transform the data that will result in 3 different CSV
files (| delimited)
3. Add header column to all the 3 files
4. Zip the end result into a GZip file

The problem I am facing is in the step 3 and 4
Below code is how I am trying to aggregate and write to a file, I am using
 to add header before I the data into the file,
however, the header gets added multiple times, one for every 30 seconds,
within the output file, as the aggregation completionInterval is set to
"3". I am facing the same issue with creating the ZIP file.

Is there a way to identify the last record being processed and store the
flag in a global variable, using this variable value I can add the header
and zip the file once all the records are processed.





true










ready_attributeList_inventory_onhand_Rejects







--
View this message in context: 
http://camel.465427.n5.nabble.com/How-to-identify-the-last-record-that-is-getting-processed-tp5774793.html
Sent from the Camel - Users mailing list archive at Nabble.com.


Re: Camel Aggregator - Completion based on CamelSplitComplete

2015-12-04 Thread ganga_camel
The CamelSplitComplete is at each exchange level that gets split. In case the
below block is aggregating only a specific set of filtered data, then the
exchanges that are getting into this block may not be the last exchange from
the bigger set of exchanges and hence the block posted will never get
executed.
This header works pretty fine without any filtering applied.

However, Is there a better way of identifying the end of records (file)
processing with filtering of the records?

-Ganga



--
View this message in context: 
http://camel.465427.n5.nabble.com/Camel-Aggregator-Completion-based-on-CamelSplitComplete-tp5774653p5774659.html
Sent from the Camel - Users mailing list archive at Nabble.com.