Hey Claus,

Having to acquire a lock on the file sounds like a good way to implement the 
"don't start attempting to read an empty file" semantics I'm looking for.

Having said that, the documentation on read locks is somewhat misleading. It 
notes a boolean URI parameter called consumer.exclusiveReadLock. The 
configuration processor doesn't seem to consider this to be an acceptable 
option - maybe I'm doing something wrong.

So turning to the source, the FileProcessStrategyFactory appears to accept a 
flag called "readLock", which can be either none, markerFile, fileLock, rename 
or changed. So I went for fileLock. However, it seems that this strategy is not 
scoped on an individual endpoint, rather it appears to be set globally for the 
entire camel context (I gained this impression by debugging the 2.12.1 
release). So it seems that which ever file endpoint is processed first sets the 
strategy for the entire context.

Or am I missing the point?

Cheers,

Ben

> On Nov 5, 2013, at 15:19, Claus Ibsen <claus.ib...@gmail.com> wrote:
> 
> If you are talking about how to not pickup new files in a Camel from
> route, then take a look at the various read lock documentation on the
> file component.
> 
>> On Tue, Nov 5, 2013 at 2:39 PM, Ben Hood <0x6e6...@gmail.com> wrote:
>> Hi,
>> 
>> In my first attempt to use Camel I’ve run into a intra-route timing issue 
>> that I’ve only solved with a hack, so I was wondering whether there are any 
>> best practices of dealing with timing issues when dealing with multiple 
>> processing steps in a batch file pipeline.
>> 
>> Basically I am trying to avoid doing an HTTP POST with an empty payload, 
>> since the route performing the HTTP POST is triggered before the file that 
>> it is wired to upload has been written.
>> 
>> If I turn stream caching on, this problem goes away. However, since some 
>> files can be quite big, I’d prefer not to have do stream caching.
>> 
>> So to solve the issue, I’ve written a workaround bean that just does a 
>> Thread.sleep() in order to wait for the upload file to actually get some 
>> data in it before firing off the HTTP POST.
>> 
>> I’ve got a two step pipeline that:
>> 
>> 1. Transcodes a batch input file into an intermediate format (using msgpack 
>> serialization);
>> 2. Performs an HTTP POST of the intermediate format to a remote server;
>> 
>> I’d like to keep the intermediate format around on disk for debugging and 
>> manual replay tasks.
>> 
>> My camel context has two routes:
>> 
>> <route id=“transcode-to-msgpack">
>>      <from uri="file:/tmp/d"/>
>>      <log message="Transcoding ${file:name} to msgpack" />
>>      <to uri="bean:transcoder"/>
>>      <to uri="file:/tmp/b?fileName=${file:name.noext}.msgpack"/>
>> </route>
>> 
>> <route id=“post-msgpack-payload">
>>      <from uri="file:/tmp/b"/>
>>      <from uri="file:/tmp/e"/>
>>      <log message="POSTing ${file:name} to the rating API" />
>>      <setProperty propertyName="url.template">
>>                
>> <constant>http://localhost:9999/calls/:source/:sequence</constant>
>>      </setProperty>
>>      <process ref=“httpDataPump"/>
>> </route>
>> 
>> I have two custom beans doing the work:
>> 
>> 1. transcoder - This takes an InputStream, and returns an InputStream that 
>> wraps and transcodes the InputStream from the file;
>> 2. httpDataPump - This contains an HTTP client that uploads the 
>> FileInputStream the the InMessage from the Exchange refers.
>> 
>> Doing a Thread.sleep() seems like a real hack to me, so I was wondering if 
>> there is a more idiomatic way to solve the issue. I’ve looked into the 
>> preMoveNamePrefix options, but they appear to apply only to input files. 
>> Ideally I’m looking for something that can move the output file after it has 
>> been written.
>> 
>> Any pointers are appreciated.
>> 
>> Cheers,
>> 
>> Ben
> 
> 
> 
> -- 
> Claus Ibsen
> -----------------
> Red Hat, Inc.
> Email: cib...@redhat.com
> Twitter: davsclaus
> Blog: http://davsclaus.com
> Author of Camel in Action: http://www.manning.com/ibsen

Reply via email to