Hi,

I am working on a MINA-based application that speaks protocols from a wide variety of manufacturers' devices, some are serial, others use socket connections. All have been designed to be transport-transparent, allowing us to simulate devices and specific situations via socket-based simulator connections.

I added a CaptureLogFilter object to log data flow to a separate file. I hook that object into the filter chain as each connection is created.

Inside CaptureLogFilter, in both messageSent() and messageReceived() hooks, I check the message parameter, and if it is an IoBuffer, I spit out a capture line SENT (or RECEIVED) + # of bytes + a hex dump of those bytes.

For simulated data sets (socket-based connections) this works great, my capture log shows
SENT x bytes blah blah blah
SENT 0 bytes
RECEIVED y bytes blah blah blah

The extra "SENT 0 bytes" seemed odd but harmless.... until I noticed what happens on serial connections. For those I invariably see:

SENT 0 bytes
RECEIVED y bytes blah blah blah

Each and every SENT line from a serial-based connection reads 0 bytes, and I never see the actual command packet that I sent out. It seems apparent that said data was received by the far end device, as the RECEIVED lines show all the right responses.

My filter looks like:
public class CaptureLogFilter extends IoFilterAdapter {
...
   @Override
public void messageReceived(NextFilter nextFilter, IoSession session, Object message) throws Exception {
       log("RECEIVED: ", message);
if (nextFilter != null ) nextFilter.messageReceived(session, message);
   }

   @Override
public void messageSent(NextFilter nextFilter, IoSession session, WriteRequest writeRequest) throws Exception {
       log("SENT: ", writeRequest.getMessage());
if (nextFilter != null ) nextFilter.messageSent(session, writeRequest);
   }
...
   private void log(String event, Object arg) {
       if (arg != null && arg instanceof IoBuffer) {
           byte b[] = IoBufferUtils.asBytes((IoBuffer) arg);
           log(event + b.length + " bytes: " + ByteUtils.toHex(b));
//            IoBuffer i = (IoBuffer)arg;
// log("DBG: " + event + "pos: " + i.position() + ", lim: "+i.limit()); <<< This debug just confirmed IoBufferUtils was telling the truth, the IoBuffer is empty (position & limit are both 0)
       }
       else log(event);
   }
...
}

And is hooked in the IoConnector set up:

   private final IoConnector buildIoConnector() {
...
       if ( codecFilter == null )
           codecFilter = new ProtocolCodecFilter(createCodecFactory());
...
       DefaultIoFilterChainBuilder fc = connector.getFilterChain();
       if ( fc.contains("logging") ) fc.remove("logging");
       if ( fc.contains("codec") ) fc.remove("codec");
fc.addLast("logging", new CaptureLogFilter(getClass().getSimpleName()));
       fc.addLast("codec", codecFilter);
...
}

Anything obvious jump out at you experts? :-) I'm baffled as to why one type of connection would perform any differently than the other.

I can switch the same protocol handler between simulated (socket) and real (serial) connections and watch the SENT lines appear or disappear based on the connection.
Thanks in advance!!
boB

--
boB Gage
Software Engineer
Merge Healthcare


Reply via email to