330k tps is a very large number.. honestly it is hard for me to believe that
you can get such results with single jvm (or did the test have multiple
hosts and jvm's). But if it can I am really impressed. Would be nice to see
some hardware specs and details on the actual test. I have to run some tests
with our hosts that have a bit more juice.. Anyway, 10k tps with single jvm
is sufficient for me.
I may also take a look at the mina 2.0.
Anyway, here is the code:
// Create an acceptor
IoAcceptor acceptor = new SocketAcceptor(20,
Executors.newCachedThreadPool());
// Create a service configuration
SocketAcceptorConfig cfg = new SocketAcceptorConfig();
cfg.setReuseAddress( true );
cfg.setThreadModel(ThreadModel.MANUAL);
cfg.getFilterChain().addLast(
"protocolFilter",
new ProtocolCodecFilter( new
DiameterProtocolCodecFactory() ) );
//cfg.getFilterChain().addLast( "logger", new LoggingFilter() );
//cfg.getFilterChain().addLast("threadPool", new
ExecutorFilter(Executors.newFixedThreadPool(10)));
ByteBuffer.setUseDirectBuffers (false);
ByteBuffer.setAllocator(new SimpleByteBufferAllocator());
acceptor.bind( new InetSocketAddress( port ), new
ServerHandler(), cfg );
Trustin Lee wrote:
>
>
> Can I see your code where an IoAcceptor is created and configured?
> There's
> a known issue with unfair I/O as I mentioned in the previous post, so it
> can
> happen somehow. There's a known patch here, but we didn't review it yet.
> The patch is also known to improve performance at some cases, but it
> didn't
> in my case.
>
> http://issues.apache.org/jira/browse/DIRMINA-301
>
> To be more accurate, it depends on the machine you run on and the size of
> a
> message. If your machine is powerful enough, it can perform that good.
> The
> biggest throughput number I heard from a MINA users was around 330k
> msgs/sec, though the number of clients was not 500.
>
>
--
View this message in context:
http://www.nabble.com/optimal-configuration-for-%22diameter-like%22-protocols-tf3394204.html#a9453496
Sent from the mina dev mailing list archive at Nabble.com.