Hi there,

i have a problem with a camel route using a netty consumer.

There is a deadlock in case that the socket connection is lost while sending
data.
I made two little applications that can reproduce the problem: 

Used Camel version is 2.16.1

*1. A Socket server that will close in the middle of the communication *

import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.ServerSocket;
import java.net.Socket;

public class InterruptingSocketServer {
    
    public static void main(String[] args) {
        int listenPort = 2022;
        try (ServerSocket serverSocket = new ServerSocket(listenPort);
                Socket clientSocket = serverSocket.accept();
                BufferedReader in = new BufferedReader(
                        new
InputStreamReader(clientSocket.getInputStream()));) {
            int value = -1;
            while (value < 90) {
                value = in.read();
                System.out.println(value);
            }
        } 
        catch (IOException e) {
            System.out.println(e.toString());
            System.exit(1);
        }
        System.exit(0);
    }
}

*2. A camel route using a netty consumer to send messages to the server*

import org.apache.camel.CamelContext;
import org.apache.camel.ProducerTemplate;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.impl.DefaultCamelContext;

public class NettyDeadlock {

    public static void main(String[] args) {
        
        CamelContext context = new DefaultCamelContext();

        try {
            context.addRoutes(new RouteBuilder() {
                public void configure() {
                    from("direct:in")
                   
.to("netty4:tcp://127.0.0.1:2022?clientMode=true&textline=false&reuseAddress=false&sync=false");
                }
            });
            
            ProducerTemplate template = context.createProducerTemplate();
            context.start();
            
            for(Integer i = 0; i< 100; i++){
            template.sendBody("direct:in",i);
            }
        } catch (Exception e) {
            e.printStackTrace();
        }
        System.exit(0);
    }
}


We always end up in a deadlock with the following callstack 

Thread [main] (Suspended)       
        Unsafe.park(boolean, long) line: not available [native method]  
        LockSupport.park(Object) line: 175      
        CountDownLatch$Sync(AbstractQueuedSynchronizer).parkAndCheckInterrupt()
line: 836       

CountDownLatch$Sync(AbstractQueuedSynchronizer).doAcquireSharedInterruptibly(int)
line: 997       

CountDownLatch$Sync(AbstractQueuedSynchronizer).acquireSharedInterruptibly(int)
line: 1304      
        CountDownLatch.await() line: 231        
        DefaultAsyncProcessorAwaitManager.await(Exchange, CountDownLatch) line: 
74      
        AsyncProcessorHelper.process(AsyncProcessor, Exchange) line: 122        
        UnitOfWorkProducer.process(Exchange) line: 68   
        ProducerCache$2.doInProducer(Producer, Exchange, ExchangePattern) line: 
412     
        ProducerCache$2.doInProducer(Producer, Exchange, ExchangePattern) line: 
380     
        ProducerCache.doInProducer(Endpoint, Exchange, ExchangePattern,
ProducerCallback<T>) line: 270  
        ProducerCache.sendExchange(Endpoint, ExchangePattern, Processor, 
Exchange)
line: 380       
        ProducerCache.send(Endpoint, Processor) line: 221       
        DefaultProducerTemplate.send(Endpoint, Processor) line: 124     
        DefaultProducerTemplate.sendBody(Endpoint, Object) line: 137    
        DefaultProducerTemplate.sendBody(String, Object) line: 144      
        NettyDeadlock.main(String[]) line: 26   




--
View this message in context: 
http://camel.465427.n5.nabble.com/Netty-deadlock-during-connection-interrupt-tp5778317.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Reply via email to