Croway opened a new pull request, #21602:
URL: https://github.com/apache/camel/pull/21602

   # Description
   
   Adds MCP (Model Context Protocol) client integration to the camel-openai 
component. Users can configure MCP servers directly on the endpoint URI, and 
the component automatically lists tools, converts them to OpenAI format, and 
executes an agentic tool loop.
   
     Example
   ```
     // Automatic tool loop — the model calls tools, the component executes 
them via MCP,
     // feeds results back, and repeats until the model produces a final text 
answer.
     from("direct:chat")
         .to("openai:chat-completion?model=gpt-4"
             + "&mcpServer.fs.transportType=stdio"
             + "&mcpServer.fs.command=npx"
             + 
"&mcpServer.fs.args=-y,@modelcontextprotocol/server-filesystem,/tmp")
         .log("${body}");
   ```
   
   - MCP server configuration via mcpServer. prefix pattern (stdio, SSE, 
streamable HTTP transports)
   - Agentic tool loop with configurable maxToolIterations (default 50)
   - autoToolExecution toggle, when false, raw tool calls are returned for 
manual handling
   - tool-execution operation, executes MCP tool calls as a Camel endpoint, 
enabling pure DSL tool loops with loopDoWhile
   - returnDirect support, tools annotated with returnDirect=true short-circuit 
the loop
   - Conversation memory, full tool call chain (assistant + tool results) 
stored in conversation history
   - Connection recovery (mcpReconnect=true), auto-reconnects on transport 
failures and retries once
   - Configurable timeout (mcpTimeout, default 20s) for all MCP operations
   - Configurable protocol versions (mcpProtocolVersions)
   
   @jamesnetherton I've added `mcp-core` dependency, do you think it may mess 
with camel-quarkus? mcp-core does not have spring transitive dependencies but 
IIRC it uses reactor
   
   @ibek
   
   All the integration tests are successful with `mvn clean verify 
-Dollama.instance.type=openai -Dopenai.endpoint=http://localhost:11434/v1/ 
-Dopenai.model=qwen3-vl:8b-instruct -Dopenai.api.key=dummy 
-Dopenai.embedding.model=embeddinggemma:300m`
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to