Federico Mariani created CAMEL-23500:
----------------------------------------

             Summary: Document camel-openai usage with OpenAI-compatible 
providers (OpenRouter)
                 Key: CAMEL-23500
                 URL: https://issues.apache.org/jira/browse/CAMEL-23500
             Project: Camel
          Issue Type: Task
          Components: camel-openai
            Reporter: Federico Mariani


The camel-openai component supports any OpenAI-compatible API via the baseUrl 
parameter, but the documentation doesn't show how to use it with popular 
third-party providers like OpenRouter, Ollama, or LM Studio.

We should add a section to the *openai-component.adoc* docs with configuration 
examples for common providers. This would help users discover that a separate 
component is not needed and show how to use provider-specific features through 
existing escape hatches (additionalBodyProperty, exchange headers).

Example providers to cover:

- *OpenRouter* (https://openrouter.ai/api/v1) — multi-model gateway with 
provider routing and fallbacks
- *Ollama* (http://localhost:11434/v1) — local LLM server
- *LM Studio* (http://localhost:1234/v1) — local model runner
- *vLLM* (http://localhost:8000/v1) — high-throughput LLM serving engine with 
OpenAI-compatible API

For OpenRouter specifically, document how to:

- Set attribution headers (HTTP-Referer, X-Title) via exchange headers
- Configure provider routing preferences via _additionalBodyProperty_
- Use cross-provider model identifiers (e.g., 
anthropic/claude-sonnet-4-20250514)

Evaluate if an openrouter kamelet would make sense.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to