Re: [I] docs: plugin ai-proxy-multi, follow the official document does not work [apisix]
Plus-L closed issue #12142: docs: plugin ai-proxy-multi, follow the official document does not work URL: https://github.com/apache/apisix/issues/12142 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] docs: plugin ai-proxy-multi, follow the official document does not work [apisix]
Plus-L commented on issue #12142:
URL: https://github.com/apache/apisix/issues/12142#issuecomment-2812087946
> [@Plus-L](https://github.com/Plus-L) , Yes , from new version source code
, instances required.
>
> can you try this format:
>
> ```
> "override": {
> "endpoint":
"https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions";
>}
> ```
>
> replace
>
> ```
> "endpoint": "https://openrouter.ai/api/v1/chat/completions";
> ```
Thank you for your reply! After adding "override", the plugin works normally.
辛苦及时更新文档哦,避免其他人遇到相同的问题
Would you please kindly update the document in a timely manner to prevent
others from encountering the same problem?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
Re: [I] docs: plugin ai-proxy-multi, follow the official document does not work [apisix]
hanqingwu commented on issue #12142: URL: https://github.com/apache/apisix/issues/12142#issuecomment-2811944674  -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] docs: plugin ai-proxy-multi, follow the official document does not work [apisix]
hanqingwu commented on issue #12142:
URL: https://github.com/apache/apisix/issues/12142#issuecomment-2811937794
@Plus-L , Yes , from new version source code , instances required.
can you try this:
```
"override": {
"endpoint":
"https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions";
}
```
replace
```
"endpoint": "https://openrouter.ai/api/v1/chat/completions";
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
[I] docs: plugin ai-proxy-multi, follow the official document does not work [apisix]
Plus-L opened a new issue, #12142:
URL: https://github.com/apache/apisix/issues/12142
### Current State
Hello, I encountered an issue while using the APISIX plugin ai-proxy-multi.
Following the documentation resulted in an error message
https://apisix.apache.org/zh/docs/apisix/plugins/ai-proxy-multi/#send-request-to-an-openai-compatible-llm
curl "http://127.0.0.1:9180/apisix/admin/routes"; -X PUT \
-H "X-API-KEY: ${ADMIN_API_KEY}" \
-d '{
"id": "ai-proxy-multi-route",
"uri": "/anything",
"methods": ["POST"],
"plugins": {
"ai-proxy-multi": {
"providers": [
{
"name": "openai-compatible",
"model": "qwen-plus",
"weight": 1,
"priority": 1,
"auth": {
"header": {
"Authorization": "Bearer '"$OPENAI_API_KEY"'"
}
},
"override": {
"endpoint":
"https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions";
}
}
],
"passthrough": false
}
},
"upstream": {
"type": "roundrobin",
"nodes": {
"httpbin.org": 1
}
}
}'
run this, response:
{
"error_msg": "failed to check the configuration of plugin ai-proxy-multi
err: property \"instances\" is required"
}
### Desired State
Based on the error message, I changed providers to instances and configured
providers within a single plugin
{
"id": "ai-proxy-multi-route",
"uri": "/chat/completions",
"methods": [
"POST"
],
"plugins": {
"ai-proxy-multi": {
"instances": [
{
"name": "openai-compatible",
"provider": "openai-compatible",
"model": "deepseek/deepseek-chat-v3-0324:free",
"weight": 1,
"priority": 1,
"auth": {
"header": {
"Authorization": "Bearer xxx"
}
},
"endpoint":
"https://openrouter.ai/api/v1/chat/completions";
}
],
"passthrough": false
}
},
"upstream": {
"type": "roundrobin",
"scheme": "https",
"nodes": {
"openrouter.ai:443": 1
}
}
}
After modifying it like this, the route creation was successful, but when
calling chat/completeness, an error occurred:
```
2025/04/15 08:46:50 [error] 51#51: *1514703 lua entry thread aborted:
runtime error: .../local/apisix//deps/share/lua/5.1/resty/http_connect.lua:179:
attempt to concatenate local 'request_host' (a nil value)
stack traceback:
coroutine 0:
.../local/apisix//deps/share/lua/5.1/resty/http_connect.lua: in
function 'connect'
/usr/local/apisix/apisix/plugins/ai-drivers/openai-base.lua:78: in
function 'request'
/usr/local/apisix/apisix/plugins/ai-proxy/base.lua:47: in function
'phase_func'
/usr/local/apisix/apisix/plugin.lua:1205: in function 'common_phase'
/usr/local/apisix/apisix/init.lua:458: in function 'handle_upstream'
/usr/local/apisix/apisix/init.lua:723: in function 'http_access_phase'
access_by_lua(nginx.conf:323):2: in main chunk, client: , server:
_, request: "POST /chat/completions HTTP/1.1", host: ""
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
