kayx23 commented on code in PR #12518:
URL: https://github.com/apache/apisix/pull/12518#discussion_r2281450983


##########
docs/en/latest/plugins/ai-proxy-multi.md:
##########
@@ -935,3 +935,71 @@ curl "http://127.0.0.1:9180/apisix/admin/routes"; -X PUT \
 ```
 
 For verification, the behaviours should be consistent with the verification in 
[active health checks](../tutorials/health-check.md).
+
+### Include LLM Information in Access Log
+
+The following example demonstrates how you can log LLM request related 
information in the gateway's access log to improve analytics and audit. The 
following variables are available:
+
+* `request_type`: Type of request, where the value could be 
`traditional_http`, `ai_chat`, or `ai_stream`.
+* `llm_time_to_first_token`: Duration from request sending to the first token 
received from the LLM service, in milliseconds.
+* `llm_model`: LLM model.
+* `llm_prompt_tokens`: Number of tokens in the prompt.
+* `llm_completion_tokens`: Number of chat completion tokens in the prompt.
+
+:::note
+
+The usage in this example will become available in APISIX 3.13.0.

Review Comment:
   remove this note pls



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscr...@apisix.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to