bobbai00 opened a new issue, #4034:
URL: https://github.com/apache/texera/issues/4034

   ### Feature Summary
   
   Texera has over 130 operators. For users who are not familiar data science 
and workflow, it is hard for them to get started on using Texera.
   
   Large language models are becoming widely-used in many applications as 
copilots to help users clarify system functionalities and finish certain tasks. 
   
   Therefore, it would be great if Texera can incorporate LLMs as workflow 
copilots, helping users understand workflow functionalities and construct 
workflows.
   
   ### Proposed Solution or Design
   
   ### System Architecture
   At high level, here is the system architecture with LLM copilot:
   
   <img width="1884" height="968" alt="Image" 
src="https://github.com/user-attachments/assets/b7eb6c56-52bc-4b18-bf1f-d53e5c74c8a2";
 />
   
   Three major components should be made:
   
   ### 1. Frontend: Agent Panel
   A panel where users can CRUD agents, and interact with agent through chat 
interface.
   
   ### 2. Backend: LLM Proxy Service
   A new micro service where
   - API keys of different LLM providers are managed
   - Request and response are handled and monitored
   
   
[litellm](https://docs.litellm.ai/#when-to-use-litellm-proxy-server-llm-gateway)
 is a very good out-of-box option for such functionalities.
   
   ### 3. Backend: Access control
   Access control logic to authentic LLM-related traffic. Adding the logic to 
the existing `AccessControlService` should be good enough.
   
   
   
   ### Impact / Priority
   
   (P1)High – significantly improves user experience
   
   ### Affected Area
   
   Workflow UI


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to