imbajin commented on issue #183:
URL: 
https://github.com/apache/incubator-hugegraph-ai/issues/183#issuecomment-2690318285

   > 
[@Aryankb](https://github.com/Aryankb?rgh-link-date=2025-02-28T07%3A12%3A17.000Z)
 I think most of the people won't be proficient enough to write their own 
queries I have worked quite a bit with graph rag in my intern and at first even 
I had a bit trouble in writing those. So, I would suggest that we can get a 
description for the knowledge that the user will be providing us if they don't, 
we will by default use a LLM to get what the knowledge or text is about and 
then make an agent write the query for us and use that query ? 
[@imbajin](https://github.com/imbajin?rgh-link-date=2025-02-28T07%3A12%3A17.000Z)
 sir what is your opinion on this?
   
   @chiruu12 @Aryankb 
   First, regarding the `text2gql` part, it is an independent matter, and I 
understand that it is not strongly related to the selection of agentic frame or 
workflow impl.
   
   Here is a brief description of the actual situation. Our implementation and 
approach earlier was to use both model fine-tuning and **user templates** 
simultaneously. (see it ↓ By default, we use the GQL query template to optimize 
the effect of text2gql.)
   
   <img width="1610" alt="Image" 
src="https://github.com/user-attachments/assets/fc278898-4cbf-46b4-8d4d-90dcf0e7df6d";
 />
   
    General encoder model fine-tuning for `7-14B` can be a significant task, 
especially when it comes to how to generate GQL corpus (HG uses Gremlin queries 
by default and is compatible with most of the Cypher syntax), refer 
[wiki](https://github.com/apache/incubator-hugegraph-ai/wiki/HugeGraph-LLM-Roadmap#4-graph-query-core-1)
 to get more context 
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to