Hi, > -----Original Message----- > From: Caiyin Yang <ycy...@gmail.com> > Sent: Tuesday, August 13, 2024 8:17 AM > To: dev@iotdb.apache.org > Subject: Introduce AINode for IoTDB's Integrated Machine Learning Solution > > Hi all, > I hope this message finds you well. My name is Caiyin Yang, a committer to > the IoTDB community. I'm excited to share that several members of our > community, Minghui Liu, Yong Liu, Guo Qin, Haoran Zhang, Chenyu Li, Hang > Zhou, along with me have collaboratively developed for a new component > that > I believe can significantly enhance the analytical capabilities of IoTDB: > AINode. > AINode is envisioned as a peer to the existing DataNode and ConfigNode, > designed to expand IoTDB's capabilities by allowing the introduction and > execution of external machine learning models through simple SQL > statements. > There are some basic features of AINode: > > 1. Model Loading: AINode will facilitate the seamless importation of > machine learning models into the IoTDB ecosystem. > 2. Model Management: It will provide a robust framework for managing > these models. > 3. Inference Capability: AINode will enable IoTDB to perform inference > operations using the imported models, allowing for advanced data > analysis > directly within the database environment. > > Please let me know your thoughts and if there are any specific concerns or > questions regarding the AINode component. I am looking forward to your > valuable feedback and to contributing further to the IoTDB project. > Thank you for your time and consideration. > Best regards, > Caiyin Yang
Hi Caiyin, This sounds very interesting. I have a few questions about your project. Q1. I can see the benefit of being able to execute models via SQL statements. Does the project also see performance benefits from bringing the AI execution within the TS engine? Perhaps there is some benefit to bringing model execution closer to TS retrievals for example. Q2. AI is a hugely competitive area of course, with a wide range of AI frameworks targeting a wide range of execution h/w, from TinyML up to highly scaled cloud executions. At a high level I wonder how the project plans to support integration of that rapidly changing diversity? Does the project have a short list of frameworks and use cases in mind you will start with? Will there be an AI framework abstraction layer that the community would integrate unsupported frameworks to? In the COVESA Central Data Service Platform project we are doing some related inference work you might find interesting. With BMW Research we are working on integrating Rules based Semantic Reasoning with IoTDB. Although we are not looking to go as far as your project and directly integrate into IoTDB itself. We are focused on bridging the information and knowledge layers in a DIKW pyramid. I think your work is complimentary. The Semantic Rule could use the result of ML inference from AINode for example. Best Wishes, Stephen Lawrence