Hi Steven,

I think the design looks very good and I am looking forward to use the feature.

I have one more question. Where is the code for the specific sink running?
Do we have to add a specific StreamPipes sink within the IoTDB code base or can 
this API be used in the client?


Philipp

> On 2. Apr 2020, at 04:00, Steve Yurong Su <[email protected]> wrote:
> 
> Hi Philipp,
> 
> I am currently working on [IOTDB-516].
> 
> The data sink module in my proposal is actually designed for forwarding 
> events.
> When users want to write a program to push data to somewhere after the
> trigger processing is done,
> the data sink module could help. By implementing the interfaces
> provided by the data sink module,
> triggers could push the data to brokers, or directly push the data to
> a specific stream processing engine (e.g., StreamPipes).
> We would implement interfaces for some data sinks, and then users only
> need to override data-related methods
> to implement data forwarding.
> 
> In addition, the triggers would provide Java hooks which allow users
> to get the data changes and
> to register a callback to process the changes.
> 
> Therefore, the adapter could be built like this:
> 1. subscribe to data changes by registering a trigger in IoTDB
> 2. forward data by implementing the data sink APIs
> 
> I don't know if I've answered your question. What do you think of the design? 
> :)
> 
> Steve
> 
> Philipp Zehnder <[email protected]> 于2020年4月1日周三 上午5:02写道:
>> 
>> Hi Xiangdong,
>> 
>> yes I think this feature is exactly what we need.
>> 
>> I also had a look at the proposal in the comments and it looks very 
>> interesting.
>> 
>> I have one question regarding the sinks. Do you also plan to integrate a 
>> programming API (e.g. in Java) or do you plan to rely on broker technologies?
>> For us it would be good to have a Java API, which registers the query and a 
>> callback receiving the events.
>> Alternatively, we could use a message broker to forward the events into the 
>> StreamPipes adapter, but then we would depend on another service.
>> What do you think about that?
>> 
>> Philipp
>> 
>> 
>>> On 23. Mar 2020, at 15:17, Xiangdong Huang <[email protected]> wrote:
>>> 
>>> Hi Philipp,
>>> 
>>> That is really really big good news!
>>> 
>>>> I think data set integration should be straightforward, but what about
>>> the data stream adapter? Is it possible to subscribe to changes in IoTDB?
>>> 
>>> Yes the data set adapter is straightforward.
>>> 
>>> As for the data stream adapter....
>>> 
>>>> Does this trigger have to be set in the database or can a client
>>> subscribe to changes?
>>> 
>>> We are doing that! Have a look about issue [1]. It is called trigger in
>>> IoTDB.
>>> In our design, if a user registers a trigger for a time series and claims
>>> to send new data points to somewhere (e.g., Streampipes),
>>> then, you can get the changes of data on that timeseries in stream way.
>>> It is just a design and we have not implement it now.
>>> How do you think about this?
>>> 
>>> [1]
>>> https://issues.apache.org/jira/projects/IOTDB/issues/IOTDB-516?filter=allopenissues
>>> 
>>> Best,
>>> -----------------------------------
>>> Xiangdong Huang
>>> School of Software, Tsinghua University
>>> 
>>> 黄向东
>>> 清华大学 软件学院
>>> 
>>> 
>>> Philipp Zehnder <[email protected]> 于2020年3月23日周一 下午8:17写道:
>>> 
>>>> Hi Xiangdong,
>>>> 
>>>> I come from the StreamPipes community and I also think it's a great idea
>>>> to work together.
>>>> We have already integrated IoTDB as a data sink in StreamPipes. Users can
>>>> model their analytics pipelines and write the results directly into IoTDB.
>>>> The next step would be to also get data from IoTDB, then we can read data,
>>>> do stream calculations and write the results back.
>>>> 
>>>> We have already integrated several adapters for databases (e.g. InfluxDB,
>>>> mysql, ...) [1]. I would like to add such an adapter for IoTDB.
>>>> Usually we distinguish between dataset and datastream adapters. With a
>>>> data set adapter we poll the data once from the database and stream it
>>>> through the pipeline.
>>>> With data stream adapters we start the adapter and continuously read and
>>>> process events.
>>>> I think data set integration should be straightforward, but what about the
>>>> data stream adapter? Is it possible to subscribe to changes in IoTDB?
>>>> 
>>>> In [2] it is described that a user must add a trigger. Does this trigger
>>>> have to be set in the database or can a client subscribe to changes?
>>>> 
>>>> Philipp
>>>> 
>>>> [1]
>>>> https://github.com/apache/incubator-streampipes-extensions/tree/dev/streampipes-connect-adapters/streampipes-connect-adapter/src/main/java/org/apache/streampipes/connect/adapters
>>>> <
>>>> https://github.com/apache/incubator-streampipes-extensions/tree/dev/streampipes-connect-adapters/streampipes-connect-adapter/src/main/java/org/apache/streampipes/connect/adapters
>>>>> 
>>>> [2]
>>>> https://issues.apache.org/jira/projects/IOTDB/issues/IOTDB-516?filter=allopenissues
>>>> <
>>>> https://issues.apache.org/jira/projects/IOTDB/issues/IOTDB-516?filter=allopenissues
>>>>> 
>>>> 
>>>>> On 20. Mar 2020, at 10:36, Julian Feinauer <[email protected]>
>>>> wrote:
>>>>> 
>>>>> Hi Xiangdong,
>>>>> 
>>>>> very nice to share it here!
>>>>> Looking forward to Prometheus and Streampipes!
>>>>> Andi f I can support with DBCP (or commons pool2 or whatever, I'm happy
>>>> to help!).
>>>>> 
>>>>> Julian
>>>>> 
>>>>> Am 20.03.20, 02:55 schrieb "Xiangdong Huang" <[email protected]>:
>>>>> 
>>>>>  Hi all,
>>>>> 
>>>>>  I come from IoTDB community.
>>>>>  Following Christofer's suggestion, I'd like to share the process of
>>>> the
>>>>>  integration work between IoTDB and some other Apache projects to let
>>>> users
>>>>>  manage time-series data easier.
>>>>> 
>>>>>  1. PLC4X (done)
>>>>>  Under the help of Julian and other guys, PLC4X has provided an
>>>> example to
>>>>>  write data directly to IoTDB [1].
>>>>> 
>>>>>  2. Prometheus (just beginning)
>>>>>  Prometheus is a popular data collection and event alert system for
>>>> many
>>>>>  applications (but maybe not so popular for IoT).
>>>>>  We'd like to integrate Prometheus with IoTDB, by replacing
>>>> Prometheus's
>>>>>  data store with IoTDB [2].
>>>>> 
>>>>>  3. Flink and RocketMQ (in-progress)
>>>>>  Both Flink and rocketMQ are message queues and in many applications,
>>>> they
>>>>>  are the entrances of data and IoTDB is behind them [3] [4] [5].
>>>>> 
>>>>>  4. MiniFi (just beginning)
>>>>>  MiniFi is a dataflow management system. We'd like to integrate IoTDB
>>>> with
>>>>>  it to allow: write data from a processor to IoTDB, and consume data
>>>> from
>>>>>  IoTDB to other processors [6].
>>>>> 
>>>>>  5. Streampipes (just beginning)
>>>>>  We'd like to add trigger function in IoTDB to allow publish alerts,
>>>> or do
>>>>>  some stream calculation. A current possible solution is integrating
>>>> with
>>>>>  StreamPipes [8].
>>>>> 
>>>>>  IoTDB also begin to integrate with some other projects (which are
>>>> irrelated
>>>>>  to IoT) but make it friendly to use IoTDB. like:
>>>>> 
>>>>>  6. Calcite (almost done)
>>>>>  Calcite provides a Standard SQL language to IoTDB, which make it easy
>>>> to
>>>>>  use. A pr is open and ask for code review [7].
>>>>> 
>>>>>  7. Zeppelin (in-progress)
>>>>>  Zeppelin gives us a web-based GUI to let users operate IoTDB
>>>> interactively
>>>>>  [9].
>>>>> 
>>>>>  8. DataBase Connection Pool (just beginning)
>>>>>  with a DBCP, developers do not need to write too many codes in their
>>>>>  business logic codes.
>>>>> 
>>>>>  Those are what we can consider.
>>>>>  We also would like to hear more ideas to make it easier to use IoTDB
>>>> in IoT
>>>>>  applications.
>>>>>  Welcome to join us if you are interested in some integration ideas.
>>>>> 
>>>>>  [1]
>>>>> 
>>>> https://github.com/apache/plc4x/tree/develop/plc4j/examples/hello-integration-iotdb
>>>>>  [2] https://issues.apache.org/jira/projects/IOTDB/issues/IOTDB-519
>>>>>  [3] https://issues.apache.org/jira/browse/COMDEV-350
>>>>>  [4] https://issues.apache.org/jira/browse/IOTDB-560
>>>>>  [5] https://github.com/apache/incubator-iotdb/pull/817
>>>>>  [6] https://issues.apache.org/jira/projects/IOTDB/issues/IOTDB-518
>>>>>  [7] https://github.com/apache/incubator-iotdb/pull/902
>>>>>  [8] https://issues.apache.org/jira/projects/IOTDB/issues/IOTDB-516
>>>>>  [9] https://issues.apache.org/jira/projects/IOTDB/issues/IOTDB-515
>>>>> 
>>>>>  Best,
>>>>>  -----------------------------------
>>>>>  Xiangdong Huang
>>>>>  School of Software, Tsinghua University
>>>>> 
>>>>>   黄向东
>>>>>  清华大学 软件学院
>>>>> 
>>>>> 
>>>>> 
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: [email protected]
>>>>> For additional commands, e-mail: [email protected]
>>>>> 
>>>> 
>>>> 
>>>> 
>> 
>> 
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [email protected]
>> For additional commands, e-mail: [email protected]
>> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
> 

.........................................................
M. Sc. Philipp Zehnder
Wissenschaftlicher Mitarbeiter | Research Scientist
Information Process Engineering (IPE)
 
FZI Forschungszentrum Informatik
Haid-und-Neu-Str. 10–14 
76131 Karlsruhe, Germany
Tel.: +49 721 9654-805
Fax: +49 721 9654-806

[email protected] <mailto:[email protected]>
https://www.fzi.de/mitarbeiter/philipp-zehnder
 
.........................................................
FZI Forschungszentrum Informatik
Stiftung des bürgerlichen Rechts
Stiftung Az: 14-0563.1 Regierungspräsidium Karlsruhe
Vorstand: Prof. Dr. Andreas Oberweis, Jan Wiesenberger, Prof. Dr.-Ing. J. 
Marius Zöllner
Vorsitzender des Kuratoriums: Ministerialdirigent Günther Leßnerkraus
.........................................................

Attachment: smime.p7s
Description: S/MIME cryptographic signature

Reply via email to