As the stream schema model, you could refer to:

stream {
    name = "MonitoredStream"
    executor = "MonitoredStream"
    attributes = [
        {
            name = "value",
            type = "double",
            // more attribute properties
        },
        // more attributes definition
    ]
}

As to backend service, you could refer to:

## AlertStreamService: alert streams generated from data source
echo ""
echo "Importing AlertStreamService for HDFS... "
curl -u ${EAGLE_SERVICE_USER}:${EAGLE_SERVICE_PASSWD} -X POST -H
'Content-Type:application/json'
"http://${EAGLE_SERVICE_HOST}:${EAGLE_SERVICE_PORT}/eagle-service/rest/entities?serviceName=AlertStreamService";
-d 
'*[{"prefix":"alertStream","tags":{"dataSource":"hdfsAuditLog","streamName":"hdfsAuditLogEventStream"},"desc":"alert
event stream from hdfs audit log"}]*'

## AlertExecutorService: what alert streams are consumed by alert executor
echo ""
echo "Importing AlertExecutorService for HDFS... "
curl -u ${EAGLE_SERVICE_USER}:${EAGLE_SERVICE_PASSWD} -X POST -H
'Content-Type:application/json'
"http://${EAGLE_SERVICE_HOST}:${EAGLE_SERVICE_PORT}/eagle-service/rest/entities?serviceName=AlertExecutorService";
-d 
'*[{"prefix":"alertExecutor","tags":{"dataSource":"hdfsAuditLog","alertExecutorId":"hdfsAuditLogAlertExecutor","streamName":"hdfsAuditLogEventStream"},"desc":"alert
executor for hdfs audit log event stream"}]*'

## AlertStreamSchemaService: schema for event from alert stream
echo ""
echo "Importing AlertStreamSchemaService for HDFS... "
curl -u ${EAGLE_SERVICE_USER}:${EAGLE_SERVICE_PASSWD} -X POST -H
'Content-Type:application/json'
"http://${EAGLE_SERVICE_HOST}:${EAGLE_SERVICE_PORT}/eagle-service/rest/entities?serviceName=AlertStreamSchemaService";
-d 
'*[{"prefix":"alertStreamSchema","tags":{"dataSource":"hdfsAuditLog","streamName":"hdfsAuditLogEventStream","attrName":"src"},"attrDescription":"source
directory or file, such as
/tmp","attrType":"string","category":"","attrValueResolver":"eagle.service.security.hdfs.resolver.HDFSResourceResolver"},{"prefix":"alertStreamSchema","tags":{"dataSource":"hdfsAuditLog","streamName":"hdfsAuditLogEventStream","attrName":"dst"},"attrDescription":"destination
directory, such as
/tmp","attrType":"string","category":"","attrValueResolver":"eagle.service.security.hdfs.resolver.HDFSResourceResolver"},{"prefix":"alertStreamSchema","tags":{"dataSource":"hdfsAuditLog","streamName":"hdfsAuditLogEventStream","attrName":"host"},"attrDescription":"hostname,
such as
localhost","attrType":"string","category":"","attrValueResolver":""},{"prefix":"alertStreamSchema","tags":{"dataSource":"hdfsAuditLog","streamName":"hdfsAuditLogEventStream","attrName":"timestamp"},"attrDescription":"milliseconds
of the
datetime","attrType":"long","category":"","attrValueResolver":""},{"prefix":"alertStreamSchema","tags":{"dataSource":"hdfsAuditLog","streamName":"hdfsAuditLogEventStream","attrName":"allowed"},"attrDescription":"true,
false or
none","attrType":"bool","category":"","attrValueResolver":""},{"prefix":"alertStreamSchema","tags":{"dataSource":"hdfsAuditLog","streamName":"hdfsAuditLogEventStream","attrName":"user"},"attrDescription":"process
user","attrType":"string","category":"","attrValueResolver":""},{"prefix":"alertStreamSchema","tags":{"dataSource":"hdfsAuditLog","streamName":"hdfsAuditLogEventStream","attrName":"cmd"},"attrDescription":"file/directory
operation, such as getfileinfo, open, listStatus and so
on","attrType":"string","category":"","attrValueResolver":"eagle.service.security.hdfs.resolver.HDFSCommandResolver"},{"prefix":"alertStreamSchema","tags":{"dataSource":"hdfsAuditLog","streamName":"hdfsAuditLogEventStream","attrName":"sensitivityType"},"attrDescription":"mark
such as AUDITLOG,
SECURITYLOG","attrType":"string","category":"","attrValueResolver":"eagle.service.security.hdfs.resolver.HDFSSensitivityTypeResolver"},{"prefix":"alertStreamSchema","tags":{"dataSource":"hdfsAuditLog","streamName":"hdfsAuditLogEventStream","attrName":"securityZone"},"attrDescription":"","attrType":"string","category":"","attrValueResolver":""}]*
'

Regards,
Hao


On Tue, Nov 10, 2015 at 4:29 PM, 蒋吉麟 <smith3...@gmail.com> wrote:

> Agree. Use api create stream schema is not user friendly. If not familiar
> with create entity structure, it's not easy to use. I think we can design
> the UI for simple add the stream schema. :)
>
> 2015-11-10 16:06 GMT+08:00 Hao Chen <h...@apache.org>:
>
> > *Jirap*
> >
> > https://issues.apache.org/jira/browse/EAGLE-5
> >
> > *Use Cases*
> >
> > Currently Eagle supports very complex data processing pipeline for hadoop
> > audit/security logs,  but I think we reuse some valuable components in
> > Eagle:
> >
> > 1) distributed policy engine
> >
> > 2) highly abstracted streaming program API
> >
> > 3) user-friendly policy & alert management UI
> >
> >  for more general cases like what traditional monitoring produces do.
> >
> > *Use Case One:* For example, as to ops team like DBA, Hardware or Cloud
> > team, lots of users just would assume that given monitoring data format
> is
> > known like typical time series data points, in such case, user just need
> to
> > tell Eagle what's the stream schema and preprocess the data into kafka
> with
> > external program like scripts or agents, and Eagle could provide a
> generic
> > topology to monitor the stream without any programming, just like most
> > traditional monitoring products' paradigm.
> >
> > *Use Case Two:  *Some advanced users with development skill may want to
> > easily use Eagle streaming program API for process complex monitoring
> data
> > like some complex logs, connect to Eagle's metadata engine for managing
> > policy in UI, and execute the policy in eagle distributed policy engine
> in
> > real-time.
> >
> > *Design*
> >
> > So that we need to do following works:
> >
> > 1. Implement a generic pipeline topology as starting like: Kafka ->
> > EventParser(JSON) -> Metadata Manager -> PolicyEngine which could be
> reused
> > for lots of  simple use cases like metrics monitoring.
> >
> > 2. Allow to import or design stream schema in UI. Today, we only assume
> the
> > stream schema is already defined in databases, but as to most general
> > cases, we should allow to define stream schema by eagle tool like UI for
> > more generic purpose.
> >
> > 3. For advanced users like developers, we should make our streaming
> > framework more easy for use. One of the most critical parts is developers
> > has to define stream schema in database, and write code independently. In
> > fact, for most cases like hadoop security monitoring, the schema will
> never
> > change independently, in such case, we should even define the stream
> schema
> > in code, even we could also define policy inline as well, so we could run
> > the eagle monitoring engine without metadata store (hbase).
> >
> > Regards,
> > Hao Chen
> >
>

Reply via email to