[ 
https://issues.apache.org/jira/browse/MADLIB-1338?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nikhil updated MADLIB-1338:
---------------------------
    Description: 
The current `madlib_keras.fit()` code reports accuracy as the only metric, 
along with loss value. But we could ask for different metrics in compile params 
(`mae`, `binary_accuracy ` etc.), then `Keras.evaluate()` would return back 
`loss` (by default) and `mean_absolute_error` or `binary_accuracy` (metrics).
 This JIRA requests support to be able to report any one of these metrics in 
the output table.
 Other requirements:
 1. Remove training loss/accuracy computation from `fit_transition` and instead 
use the evaluate function to calculate the training loss/metric. See PR 
[https://github.com/apache/madlib/pull/388 
|https://github.com/apache/madlib/pull/388/files]for more details

2. metric param can be optional

3. Maybe we should rename al the related output column as metric instead of 
metrics

  was:
The current `madlib_keras.fit()` code reports accuracy as the only metric, 
along with loss value. But we could ask for different metrics in compile params 
(`mae`, `binary_accuracy ` etc.), then `Keras.evaluate()` would return back 
`loss` (by default) and `mean_absolute_error` or `binary_accuracy` (metrics).
This JIRA requests support to be able to report any one of these metrics in the 
output table.
Other requirements:
1. Remove loss/accuracy computation from `fit_transition`.


> DL: Add support for reporting various metrics in fit/evaluate
> -------------------------------------------------------------
>
>                 Key: MADLIB-1338
>                 URL: https://issues.apache.org/jira/browse/MADLIB-1338
>             Project: Apache MADlib
>          Issue Type: New Feature
>          Components: Deep Learning
>            Reporter: Nandish Jayaram
>            Priority: Major
>             Fix For: v1.16
>
>
> The current `madlib_keras.fit()` code reports accuracy as the only metric, 
> along with loss value. But we could ask for different metrics in compile 
> params (`mae`, `binary_accuracy ` etc.), then `Keras.evaluate()` would return 
> back `loss` (by default) and `mean_absolute_error` or `binary_accuracy` 
> (metrics).
>  This JIRA requests support to be able to report any one of these metrics in 
> the output table.
>  Other requirements:
>  1. Remove training loss/accuracy computation from `fit_transition` and 
> instead use the evaluate function to calculate the training loss/metric. See 
> PR [https://github.com/apache/madlib/pull/388 
> |https://github.com/apache/madlib/pull/388/files]for more details
> 2. metric param can be optional
> 3. Maybe we should rename al the related output column as metric instead of 
> metrics



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to