[jira] [Updated] (MADLIB-1338) DL: Add support for reporting various metrics in fit/evaluate
[ https://issues.apache.org/jira/browse/MADLIB-1338?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Nikhil updated MADLIB-1338: --- Description: The current `madlib_keras.fit()` code reports accuracy as the only metric, along with loss value. But we could ask for different metrics in compile params (`mae`, `binary_accuracy ` etc.), then `Keras.evaluate()` would return back `loss` (by default) and `mean_absolute_error` or `binary_accuracy` (metrics). This JIRA requests support to be able to report any one of these metrics in the output table. Other requirements: 1. Remove training loss/accuracy computation from `fit_transition` and instead use the evaluate function to calculate the training loss/metric. See PR [https://github.com/apache/madlib/pull/388 |https://github.com/apache/madlib/pull/388/files]for more details 2. metric param can be optional 3. Maybe we should rename all the related output column as metric instead of metrics was: The current `madlib_keras.fit()` code reports accuracy as the only metric, along with loss value. But we could ask for different metrics in compile params (`mae`, `binary_accuracy ` etc.), then `Keras.evaluate()` would return back `loss` (by default) and `mean_absolute_error` or `binary_accuracy` (metrics). This JIRA requests support to be able to report any one of these metrics in the output table. Other requirements: 1. Remove training loss/accuracy computation from `fit_transition` and instead use the evaluate function to calculate the training loss/metric. See PR [https://github.com/apache/madlib/pull/388 |https://github.com/apache/madlib/pull/388/files]for more details 2. metric param can be optional 3. Maybe we should rename al the related output column as metric instead of metrics > DL: Add support for reporting various metrics in fit/evaluate > - > > Key: MADLIB-1338 > URL: https://issues.apache.org/jira/browse/MADLIB-1338 > Project: Apache MADlib > Issue Type: New Feature > Components: Deep Learning >Reporter: Nandish Jayaram >Priority: Major > Fix For: v1.16 > > > The current `madlib_keras.fit()` code reports accuracy as the only metric, > along with loss value. But we could ask for different metrics in compile > params (`mae`, `binary_accuracy ` etc.), then `Keras.evaluate()` would return > back `loss` (by default) and `mean_absolute_error` or `binary_accuracy` > (metrics). > This JIRA requests support to be able to report any one of these metrics in > the output table. > Other requirements: > 1. Remove training loss/accuracy computation from `fit_transition` and > instead use the evaluate function to calculate the training loss/metric. See > PR [https://github.com/apache/madlib/pull/388 > |https://github.com/apache/madlib/pull/388/files]for more details > 2. metric param can be optional > 3. Maybe we should rename all the related output column as metric instead of > metrics -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (MADLIB-1338) DL: Add support for reporting various metrics in fit/evaluate
[ https://issues.apache.org/jira/browse/MADLIB-1338?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Nikhil updated MADLIB-1338: --- Description: The current `madlib_keras.fit()` code reports accuracy as the only metric, along with loss value. But we could ask for different metrics in compile params (`mae`, `binary_accuracy ` etc.), then `Keras.evaluate()` would return back `loss` (by default) and `mean_absolute_error` or `binary_accuracy` (metrics). This JIRA requests support to be able to report any one of these metrics in the output table. Other requirements: 1. Remove training loss/accuracy computation from `fit_transition` and instead use the evaluate function to calculate the training loss/metric. See PR [https://github.com/apache/madlib/pull/388 |https://github.com/apache/madlib/pull/388/files]for more details 2. metric param can be optional 3. Maybe we should rename al the related output column as metric instead of metrics was: The current `madlib_keras.fit()` code reports accuracy as the only metric, along with loss value. But we could ask for different metrics in compile params (`mae`, `binary_accuracy ` etc.), then `Keras.evaluate()` would return back `loss` (by default) and `mean_absolute_error` or `binary_accuracy` (metrics). This JIRA requests support to be able to report any one of these metrics in the output table. Other requirements: 1. Remove loss/accuracy computation from `fit_transition`. > DL: Add support for reporting various metrics in fit/evaluate > - > > Key: MADLIB-1338 > URL: https://issues.apache.org/jira/browse/MADLIB-1338 > Project: Apache MADlib > Issue Type: New Feature > Components: Deep Learning >Reporter: Nandish Jayaram >Priority: Major > Fix For: v1.16 > > > The current `madlib_keras.fit()` code reports accuracy as the only metric, > along with loss value. But we could ask for different metrics in compile > params (`mae`, `binary_accuracy ` etc.), then `Keras.evaluate()` would return > back `loss` (by default) and `mean_absolute_error` or `binary_accuracy` > (metrics). > This JIRA requests support to be able to report any one of these metrics in > the output table. > Other requirements: > 1. Remove training loss/accuracy computation from `fit_transition` and > instead use the evaluate function to calculate the training loss/metric. See > PR [https://github.com/apache/madlib/pull/388 > |https://github.com/apache/madlib/pull/388/files]for more details > 2. metric param can be optional > 3. Maybe we should rename al the related output column as metric instead of > metrics -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (MADLIB-1338) DL: Add support for reporting various metrics in fit/evaluate
[ https://issues.apache.org/jira/browse/MADLIB-1338?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Nandish Jayaram updated MADLIB-1338: Description: The current `madlib_keras.fit()` code reports accuracy as the only metric, along with loss value. But we could ask for different metrics in compile params (`mae`, `binary_accuracy ` etc.), then `Keras.evaluate()` would return back `loss` (by default) and `mean_absolute_error` or `binary_accuracy` (metrics). This JIRA requests support to be able to report any one of these metrics in the output table. Other requirements: 1. Remove loss/accuracy computation from `fit_transition`. was: The current {{madlib_keras.fit()}} code reports accuracy as the only metric, along with loss value. But we could ask for different metrics in compile params ({{mae, binary_accuracy}} etc.), then {{Keras.evaluate()}} would return back {{loss}} (by default) and {{mean_absolute_error}} or {{binary_accuracy}} (metrics). This JIRA requests support to report all of these metrics in the output table. Other requirements: Output summary table must have the metrics' labels (instead of just accuracy) Remove loss/accuracy computation from fit_transition. > DL: Add support for reporting various metrics in fit/evaluate > - > > Key: MADLIB-1338 > URL: https://issues.apache.org/jira/browse/MADLIB-1338 > Project: Apache MADlib > Issue Type: New Feature > Components: Deep Learning >Reporter: Nandish Jayaram >Priority: Major > Fix For: v1.16 > > > The current `madlib_keras.fit()` code reports accuracy as the only metric, > along with loss value. But we could ask for different metrics in compile > params (`mae`, `binary_accuracy ` etc.), then `Keras.evaluate()` would return > back `loss` (by default) and `mean_absolute_error` or `binary_accuracy` > (metrics). > This JIRA requests support to be able to report any one of these metrics in > the output table. > Other requirements: > 1. Remove loss/accuracy computation from `fit_transition`. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (MADLIB-1338) DL: Add support for reporting various metrics in fit/evaluate
[ https://issues.apache.org/jira/browse/MADLIB-1338?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Orhan Kislal updated MADLIB-1338: - Description: The current {{madlib_keras.fit()}} code reports accuracy as the only metric, along with loss value. But we could ask for different metrics in compile params ({{mae, binary_accuracy}} etc.), then {{Keras.evaluate()}} would return back {{loss}} (by default) and {{mean_absolute_error}} or {{binary_accuracy}} (metrics). This JIRA requests support to report all of these metrics in the output table. Other requirements: Output summary table must have the metrics' labels (instead of just accuracy) Remove loss/accuracy computation from fit_transition. was: The current {{madlib_keras.fit()}} code reports accuracy as the only metric, along with loss value. But we could ask for multiple metrics in compile params (for eg., {{metrics=['mae','accuracy']}}), then {{Keras.evaluate()}} would return back {{loss}} (by default), {{mean_absolute_error}} and {{accuracy}} (metrics). This JIRA requests support to report all of these metrics in the output table. Other requirements: 1. Output summary table must have a 2-D array to report {{metrics}}. The inner dimension corresponds to all metrics values for the iteration at which it is computed. 1. Output summary table must have the metrics' labels (eg., [mean_absolute_error, accuracy]) Summary: DL: Add support for reporting various metrics in fit/evaluate (was: DL: Add support for reporting multiple metrics in fit/evaluate) > DL: Add support for reporting various metrics in fit/evaluate > - > > Key: MADLIB-1338 > URL: https://issues.apache.org/jira/browse/MADLIB-1338 > Project: Apache MADlib > Issue Type: New Feature > Components: Deep Learning >Reporter: Nandish Jayaram >Priority: Major > Fix For: v1.16 > > > The current {{madlib_keras.fit()}} code reports accuracy as the only metric, > along with loss value. But we could ask for different metrics in compile > params ({{mae, binary_accuracy}} etc.), then {{Keras.evaluate()}} would > return back {{loss}} (by default) and {{mean_absolute_error}} or > {{binary_accuracy}} (metrics). > This JIRA requests support to report all of these metrics in the output table. > Other requirements: > Output summary table must have the metrics' labels (instead of just accuracy) > Remove loss/accuracy computation from fit_transition. -- This message was sent by Atlassian JIRA (v7.6.3#76005)