juliusshufan opened a new pull request #9805: Enable the reporting of 
cross-entropy or nll loss value when training CNN network using the models 
defined by example/image-classification
URL: https://github.com/apache/incubator-mxnet/pull/9805
 
 
   ## Description ##
   MxNET already implements the loss computation at Python layer (in 
python/mxnet/metric.py)
   For most of the CNN models used for image-classification, as softmax is 
commonly used as the output, the cross-entropy or likelihood loss value is 
helpful to monitor the convergence trend during training.
   The current implementation of example/image-classification/fit.py already 
provides an extensible implementation to report some useful information during 
training, such as accuracy. This PR utilizes this design and enabling the 
report of cross-entropy or nll loss value during training.
   
   My understanding is: the cross-entropy loss and (negative) likelihood loss 
are most common used for softmax output, and therefore I choose reporting 
either cross-entropy or negative likelihood loss as long as the output layer is 
the softmax.
   
   The current implementation of this PR is add a string-type argument, that is 
the "--loss", and the corresponding type loss will be reported accordingly. 
_(if the user want to report more than one type of loss value, it can be a 
list, such as 'ce, nll'. )_
   
   ## Checklist ##
   ### Essentials ###
   - [ ] Passed code style checking (`make lint`)
   - [ ] Changes are complete (i.e. I finished coding on this PR)
   - [ ] All changes have test coverage:
   - Unit tests are added for small changes to verify correctness (e.g. adding 
a new operator)
   - Nightly tests are added for complicated/long-running ones (e.g. changing 
distributed kvstore)
   - Build tests will be added for build configuration changes (e.g. adding a 
new build option with NCCL)
   - [ ] Code is well-documented: 
   - For user-facing API changes, API doc string has been updated. 
   - For new C++ functions in header files, their functionalities and arguments 
are documented. 
   - For new examples, README.md is added to explain the what the example does, 
the source of the dataset, expected performance on test set and reference to 
the original paper if applicable
   - [ ] To the my best knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   
   ### Changes ###
   All the code changes are in example/image-classification/common/fit.py
   A new argument '--loss' is introduced, and it can be set as either a string 
or a list corresponding to type(s) of the loss value accordingly.
   Taking the example/image-classification/train_cifar10.py as an example, the 
cross-entropy loss value will be reported if the below bold code line added,
   parser.set_defaults(
       ......
       **loss = 'ce'**
       ......
   )
   
   The output log will be the following pattern:
   
   INFO:root:Epoch[0] Batch [380] Speed: 504.42 samples/sec accuracy=0.087109 
top_k_accuracy_5=0.469531 **cross-entropy=2.305971**
   INFO:root:Epoch[0] Train-accuracy=0.098437
   INFO:root:Epoch[0] Train-top_k_accuracy_5=0.506250
   INFO:root:Epoch[0] **Train-cross-entropy=2.304529**
   INFO:root:Epoch[0] Time cost=100.989
   INFO:root:Epoch[0] Validation-accuracy=0.098101
   INFO:root:Epoch[0] Validation-top_k_accuracy_5=0.486946
   INFO:root:Epoch[0] **Validation-cross-entropy=2.302873**
   
   ## Comments ##
   - If this change is a backward incompatible change, why must this change be 
made.
   - Interesting edge cases to note here
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to