apeforest commented on issue #16735: Use single-bit for mask in dropout operator URL: https://github.com/apache/incubator-mxnet/pull/16735#issuecomment-584766500 @TaoLv Thanks for your review and suggestion. Using blocking does help: ``` [{'Dropout': [{'avg_time_Dropout': 2.9556092573329806, 'p50_time_Dropout': 2.9536145739257336, 'p90_time_Dropout': 2.9735330026596785, 'p99_time_Dropout': 3.0410749791190033, 'inputs': {'data': (1024, 1024)}}]}] ``` Here is my performance script: ``` #!/usr/bin/python import mxnet as mx from mxnet import nd from benchmark.opperf.utils.benchmark_utils import run_performance_test mx.random.seed(17) res = run_performance_test(nd.Dropout, run_backward=True, dtype='float32', ctx=mx.cpu(), inputs=[ {"data" : (1024, 1024)} ], warmup=20, runs=100, profiler='python') print(res) ``` I am not in favor of adding an option. It exposes internal implementation (e.g. using bit mask or not) to users and is also counterintuitive (why not use bit-mask). If sacrificing performance (to some extent) can help improve usability, I think we need to consider the trade off.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services