cbalint13 commented on code in PR #18545:
URL: https://github.com/apache/tvm/pull/18545#discussion_r2594790744


##########
docs/how_to/tutorials/e2e_opt_model.py:
##########
@@ -95,13 +95,38 @@
 # leverage MetaSchedule to tune the model and store the tuning logs to the 
database. We also
 # apply the database to the model to get the best performance.
 #
+# The ResNet18 model will be divided into 20 independent tuning tasks during 
compilation.
+# To ensure each task receives adequate tuning resources in one iteration 
while providing
+# early feedback:
+#
+# - To quickly observe tuning progress, each task is allocated a maximum of 16 
trials per
+#   iteration (controlled by ``MAX_TRIALS_PER_TASK=16``). We should set 
``TOTAL_TRIALS``
+#   to at least ``320 (20 tasks * 16 trials)`` ensures every task receives one 
full iteration

Review Comment:
   * I am not sure that a demanding and also imperative statement like "We 
should"  would have it's place here.
   * I would remove "We should set TOTAL_TRIALS to at least ``320 (20 tasks * 
16 trials)`` ensures every task receives one full iteration  of tuning.",  I 
believe it can mislead on the the way metaschedule really works internally. The 
main goal is to allow the metaschedule explorer to unfold (and does not matter 
much if a parameter cover or not the other by division) as much of its 
exploration space as it it is possible.
   
   Something like "We set it here to a lower value of 512 for a quick 
demonstration purpose"  or similar is fair enough. There is already the " # 
Change to 20000 for better performance if needed" which also hints the user how 
to really get good results outside of scope of this "quick" tutorial.
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to