Hello all,

In the context of https://github.com/ghc-proposals/ghc-proposals/pull/500,
I am looking for a strategy to measure the impact of resolving modules
dependencies using a lexical scan during the downsweep phase.

For example, in this branch: 
https://gitlab.haskell.org/TristanCacqueray/ghc/-/tree/make-lexical-analysis
I've added a tokenizer pass in the GHC.Driver.Make.getPreprocessedImports 
function,
and using `/usr/bin/time -o /dev/stdout -v _build/stage1/bin/ghc 
compiler/GHC/Tc/TyCl.hs 2> /dev/null`
I measure:

  LexicalAnalysis found 91 qualified name, and lasted: 150msec
  Elapsed (wall clock) time (h:mm:ss or m:ss): 0:00.72
  Maximum resident set size (kbytes): 142396

Using a clean build I get:

  Elapsed (wall clock) time (h:mm:ss or m:ss): 0:00.68
  Maximum resident set size (kbytes): 140012


Now my question is, how would you measure the time and space cost of
such change. For example, what profiling tool would you recommend and
how can I test a modified ghc to build ghc itself?

Thanks in advance,
-Tristan

Attachment: signature.asc
Description: PGP signature

_______________________________________________
ghc-devs mailing list
ghc-devs@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs

Reply via email to