First things first: I am new to Nim, and loving it so far :) With that being 
said...

I think it's a shame to take the attitude that download size doesn't matter. 
63% of the world accesses the internet 
(<https://www.statista.com/statistics/617136/digital-population-worldwide/>) 
but the vast majority of these have comparatively slow download speeds 
(<https://www.speedtest.net/global-index#mobile>). Bloated websites are a 
modern plague and disproportionately impact the poorer nations of the world -- 
in fact, the majority of people globally.

Internet energy consumption is a serious issue. While Bitcoin gets all the bad 
press, every kb transferred requires energy. The more data transferred, the 
more fossil fuels are burned. This is an existential issue, and if we're not 
part of the solution we're part of the problem.

Every kb counts!

Now, I can understand that minimisation isn't a priority for Nim. But I wonder 
if there could be a 'recommended' approach that allows Nim's JavaScript output 
to play well with best-in-class approaches to JS minimisation?

I've been fiddling with some post-compile processing. Running Nim's output JS 
files through `google-closure-compiler --compilation_level 
ADVANCED_OPTIMIZATIONS` followed by `babel-minify` seems to produce useful 
results.

It would be encouraging to see a Nim pick this up in some way. Like I said, I 
don't expect Nim to focus on minifying JS output. But perhaps a set of 
post-processing steps (or whatever) could be added to standard tests, to ensure 
that minified output works as expected? Perhaps this is something for 'the 
community', but to be frank, I believe in 'being the change you want to see in 
the world', and it would be great to see the Nim team actively engage with the 
ridiculous amount of bloat on the web.

Reply via email to