A late reply because I was hoping to dive into this a bit deeper before 
replying. But due to lack of time, a high-level feedback must suffice: This 
looks awesome!

I completely agree with your observation that there is a gap between developing 
prototypes e.g. in Python and bringing them into production -- not only in deep 
learning, but data science in general. And I also think that Nim's feature set 
would be perfect to fill this gap.

A quick question on using statically-typed tensors: I assume that this implies 
that the topolgy of a network cannot be dynamic at all? I'm wondering if there 
are good work-arounds to situations where dynamic network topologies are 
required, for instance when a model wants to choose its number of hidden layer 
nodes iteratively, picking the best model variant. Are dynamically typed 
tensors an option or would that defeat the design / performance?

Reply via email to