As a data scientist, I feel that Nim has tremendous potential for data science,
machine learning and deep learning.
In particular, it's currently non-trivial to bridge the gap between deep
learning research (mostly Python and sometimes Lua) and production (C for
embedded devices, javascript
You might be interested in my autograd library:
[nim-rmad](https://github.com/mratsim/nim-rmad).
It automatically compute the gradient of any function with regards to any of
its input. It uses reverse-mode auto-differentiation behind the scene.
let ctx = newContext[float32]()
Hi Mratsim,
I have seen nim-rmad, however in my case requirement is somewhat different. The
derivative calculation should not slowdown or change the interface of original
function. Hence macro approach that generates separate set of functions to
compute derivatives.
well, not that I know of. You should know that a dot expression can be a
function call, but it can also be an access to an object member. I think you
have to write your own normalization function.
Hi, I am working on macro that should generate a derivative calculation, for a
funtion
proc f1(x, y: float): float {.noSideEffect.} =
generates
proc f1_aad(x,y: float): float
that computes derivative of f1 at point x,y.
The difficulty I am having that
f32 (and even f16) is critical for fast computation for deep learning,
especially on GPU and like for games. F32 is more than enough for the majority
of data (images, sounds, text, financial data).
However from a library writer point of view, declaring a type/data
structure/function signature
very nice library ! @Ward: How did you translate the interfaces ? manually,
with c2nim or another tool. It would be nice to have a tool to import type
libraries.
I'm probably missing something **really** obvious, but how are you supposed to
handle a scenario where you read a file from stdin and then need to prompt for
a password after doing so.
e.g. something like this:
cat sometextfile.txt | myapp -
Password: blahblah