Congratulations !
I especially like the neural network examples and hope more
will be forthcoming.
The new version of Arraymancer, v0.4.0 "The Name of the Wind" is live today.
here is the changelog:
* * *
* Core:
* OpenCL tensors are now available! However Arraymancer will naively select
the first backend available. It can be CPU, it can be GPU. They support basic
and broadcasted
Arraymancer v0.3.0 Dec. 14 2017
Finally after much struggles, here is Arraymancer new version. Available now on
Nimble. It comes with a new shiny doc (thanks @flyx and NimYAML doc):
[https://mratsim.github.io/Arraymancer](https://mratsim.github.io/Arraymancer)
Changes:
* **Very** Breaking
@mratsim Would you mind if I make a reference to your lib in my bachelor thesis
about optimization?
This looks like a very useful library for me. I shall certainly be checking it
out. Nice one!
I am very excited to announce the second release of Arraymancer which includes
numerous improvements `blablabla` ...
Without further ado:
* Communauty
* There is a Gitter room!
* Breaking
* `shallowCopy` is now `unsafeView` and accepts `let` arguments
* Element-wise
The only static parts of the Tensor types are the Backend (Cpu, CUDA, ...) and
the internal type (int32, float32, object ...).
The network topology will be dynamic and using dynamic graphs more akin to
PyTorch/Chainer/DyNet than Theano/Tensorflow/Keras.
My next step is to build an autograd so
A late reply because I was hoping to dive into this a bit deeper before
replying. But due to lack of time, a high-level feedback must suffice: This
looks awesome!
I completely agree with your observation that there is a gap between developing
prototypes e.g. in Python and bringing them into
`get_data_ptr` is now public .
For now, I will add the neural network functionality directly in Arraymancer.
The directory structure will probably be:
* src/arraymancer ==> core Tensor stuff
* src/autograd ==> automatic gradient computation (i.e.
I've been following this for a while on GitHub and I think it is a very
impressive project. Nim would be a great language for scientific computing, but
it needs to have the numerical libraries and this is an excellent first step in
creating them.
A couple of questions. First, are you planning
As a data scientist, I feel that Nim has tremendous potential for data science,
machine learning and deep learning.
In particular, it's currently non-trivial to bridge the gap between deep
learning research (mostly Python and sometimes Lua) and production (C for
embedded devices, javascript
11 matches
Mail list logo