Re: Advent of Nim

2017-10-07 Thread cdunn2001
I commented on your github repo. I also found an odd case where a space is 
needed:

  * 
[https://github.com/nim-lang/Nim/issues/1311](https://github.com/nim-lang/Nim/issues/1311)




Re: Python-like with, context managers, and the RAII pattern

2017-10-07 Thread Tiberium
cdunn2001: "There's actually a lot of places I'd really like to use "withAs" 
in, eg to make sure that resources are closed, but without needing to add a 
resource-specific "defer" line." So wizzardx will need to create a template 
like this for every type he'll use


Re: Python-like with, context managers, and the RAII pattern

2017-10-07 Thread cdunn2001
Since you don't want to dive into macros and you don't need anything 
super-generic, I think you're better off with @Benjaminel's idea: 


template withcd(newdir: string, statements: untyped) =
  let olddir = os.getCurrentDir()
  os.setCurrentDir(newdir)
  defer: os.setCurrentDir(olddir)
  statements



import os

proc hi() =
  echo "in:", os.getCurrentDir()

hi()

withcd("bar"):
  echo "It works!"
  hi()

echo "Finish it."
hi()



in:/foo
It works!
in:/foo/bar
Finish it.
in:/foo



Re: "Cross-platform C" target for Nim?

2017-10-07 Thread Tiberium
wizzardx: yes, and nim uses niminst


Re: Hacktoberfest project contributions?

2017-10-07 Thread mratsim
Arraymancer was born in-between linalg and neo and is targeted at machine 
learning, with a focus on deep learning and computer-vision (and later speech 
and language hopefully).

I started Arraymancer because you need 4-dimensional tensors for deep learning 
on images (Number of images, RGB colors, height, width) and 5d for videos or 3D 
images.

It takes a Numpy/Julia/Matlab like multidimensional array approach meaning 
operations like reshaping, concatenating arrays over an arbitrary dimensions, 
permuting dimensions, working with stacked 2d matrices in a 3d tensors is 
possible and efficient.

* * *

There are a few twists:

  * Like Andrea, I like linear algebra operations like `+` and `*` to mean math 
(linear algebra) things, so broadcasting and element-wise matrix product is not 
implicit like in Numpy but opt-in: you must use custom operators `.+` and `.*`. 
The benefit is that if you have strange results you can grep those.
  * Like Nim, I choose to have value semantics for my tensors, meaning it's 
copy on assignation. You can opt-in to use "unsafe" operations like 
"unsafeReshape" or "unsafeTranspose" that will provide a view over the same 
data.

It avoids a lot of beginners gotchas and like broadcasting makes it easy to 
grep for potential issues.

Note: For CudaTensors I currently can't due to `=` overloading issues.




* * *

**My short-term plans (consolidating the base):**

  * Convolutional neural networks are just around the corner
  * Improve CudaTensors support
  * Doing demos
  * Cover common deep-learning use cases outside of Recurrent Neural Networks.



**Mid-term plans (focus on data):**

  * CSV loading and/or conversion from a Nim dataframe package like NimData
  * Basic image manipulation (rotating, flipping, blurring ...), in an extra 
package like Arraymancer-vision.
  * Saving and loading models from disk.
  * Provide popular datasets (IRIS, MNIST ...)
  * Recurrent neural networks.



**Long-term plans (hardware and format):**

  * More hardware support (AMD GPUs at least)
  * Provide utilities for general machine learning (and not only deep learning):

* Sparse tensors but like Andrea said, apart on Cuda there is no standard :/
* PCA and SVD matrices decomposition
* Clustering tools like KMeans, DBScans



**Very long-term plans (wishlist):**

  * Bindings to gradient boosted trees libraries 
([XGboost](https://github.com/dmlc/xgboost) / 
[LightGBM](https://github.com/Microsoft/LightGBM))
  * Videos loading and analysis
  * New backends: OpenCL, Javascript, Apple Metal



**Very (very !) long-term plans (wishlist reloaded):**

  * More natural language utilities or bindings like 
[spaCy](https://github.com/explosion/spaCy)
  * REST API for example, send Arraymancer an image and it returns what was 
detected in it.
  * [Bayesian Neural 
Networks](https://alpha-i.co/blog/MNIST-for-ML-beginners-The-Bayesian-Way.html) 
which works with uncertainty built-in



**What is not a focus:**

  * Linear algebra/LAPACK not used in Machine Learning (like solving equations)
  * 2D/3D geometry and quaternions 
([nim-glm](https://github.com/stavenko/nim-glm) is probably more suited than 
either neo or Arraymancer)




Re: Trying to write readable code, need help

2017-10-07 Thread mratsim
I agree, this:self makes code less grep-able.

var vsl = self.vsl works because vsl is a ref type so the pointer is indeed 
copied but it points to the same (shared) data location.