Hi Stephan,
Thanks for replying to my concerns.
I have looked a bit more into it and `intersect1d` is not the right concept for
my problem anyway. And it is great that it works faster than reduce approach.
Also, I did more speed tests and, after all, `in1d` approach is not generally
faster. It
Dear Dom,
just check, and on my computer the new version is ~factor 2 faster
compared to the reduce approach if arrays are shuffled. For sorted
arrays, the the new version is factor 3.4. faster:
from functools import reduce
idss = [np.random.permutation(np.arange(a*100, int(1e5)+a*100, 1)) f
Dear Dom,
thanks for bringing up the possible constriction. I agree that this
would be serious argument against the change.
However, as you said the overlapping/non-overlapping indices would
become ambiguous with more than two arrays. And calling the fucntion
with only two arrays at a time w