I solved Expert 4 by Alireza's approach like following. (I don't understand 
how to use TensorOperations.jl in this case)
Thanks for Alireza!

```
p, n = 10, 20
M = ones(n,n,p)
V = ones(n,p)
S = reduce(+, [M[i,:,j]*V[i] for i = 1:n, j = 1:p])'
S
```

2014年6月27日金曜日 5時59分15秒 UTC+9 Steven G. Johnson:
>
>
>
> On Thursday, June 26, 2014 9:54:34 AM UTC-4, Michiaki Ariga wrote:
>>
>> In original numpy version as following, matrix and vector are 3dimension 
>> arrays.
>> Is there any way to compute tensordot like numpy?
>>
>
> There is no built-in tensor contraction function at the moment (
> https://github.com/JuliaLang/julia/issues/3250), but you can try out the 
> TensorOperations package:
>
>          https://github.com/Jutho/TensorOperations.jl
>
> You can also just write your own loop and the performance should be fine; 
> it's pretty easy to do this if you know the dimensionality in advance, and 
> it's only complicated to write tensor contraction code if you want to 
> handle arbitrary dimensionality and arbitrary contraction index sets.  (It 
> takes a while to get used to the fact that you don't need to call a library 
> function for every inner loop in Julia.)
>

Reply via email to