This is a classic case of the need for vectorization in Python and Matlab
making the algorithm completely incomprehensible. You're often much better
off porting to Julia from simple for-loop based C or Java codes instead.


On Mon, Aug 25, 2014 at 2:05 PM, Iain Dunning <iaindunn...@gmail.com> wrote:

> Updated gist for the doubly-even order case
> https://gist.github.com/IainNZ/9b5f1eb1bcf923ed02d9
>
> For a magic square of size 10000:
> Mine: 0.47
> Matlab-style: 1.7
> Python-style: 1.0
>
> So, probably faster than MATLAB-in-MATLAB at this point, maybe same as
> PyPy?
>
>
> On Monday, August 25, 2014 9:38:11 AM UTC-4, Phillip Berndt wrote:
>
>> Hi julia-users,
>>
>> I've recently stumbled over Julia and wanted to give it a try.
>>
>> To assess it's speed, I've implemented another micro-benchmark, namely a
>> version of Matlab's magic() function that generates magic squares. Since I
>> have no experience writing optimal Julia code, I started off with literal
>> translations of two different implementations - Matlab's and the one from
>> magic_square.py from PyPy, which is an optimized version for NumPy. I then
>> timed the calculation of all magic squares from N=3 to N=1000. The table
>> from Julia's homepage suggests that in most cases, it is significantly
>> faster than Python and Matlab. In my case, it's significantly slower, which
>> is somehow disappointing ;) My question now is:
>>
>> Can the implementation be optimized to outperform the other two?
>>
>> *The times:*
>>
>> Julia, Matlab version: elapsed time: 18.495374216 seconds (13404087428
>> bytes allocated, 12.54% gc time)
>> Julia, Python version: elapsed time: 8.107275449 seconds (13532473792
>> bytes allocated, 26.99% gc time)
>> Matlab: Elapsed time is 4.994960 seconds.
>> Python: 1 loops, best of 3: 2.09 s per loop
>>
>> My test machine is a 4 Core i7-4600 Notebook with 2.1 GHz and 8 GiB RAM,
>> running a current Linux Mint and Julia 0.3 stable. To be fair, Python does
>> not seem to gc during this loop (disabling gc doesn't alter the time here),
>> so one should compare with 8.1 s * (1.-.2699) = 5.91 s for Julia. That's
>> still much slower than Python. (By the way, even Octave only needs 4.46
>> seconds.) If I translate the matrices in magic_python to account for
>> column-major storage, the execution time does not significantly improve.
>>
>> *The code:*
>>
>> Matlab: tic; arrayfun(@magic, 3:1000, 'UniformOutput', false); toc
>> IPython: import magic_square; %timeit [ magic_square.magic(x) for x in
>> range(3, 1001) ];
>> Julia: I've uploaded the code to a Gist at https://gist.github.com/
>> phillipberndt/2db94bf5e0c16161dedc and will paste a copy below this post.
>>
>>
>> Cheers,
>> Phillip
>>
>>
>> function magic_matlab(n::Int64)
>>     # Works exactly as Matlab's magic.m
>>
>>     if n % 2 == 1
>>         p = (1:n)
>>         M = n * mod(broadcast(+, p', p - div(n+3, 2)), n) +
>> mod(broadcast(+, p', 2p - 2), n) + 1
>>         return M
>>     elseif n % 4 == 0
>>         J = div([1:n] % 4, 2)
>>         K = J' .== J
>>         M = broadcast(+, [1:n:(n*n)]', [0:n-1])
>>         M[K] = n^2 + 1 - M[K]
>>         return M
>>     else
>>         p = div(n, 2)
>>         M = magic_matlab(p)
>>         M = [M M+2p^2; M+3p^2 M+p^2]
>>         if n == 2
>>             return M
>>         end
>>         i = (1:p)
>>         k = (n-2)/4
>>         j = convert(Array{Int}, [(1:k); ((n-k+2):n)])
>>         M[[i; i+p],j] = M[[i+p; i],j]
>>         i = k+1
>>         j = [1; i]
>>         M[[i; i+p],j] = M[[i+p; i],j]
>>         return M
>>     end
>> end
>> @vectorize_1arg Int magic_matlab
>>
>> function magic_python(n::Int64)
>>     # Works exactly as magic_square.py (from pypy)
>>
>>     if n % 2 == 1
>>         m = (n >> 1) + 1
>>         b = n^2 + 1
>>
>>         M = reshape(repmat(1:n:b-n, 1, n+2)[m:end-m], n+1, n)[2:end, :] +
>>             reshape(repmat(0:(n-1), 1, n+2), n+2, n)[2:end-1, :]'
>>         return M
>>     elseif n % 4 == 0
>>         b = n^2 + 1
>>         d = reshape(1:b-1, n, n)
>>
>>         d[1:4:n, 1:4:n] = b - d[1:4:n, 1:4:n]
>>         d[1:4:n, 4:4:n] = b - d[1:4:n, 4:4:n]
>>         d[4:4:n, 1:4:n] = b - d[4:4:n, 1:4:n]
>>         d[4:4:n, 4:4:n] = b - d[4:4:n, 4:4:n]
>>         d[2:4:n, 2:4:n] = b - d[2:4:n, 2:4:n]
>>         d[2:4:n, 3:4:n] = b - d[2:4:n, 3:4:n]
>>         d[3:4:n, 2:4:n] = b - d[3:4:n, 2:4:n]
>>         d[3:4:n, 3:4:n] = b - d[3:4:n, 3:4:n]
>>
>>         return d
>>     else
>>         m = n >> 1
>>         k = m >> 1
>>         b = m^2
>>
>>         d = repmat(magic_python(m), 2, 2)
>>
>>         d[1:m, 1:k] += 3*b
>>         d[1+m:end, 1+k:m] += 3*b
>>         d[1+k, 1+k] += 3*b
>>         d[1+k, 1] -= 3*b
>>         d[1+m+k, 1] += 3*b
>>         d[1+m+k, 1+k] -= 3*b
>>         d[1:m,1+m:n-k+1] += b+b
>>         d[1+m:end, 1+m:n-k+1] += b
>>         d[1:m, 1+n-k+1:end] += b
>>         d[1+m:end, 1+n-k+1:end] += b+b
>>
>>         return d
>>    end
>> end
>> @vectorize_1arg Int magic_python
>>
>> print("Matlab version: ")
>> @time magic_matlab(3:1000)
>>
>> print("Python version: ")
>> @time magic_python(3:1000)
>>
>>
>>

Reply via email to