(those times are on 0.3.0, OSX, by the way - may be better/different on new 
versions)

On Sunday, November 30, 2014 11:33:58 PM UTC-5, Iain Dunning wrote:
>
> Check out
>
> https://gist.github.com/IainNZ/1afb9318c841c9bd2234
>
> I get
>
> IAINMAC:Desktop idunning$ julia test.jl
> elapsed time: 0.006008895 seconds (2800048 bytes allocated)
> elapsed time: 0.011813825 seconds (2800048 bytes allocated)
> elapsed time: 0.33829981 seconds (2800048 bytes allocated)
>
> elapsed time: 0.004994186 seconds (2800048 bytes allocated)
> elapsed time: 0.006508385 seconds (2800048 bytes allocated)
> elapsed time: 0.056528559 seconds (2800048 bytes allocated)
>
> Which is pretty neat, and is generalized to work with any number type.
>
> I wouldn't put much stock in the DataFrames times - the performance isn't 
> going to be great. The "pure" data times though, those surprise me -  I 
> don't get how LuaJIT is generating better code for something so simple.
>
> On Sunday, November 30, 2014 9:10:37 PM UTC-5, Andreas Noack wrote:
>>
>> Hi Joseph
>>
>> I just tried to run your code and I get approximately the same numbers 
>> for Julia and I couldn't see any obvious errors in your implementation. 
>> DataVectors have some overhead to allow for missing values, so I don't know 
>> they can be made faster.
>>
>> It might be cheating in the comparison with LuaJit, but the code 
>> vectorizes very nicely so if I add @simd before the inner for loop and 
>> @inbounds before the line in that loop I get a five fold speed up on my 
>> fairly new MacBook Pro.
>>
>> Andreas
>>
>> 2014-11-30 20:04 GMT-05:00 John Myles White <johnmyl...@gmail.com>:
>>
>>> Hi Joseph,
>>>
>>> Have you read 
>>> http://julia.readthedocs.org/en/release-0.3/manual/performance-tips/ ?
>>>
>>> I didn't read your code in detail, but a superficial read suggests that 
>>> your code has a lot of type-instability, which is a showstopper for Julia.
>>>
>>>  -- John
>>>
>>> On Nov 30, 2014, at 4:58 PM, Joseph Ellsworth <joex...@gmail.com> wrote:
>>>
>>> Just finished some basic tests comparing the lua jit and Julia for the 
>>> kinds of statistical functions we commonly compute.   It essentially loads  
>>> 70K  1 minute bar records and computes a sma(14) and sma(600) for every row 
>>> in the file.  This time I included source code so  others can figure out 
>>> what I missed.   It is admittedly a simplified case but I have found that 
>>> if this function runs fast the rest of our system tends to run fast so I 
>>> consider it a realistic starting benchmark. 
>>>
>>> http://bayesanalytic.com/lua_jit_faster_than_julia_stock_prediction/   
>>>
>>> The results were not what I expected.     I expected Julia to blow away 
>>> lua even with a jit due to the fact that I could allocate memory for result 
>>> arrays in typed arrays in Julia as blocks and couldn't figure out how to do 
>>> the same in lua.  In addition the lua array index access seem more like a 
>>> hash rather than a pure numeric array index which should give Julia a 
>>> substantial advantage when looping across items in an array.    What I 
>>> found is that Lua jit out performed Julia in all but 1 test even if you 
>>> don't consider Julia's horrible start-up performance.      
>>>
>>> I am hoping that somebody finds a mistake that would make Julia out 
>>> perform as I really want to love it.    I like the Julia community  I also 
>>> really like the multi-dispatch function system.   The Julia community seems 
>>> to be working at a incredible velocity but Julia's poor error messages,  
>>> slow startup time and letting lua beat them makes me skeptical for 
>>> investing in it for larger projects.    On the other-hand Lua has been 
>>> around for a long time and is used as a scripting engine in many games and 
>>> consoles  and is unlikely to go away anytime soon. 
>>>
>>> If any of you produce a better Julia version that performs better then 
>>> let me know and I will add it to the original article.    If any of you 
>>> have a chance to port the same code to Python to using pypy,  Java, Scala, 
>>> C  then let me know and I will add it to the original article. 
>>>
>>>
>>>
>>

Reply via email to