Not being able to change the type of an object is one of the constraints on
totally dynamic behavior that Julia makes to keep everyone – including but
not limited to the compiler – sane. If the type of an object could change,
this would happen all over:

A = zeros(m,n)
f(A)
# now we have no idea what the type of A is


When you don't know the type of something, you can't generate efficient
code to manipulate it, so you're back to running at interpreted
Python/Matlab/R speeds. In systems with lots of built-in functions
(implemented in a low-level language), you could just know which built-in
functions do this sort of thing, but in Julia almost everything is
implemented in Julia, so you don't know this. It's possible to analyze the
definitions of things to try to know from definitions which functions might
change the type of an argument, but in a language with as much open-ended
polymorphism as Julia, that can be surprisingly difficult.


On Thu, Apr 24, 2014 at 6:19 AM, Tobias Knopp
<tobias.kn...@googlemail.com>wrote:

> As is explained here https://github.com/JuliaLang/julia/issues/4211 a
> mutating reshape conflicts with the array dimension beeing a type parameter.
>
> Am Donnerstag, 24. April 2014 10:07:31 UTC+2 schrieb Tobias Knopp:
>
>> A few things to add:
>> - Note that writing vector expressions in an efficient manner is not
>> trivial and actually an unsolved issue in Julia. If you write
>>
>> x = a + b .* c
>>
>> there will be temporary arrays created which leads to a major slowdown.
>> There is a devectorization macro though that can solve this by transforming
>> this into
>>
>> for n=1:length(N)
>>   x[n] = a[n] + b[n] * c[n]
>> end
>>
>> So although vector expressions might seem easy to use for a beginner, in
>> practical programming they are really non-trivial.
>>
>> - Matlab is actually so clever that it will only copy arrays internally
>> when one writes to an array copy. This is called copy-on-write (COW).
>>
>> - I think it is true that Julia is a little harder to learn than Matlab
>> when one considers only the absolute newbie programmers. But if you dig a
>> little deeper into programming Julia offers several serious advantages that
>> will pay out after taking the first step. The type system with multiple
>> dispatch is one of the most awesome things I have seen so far. And when you
>> start learning about how to use mex in Matlab or the equivalent things in
>> Python I think one has reached a point where Julia would be much easier to
>> use.
>>
>> - For a mathematician statements like "x=x+1" can be very confusing no
>> matter how this is implemented internally... :-)
>>
>> - I have to admit that a reshape!(A,dim) would make a lot of sense. But
>> there might be technical reasons why it is not possible to change the array
>> dimensions in an existing array.
>>
>>
>>
>>
>> Am Donnerstag, 24. April 2014 07:16:26 UTC+2 schrieb Ethan Anderes:
>>>
>>> Jameson:
>>>
>>> Yes, the Matlab choice is slow and doesn't scale, but it's very easy to
>>> reason about. I think it was instructive for me to try and think of a real
>>> life bug that realized my worries. I came to realize that most of the code
>>> where I was using vec() was embedded in a chain of function calls like
>>>
>>> b = M * exp(vec(a))
>>>
>>> ....so I see your point about fast easily composible functions when
>>> output and input share memory.
>>>
>>> I recently had a discussion with a colleague that commented that R was
>>> essentially developed by statisticians which is why it doesn't scale well
>>> (not sure if that is actually true but thats beside my point). On the other
>>> hand, things like python are written by CS folks which hinders access for
>>> mathy-science folks who just want to prototype an idea every now and again
>>> without investing the time to upgrade their programming skills. I think
>>> julia can have the best of both: easy to learn + modern CS.
>>>
>>>
>>>

Reply via email to