On Fri, Oct 23, 2015 at 7:18 AM, Kris De Meyer <k...@aniku.co.uk> wrote:

> Thanks for the tips. There are no depreciation warnings in my own code,
> but the Sundials package which I rely on has quite a few. Although the
> tests which are really slow don't print out depreciation warnings (the
> printed warnings occur earlier), I suppose depreciation checking can still
> be slowing things down, even if no warnings are printed anymore? I guess
> I'll just have to wait to see if Sundials can catch up (I see it's failing
> to pass tests under 0.4.0). This is very annoying as I wanted to move up
> from 0.3.11 because of problems in the PyCall/PyPlot/Conda packages that I
> don't seem to find a solution for in 0.3.11 but have been told are resolved
> in 0.4.0.
>

Sorry for the annoyance, the situation is quite frustrating. Sundials
should get updated soon.


> Please accept the following not as unbridled criticism but as a way to
> improve working procedures. I've been developing Julia code since release
> v0.3.7. Unfortunately this isn't the first time that I lose many hours
> having to figure out why something that works in one release stops working
> in a subsequent one. To be honest, it's putting me off Julia and it must
> have similar effects on other potential users too. To me this points to the
> need for better procedures and guidelines in the way the language
> progresses and how the "official" packages catch up. I worked a couple of
> years for the MathWorks. Breaking backwards compatibility was generally not
> allowed, and the coordinated testing and release procedures made that
> nearly impossible. For Julia to be taken seriously by people outside the
> academic community, it would do well to start looking at how similar
> procedures can be adopted into an open-source development model. It's one
> thing to write good code and develop the "ideal" language, it's another
> thing altogether to release workable software to an outside community.
>

This is a real issue. Julia is not yet at the point where prohibiting
backwards incompatible changes is realistic. But we're getting there – in
another year or so, after 1.0 is release, that will change. This is also
one of the reasons we founded Julia Computing – one of our first products
is a supported Julia distribution, including a set of packages that will be
updated in sync. We will support these versions for years, including making
sure supported packages continue to work and get bug fixes.

On a slightly different note, in 2 or 3 release cycles, Matlab will have
> caught up on any performance gains Julia may have introduced (by using the
> same LLVM compiler procedures Julia uses) and then the only thing Julia
> will have going for it is that it's free. But my cost to my employers is
> such that if I lose as little as 3 days a year on compatibility issues,
> they would be better off paying for a Matlab license...
>

Matlab has made good progress with their JIT performance recently –
probably in no small part due to Julia's existence. I suspect, however,
that most of the low hanging performance fruit has already been picked:
going from slow to decently fast is relatively easy compared to going from
decently fast to really fast. So I don't think you can necessarily
extrapolate that rate of performance improvement into the future. Yes,
Matlab uses LLVM which Julia also uses, but that doesn't mean it can do the
same things – the language designs are very different.

The notion of a closing gap between Matlab and Julia also supposes that
Julia is standing still, which is far from true. Julia is currently gaining
language-level multithreading support (as compared to having a few kernels
that are written in C to use threads). Soon we'll have the ability to
statically compile Julia programs without any dependency on LLVM – you will
get a standalone binary that only depends on a fairly small Julia runtime
library (i.e. no JIT). There's a lot of work being done on distributed
parallel computing as well – the current state of affairs is somewhat
usable but not nearly what we've envisioned for it. And of course we've
only begun to touch the kinds of fancy performance optimizations that can
be done – compared to, say, the kinds of stuff that's done by JavaScript
JITs. So even if Matlab catches up to where we are now, by they time that
happens, Julia will be way off in the distance, speeding into the future :-)

Reply via email to