[julia-users] Re: Setting socket options in ZMQ.jl

2016-07-25 Thread Jeffrey Sarnoff

>
> How are ZMQ socket options specified in Julia?

 
the attached file may help you


On Monday, July 18, 2016 at 6:15:52 PM UTC-4, Salman Haider wrote:
>
>
> ZeroMQ has an option where the subscribe socket only keeps the last 
> message. I was wondering if there is a way to specify that.
> Something like the following:
>
> using ZMQ
> ctx = Context()
> s = Socket(ctx, SUB)
> ZMQ.subscribe(s)
> ZMQ.set_subscribe(ZMQ.CONFLATE, 1)   # ERROR: LoadError: UndefVarError: 
> CONFLATE not defined
> ZMQ.connect(s, "tcp://127.0.0.1:5001")
>
>
> while true
>  msg = bytestring(ZMQ.recv(s))
>  println(msg)
> end
>
>
>
> The C++ analog would go something like this:
>   zmq::context_t context (1);
>   zmq::socket_t subscriber (context, ZMQ_SUB);
>
>   int conflate = 1;
>
>   *subscriber**.setsockopt(ZMQ_CONFLATE, &conflate, sizeof(conflate) );   
> // need this in julia*
>   subscriber.connect("tcp://localhost:5556");
>   subscriber.setsockopt(ZMQ_SUBSCRIBE, "", 0); 
>
>
> More generally, how are socket options specified in Julia?
> http://api.zeromq.org/4-0:zmq-setsockopt
>
> Thanks.
>


ZMQ_sockets.jl
Description: Binary data


[julia-users] Re: Issue with Pkg.update() after fresh Julia installation on Windows 7 system

2016-07-25 Thread Jeffrey Sarnoff
Why not follow their advice:
>
>
> *Go to the Packages → Julia → Open Terminal menu and*
> *run Pkg.update()in Julia, then try again.*
> *If you still see an issue, please report it to:*_ 
> http://discuss.junolab.org/_



On Wednesday, July 20, 2016 at 4:16:03 PM UTC-4, Pedro Hussain wrote:
>
> After installing Julia language on my Windows 7 PC (Windows 
> Self-Extracting Archive 64 bit) I started the Julia console and ran 
> Pkg.update(). This resulted in following error message:
>
> *ERROR: couldn't update julia\v0.4.cache\JuliaParser using git remote 
> update*
>
> I also installed Atom and after the first start Atome recompiled a number 
> of modules and then halted with following error message:
>
> *Error loading Atom.jl package*
> *Go to the Packages → Julia → Open Terminal menu and*
> *run Pkg.update()in Julia, then try again.*
> *If you still see an issue, please report it to:*
> *_ http://discuss.junolab.org/_ *
>
> Don't know whether the same cause is in effect in both cases. Can anyone 
> advice on the cause and how to overcome this issue? Thanks
>


[julia-users] Effects of globals on dispatch + inlining

2016-07-25 Thread Fábio Cardeal

Hello . . .



*BackstoryPreviously on julia-users...*
Recently on another thread I tried to explain to someone how globals were 
messing with type dispatch and inlining.
They probably already knew but it was a just a `make-sure` quick 
explanation, then I tripped over it and now I want to make sure **I** know.
I experimented a little and posted a better explanation...
I STILL missed a few things, as I am prone to do, so I made a little 
jupyter notebook to link to and revise when needed.

*---*

The notebook: https://gist.github.com/fcard/1bf78e9d4f6ea3be76518f6a0fbe0283
It would be lovely if someone could take a quick look and see if the 
information there is correct and if I missed anything.

Thanks!


Re: [julia-users] Re: A Very Simple Benchmark for Brutal-force Loops in Several Languages: revised, Julia is fast!

2016-07-25 Thread Tim Holy
Given the apparent interest in the topic and the decisions that people seem to 
be making, it seems worth pointing out that folks are still using apples-to-
oranges comparisons on this benchmark.

There are at least two important differences:
- in the other languages, `linspace` allocates a vector, but in Julia it 
(currently) creates a compact object from which values are computed on-the-fly. 
That computation involves a division, and division is slow.
- The languages like C aren't doing bounds-checking. You might imagine adding 
`@inbounds` to the Julia version. But `@inbounds` has no impact on LinSpace 
objects in julia 0.4. In julia 0.5, it does work like you'd expect (thanks, 
Blake Johnson).

Combining these observations, we can `collect` the values into a `Vector` and 
then use `@inbounds`. For me the version below is nearly twice as fast as the 
original:

function benchmark()
nsamples = 100
x = collect(linspace(0, 5, nsamples))
y = zeros(nsamples)
# attempt to trigger JIT to compile all functions needed in the loops
# before profiling
a = cos(0.0); a = 1.5*2.5; a = 1.5+2.5;
println("\nBrutal-force loops, 100 times:")
@time begin
for m = 1:100
@inbounds for n = 1:nsamples
y[n] = cos(2*x[n]+5);
end
end
end
end
benchmark();

Best,
--Tim

On Monday, July 25, 2016 2:02:41 PM CDT Zhong Pan wrote:
> Agree that while raw speed is important, in most situations it wouldn't be
> the most important reason to choose one programming language over another.
> 
> I came from the angle of an engineer in a small company. For myself, the
> main attraction of Julia was the easiness to achieve decent speed without
> making much explicit effort: that means what feels more natural vectorized
> will be vectorized, while what feels more natural in a loop will be in a
> loop; that means I don't need to resort to another language or a library
> only for improving speed; and that means apart from sticking to a couple
> good habits, I can use objects, functions etc. the same way inside a loop
> vs. outside. None of these is critical by itself, but they add up to an
> uninterrupted flow of thoughts while writing code to explore, try, fail,
> and retry, for many iterations.
> 
> During this "less careful" prototyping, 1-2x slow down is fine, but with
> Julia I know I won't sit there for tens of minutes waiting for a result
> while debating myself whether I should rewrite it in C++ or rehaul the code
> with Cython etc.; instead I can rest assured that as long as my algo and
> coding have no mistakes or major flaws, the speed is close to what I will
> get even if I make several times more effort to rewrite it in C++.
> 
> Another big deal for me is the resulted removal of the barrier between
> prototype and production code. For production I can review and improve my
> code carefully, but rewriting it in a less expressive language is too much.
> 
> I was a huge fan of Python (heck I even persuaded my previous boss, a VP,
> to pick up Python - though I don't know if he really had time to finish it.
> 
> :-)). However, the slow raw speed and the over-freedom to change class
> 
> definition anywhere always gave me the itch to find something better. My
> brother at JPL who worked on Python projects also complained about having
> to think really hard to vectorize almost everything and then couldn't
> easily understand what he was doing a few months later because the code was
> too unnatural for the problem; the indentation was also a big headache as
> collaborators use different editors with different tab definitions.
> 
> So I'm really happy to have found Julia, which gave me the same joy as
> coding in Python and removed the main itches.
> 
> -Zhong




[julia-users] Re: A Very Simple Benchmark for Brutal-force Loops in Several Languages: revised, Julia is fast!

2016-07-25 Thread Zhong Pan
Agree that while raw speed is important, in most situations it wouldn't be 
the most important reason to choose one programming language over another.

I came from the angle of an engineer in a small company. For myself, the 
main attraction of Julia was the easiness to achieve decent speed without 
making much explicit effort: that means what feels more natural vectorized 
will be vectorized, while what feels more natural in a loop will be in a 
loop; that means I don't need to resort to another language or a library 
only for improving speed; and that means apart from sticking to a couple 
good habits, I can use objects, functions etc. the same way inside a loop 
vs. outside. None of these is critical by itself, but they add up to an 
uninterrupted flow of thoughts while writing code to explore, try, fail, 
and retry, for many iterations.

During this "less careful" prototyping, 1-2x slow down is fine, but with 
Julia I know I won't sit there for tens of minutes waiting for a result 
while debating myself whether I should rewrite it in C++ or rehaul the code 
with Cython etc.; instead I can rest assured that as long as my algo and 
coding have no mistakes or major flaws, the speed is close to what I will 
get even if I make several times more effort to rewrite it in C++.

Another big deal for me is the resulted removal of the barrier between 
prototype and production code. For production I can review and improve my 
code carefully, but rewriting it in a less expressive language is too much.

I was a huge fan of Python (heck I even persuaded my previous boss, a VP, 
to pick up Python - though I don't know if he really had time to finish it. 
:-)). However, the slow raw speed and the over-freedom to change class 
definition anywhere always gave me the itch to find something better. My 
brother at JPL who worked on Python projects also complained about having 
to think really hard to vectorize almost everything and then couldn't 
easily understand what he was doing a few months later because the code was 
too unnatural for the problem; the indentation was also a big headache as 
collaborators use different editors with different tab definitions. 

So I'm really happy to have found Julia, which gave me the same joy as 
coding in Python and removed the main itches.

-Zhong













Re: [julia-users] ReadOnlyMemoryError() on Windows 64

2016-07-25 Thread 'Bill Hart' via julia-users
It turns out that the improperly initialised struct was a coincidence and 
unrelated to the ReadOnlyMemoryError() we were getting on Windows.

We have tracked the issue down, and it was very subtle. Basically the dll 
we built for MPIR (our GMP drop-in replacement) was being built 
incorrectly. It was using linux assembly code instead of Windows, causing 
all sorts of problems.

The following is a bit off topic, but just in case anyone else encounters 
this issue themselves, here is the correct way to build MPIR and MPFR dlls 
on Windows 64 under msys2 with mingw-w64 compiler installed and in the PATH 
(assuming MPIR's ./config.guess claims you have a core2):

wget http://mpir.org/mpir-2.7.2.tar.bz2
tar -xvf mpir-2.7.2.tar.bz2
cd mpir-2.7.2
./configure --enable-shared --disable-static --enable-gmpcompat 
--build=core2-w64-mingw64 LDFLAGS=-static-libgcc ABI=64
cd ..
wget http://www.mpfr.org/mpfr-current/mpfr-3.1.4.tar.bz2
tar -xvf mpfr-3.1.4.tar.bz2
cd mpfr-3.1.4
./configure --with-gmp-build=/home/User/mpir-2.7.2 --enable-shared 
--disable-static
make -j
cd ..

Bill.


[julia-users] Re: A Very Simple Benchmark for Brutal-force Loops in Several Languages: revised, Julia is fast!

2016-07-25 Thread dextorious
I haven't done any systematic benchmarking since Numba introduced the 
ability to JIT compile entire classes. In my experience, very well written 
Julia code is usually equivalent or better (in cases when @simd is helpful) 
compared to Numba JIT'd code. The Python code is sometimes easier to write, 
since Numba takes care of everything, but it's a double edged sword - if 
you run into a case where Numba doesn't work well or at all, you're just 
out of luck. In my personal view, the availability of Numba and other 
libraries just means Python vs Julia performance comparisons aren't 
particularly relevant, you should pick the language you prefer, not because 
you think it's faster, whereas if absolute performance is the only metric, 
you have to resort to Fortran/C++ anyway. I'm writing most of my code in 
Julia because I prefer the type system and multiple dispatch to Python's 
OOP, but I don't think it's meaningfully faster as long as you're willing 
to use appropriate libraries.

The one outlier is the fact that GPU programming is much easier in Python 
at the moment. Hopefully that will change soon, some of the progress in 
ArrayFire.jl and CUDA libraries is very promising.

On Monday, July 25, 2016 at 7:34:04 PM UTC+3, dnm wrote:
>
> Interesting. Did you use the updated Julia code? 
>
> Have you done any comparisons between reading and writing  Numba JIT 
> classes and Julia types in tight loops?
>
> On Monday, July 25, 2016 at 10:41:48 AM UTC-4, dexto...@gmail.com wrote:
>>
>> Just for the sake of comprehensiveness, I ran your Python benchmark 
>> through the Numba JIT library (which uses the same underlying LLVM 
>> infrastructure that Julia does) and on my computer the Python code is 
>> faster than Julia by 68%. Vanilla CPython is terrible for this kind of 
>> simple explicit loop code, but Numba and other JIT libraries largely solve 
>> that issue with minimal effort as long as the code is simple enough. That 
>> by no means solves all of Python's issues in the context of numerical 
>> programming and I'm sure the Julia benchmark could be improved as others 
>> have already mentioned, but benchmarking Python this way isn't necessarily 
>> representative of how a performance-conscious programmer would reasonably 
>> approach a problem of this kind.
>>
>

Re: [julia-users] Re: Which package downgrades other packages?

2016-07-25 Thread Stefan Karpinski
On Mon, Jul 25, 2016 at 10:26 AM, Jeffrey Kuhn  wrote:

> In the age of TB hard drives, package versioning is what node gets right.
> Packages can either be added globally or specific versions can be
> automatically downloaded into a subfolder of the current project directory.


This approach doesn't work for native languages where shared libraries must
actually be shared. It also doesn't work for a language like Julia with
generic functions. Note that npm is in the process of backing of of this
model at least partially.


> I wish Julia would allow projects with local package environments, rather
> than one big hidden `.julia` folder for everything.


You can already have project-local environments using the JULIA_PKGDIR
environment variable. The new version Pkg3 will increase support for this
mode of working significantly.


Re: [julia-users] Re: Which package downgrades other packages?

2016-07-25 Thread Stefan Karpinski
On Mon, Jul 25, 2016 at 12:05 PM, Andreas Lobinger 
wrote:

> Hello colleagues,
>
> On Monday, July 25, 2016 at 4:40:10 PM UTC+2, Stefan Karpinski wrote:
>>
>> The difference between the Julia package ecosystem and DLL hell is that
>> DLLs expose static interfaces and cannot adapt to their environment. If
>> libA.dll expects on interface from libC.dll and libB.dll expects another,
>> you're just stuck – you can use libA.dll or libB.dll but not both (unless
>> you do some crazy stuff with loading multiple copies of libC.dll). Julia
>> packages are quite a bit more flexible: upon loading, A.jl can introspect
>> the version and interface of C.jl and adapt to it, so that it works with a
>> newer or older version correctly. If you want to be able to use A and B at
>> the same time with a newer versions of C in the scenario you describe, the
>> solution is to update A so that it works with the newer version of C, and
>> ideally continues to work the the older version as well.
>>
>
> Can you give us a good example of situation A.jl and C.jl you described?
>

Compat.jl does this extensively with respect to Julia itself.

While i value Stefan's optimism
>
>> and ideally continues to work
>
> and the experimental/scientific approach
>
>>  A.jl can introspect the version and interface of C.jl and adapt to it,
>
> still i could raise the question: And what if not?
>

The "ideally" applies to developer actions, not whether this is possible or
not – it is.


> And why is it so easy to publish a package creating dependency problems?
> Why are dependency problems not detected by automatic testing? And what is
> the recommended way if a package author isn't available? etc.
>

Dependency problems don't seem particularly common and they seem to be
resolved quickly when they come up. Automatic testing is the ideal way to
handle this, but someone has to build, maintain and host that testing
infrastructure. The problem of automatically detecting version interactions
is exponential in the number of packages and (and polynomial in the number
of versions with a very high degree), so some significant cleverness would
need to be applied in order to make this computationally practical.


[julia-users] Re: A Very Simple Benchmark for Brutal-force Loops in Several Languages: revised, Julia is fast!

2016-07-25 Thread dnm
Interesting. Did you use the updated Julia code? 

Have you done any comparisons between reading and writing  Numba JIT 
classes and Julia types in tight loops?

On Monday, July 25, 2016 at 10:41:48 AM UTC-4, dexto...@gmail.com wrote:
>
> Just for the sake of comprehensiveness, I ran your Python benchmark 
> through the Numba JIT library (which uses the same underlying LLVM 
> infrastructure that Julia does) and on my computer the Python code is 
> faster than Julia by 68%. Vanilla CPython is terrible for this kind of 
> simple explicit loop code, but Numba and other JIT libraries largely solve 
> that issue with minimal effort as long as the code is simple enough. That 
> by no means solves all of Python's issues in the context of numerical 
> programming and I'm sure the Julia benchmark could be improved as others 
> have already mentioned, but benchmarking Python this way isn't necessarily 
> representative of how a performance-conscious programmer would reasonably 
> approach a problem of this kind.
>


Re: [julia-users] Re: Which package downgrades other packages?

2016-07-25 Thread dnm
Is this something that will be addressed with the new package system?

On Monday, July 25, 2016 at 12:05:27 PM UTC-4, Andreas Lobinger wrote:
>
> Hello colleagues,
>
> On Monday, July 25, 2016 at 4:40:10 PM UTC+2, Stefan Karpinski wrote:
>>
>> The difference between the Julia package ecosystem and DLL hell is that 
>> DLLs expose static interfaces and cannot adapt to their environment. If 
>> libA.dll expects on interface from libC.dll and libB.dll expects another, 
>> you're just stuck – you can use libA.dll or libB.dll but not both (unless 
>> you do some crazy stuff with loading multiple copies of libC.dll). Julia 
>> packages are quite a bit more flexible: upon loading, A.jl can introspect 
>> the version and interface of C.jl and adapt to it, so that it works with a 
>> newer or older version correctly. If you want to be able to use A and B at 
>> the same time with a newer versions of C in the scenario you describe, the 
>> solution is to update A so that it works with the newer version of C, and 
>> ideally continues to work the the older version as well.
>>
>
> Can you give us a good example of situation A.jl and C.jl you described?
>
> While i value Stefan's optimism 
>
>> and ideally continues to work
>
> and the experimental/scientific approach
>
>>  A.jl can introspect the version and interface of C.jl and adapt to it,
>
> still i could raise the question: And what if not?
>
> And why is it so easy to publish a package creating dependency problems? 
> Why are dependency problems not detected by automatic testing? And what is 
> the recommended way if a package author isn't available? etc.
>
> I do not have any answer to the above questions and i expect no simple 
> answers for someone...
> I agree with Stefan, structure is there to avoid these problems and 
> reasonable forward looking package author can deal with it, but what to do 
> in cases where this has not happened.
>
>
>

Re: [julia-users] Re: Which package downgrades other packages?

2016-07-25 Thread Andreas Lobinger
Hello colleagues,

On Monday, July 25, 2016 at 4:40:10 PM UTC+2, Stefan Karpinski wrote:
>
> The difference between the Julia package ecosystem and DLL hell is that 
> DLLs expose static interfaces and cannot adapt to their environment. If 
> libA.dll expects on interface from libC.dll and libB.dll expects another, 
> you're just stuck – you can use libA.dll or libB.dll but not both (unless 
> you do some crazy stuff with loading multiple copies of libC.dll). Julia 
> packages are quite a bit more flexible: upon loading, A.jl can introspect 
> the version and interface of C.jl and adapt to it, so that it works with a 
> newer or older version correctly. If you want to be able to use A and B at 
> the same time with a newer versions of C in the scenario you describe, the 
> solution is to update A so that it works with the newer version of C, and 
> ideally continues to work the the older version as well.
>

Can you give us a good example of situation A.jl and C.jl you described?

While i value Stefan's optimism 

> and ideally continues to work

and the experimental/scientific approach

>  A.jl can introspect the version and interface of C.jl and adapt to it,

still i could raise the question: And what if not?

And why is it so easy to publish a package creating dependency problems? 
Why are dependency problems not detected by automatic testing? And what is 
the recommended way if a package author isn't available? etc.

I do not have any answer to the above questions and i expect no simple 
answers for someone...
I agree with Stefan, structure is there to avoid these problems and 
reasonable forward looking package author can deal with it, but what to do 
in cases where this has not happened.




Re: [julia-users] embedding: julia repl as a shared library

2016-07-25 Thread Yichao Yu
On Mon, Jul 25, 2016 at 11:18 AM, Joosep Pata  wrote:
>
>> I was under the impression that by "patch repl.c" you mean to patch it
>> somehow so that you can compile as a shared library, that will be very
>> bad and is intentionally not supported.
>> If you are talking about adding your llvm initialization stuff in this
>> file and compile it still as an executable and if you current goal is
>> to get a julia binary that does not confuse LLVM then I think that's
>> the best way to do it and the approach itself is not "dirty" at all
>> (apart from possibly dirty things you need to do to "unconfuse" LLVM,
>> which you need to do anyway, independent of where you do it).
>
> Yes, that's what I wanted to do (re-compile the julia binary with my preinit 
> code), sorry for not being clear. If only 3 (or some N<10) lines of code was 
> needed to make a fully functional julia binary using libjulia, I suppose 
> repl.c would be a bit shorter as well.

Well, this IS actually the case. Most of the logic in `repl.c` are
actually not necessary for a working julia binary. Many of them (in
terms of line count) are actually only needed during (or even only for
debugging) bootstrapping. In another word, `true_main` should
basically only contain one `jl_set_ARGS` and one `jl_eval_string` if
it doesn't need to support bootstrapping.

>
> In fact, I found the way to implement what I needed also via dlopening 
> libjulia [1] as you suggested, but having a non-standard location for the 
> julia binary (wrt. julia source tree) is a real pain, so I think I'll just go 
> with the patch-and-recompile-binary approach.
>
> Thanks again for the clarifications!
>
> [1] https://github.com/jpata/ROOT.jl/blob/cxx/src/ui.cc


Re: [julia-users] embedding: julia repl as a shared library

2016-07-25 Thread Joosep Pata

> I was under the impression that by "patch repl.c" you mean to patch it
> somehow so that you can compile as a shared library, that will be very
> bad and is intentionally not supported.
> If you are talking about adding your llvm initialization stuff in this
> file and compile it still as an executable and if you current goal is
> to get a julia binary that does not confuse LLVM then I think that's
> the best way to do it and the approach itself is not "dirty" at all
> (apart from possibly dirty things you need to do to "unconfuse" LLVM,
> which you need to do anyway, independent of where you do it).

Yes, that's what I wanted to do (re-compile the julia binary with my preinit 
code), sorry for not being clear. If only 3 (or some N<10) lines of code was 
needed to make a fully functional julia binary using libjulia, I suppose repl.c 
would be a bit shorter as well.

In fact, I found the way to implement what I needed also via dlopening libjulia 
[1] as you suggested, but having a non-standard location for the julia binary 
(wrt. julia source tree) is a real pain, so I think I'll just go with the 
patch-and-recompile-binary approach.

Thanks again for the clarifications!

[1] https://github.com/jpata/ROOT.jl/blob/cxx/src/ui.cc

Re: [julia-users] embedding: julia repl as a shared library

2016-07-25 Thread Yichao Yu
On Sun, Jul 24, 2016 at 3:19 PM, Yichao Yu  wrote:
> On Sun, Jul 24, 2016 at 3:18 PM, Yichao Yu  wrote:
>> On Sun, Jul 24, 2016 at 3:12 PM, Joosep Pata  wrote:
>>> Right, thanks for the tip. To confirm: `ui/repl.c` is still the code that
>>> gets compiled to the julia(-debug) binary, right?
>>
>> Yes.
>>
>>> If I call "Base._start()" via libjulia, I still need to reproduce the usual
>>> argv logic of the julia binary.
>>> I'll just patch `repl.c` to my needs then without changing the linking, it's
>>> dirty but seems better that re-implementing.

I was under the impression that by "patch repl.c" you mean to patch it
somehow so that you can compile as a shared library, that will be very
bad and is intentionally not supported.
If you are talking about adding your llvm initialization stuff in this
file and compile it still as an executable and if you current goal is
to get a julia binary that does not confuse LLVM then I think that's
the best way to do it and the approach itself is not "dirty" at all
(apart from possibly dirty things you need to do to "unconfuse" LLVM,
which you need to do anyway, independent of where you do it).

>>
>> Sure, if you think literally two lines of code much worse than a dirty
>> patching then go for it.
>
> Sorry, 3 lines, to be more precise.
>
>>
>>>
>>> On Sunday, 24 July 2016 20:43:54 UTC+2, Yichao Yu wrote:

 On Sun, Jul 24, 2016 at 2:39 PM, Yichao Yu  wrote:
 > On Sun, Jul 24, 2016 at 2:37 PM, Joosep Pata  wrote:
 >> I'd like to not re-implement all the REPL boiler-plate, like
 >> ~~~
 >> ios_puts("\njulia> ", ios_stdout);
 >> ios_flush(ios_stdout);
 >> line = ios_readline(ios_stdin);
 >> ~~~
 >> and so on.
 >
 > That's not the repl and you don't need to implement that.

 The only thing you need to do to get a repl after initialization is to
 call `Base._start()`. Simply `jl_eval_string("Base._start()")` should
 work.

 >
 >>
 >> In effect, I want to launch the usual julia REPL, but call some of my
 >> own
 >
 > Which is **NOT** in the repl.c
 >
 >> initialization procedures before julia_init.
 >> My motivation is that I want to call an external library that dies
 >> horribly
 >> due to the LLVM symbols present if loaded after julia_init is called,
 >> see
 >> also https://github.com/JuliaLang/julia/issues/12644.
 >> The only way I've managed to do it is to recompile the julia binary, I
 >> figured I could re-use the repl code by just dynamically loading it.
 >>
 >> On Sunday, 24 July 2016 19:55:00 UTC+2, Yichao Yu wrote:
 >>>
 >>> On Sun, Jul 24, 2016 at 1:21 PM, Joosep Pata 
 >>> wrote:
 >>> > Hi,
 >>> >
 >>> > I'd like to compile ui/repl.c into a shared library so that I could
 >>> > dlopen julia after some other initialization procedures that would
 >>> > otherwise
 >>> > conflict with the LLVM linked to julia.
 >>>
 >>> You should **NOT** compile `ui/repl.c` since it will fail as you saw.
 >>> You should just use `libjulia.so`, why is it not enough? `ui/repl.c`
 >>> is just a very thin wrapper and should have nothing to do with LLVM or
 >>> whatever conflict you saw.
 >>>
 >>> >
 >>> > I succeeded in doing that on OSX using:
 >>> >
 >>> > ~~~
 >>> > diff --git a/ui/Makefile b/ui/Makefile
 >>> > +julia-release: $(build_bindir)/julia$(EXE)
 >>> > $(build_private_libdir)/librepl.$(SHLIB_EXT)
 >>> > ...
 >>> > +$(build_private_libdir)/librepl.$(SHLIB_EXT): $(OBJS)
 >>> > +   @$(call PRINT_LINK, $(CXXLD) -shared $(CXXFLAGS)
 >>> > $(CXXLDFLAGS)
 >>> > $(LINK_FLAGS) $(SHIPFLAGS) $^ -o $@ -L$(build_private_libdir)
 >>> > -L$(build_libdir) -L$(build_shlibdir)
 >>> > ~~~
 >>> >
 >>> > so I can call julia dynamically as
 >>> > ~~~
 >>> >  my_init(); // initalize stuff that hides its LLVM symbols after
 >>> > loading
 >>> > ...
 >>> >  void* handle_julia = dlopen(LIBJULIAREPL, RTLD_NOW | RTLD_GLOBAL);
 >>> > ...
 >>> >  typedef int (*t_jl_main)(int, char**);
 >>> >  t_jl_main jl_main = (t_jl_main)dlsym(handle_julia, "main");
 >>> >  return jl_main(argc, argv);
 >>> > ~~~
 >>> >
 >>> > On linux, I get strange linker errors:
 >>> > `/usr/bin/ld: repl.o: relocation R_X86_64_TPOFF32 against
 >>> > `tls_states.12084' can not be used when making a shared object;
 >>> > recompile
 >>> > with -fPIC`
 >>> >
 >>> > As far as I can tell, julia uses fPIC throughout. Has anyone
 >>> > encountered
 >>> > something like this before? Google links to some old gcc bugs and a
 >>> > go
 >>> > linker issue but it's not evident if there is a fix.
 >>> >
 >>> > Cheers,
 >>> > Joosep


Re: [julia-users] Re: Which package downgrades other packages?

2016-07-25 Thread Jeffrey Kuhn
In the age of TB hard drives, package versioning is what node gets right. 
Packages can either be added globally or specific versions can be 
automatically downloaded into a subfolder of the current project directory. 
I wish Julia would allow projects with local package environments, rather 
than one big hidden `.julia` folder for everything.

On Sunday, July 24, 2016 at 2:26:29 PM UTC-5, David Anthoff wrote:
>
> The deeper problem though is: what if I want to use all the currently 
> installed packages at the same time, but don’t want the downgraded versions 
> of them? Say I want to use package A and B, and both have a dependency on 
> C, but A requires some older version of C, and that prevents me from 
> getting the latest version of B.
>
>  
>
> In some way the situation is a little like DDL hell on Windows in the 
> early 90s: we can only ever have one version of a dependency installed and 
> loaded at the same time in julia right now. I guess one way to “solve” this 
> would be to start adding new versions of packages whenever there is a 
> breaking change, i.e. when C introduced a breaking change it could have 
> registered a new version C2. Terrible, of course, but that is what happened 
> on Windows for things like the MS C runtime…
>
>  
>
> Are there any thoughts around this, how this could be solved for julia? My 
> guess is that these situations will only happen more often as the package 
> eco-systems grows…
>
>  
>
> *From:* stefan.k...@gmail.com  [mailto:stefan.k...@gmail.com 
> ] *On Behalf Of *Stefan Karpinski
> *Sent:* Saturday, July 23, 2016 11:06 AM
> *To:* Julia Users >
> *Subject:* Re: [julia-users] Re: Which package downgrades other packages?
>
>  
>
> The way to do this would be to compute which packages' removal would allow 
> another package to be upgraded.
>
>  
>
> On Sat, Jul 23, 2016 at 1:59 PM, Tony Kelman  > wrote:
>
> Part of the issue here is dependency resolution is a global feasibility 
> problem, each package imposes a subset of constraints. So you can identify 
> active / free constraints at a solution, but it can be hard to assign blame 
> to a single package in general.
>
>
>
>
> On Saturday, July 23, 2016 at 7:37:50 AM UTC-7, Chris Rackauckas wrote:
>
> Another +1. When Optim.jl tagged v0.5, it took me too long find out it was 
> responsible for rolling back a few of my packages, causing some tests to 
> break (especially since I didn't have it master checked out for it, so I 
> wasn't expecting it to really change! I only tracked it down because of the 
> julia-users announcement). That's not Optim's fault, but an issue with the 
> package system for not making it explicit why it was occurring (at least it 
> didn't a month ago?). I think Pkg.update() tells you when a package is 
> rolled back, but not why.
>
>  
>
> IIRC, I was really hoping that Pkg.status() would tell me whenever a 
> package was not at its highest version due to another package, and tell 
> me which package was doing that. For example,
>
>  
>
> -CoolPkg 0.1 (Rolled back due to AwesomeFoo)
>
>
> Then it would be easy to see where I should checkout master, find how to 
> make them work together, and submit a pull request! But I don't know if 
> that would be difficult to implement.
>
> On Friday, July 22, 2016 at 7:24:30 PM UTC-7, Tony Kelman wrote:
>
> Maybe a useful function to write and submit to PkgDev would be go through 
> all installed packages, check the METADATA requires file for all the 
> installed versions and display a list of upper-bounded dependencies and 
> which package is responsible for each. A little bit of code might go a long 
> way in making this more discoverable.
>
>  
>


Re: [julia-users] Re: What packages, features, other strengths would you recommend when showing Julia to molecular neurobiologists?

2016-07-25 Thread Stefan Karpinski
I think that the key is to take an informative approach rather than a
forceful one. If people are happy using whatever they're currently using,
don't try to force them to change. There are, however, usually people who
are in a great deal of pain trying to solve the problems they're tackling
with the tools they have – those people are often extremely relieved to
find something that can make their lives easier. People are themselves in
the best position to know this, so if you show them something like Julia
and what it can do, they'll know if they have a use for it or not. That
said, showing examples that connect with them and allow them to imagine how
to apply it in their work is key.

On Mon, Jul 25, 2016 at 7:18 AM, Tamas Papp  wrote:

> Hate to sound like a curmudgeon, but language evangelism frequently
> backfires, and if it is coming from a person not working in a particular
> problem domain, the best you can expect is a shrug. Which is fair I
> guess, "I don't know what you are doing but I am sure you would find
> language X a good match for it" doesn't sound too convincing.
>
> On the bright side, Julia is spreading fast in many communites, so if it
> is useful for those scientists, I am sure they will find it on their own
> quickly. When they are ready; and when the language is ready.
>
> On Mon, Jul 25 2016, Chris Rackauckas wrote:
>
> > It seems like most of what they do is biostatistics/bioinformatics. I
> would
> > show them PyCall and RCall. Knowing that you easily have all of those
> > libraries (and your previous libraries) is great. Also show them the
> > JuliaStats stuff.
> >
> > In fact, ask them what they'd want to add to Julia if they had the time.
> > You'll run the gambit and show them a package which already does it. This
> > happens all the time on the Gitter: someone new comes saying "hey I want
> to
> > learn Julia. It's new so it doesn't have many packages... does it have
> > something for this? Oh it does... this? Oh it does... this? Oh..., its
> > package system is actually pretty complete." This combined with the
> > R/Python/MATLAB glue really makes one confident that Julia at least has
> > enough to try on a real project (and get hooked).
> >
> > I'd also show them Plots.jl. It is also much nicer than other plotting
> > libraries I've used before. The fact that you can switch backends with
> the
> > same code means that you get all the new updates "for free" when backends
> > come out (I'm looking at GLVisualize!)
> >
> > Definitely show them the BioJulia group.
> >
> > Show them @parallel and pmap. If they have HPCs, show them how to just
> give
> > Julia the machinefile and together you already have multinode parallelism
> > for embarassingly parallel problems.
> >
> > Last but not least, show them the community: julia-users, the Gitter
> > channels for chatting with the devs, etc. Knowing that there's always
> help
> > right there is really wonderful.
> >
> > On Monday, July 25, 2016 at 2:44:16 AM UTC-7, Job van der Zwan wrote:
> >>
> >> *TLDR: I'd like to show Julia to my colleagues, but don't have a clue
> >> which cool packages and features I should show off to them, because I
> don't
> >> do any scientific work myself.*
> >>
> >> Hi,
> >>
> >> I'm an interaction designer working for a research group at Karolinska
> >> Institute[0]. Basically, I'm a glorified front-end webdev. I don't do
> any
> >> scientific work myself, I'm just building a web-based interface for
> >> browsing and visualizing single-cell data for them. So my use-cases
> don't
> >> seem to align with Julia's strengths, but I like the design of the
> >> language, the ideas behind the project and have been following its
> >> development great pleasure.
> >>
> >> Last week while watching a bunch of JuliaCon videos during a lunch
> break,
> >> one of my colleagues asked what the video was about. I tried to explain
> the
> >> Julia project to him, as well as the language's strengths and
> weaknesses.
> >> Sadly, I didn't really do a good job of it, since I don't actually
> program
> >> in it myself. He said it looked a lot like Matlab (his language of
> choice)
> >> and was interested in the free-and-open-source aspect. But he expected
> >> there to not be enough packages yet for him to work with it and was
> >> sceptical about whether switching to it would be worth it. I tried to
> >> explain that Julia can call out to Matlab code with practically no
> >> overhead, but he didn't really look convinced (and I didn't have a
> working
> >> Julia environment to show it off to him either). While Jupyter was also
> a
> >> turn-off, since he doesn't like notebooks, but the Juno video
> compensated
> >> for that a lot.
> >>
> >> Basically, I'd like to show Julia to my colleagues, give them some
> >> pointers on where it might be fun to start playing with it, what are
> some
> >> of its amazing features *that matter to them*, but I don't have a clue
> of
> >> what I should focus on to do so.
> >>
> >> The researcher

[julia-users] Re: A Very Simple Benchmark for Brutal-force Loops in Several Languages: revised, Julia is fast!

2016-07-25 Thread dextorious
Just for the sake of comprehensiveness, I ran your Python benchmark through 
the Numba JIT library (which uses the same underlying LLVM infrastructure 
that Julia does) and on my computer the Python code is faster than Julia by 
68%. Vanilla CPython is terrible for this kind of simple explicit loop 
code, but Numba and other JIT libraries largely solve that issue with 
minimal effort as long as the code is simple enough. That by no means 
solves all of Python's issues in the context of numerical programming and 
I'm sure the Julia benchmark could be improved as others have already 
mentioned, but benchmarking Python this way isn't necessarily 
representative of how a performance-conscious programmer would reasonably 
approach a problem of this kind.


Re: [julia-users] Re: Which package downgrades other packages?

2016-07-25 Thread Stefan Karpinski
The difference between the Julia package ecosystem and DLL hell is that
DLLs expose static interfaces and cannot adapt to their environment. If
libA.dll expects on interface from libC.dll and libB.dll expects another,
you're just stuck – you can use libA.dll or libB.dll but not both (unless
you do some crazy stuff with loading multiple copies of libC.dll). Julia
packages are quite a bit more flexible: upon loading, A.jl can introspect
the version and interface of C.jl and adapt to it, so that it works with a
newer or older version correctly. If you want to be able to use A and B at
the same time with a newer versions of C in the scenario you describe, the
solution is to update A so that it works with the newer version of C, and
ideally continues to work the the older version as well.

On Sun, Jul 24, 2016 at 3:26 PM, David Anthoff  wrote:

> The deeper problem though is: what if I want to use all the currently
> installed packages at the same time, but don’t want the downgraded versions
> of them? Say I want to use package A and B, and both have a dependency on
> C, but A requires some older version of C, and that prevents me from
> getting the latest version of B.
>
>
>
> In some way the situation is a little like DDL hell on Windows in the
> early 90s: we can only ever have one version of a dependency installed and
> loaded at the same time in julia right now. I guess one way to “solve” this
> would be to start adding new versions of packages whenever there is a
> breaking change, i.e. when C introduced a breaking change it could have
> registered a new version C2. Terrible, of course, but that is what happened
> on Windows for things like the MS C runtime…
>
>
>
> Are there any thoughts around this, how this could be solved for julia? My
> guess is that these situations will only happen more often as the package
> eco-systems grows…
>
>
>
> *From:* stefan.karpin...@gmail.com [mailto:stefan.karpin...@gmail.com] *On
> Behalf Of *Stefan Karpinski
> *Sent:* Saturday, July 23, 2016 11:06 AM
> *To:* Julia Users 
> *Subject:* Re: [julia-users] Re: Which package downgrades other packages?
>
>
>
> The way to do this would be to compute which packages' removal would allow
> another package to be upgraded.
>
>
>
> On Sat, Jul 23, 2016 at 1:59 PM, Tony Kelman  wrote:
>
> Part of the issue here is dependency resolution is a global feasibility
> problem, each package imposes a subset of constraints. So you can identify
> active / free constraints at a solution, but it can be hard to assign blame
> to a single package in general.
>
>
>
>
> On Saturday, July 23, 2016 at 7:37:50 AM UTC-7, Chris Rackauckas wrote:
>
> Another +1. When Optim.jl tagged v0.5, it took me too long find out it was
> responsible for rolling back a few of my packages, causing some tests to
> break (especially since I didn't have it master checked out for it, so I
> wasn't expecting it to really change! I only tracked it down because of the
> julia-users announcement). That's not Optim's fault, but an issue with the
> package system for not making it explicit why it was occurring (at least it
> didn't a month ago?). I think Pkg.update() tells you when a package is
> rolled back, but not why.
>
>
>
> IIRC, I was really hoping that Pkg.status() would tell me whenever a
> package was not at its highest version due to another package, and tell
> me which package was doing that. For example,
>
>
>
> -CoolPkg 0.1 (Rolled back due to AwesomeFoo)
>
>
> Then it would be easy to see where I should checkout master, find how to
> make them work together, and submit a pull request! But I don't know if
> that would be difficult to implement.
>
> On Friday, July 22, 2016 at 7:24:30 PM UTC-7, Tony Kelman wrote:
>
> Maybe a useful function to write and submit to PkgDev would be go through
> all installed packages, check the METADATA requires file for all the
> installed versions and display a list of upper-bounded dependencies and
> which package is responsible for each. A little bit of code might go a long
> way in making this more discoverable.
>
>
>


Re: [julia-users] Composite Type Array

2016-07-25 Thread maxent219
Thanks alot Cameron & Jared.

This makes much more sense now & the composite type seems more usable. I 
had erroneous & conflicting expectations of how I would interact with the 
events data structure after preallocating it. 

I'm going to try it out & see how it fits into a larger scale project. 

On Friday, July 22, 2016 at 3:19:26 PM UTC-4, Cameron McBride wrote:
>
>
> On Fri, Jul 22, 2016 at 3:09 PM, Cameron McBride  > wrote:
>
>> julia> function get_event(v::Vector{ExampleEvent}, i)
>>  n = length(v)
>>  if i > n
>>for j in n:i
>>push!(v, ExampleEvent("",0,0,0,0,0,0)) # default values
>>end
>>  end
>>  v[i]
>>end
>>
>
> haha, that creates one extra element (ExampleEvent row) on the first pass. 
> Whoops. Here is a quick fix. Likely better ways to do all this, just 
> kicking it further.
>
> julia> function get_event(v::Vector{ExampleEvent}, i)
>  n = length(v)
>  if i > n
>i1 = n > 0 ? n : 1
>for x in i1:i
>push!(v, ExampleEvent("",0,0,0,0,0,0)) # default values
>end
>  end
>  v[i]
>end
>
> Cameron
>


[julia-users] Re: Interactive Animated PyPlot in IJulia

2016-07-25 Thread Thomas Hudson
Since Julia 0.4.6, this solution no longer seems to work: the code reverts 
to plotting only the final frame calculated.

Does anyone have any idea how to tweak the code and get identical 
on-the-fly plotting behaviour with PyPlot under Julia 0.4.6?

Thanks for any help you can give,

Tom

On Thursday, 27 November 2014 22:12:14 UTC+1, Christoph Ortner wrote:
>
> And here is the working code:
>
> [1]
> using Gadfly,Reactive,Interact,PyPlot
> myfig = figure()
> function myplot(data)
> withfig(myfig) do
> PyPlot.plot(data[1], data[2])
> axis([0,1,-.3,.3])
> end
> end
> x = linspace(0,1,100)
> myinput=Input((x,0*x))
> lift(myplot, myinput)
>
> [2]
> x = linspace(0,1,100)
> for t = -1:.1:1
> y = t * x .*(1-x)
> push!(myinput,(x, y))
> end
>
>
> On Thursday, 27 November 2014 21:11:22 UTC, Christoph Ortner wrote:
>>
>> Hi Steven,
>>
>> That worked! Thank you.
>>
>> (Though admittedly I did not fully understand your explanation.)
>>
>> All the best, 
>> Christoph
>>
>> On Thursday, 27 November 2014 19:04:12 UTC, Steven G. Johnson wrote:
>>>
>>> PyPlot, like the Python package of the same name, plots as a side 
>>> effect. You can use the withfig function to wrap PyPlot commands and make 
>>> them functional (returning the figure object as the withfig return value 
>>> rather than displaying it as a side effect). This allows Pyplot to be used 
>>> with @manipulate, but should also work with other Reactive functions. 
>>
>>

[julia-users] Re: splice! inconsistency with Vector{Vector{UInt8}}

2016-07-25 Thread 'George Marrows' via julia-users
To answer myself: I didn't read the docs correctly. Or I got my types 
mangled. Or both.

The final splice should be
  splice!(a, 1:0, Vector{UInt8}[UInt8[7,3]])
not
  splice!(a, 1:0, UInt8[7,3])

splice! splices in another collection, not a single element.

Interfaces are very informally defined in Julia aren't they? I don't see 
there's any way to name an iterable something so that the def of splice! 
could be eg
  function splice!{E, T<:Integer}(a::Vector{E}, r::UnitRange{T}, 
ins::Iterable{E})
... which would have helped me not to trip over this problem

George


On Saturday, 23 July 2016 21:58:10 UTC+3, George Marrows wrote:
>
> Hi. As per the docs on splice! and n:(n-1) ranges, the following works 
> fine to insert a value before location 1 in a Vector:
> x = [1,2,3]
> splice!(x, 1:0, 23)
> print(x)  # => [23,1,2,3]
>
> But the following behaves strangely and seems to corrupt the array in the 
> process:
> a = Vector{UInt8}[UInt8[0xf3,0x32,0x37]]
> splice!(a, 1:0, UInt8[7,3])   # =>  MethodError: `convert` has no method 
> matching convert(::Type{Array{UInt8,1}}, ::UInt8)
> print(a) # => [#undef,#undef,UInt8[0xf3,0x32,0x37]]
>
> Is this a bug, or have I simply not read the docs correctly?
>
> Many thanks, 
> George
>


Re: [julia-users] ReadOnlyMemoryError() on Windows 64

2016-07-25 Thread 'Bill Hart' via julia-users
Coincidentally we had a failure in the same test on Linux today (after 
running it hundreds of times before without issue).

My colleague had the bright idea of running that test many times so that we 
could reliably reproduce the error on Linux. We then found the bug, which 
was in fact a bug in our code (not properly initialising a struct being 
passed to C).

This is almost certainly what was causing the issue on Windows. So I think 
the problem is likely solved.

Bill.

On Sunday, 24 July 2016 23:35:42 UTC+2, Yichao Yu wrote:
>
> On Sun, Jul 24, 2016 at 5:22 PM, 'Bill Hart' via julia-users 
> > wrote: 
> > I built the dlls we make use of in our Nemo package a slightly odd way, 
> but 
> > everything worked, all tests passed. 
> > 
> > I decided not to be lazy and built the dlls the correct way, and all of 
> a 
> > sudden I get a ReadOnlyMemoryError() whilst running our test code. 
> > 
> > This is with either Julia 0.4.0 or 0.4.6 on Windows 64. 
> > 
> > depends22 shows no problems with the dlls and that they don't depend on 
> the 
> > msys dll. I'm quite sure I've built them correctly. 
> > 
> > Julia seems to think the dlls are valid, I just get that error. 
> > 
> > Does anyone have any insight into what this error is and what is causing 
> it? 
>
> The error means there's some memory error. It's impossible to tell 
> what's causing it without more detail. 
>
> > 
> > Bill. 
>


Re: [julia-users] splice! inconsistency with Vector{Vector{UInt8}}

2016-07-25 Thread Milan Bouchet-Valat
Le samedi 23 juillet 2016 à 11:37 -0700, 'George Marrows' via julia-
users a écrit :
> Hi. As per the docs on splice! and n:(n-1) ranges, the following
> works fine to insert a value before location 1 in a Vector:
> x = [1,2,3]
> splice!(x, 1:0, 23)
> print(x)  # => [23,1,2,3]
This example works, but only as a special-case of '23' being taken as
equivalent to '[23]'.

> But the following behaves strangely and seems to corrupt the array in
> the process:
> a = Vector{UInt8}[UInt8[0xf3,0x32,0x37]]
> splice!(a, 1:0, UInt8[7,3])   # =>  MethodError: `convert` has no
> method matching convert(::Type{Array{UInt8,1}}, ::UInt8)
> print(a) # => [#undef,#undef,UInt8[0xf3,0x32,0x37]]
This example fails because 'a' is a vector of vectors, so you need to
pass a vector of vectors to splice!() too. Since you pass it a vector
of UInt8, it tries to insert UInt8 values into a, which isn't possible.
The failure comes from 'convert(Vector{UInt8}, 7)'.

Try with
splice!(a, 1:0, Vector{UInt8}[UInt8[7,3]])

The array corruption is a different issue which is hard to fix:
checking whether conversion will succeed before resizing the array
hurts performance a lot, so currently if conversion fails the array is
resized anyway. This might be fixed at some point if a good solution is
found. See https://github.com/JuliaLang/julia/issues/15868


Regards

> Is this a bug, or have I simply not read the docs correctly?



Re: [julia-users] Re: What packages, features, other strengths would you recommend when showing Julia to molecular neurobiologists?

2016-07-25 Thread Tamas Papp
Hate to sound like a curmudgeon, but language evangelism frequently
backfires, and if it is coming from a person not working in a particular
problem domain, the best you can expect is a shrug. Which is fair I
guess, "I don't know what you are doing but I am sure you would find
language X a good match for it" doesn't sound too convincing.

On the bright side, Julia is spreading fast in many communites, so if it
is useful for those scientists, I am sure they will find it on their own
quickly. When they are ready; and when the language is ready.

On Mon, Jul 25 2016, Chris Rackauckas wrote:

> It seems like most of what they do is biostatistics/bioinformatics. I would 
> show them PyCall and RCall. Knowing that you easily have all of those 
> libraries (and your previous libraries) is great. Also show them the 
> JuliaStats stuff. 
>
> In fact, ask them what they'd want to add to Julia if they had the time. 
> You'll run the gambit and show them a package which already does it. This 
> happens all the time on the Gitter: someone new comes saying "hey I want to 
> learn Julia. It's new so it doesn't have many packages... does it have 
> something for this? Oh it does... this? Oh it does... this? Oh..., its 
> package system is actually pretty complete." This combined with the 
> R/Python/MATLAB glue really makes one confident that Julia at least has 
> enough to try on a real project (and get hooked).
>
> I'd also show them Plots.jl. It is also much nicer than other plotting 
> libraries I've used before. The fact that you can switch backends with the 
> same code means that you get all the new updates "for free" when backends 
> come out (I'm looking at GLVisualize!)
>
> Definitely show them the BioJulia group. 
>
> Show them @parallel and pmap. If they have HPCs, show them how to just give 
> Julia the machinefile and together you already have multinode parallelism 
> for embarassingly parallel problems.
>
> Last but not least, show them the community: julia-users, the Gitter 
> channels for chatting with the devs, etc. Knowing that there's always help 
> right there is really wonderful. 
>
> On Monday, July 25, 2016 at 2:44:16 AM UTC-7, Job van der Zwan wrote:
>>
>> *TLDR: I'd like to show Julia to my colleagues, but don't have a clue 
>> which cool packages and features I should show off to them, because I don't 
>> do any scientific work myself.*
>>
>> Hi,
>>
>> I'm an interaction designer working for a research group at Karolinska 
>> Institute[0]. Basically, I'm a glorified front-end webdev. I don't do any 
>> scientific work myself, I'm just building a web-based interface for 
>> browsing and visualizing single-cell data for them. So my use-cases don't 
>> seem to align with Julia's strengths, but I like the design of the 
>> language, the ideas behind the project and have been following its 
>> development great pleasure.
>>
>> Last week while watching a bunch of JuliaCon videos during a lunch break, 
>> one of my colleagues asked what the video was about. I tried to explain the 
>> Julia project to him, as well as the language's strengths and weaknesses. 
>> Sadly, I didn't really do a good job of it, since I don't actually program 
>> in it myself. He said it looked a lot like Matlab (his language of choice) 
>> and was interested in the free-and-open-source aspect. But he expected 
>> there to not be enough packages yet for him to work with it and was 
>> sceptical about whether switching to it would be worth it. I tried to 
>> explain that Julia can call out to Matlab code with practically no 
>> overhead, but he didn't really look convinced (and I didn't have a working 
>> Julia environment to show it off to him either). While Jupyter was also a 
>> turn-off, since he doesn't like notebooks, but the Juno video compensated 
>> for that a lot.
>>
>> Basically, I'd like to show Julia to my colleagues, give them some 
>> pointers on where it might be fun to start playing with it, what are some 
>> of its amazing features *that matter to them*, but I don't have a clue of 
>> what I should focus on to do so.
>>
>> The researchers I work for are molecular neurobiologists. They're doing 
>> pretty well, having published in Science last year and this year, see 
>> here[1] for a list of publicatiosn. Currently Anaconda is the "lingua 
>> franca" platform, but some in the group prefer Matlab or R over Python. Of 
>> course, one of Julia's selling points is that it's a very "inclusive" 
>> language, so I definitely could show that, but I don't know what else to 
>> demonstrate. I'm hoping there are researchers here with similar enough 
>> use-cases for Julia who could give me some suggestions about what kind of 
>> things they might really like over their existing solutions.
>>
>> Cheers,
>> Job
>>
>> [0] http://linnarssonlab.org/
>> [1] http://linnarssonlab.org/publications/
>>



[julia-users] help with integrating websockets and protobuf

2016-07-25 Thread Jon Norberg
Anyone have any experiences with protobuf/web Sockets?



[julia-users] Re: What packages, features, other strengths would you recommend when showing Julia to molecular neurobiologists?

2016-07-25 Thread Chris Rackauckas
It seems like most of what they do is biostatistics/bioinformatics. I would 
show them PyCall and RCall. Knowing that you easily have all of those 
libraries (and your previous libraries) is great. Also show them the 
JuliaStats stuff. 

In fact, ask them what they'd want to add to Julia if they had the time. 
You'll run the gambit and show them a package which already does it. This 
happens all the time on the Gitter: someone new comes saying "hey I want to 
learn Julia. It's new so it doesn't have many packages... does it have 
something for this? Oh it does... this? Oh it does... this? Oh..., its 
package system is actually pretty complete." This combined with the 
R/Python/MATLAB glue really makes one confident that Julia at least has 
enough to try on a real project (and get hooked).

I'd also show them Plots.jl. It is also much nicer than other plotting 
libraries I've used before. The fact that you can switch backends with the 
same code means that you get all the new updates "for free" when backends 
come out (I'm looking at GLVisualize!)

Definitely show them the BioJulia group. 

Show them @parallel and pmap. If they have HPCs, show them how to just give 
Julia the machinefile and together you already have multinode parallelism 
for embarassingly parallel problems.

Last but not least, show them the community: julia-users, the Gitter 
channels for chatting with the devs, etc. Knowing that there's always help 
right there is really wonderful. 

On Monday, July 25, 2016 at 2:44:16 AM UTC-7, Job van der Zwan wrote:
>
> *TLDR: I'd like to show Julia to my colleagues, but don't have a clue 
> which cool packages and features I should show off to them, because I don't 
> do any scientific work myself.*
>
> Hi,
>
> I'm an interaction designer working for a research group at Karolinska 
> Institute[0]. Basically, I'm a glorified front-end webdev. I don't do any 
> scientific work myself, I'm just building a web-based interface for 
> browsing and visualizing single-cell data for them. So my use-cases don't 
> seem to align with Julia's strengths, but I like the design of the 
> language, the ideas behind the project and have been following its 
> development great pleasure.
>
> Last week while watching a bunch of JuliaCon videos during a lunch break, 
> one of my colleagues asked what the video was about. I tried to explain the 
> Julia project to him, as well as the language's strengths and weaknesses. 
> Sadly, I didn't really do a good job of it, since I don't actually program 
> in it myself. He said it looked a lot like Matlab (his language of choice) 
> and was interested in the free-and-open-source aspect. But he expected 
> there to not be enough packages yet for him to work with it and was 
> sceptical about whether switching to it would be worth it. I tried to 
> explain that Julia can call out to Matlab code with practically no 
> overhead, but he didn't really look convinced (and I didn't have a working 
> Julia environment to show it off to him either). While Jupyter was also a 
> turn-off, since he doesn't like notebooks, but the Juno video compensated 
> for that a lot.
>
> Basically, I'd like to show Julia to my colleagues, give them some 
> pointers on where it might be fun to start playing with it, what are some 
> of its amazing features *that matter to them*, but I don't have a clue of 
> what I should focus on to do so.
>
> The researchers I work for are molecular neurobiologists. They're doing 
> pretty well, having published in Science last year and this year, see 
> here[1] for a list of publicatiosn. Currently Anaconda is the "lingua 
> franca" platform, but some in the group prefer Matlab or R over Python. Of 
> course, one of Julia's selling points is that it's a very "inclusive" 
> language, so I definitely could show that, but I don't know what else to 
> demonstrate. I'm hoping there are researchers here with similar enough 
> use-cases for Julia who could give me some suggestions about what kind of 
> things they might really like over their existing solutions.
>
> Cheers,
> Job
>
> [0] http://linnarssonlab.org/
> [1] http://linnarssonlab.org/publications/
>


Re: [julia-users] Re: Calling all users of ParallelAccelerator.

2016-07-25 Thread pevnak
Hi Todd,
I have been looking at latte and it does not seem to be useful for me, 
since I need some special constructs and they are just not available.
Nevertheless, I would like to ask you, if Latte uses parallelization? In my 
own implementation, I am struggling to exploit multi-core hw.
Thank you very much.
Best wishes,
Tomas




On Saturday, 23 July 2016 05:42:57 UTC+2, Todd Anderson wrote:
>
> You may also want to look at another IntelLabs project on GitHub called 
> Latte.  It provides a DSL for deep neural networks in Julia.
>
> Todd
>
> --
> *From: *pev...@gmail.com 
> *To: *"julia-users" >
> *Sent: *Friday, July 22, 2016 8:07:19 PM
> *Subject: *[julia-users] Re: Calling all users of ParallelAccelerator.
>
> Hi Todd,
>>
> I have tried several times to use ParallelAccelerator to speed up my toy 
> Neural Network library, but I never had any significant performance boost. 
> I like the idea of the project a lot, sadly I was never able to fully 
> utilise it.
>
> Best wishes,
> Tomas 
>
>

[julia-users] What packages, features, other strengths would you recommend when showing Julia to molecular neurobiologists?

2016-07-25 Thread Job van der Zwan
*TLDR: I'd like to show Julia to my colleagues, but don't have a clue which 
cool packages and features I should show off to them, because I don't do 
any scientific work myself.*

Hi,

I'm an interaction designer working for a research group at Karolinska 
Institute[0]. Basically, I'm a glorified front-end webdev. I don't do any 
scientific work myself, I'm just building a web-based interface for 
browsing and visualizing single-cell data for them. So my use-cases don't 
seem to align with Julia's strengths, but I like the design of the 
language, the ideas behind the project and have been following its 
development great pleasure.

Last week while watching a bunch of JuliaCon videos during a lunch break, 
one of my colleagues asked what the video was about. I tried to explain the 
Julia project to him, as well as the language's strengths and weaknesses. 
Sadly, I didn't really do a good job of it, since I don't actually program 
in it myself. He said it looked a lot like Matlab (his language of choice) 
and was interested in the free-and-open-source aspect. But he expected 
there to not be enough packages yet for him to work with it and was 
sceptical about whether switching to it would be worth it. I tried to 
explain that Julia can call out to Matlab code with practically no 
overhead, but he didn't really look convinced (and I didn't have a working 
Julia environment to show it off to him either). While Jupyter was also a 
turn-off, since he doesn't like notebooks, but the Juno video compensated 
for that a lot.

Basically, I'd like to show Julia to my colleagues, give them some pointers 
on where it might be fun to start playing with it, what are some of its 
amazing features *that matter to them*, but I don't have a clue of what I 
should focus on to do so.

The researchers I work for are molecular neurobiologists. They're doing 
pretty well, having published in Science last year and this year, see 
here[1] for a list of publicatiosn. Currently Anaconda is the "lingua 
franca" platform, but some in the group prefer Matlab or R over Python. Of 
course, one of Julia's selling points is that it's a very "inclusive" 
language, so I definitely could show that, but I don't know what else to 
demonstrate. I'm hoping there are researchers here with similar enough 
use-cases for Julia who could give me some suggestions about what kind of 
things they might really like over their existing solutions.

Cheers,
Job

[0] http://linnarssonlab.org/
[1] http://linnarssonlab.org/publications/