Re: [julia-users] ANN node-julia 1.0.0
Some basic questions: We know that Node is blocking on cpu intensive tasks. If I use the async option, are the calculations run separately. That is does it allow the mode process to continue with its event loop. If the answer is yes, what happens when julia is busy with a previous calculation and a new one is passed to it? Thanks so much for creating this.
Re: [julia-users] ANN node-julia 1.0.0
I meant the node process not mode process. Blaming the autocorrect in iPad . . . On Saturday, January 17, 2015 at 10:53:36 AM UTC-5, Test This wrote: Some basic questions: We know that Node is blocking on cpu intensive tasks. If I use the async option, are the calculations run separately. That is does it allow the mode process to continue with its event loop. If the answer is yes, what happens when julia is busy with a previous calculation and a new one is passed to it? Thanks so much for creating this.
Re: [julia-users] ANN node-julia 1.0.0
On Saturday, January 17, 2015 at 10:53:36 AM UTC-5, Test This wrote: Some basic questions: We know that Node is blocking on cpu intensive tasks. If I use the async option, are the calculations run separately. That is does it allow the mode process to continue with its event loop. The answer is yes, and the reason is though node is singly threaded, native modules can be multi-threaded, the main thread accepts work and then queues it along with the function to return the result to and returns immediately meanwhile a couple of other threads dequeue the work evaluate and then enqueue the result and finally the main thread is signaled using uv_async_send. If the answer is yes, what happens when julia is busy with a previous calculation and a new one is passed to it? Currently there's good news and bad news. The good news is that node-julia async calls will not block and so node will not block, but the bad news is the the evaluator running in a separate thread waits for the result before moving on to the next thing. Thanks so much for creating this. Oh man, you're welcome, most definitely.
Re: [julia-users] ANN node-julia 1.0.0
This is awesome. That would provide an easy way to scale a web-application at least to some extent by outsourcing the CPU intensive calculation out of Node. Currently, Julia takes a long time to start up. Does the Julia process start when include myjuliafile.jl is called? If so, can one call this line somewhere when the server starts. Otherwise, the start up time will increase the wait for the web clients. On Saturday, January 17, 2015 at 12:32:36 PM UTC-5, Jeff Waller wrote: On Saturday, January 17, 2015 at 10:53:36 AM UTC-5, Test This wrote: Some basic questions: We know that Node is blocking on cpu intensive tasks. If I use the async option, are the calculations run separately. That is does it allow the mode process to continue with its event loop. The answer is yes, and the reason is though node is singly threaded, native modules can be multi-threaded, the main thread accepts work and then queues it along with the function to return the result to and returns immediately meanwhile a couple of other threads dequeue the work evaluate and then enqueue the result and finally the main thread is signaled using uv_async_send. If the answer is yes, what happens when julia is busy with a previous calculation and a new one is passed to it? Currently there's good news and bad news. The good news is that node-julia async calls will not block and so node will not block, but the bad news is the the evaluator running in a separate thread waits for the result before moving on to the next thing. Thanks so much for creating this. Oh man, you're welcome, most definitely.
Re: [julia-users] ANN node-julia 1.0.0
Yeah, considering issue #1 is Installation is too complex! I'm not sure whether it's exactly production-usable or widely known at this point. But of the linear algebra for node / browsers projects I've seen out there, it appears to be the most sophisticated by a pretty substantial margin. Most others look to be simple wrappers of some small fraction of the reference netlib blas/lapack, or naive JS implementations that look a lot more like the reference versions at a source level than a proper optimized blocked, SIMD, multithreaded etc modern BLAS. On Saturday, January 17, 2015 at 8:59:45 PM UTC-8, Jeff Waller wrote: On Saturday, January 17, 2015 at 10:59:13 PM UTC-5, Tony Kelman wrote: Might be interesting to compare the performance here vs https://github.com/amd/furious.js Ok I'm up for this, it's a lot of stuff, hope the cut-and-paste install method works. The unit test page https://amd.github.io/furious.js/unittest.html looks like it's not all there, they haven't updated lately, though (Aug) what's up?
Re: [julia-users] ANN node-julia 1.0.0
Might be interesting to compare the performance here vs https://github.com/amd/furious.js On Saturday, January 17, 2015 at 7:02:21 PM UTC-8, Test This wrote: This is awesome. That would provide an easy way to scale a web-application at least to some extent by outsourcing the CPU intensive calculation out of Node. Currently, Julia takes a long time to start up. Does the Julia process start when include myjuliafile.jl is called? If so, can one call this line somewhere when the server starts. Otherwise, the start up time will increase the wait for the web clients. On Saturday, January 17, 2015 at 12:32:36 PM UTC-5, Jeff Waller wrote: On Saturday, January 17, 2015 at 10:53:36 AM UTC-5, Test This wrote: Some basic questions: We know that Node is blocking on cpu intensive tasks. If I use the async option, are the calculations run separately. That is does it allow the mode process to continue with its event loop. The answer is yes, and the reason is though node is singly threaded, native modules can be multi-threaded, the main thread accepts work and then queues it along with the function to return the result to and returns immediately meanwhile a couple of other threads dequeue the work evaluate and then enqueue the result and finally the main thread is signaled using uv_async_send. If the answer is yes, what happens when julia is busy with a previous calculation and a new one is passed to it? Currently there's good news and bad news. The good news is that node-julia async calls will not block and so node will not block, but the bad news is the the evaluator running in a separate thread waits for the result before moving on to the next thing. Thanks so much for creating this. Oh man, you're welcome, most definitely.
Re: [julia-users] ANN node-julia 1.0.0
On Saturday, January 17, 2015 at 10:02:21 PM UTC-5, Test This wrote: This is awesome. That would provide an easy way to scale a web-application at least to some extent by outsourcing the CPU intensive calculation out of Node. And you get access to BLAS, Mocha (GPU even), JuMP, Optim, Stats. Good for fast access to website oriented machine learning stuff; regressions, recommender systems, etc. Currently, Julia takes a long time to start up. Does the Julia process start when include myjuliafile.jl is called? If so, can one call this line somewhere when the server starts. Because it's an embed, Julia exists/executes as part of the node process in another thread. The engine starts up the first time a call to eval, exec, Script is made. For first time stuff especially loading big packages, there's going to be some lag because of the action of the JIT, but I haven't seen lag otherwise. Otherwise, the start up time will increase the wait for the web clients. There should not be re-JITing per connection. so I expect this to not be a problem. Feedback?
Re: [julia-users] ANN node-julia 1.0.0
On Saturday, January 17, 2015 at 10:59:13 PM UTC-5, Tony Kelman wrote: Might be interesting to compare the performance here vs https://github.com/amd/furious.js Ok I'm up for this, it's a lot of stuff, hope the cut-and-paste install method works. The unit test page https://amd.github.io/furious.js/unittest.html looks like it's not all there, they haven't updated lately, though (Aug) what's up?
Re: [julia-users] ANN node-julia 1.0.0
This is super cool. I wonder if it wouldn't be possible allow Julia to operate on JavaScript typed arrays in-place? On Fri, Jan 16, 2015 at 4:20 PM, Kevin Squire kevin.squ...@gmail.com wrote: Hi Jeff, can you share a link? Cheers, Kevin On Fri, Jan 16, 2015 at 1:06 PM, Jeff Waller truth...@gmail.com wrote: So I'm happy to announce version 1.0.0 of node-julia, a Julia engine embedded in node, and io.js now too. It's been a pretty long road and I owe many people (perhaps reading this now) a lot. I've said many times (maybe not on this forum) that enabling people is the important part and I hope this tool does that. Some of the new features supported since my first update here (in Sept). * both asynchronous and synchronous processing uses libuv (inside joke) * use of Javascript typed arrays were possible * Julia composites in JavaScript as (the opaque) JRef * functionalized Scripts All those keywords are probably not that interesting, but I can share this which might be. I've done some early testing on simple matrix multiplications (will blog, but not done), and it turns out it's actually faster to copy the array from JavaScript into the Julia engine multiply and then copy the result back out than to use JavaScript directly for most matrices (maybe not 3x3). And when compared to the other popular linear algebra packages for node, Julia-within-node can be a lot faster -- sometimes 1000x faster.
Re: [julia-users] ANN node-julia 1.0.0
On Friday, January 16, 2015 at 4:26:30 PM UTC-5, Stefan Karpinski wrote: This is super cool. I wonder if it wouldn't be possible allow Julia to operate on JavaScript typed arrays in-place? Hmm, maybe! With some caveats first the good news. Here's where the Javascript array buffer is obtained https://github.com/waTeim/node-julia/blob/master/src/NativeArray.h#L59 Here's the relevant part of the copy from Javascript to C++: it's ptr to ptr https://github.com/waTeim/node-julia/blob/master/src/request.cpp#L225 And here it is again from C++ to julia a buffer handoff https://github.com/waTeim/node-julia/blob/master/src/rvalue.cpp#L175 Caveats. this is all happening in separate threads v8 has its own memory management, however, ArrayBuffers can be neutered https://github.com/joyent/node/blob/master/deps/v8/include/v8.h#L2761 node heap size is 2G, having but maybe neutering means it no longer takes part in the calculations Javascript Typed arrays are 1D only. there's implicit row-major to column major transformation going on So multidimensional stuff might be a pain, but vector transfer of ownership? yea probably.